Presentation of Brain Data in Everyday Life

hw01_trackvizD

What is the role of mental activity data in our daily lives? I investigated this by 1. tracking my own activity data and visualizing its multidimensional nature, and 2. creating an expressive work that shows the nuances of data usually buried beyond words.

One source of mental data that is connected with every moment of life is our brain activity. What consumes our attention and how may such brain data be used in everyday life? 3. I built a system for visualizing the most attention-grabbing moments in video clips according to EEG activity, and 4. speculated on consequences of a world where brain data, like brain waves and functional imaging, is exchanged, mined, and subject of marvel in mental sports.

Track

How are you doing? We get asked that question all the time, but in EVERYDAY LIFE we have no single word answer for it. How we feel depends on a set of different propositions we ask ourselves, including but not limited to “am I happy?”, “am I fulfilled?”, and “do I feel loved?” I tracked the multi-dimensionality of this mental feeling we call “mood” by asking myself questions related to states of well-being throughout the day, in order to create a visualization of this complex quantity that allows us to intuit data from multiple streams, in multiple days, under multiple influences, following a trajectory full of nuanced semantics, all in a single understandable (interactive) display.

I tracked my response to a set of 40 yes/no answers over 18 days (10 times a day), and calculated averages over a subset of these questions and produced measures of my current state, including “forlornness,” “emotional balance,” “activity level,” and “positivity,” the last of which includes all the questions. The sampling produced quantitative measures from the binary questions. It is difficult to visualize the evolution of these quantities throughout both a single day and over 18 days of my tracking, so I created a circular plot of the four metrics with 12 o’clock as the beginning of the day and going clockwise as the day proceeds. In the next iteration, in order to make these quantities more concrete, I chose different colors and shapes for their presentation. For example, “emotional balance” is shown as clouds of yellow, because they tend to remain the same throughout the day, and “activity” are shown as circles of green indicating active pursuits. For the subsequent iteration, I created a legend to make the quantities easier to understand, as well as labeled days where large discrepancies arise using the names of the people I was with at the time the measurements were made. I realize the certain folks made my life beautiful, and hence less lonely.

While the paper visualization gave a glance into my mood in a multidimensional visual that provides information on how each day proceeded, it failed to give a sense of the way the data was generated. For the interactive iteration, I made the circles evolve in time so that data is generatively shown at desired speed and size. I also enabled the viewer to filter the quantity they are interested in and play with the data by moving the mouse and clicking on the day of interested. Since each day also contained information on where I was, who I was with, and other variables that may be able to predict mood, I decided to, for the last iteration, show only one circle from day 1 to 18 sequentially, labeling the days where people contributed to my life and mood and plotting their names on a circle within the circle, showing their contribution to my “emotional balance.” The result is a sketch that narrated my “so-called” life as defined by “so-called” quantitative variables that vary in color and size over the course of the collection of the data in 18 days. Perhaps by looking at this 4D visual you get a sense of how my “so-called” mood evolved throughout the process and what people contributed to it.

Remix

A poem came from the mind of the poet. From the oral traditions of Homer to the printed paper of the Old English verse, poetry has evolved from a listening-speaking tradition to a physical-symbolic tradition. In our digital age, what is the manifestation of poetry in EVERYDAY LIFE? In attempting to create new media for poetry, I wrote a new poem about the science of memory based on the structure of a classic by Elizabeth Bishop, and used only html and css to show typographic, geometric, and interactive elements that support the semantic of the poetry remix. The new remix is not only based on content, but on showing the extra-poetic nature of language, emphases, influences, hidden messages, and subtle correlations within the poem that allows us to understand and emote it on a new psychological level.

remix remixed: poetic invention based on Elizabeth Bishop’s “One Art.”

Poetry is no longer a stream of words. It’s an interactive, multidimensional, nuanced work that goes beyond words, embedded within which is psychological data of how to interact with it, what context it’s in, how it’s read, and expression tied to how the poem is viewed. As such one needs to view more than just a poem, but rather variations of the poem in different forms with the same text. The sum total of different interactions with text of a form becomes One Poem.

Feed

The amount of data we are inundated with in EVERYDAY LIFE is enormous. How do we selectively attend to just that which is important? It turns out our brain acts to filter out unwanted data. Can we get information on how that process occurs?

Our mental activity is a source of data that is with us every moment of our days and nights. The data of our brain activity is fundamental to how we process information, what we pay attention to, and how we organize information. Given the shear mass of information available to us today, I wanted to gauge what types of info are actually salient to our brains, what is it that captures our attention in this world of increasing data availability, where every idea has a precedent and every question already has an answer online. What is our brain filtering for?

I started by examining electroencephalography (EEG) data recorded from peoples’ brains noninvasively in a situation where they are viewing short videos that connote different emotional content like “fun,” “in love,” “melancholy,” and “horror.” This serves as a model for media like TV commercials and movie trailers, where peoples’ attention must be captured and maintained in a short time like 30 seconds. Which elements in video lead to constant arousal and which lead to subconscious processing? It turns out activity in alpha and beta EEG frequencies account for how subjects pay attention to stimuli. Since visualization of EEG data in their raw time-varying form gave little information, I took the data from a publicly available dataset of EEG recordings of subjects viewing different music videos, and calculated the wavelet transform of the time varying EEG signal throughout the viewing trail. The wavelets provide a frequency spectrum of the signal over time in the video, which I can correlated to characteristic frequencies known to code for dimensions such as attention, arousal, relaxation, and sleep. I found the segments of video where high beta activity (blue) is coupled to low alpha power (red), indicating high level of attention, and displayed the video along with the wavelet-generated frequency spectrum in real time, to show which are the salient portions of the video.

However this form of visualization can be difficult to intuit when audiences don’t understand frequency spectra. For the next iteration, I showed the video in 1.8 second segments where each segment is represented by one image. The opacity of the image is determined by the attention signal (ratio of peak beta to peak alpha, which is high for high attention states). By looking at the finished sequence of images, you can intuit which part of the video was salient (and has has high opacity). As a final exercise, I transformed the video into elliptical pixels. The pixels are most visible when attention signal is high because the width and height of the ellipses drawn correspond to optimal alpha and beta bands. By looking at the ellipse generated video, you can see most clearly when the attention signal is high, and either all white or all black when attention is low, giving us a subjective feel for how the video is experienced in real time. When do we pay attention? Salient information can be as obvious as appearance of a smiling individual to something less clear such as a turn in the story in the video. Using EEG to analyze how clips narrating optimal attention-grabbing moments together gives us insight into how brain data in everyday life can predict behaviors like paying attention.

What If

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s