Advertisement

Motion Picture

BIT-SIZED EMPATHY

Escobedo’s comment on cultural connectedness underscores another tension at the heart of GIFs. Even though the GIF is a visual language that consists of gestures, the interaction seems to take place between a viewer and a screen. This raises questions about how GIFs might contribute to the social isolation that some believe modern technology can produce.

But the work being done at the MIT Media Lab suggests that it would be a mistake to dismiss GIFs as incapable of soliciting a valid emotional response. At this lab, MIT graduate students Travis Rich and Kevin Hu aim to map the emotional language of GIFs. Hu and Rich have created a website called GIFGIF, which presents a user with two GIFs and asks which GIF better expresses one of 17 emotions, including pleasure, disgust or surprise. “The emotive content of GIFs is very powerful, and people in our age group use them a lot, but they haven’t been approached scientifically,” says Hu, a first-year graduate student in the Media Arts and Science program. In Hu’s opinion, empathy is an essential component of the GIF’s unmeasured emotional potential. “How do we know that GIFs are an empathetic medium? Partially because people wouldn’t use [them] if they weren’t,” Hu says.

Advertisement

Rich, also a first-year doctoral student at the Media Arts and Sciences Program, concurs with his colleague about the potential of GIFs to bring empathy to the internet. “GIFs are great for that because there are sites like ‘What Should We Call Me?,’ which let you realize, ‘Oh there is another human who at some point felt this way,’” Rich says.

The pair’s first goal is to create a text-to-GIF translator using their data, but their work already shows signs of promise with regard to exposing the potential of GIFs to tap into fundamental human emotions. One of their latest projects, MirrorMirror, created in collaboration with MIT Media Lab researcher Sophia Brueckner, is an expression-to-GIF translator that uses data from GIFGIF and a facial-feature-tracking library to analyze people’s facial expressions and respond with a GIF that imitates them.

{shortcode-0505f866adca1012b6654fe86f5be03b46d86c2a}

Hu and Rich let me play with the aptly named MirrorMirror. The desktop screen is mirrored so that until I make my first facial expression in front of the web camera, all I can see is my own reflection. I decide to make a surprised expression and find myself face to face with Cookie Monster, who has just been surprised with a giant chocolate chip cookie. I laugh, and MirrorMirror responds with a laughing cartoon character from its database. “If you tilt your face down it will think you’re angrier,” Rich tells me. But all I can do is laugh at my animated reflections.

Perhaps to an outsider this would resemble a haphazard, piecemeal exchange between a human and a screen. It certainly doesn’t resemble how audiences used to interact with their media and each other. “There was a feeling during the ’60s and ’70s before cable hit that…when you were watching television in your own home, what you were watching was being seen by millions and millions of people at the same time,” Hlynsky says. “You could look out the window, and your neighbor’s window was flickering at the same frequency as the light in your living room. That doesn’t happen anymore.”

But projects like MirrorMirror and GIFGIFs seem poised to magnify the power of the file format to simulate real human connection. Hu and Rich have found through color analysis that most of the GIFs at the top of the emotional categories are brown and pink in tone––in other words, flesh-colored. “The guess and hypothesis that we made from that was that the reason that the ones towards the top are more human is because humans more easily relate to another human in that GIF,” Rich says. “Whether it’s egotistical or not, we are wired to like ourselves and things that remind us of our species.”

Tags

Recommended Articles

Advertisement