Mirror, mirror on the wall, Who is the fairest one of all?

The answer is it all depends on the image.


In the last few years, virtual reality, including Second Life, has faded from the technology hype cycle. But as I recently discovered there is still life in the old dog, as demonstrated in this lunch time lecture titled Transformed Social Interaction in Virtual Reality from Harvard University’s Berkman Center for Internet & Society.

Stanford’s Professor Bailenson examines how social interactions can be affected in a virtual world where one can modify actual behavior through a ‘digital filter’ and present something quite different to an audience in the virtual world. An example would be sending different behaviors to an audience of avatars or the same behavior to the same audience of avatars. Or as he states (my paraphrase)  “If I am chairing a meeting in the real world, I can only maintain eye contact with one person at a time, in a virtual meeting, I can maintain eye contact with everyone in the meeting.”

One of the most interesting experiments (see image above) has to do with the well known phenomenon that we tend to like people who look, act, or behave like us.  One week before the presidential election in 2004, volunteers were shown one of three scenarios

  1. Untouched photos of George Bush and John Kerry (the control group)
  2. An untouched photo of George Bush and an altered photo of John Kerry (a 60/40 morph of the untouched photo with 40% of the volunteer’s facial features).
  3. An untouched photo of John Kerry and an altered photo of George Bush (a 60/40 morph of the untouched photo with 40% of the volunteer’s facial features).

Why 40%? Anything below that figure is not perceived as “odd” by the volunteer. So what was the result? It is very very scary.


There is a 19% swing between the two morphed alternatives. Just by making the politician’s appearance look a bit like the volunteer, we can change the outcomes of elections.

This capability to manipulate images and behavior (and to hide the fact from the other party) presents a significant ethical dilemma in virtual worlds. As with most things it can be a force for good but in the wrong hands can be very dangerous. There are a lot of people who are worried about social networks and behavioral targeting (advertising customized to your behavior) – what is to stop advertisers from presenting ads on facebook using pitchmen that deliberately share your characteristics in order to increase your propensity to buy?