Expressive faces

In 1992 I first read Neal Stephenson’s novel Snow Crash. I was influenced by many things in that book, but one thing that struck me in particular was the way the Metaverse (a term Stephenson coined for that novel) relied on good facial expression. In the story, Juanita, an expert on facial expression, implements the key feature that draws people to the on-line social platform — a way for people to accurately convey emotion and intention via the facial expressiveness of their avatars.

In 1996, I set out to build a version of what Juanita had created. I originally presented it as a SIGGRAPH 97 Technical Sketch, which was then incorporated into a traveling museum exhibit by the American Museum of the Moving Image, and was eventually used by kids on the Spectrum to teach themselves how to interpret other peoples’ facial expressions.

The original demo showed how a real-time autonomous virtual character could express convincing emotions, without using repetitive prebuilt animations, by mixing facial expressions over time to simulate shifting moods and attitudes. Unfortunately it stopped running on the Web in 2013, when Oracle, having acquired Java, disabled unsigned Java applets.

So I partially reimplemented it in Javascript. My partial rebuild mostly shows how a minimal number of elements of facial expression can start to generate a convincing impression of character and personality. You can see for yourself by running THIS DEMO.

Ken Perlin
Posted on:
Post author

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: