top of page
Post Capture Filters
AR has primarily focused on content from the live camera. Post Capture filters use photos already in the camera roll, allowing users to be more prescribed with the way they want to customize their content. I designed and pushed the development of procedural tools that allow users to be creative and unique with their photos
AR Expressive Characters
Role: Art Director, Designer
AR Expressive Characters was an initiative that I led at Facebook. The vision was to offer users a variety of memorable characters they could transform into and also use to express themselves with. The north star's goal was to make these character experiences fully customizable and unique so users could apply them as their personalized Avatars on video calls. The effort served as a major cross-functional workstream for developing facial tracking and other advanced persistent features on Facebook's Spark AR platform.
Two of the initiative's biggest challenges were to continually create new and compelling characters, as well as make the experience engaging with the limited tech specs (allowing it to run on low-end devices). That led to finding creative solutions in the base designs as well as find ways to procedurally build animations and materials.
There was heavy focus on retention and increasing user engagement. In order to do that, it was extremely important that each of the experiences felt unique and different from the one that proceeded it. We built-in narratives and interactions that helped lead users into wanting many more of these experiences (as evident in user research and metrics).
User research played a major role in the development cycle of the AR Expressive Characters. Through user testing and feedback, we were able to understand what users wanted and needed out of these experiences. We were able to identify what designs and styles users responded to the most, as well as the pain points that would help us improve with our iterations. Iterations included various narratives, visual styles, UX, and interactivity.
Werewolf was the very first Expressive Character that started it all. The initial goals were to bring more of the male youth demographic to the Facebook AR platform and also push the development of Facebook's AR Software, Spark. I worked closely with UXR and engineering to come inform my design, which went from a dark scary monster to a silly cartoon. The final design is a bit of a blend of ideas aimed at mass appeal. User research indicated that users were hungry for visually rich and immersive AR experiences. Werewolf was able to provide that by being the testbed for the development of Facial tracking, animation, and visual quality on Spark AR. The Werwolf transformation helped bring the blend shape feature to Spark and defined the topology for all future Spark face experiences.
Early Halloween Dragon Sketch Ideas
The goal was to create a more compelling character experience than Werewolf and also push the feature set of Spark AR. The biggest challenge Dragon faced was cramming all the features into a small file package. This led to generating the animated skin texture, fire effects all procedurally with code (by amazing tech artist Eric Larson). User research showed that people loved the accessories on Werewolf, so I brought back accessories. The Northstar was to be able to have many more choices and customization for users.
Insights from User Research showed that users wanted more expressive characters and also desired content that felt relatable. The goal with Pizza Face was to create a non-humanoid character and also create a new narrative for the facial transformation. In Pizza Face, users saw their character emerge from a slice of Pizza that had slapped onto their face. The character disappeared after users ate the pizza off their face. Pizza Face was the second character to use FACS and pushed the overall ability for users to emote facial expressions. I partnered with talented Tech Artist Josh Law on this project and we used Pizza Face with Engineering to also improve the face-tracking technology on the Spark AR platform.
The goal with Zombie was to drive user engagement through expression. This character experience was the first to use additional blend shapes (FACS) to allow the user to articulate more character emotion through their facial movements. The result was the ability for users to emote through the character much more comprehensively than they were in the previous effects (Werewolf, Dragon). Expressions such as pucker and puffing cheeks became possible.
bottom of page