Project Gum

Unity

Unity’s core belief was the driving force for Project Gum

The world is a better place with more creators in it.

With the growing number of aspiring creatives and social influencers, we embarked on a journey to see how we could empower that audience through Unity’s real-time gaming engine with augmented reality for their own social creations.

Early AR explorations were aimed at placing objects into a person’s world. Anchoring digital elements with physical attributes to create synthetic realities.

World Meshing. Fidelity observations of meshing topology with digital crayons.

Mesh Interactions. Observing the update speed between real and virtual objects.

Listening to our target audience, a consistent request was to change their existing environment. This lead to texturing the world mesh with a video using a triplanar shader. Users could test this feature with videos we provided, but they could also use any image or video from their camera roll to tailor the results closer to their own desires.

World Mesh Texturing

Video ‘Wrapping’ Surfaces

As we explored different video content, we also explored different blend modes and tiling features. Recognizing the burden of choice, we stuck to a simple mirror repeating pattern for the tiling while exposing a select number of blend modes.

Videos with different tiling and blend modes.

Answering the reoccurring request to add fire🔥, I worked on a prototype that placed a flat video on a 2D plane. With the video on a black background and the blend mode set to additive, it fused the 3D qualities from both analog and virtual quite well.

Body + Surface = Events

Inspired by the magical editing approaches of social creators like Kevin Perry & Zach King, I started to build prototypes that aimed to replace the editing process with a real-time effect using ARKit’s body and scene awareness features. Through determining the distance between a body and surface element, I’d run distance calculations between the objects and trigger an event once a threshold was surpassed.

Body & Surface Trigger. Exploring the desired threshold of the body event trigger

Looking for ways the users interactions/performance can introduce objects into their environment.

Using ARKit Skeletal Tracking to trigger a CREATE Event when the hands are in close proximity. Then updating the position and scale of the object created while the hands are moving. When the hands are at rest, the object stops updating and the system returns to its neutral/ready state.

Body Triggers

ARKit Skeletal Tracking. Hand Proximity Spawn Trigger

Link to dice hand video (short wasn’t able to embed)

Nostalgia takes over and the idea to create Polaroid pictures inside of AR emerges. Capturing a still image from the live video and placing it inside a model of a photograph that has physics enabled.

AR Photographs

AR Image Capture. Extracting cropped stills from the live camera.

Previous
Previous

Spatial Inputs

Next
Next

Project Dalí