top of page
ASTEROID.jpg

ASTEROID - 30 Ninjas

Figma + Unity

Role: Associate UX/UI Designer

Involvement: Designing world space UI, gesture controls in XR, and implementation of UI

ASTEROID is a unique immersive experience in partnership with a Doug Liman film using emergent technology, the Samsung Galaxy XR headset. Doug Liman’s ASTEROID is a high-concept survival thriller centered on a group of people stranded in space after an asteroid mission goes wrong. As a post-movie interactive experience, ASTEROID places the player as Data Analyst piecing together data in a simulation in order triangulate the location of a lone survivor to launch a rescue.

 

ASTEROID's greatest challenge was the gesture based controls which created a lot of new challenges when designing user interactions, a major issue being gesture overlap where similar gestures would accidentally perform an unintended action. In this project I created the styleguide, user storyboards, implemented the UI, navigational gestures and provided functionality as well as some visual optimization investigations.

Player Menu Iterations
InitialWristMenu.png
Tablet.png
CondensedTablet.png
MenuOpening.gif

The Wrist Device Menu

Initially the player's menu was part of a wrist device that would open with a fist gesture and turning the back of the hand towards the user as if looking at a watch. This interaction replicated the real world actions to interact with a wrist device and helped to prevent gesture overlap due to the intentional and specific action.

​

In this mock up, the menu would project over the wrist. The initial design's approach was a tactile one. Allowing players the poke and pinch objects and makes the experience more immersive overall.

​

​

The Tablet Menu

Following reviews with external stakeholders, it was changed to replicate the Galaxy XR's default gesture (palm up and pinch). This action would be done on the non-dominant hand as the dominant hand was reserved to access the Galaxy XR's operating system menu. 

​

While opening the menu adhered to the Android XR this iteration retained the interaction to physically push the buttons down. This menu allowed for the player to read the text messages between the Handler character and the player, access to the movie player, and database. The palm scanner was an interactive way to exit to the player menu, instead of a confirmation pop-up users would place their hand over the scanner for a set duration. If they choose not to leave they can move their hand away.

Removing Tablet Features

In these iterations, the text messages between the Handler and player were removed. Instead, the database where the clues could be reviewed by the player became nested with the player menu instead of a separate screen.

​

In these examples the locations could be selected by tabs or cycled to show all the clues they found by area. The default database shown would always be the location the player is currently in so that it is efficient to review their notes. 

​

We still wanted to retain a tablet-like appearance to boost the player fantasy as if there was a tool players could use as they investigate each area.

Final Pinch Menu

After reviews with external stakeholders, it was changed to fully replicate Galaxy XR's raycast + pinch interaction. Because of this, the menu no longer needed to be closer. The UI art style also had to pivot following feedback.

​

The database as seen in the previous iterations were cut and replaced with audio logs only, since there were only a few audio logs they no longer needed to be grouped by location. Instead they became a separate menu.

​

Interactions were also changed to adhere to the Android XR raycast and pinch method where the pointer finger projects a cursor onto the UI and pinching to select objects. While it reduces the immersive and tactile experience it is consistent with the Android XR system.

Movie Player

Movie Player Iterations

In the early stages there was an emphasis on utilizing the movie to help find clues the player would use in the interactive experience. Navigating the movie was a crucial element.

​

Various accessibility options needed to be available such as volume control, subtitles, subtitle settings (size, background opacity, language selection between English and Korean).

​

An additional challenge related to subtitles integrating with various depths in a 3D movie. In future projects it would be beneficial to make use of eye tracking technology to set the subtitles depth to dynamically update with the eyes. This way the subtitles adjust with the users visual cues. 

videoplayer_mocks.png

Final Movie Player

After reviews, it was decided that the movie player needed to replicate YouTube XR's video player. The only difference is that the chapter select needed to remain as sections (due to technical constraints) instead of a progress bar users could scrub through the movie.

Used a slider and track bar with segments to denote the different chapters.

​​

​

​

​

​

This included highlighted areas with a thumbnail for the chapter. At one point the idea was to involve the film more, to give context and hints that would guide the player in the location they were in. This allowed them to quickly navigate to the movie that would help the player progress.

​​

​

​

In the third iteration, the chapter select was a panel that can be opened from the player allowing the player to scroll down a list of thumbnails and chapter description.

​​​

​

In this mock, the dots would be represent each chapter as the film cannot be scrubbed as we had previously assumed due to technical limitations. Here we would denote the film's progress by filling each dot, and allowing the player to select chapter.

Speaking to AI

In this mock up the idea was that the player could switch between two AI characters to speak to using the wrist device paired with a gesture. To solve switching between AI characters I created a gesture based toggle. When the player's dominant hand is over the wrist device a toggle would appear, swiping up with the thumb to switch between the two characters. As feedback, the wrist device would change colour and name to communicate which AI character is selected.

The initial player tool was a wrist device that allowed for communicating with the AI and accessing the player menu. The user would hold their wrist up to their mouth to activate the microphone, then speak as if speaking into the wrist device. This ensured that the mic only records when the player intends to and gives the player more agency.

 

​

​

Frame 44.png

 

​Later the AI characters were reduced to one so toggling between characters was no longer needed. In this iteration the mic's toggle was controlled by the gesture. By holding the wrist device near the mouth the mic would activate and the wrist device would reflect that.​​​

 

​

 

In our final iteration stakeholders wanted a more accessible way to use the mic - an open mic. To make this pivot the mic would be enabled at all times. However to still retain player agency we added a mute button to the player menu so there was a way to stop the mic from recording at any time if they choose to.

​

Group 640.png
Player.png
Player-1.png
  • LinkedIn
  • LinkedIn

©2022 by Kingsley Ip. Proudly created with Wix.com

bottom of page