NASA cover.png

Mixed Reality + NASA JPL

 
 

Designing Mixed Reality Experiences for the Mars 2020 Rover Mission

 

Currently prototyping mixed reality experiences via HoloLens for the Mars 2020 Rover Mission in collaboration with NASA's Jet Propulsion Laboratory

JANUARY 2017 — PRESENT

TEAM / ALLISON CHAN, JANICE KIM, DILLON BAKER, EMILY PASTERNACK

 
 
 

 
 
 
 

"What would help is to put everyone in that same environment. It’s all about overlapping and finding the common areas that are worth exploring."

NAT GUY, LEAD UI DEVELOPER @ JPL

 

"The problem is that people have really different stakes in the process. Scientists focus on the science—they won't pay attention to the limitations of the machine. They need to have some sort of risk assessment with engineers, instead of wasting time hypothesizing about what can and can’t happen."

GREG QUETIN, EX-MECHANICAL ENGINEER @ JPL

 
 

ABSTRACT

The Mars 2020 Rover Mission control team at JPL faces a fundamental interaction problem—the team's two-dimensional interfaces cannot provide the level of spatial awareness needed to think through complex three-dimensional problems, like rover navigation. This is especially problematic when piloting uncharted terrain filled with geographical obstacles and occlusions, like steep slopes or sand pits. Spatial misjudgements can undermine the entire mission, and negotiating these risks with the demands of the science is only possible through active dialogue and collaboration. 

Our capstone explores how emerging mixed reality technologies like HoloLens can bring to life the Mars topography through scalable, interactive, three-dimensional simulations that anchor to real space. By contextualizing the rover's spatial workflow in a shared immersion, mixed reality can help JPL's cross-functional teams achieve alignment in making sense of complex, high-risk environments, and therefore articulate rover strategy with heightened consensus and clarity. 

 

OBJECTIVES

Analysis

How can we unblock spatial awareness of the Mars terrain by reimagining the physical and visual interaction model through mixed reality? How can we more effectively translate two-dimensional data into three-dimensional visualizations? How can this better prioritize navigation and synchronize risk analysis across roles and teams?

 

Communication

How can we help teams convey complex ideas to each other in the mixed reality environment? What tools and interfaces can we build to help teams organize and annotate visual data? How can this help teams with conflicting needs reach consensus upon rover strategy?

 

Scalability

How can we leverage the holographic capabilities of mixed reality to free collaboration from physical constraints and allow teams to scale visualizations across different spaces, contexts, and use cases? How can we afford teams better control over the magnitude, legibility, location, and sharing of their data?

 

DOWNLOAD RESEARCH PROPOSAL

 
 
 
Here's a lil sneak peek...

Here's a lil sneak peek...