Inspiration & Rationale -->


Summary

If an AR system can be thought of as one that combines real and virtual processes, is interactive in real-time, and is registered in three dimensions; why do we witness the majority of AR applications utilising primarily visual displays of information? I propose a practice-led compositional approach for developing ‘Multisensory Augmented Reality (MSAR) Experiences’, arguing that, as an medium that combines real and virtual multisensory processes, it must explored with a multisensory approach.

This project (polaris~) uses the open-souce Project North Star HMD from Leap Motion, whose general documentation can be found in the resources section. I am using the Project Esky MRTK Unity Implementation for building the software in Unity3D, which is developed by Damien Rompapas (massive thank you for all the hours you have spent helping me with errors and bugs).

This page outlines my use of the system which started around June of 2020 and is ongoing. To clarify, the original design has been open sourced by Leap Motion since 2018, but there have been a fair few community revisions and updates to the design (see more here). This page documents the development of the CombineReality Deck X version of the Project North Star HMD. CombineReality is run by Noah Zerkin, who has provided countless support to my own project, so thanks Noah! He's also pretty much the only inexpensive parts sourcer of the electrical bits needed for the headset.



LeapMotion video from 2018 showcasing through-combiner footage of the robust hand tracking in North Star

Inspiration & Rationale -->



Resources

Headset Documentation: Project North Star

Community: Project North Star Discord Server

Repository: Project Esky Renderer