This Project uses Unity Sentis to detect and track objects in mixed reality. When the user grabs an object, a door is spawned based on the type of object they grabbed.
- Download the APK from Releases
- Install the APK onto your Meta Quest 3
- Make sure you activate the necessary permissions
- Make sure you have completed the environment setup for your room
(Settings > Environment Setup > Set Up)
- Launch the App
(Library > Unknown Sources > Object Recognition in MR)
- Make sure that Git LFS is installed on your machine
- You can check it by using:
git lfs version
- Clone the repository:
git clone https://github.com/mirkosprojects/object-recognition-mr.git
- Open the Unity Project
- Go to
File > Build Profiles > New Android Profile
- Click
Switch Profile
- Click
Build
- If prompted with the warning
Unsupported Input Handling
, selectYes
- Install the APK onto your Meta Quest 3
For debugging purposes, it can be useful to activate visualization. The following visualizations are available.
- Bounding Boxes: Shows the Bounding Boxes of detected objects after every detection
- Object Tracker Markers: Shows the tracked objects in 3D space
- Hand Poses: Shows particle effects and text for detected hand poses
- MRUK Room Mesh: Shows the room outline
- In the Unity Project, open the Scene
Objectdetection Handgrab
- Open
SentisInferenceManagerPrefab
- Activate
Sentis Inference Ui Manager
ANDSentis Object Detected Ui Manager
- In the Unity Project, open the Scene
Objectdetection Handgrab
- Open
SentisInferenceManagerPrefab
- Under
Object Tracker Manager
go toMarker Settings
and activateVisualize Markers
- In the Unity Project, open the Scene
Objectdetection Handgrab
- Expand
Poses
- Activate
PoseRecognizedVisuals
- In the Unity Project, open the Scene
Objectdetection Handgrab
- Activate
EffectMesh