Augmented Reality — The future of Immersive Interaction


When I applied for the EIT Digital Master School program, I didn’t expect the universities I got to would offer such a broad range of cutting-edge courses. Two of such courses at Universite Paris-Saclay were Advanced Immersive Interaction and Mixed Reality, where we got to experience the fascinating concepts of Virtual, Augmented and Mixed Reality. As it usually is in the creative sector of Human-Computer Interaction and Design, the topics of the final projects can be chosen by the students. And that’s what I think is the best part of the education system, to provide the scaffolding but give the freedom of what to build to the students. The theoretical lectures were an inspiring journey starting from fundamentals, through applications of XR, discussions on how our close future will be affected by the new technology wave.

Augmented Reality Riddle

It’s quite astonishing how our perception can mislead us. A simple question asked by our teacher (a fan of anime)— what do you think is AR projection in this video? The dog, the bag, or maybe both of them?

Title: Den-noh Coil — Credits to Professor Jeanne Vezien

Just to be clear, the bag pounded when it touched the ground and the dog barked when the bag fell down on it, but it won’t be any help in this game since the technology uses spatial sound. Whichever character, illusionary or not, would provide realistic spatial audio feedback. Notice that both girls have glasses on, so they can see beyond reality.

The answer is not as obvious as it seems… the bag is going through the dog, right? So the dog must be an illusion. True, but what about the property of AR where it can detect the surface of an object and occlude it. Well… but what about the glitch that appears on the dog? The same happens to the bag so it doesn’t prove which object is real. Maybe, the bag’s behaviour when it lands on both the seat and the ground and behaves as a rigid object? The dog rolls on the floor too though. Let’s give it a last try, what about the shade both characters cast? Would Augmented Reality objects cast a shade? According to this paper, it’s all a matter of geometry rendering from the perspective of the observer. AR can do a lot, and it will become present in more and more areas of our lives. But what it can’t do is store physical objects within its surface, therefore, finally solving the riddle — only one object in the video is real and it is the bag. Indeed, what would be a benefit of an illusionary container which could at most provide haptic feedback but wouldn’t be any practical?

Is The Future Here Yet?

It’s hard to imagine how much Mixed Reality can bring to our environment and what impact it will have on our daily life. Things are changing: the affirmation of virtual and augmented reality technologies is adding another dimension to the user experience. Medical students who can practice surgery in realistic conditions, designers who can test multiple arrangements of the interior, engineers who get step by step instructions on how to repair a device. These are just some of the endless applications of this upcoming world-changing technology.

Thanks to the courses mentioned at the beginning, I realised how much potential is there to be unleashed when it comes to Mixed Reality. It was the first time I could try out the Microsoft Hololens 2 myself and experience how it is like to interact with virtual objects. With the Mixed Reality Toolkit we could test the ready-made components for common interactions, so grabbing, touching, rotating, sliding etc.


Immersive Interaction Project

Idea Development

As for the final project of Advanced Immersive Interaction, after a brainstorming session with my dear teammate Hiba, we developed an AR Magic Chemistry Lab in Unity (a cross-platform engine for game development). We also defined our User Profile:

  • age between 12–40,
  • knowledgeable about how virtual environments work,
  • likes Chemistry / Alchemy.

The Magic Chemistry Lab was initially a concept to create a safe environment to test various chemical reactions but it was dropped along the way due to assets limitations. Instead of a 1:1 reflection of realistic chemical behaviour, we settled for a more fantasy-like one where various unexpected interactions happen.


The prototype went through various changes from the first iteration. Since the time for this project was limited to 7 weeks, the approach had to be effective regardless of obstacles. And there were many! Starting with the Vuforia engine which was giving us unexpected errors when trying to project objects on physical markers. Later came assets that were incompatible with Hololens 2, bugs in our scripts, let alone the gazillion adjustments to software we had to make to deploy the first build to the device.

One of the project requirements was to mix the real world with the virtual one, and since our plan A (Vuforia) didn’t work, we had to apply spatial recognition to the surfaces of a room so they are recognized as rigid bodies. Luckily, this task was relatively easy and took us precisely one youtube video to learn from.


Having set up the environment, we processed with assembling the assets to create a futuristic chemistry lab full of minerals, flasks, acid, fire etc. We used the MRTK interaction toolkit and scripts to add interactions to our elements so that they are grabbable, touchable and pressable. The scripts made the elements disappear when touched, expand in size, or break when dropped.


The application is an AR environment resembling a chemistry lab where you can experiment with interactive elements.

Interaction table

The minerals on the table can interact with each other creating visual effects such as flames, glow or fire. Some of the components enable modification of their colour by pressing a button. Furthermore, an acid flask can be filled or emptied with a button, which assess can be also used to melt minerals. The fire that started to burn can be later put out with an extinguisher. Next, a mineral dropped by the user onto the table shatters into small pieces, and the balloon touched with the spray grows in size and pops.


All in all, the outcome of the project was quite satisfying and we managed to implement most of the interactions that were planned to incorporate into the environment. Even though it is just a prototype, it could be already tested with Hololens 2 with the first batch of users to get some initial feedback. There are still some remaining concerns on how to make the scenery more accessible and easier to consume but as for the time limitations and all the obstacles along the way, we consider this project a success.

Below you can see the scene and the interactions recorded in the Unity emulator.

Thank you, and see you wearing your new AR glasses in 10 years!




➰Aesthetics enjoyer, 🤹‍♀️Psychology freak, 🙋‍♀️People-oriented

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

My Slight Problem with Linus Tech Tips

The AIOZ Network keeps growing: here are all the stats

Solving Drones’ Biggest Privacy Concern

Introduction to Chip Manufacturing

The Real Problems with the Uber Self-Driving Car Fatality

Technology • Innovation • Publishing — Issue #144

The Digital Divide

XC Portfolio: EV-olving Mobility

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Natalia Świerz

Natalia Świerz

➰Aesthetics enjoyer, 🤹‍♀️Psychology freak, 🙋‍♀️People-oriented

More from Medium

Bringing Alice to Life

Using Procreate 3D for Augmented Reality design

How to Develop for the Immersive Web

Augmented Reality and IoT