Here, put these on and feel what I feel
Technology being developed at the University of Auckland is letting people share each other's experiences in real time
Fifty years ago computers were the size of a house, but in the decades since then the technology has shrunk to fit into the palm of our hand, and become thousands of times more powerful.
But despite the technology becoming part of everyday life there is still a division between the physical and digital worlds. Our cell phone screens are windows into the digital domain, yet people looking at their phones often ignore the real world around them.
For decades computer scientists have been working on how to seamlessly blend the physical and digital worlds and make computers disappear. One way to do this is through Virtual Reality (VR), where a person wears a head mounted display (HMD) and views 3D graphics to be completely immersed in a computer-generated world. However, VR completely removes people from the real world.
Another way to blend physical and digital worlds is through Augmented Reality (AR). AR is technology which aims to seamlessly superimpose virtual imagery over a user’s view of the real world. It has three key properties - 3D virtual content appears over the real world, the content can be fixed in space, and the user can interact with it. VR tries to replace a person’s view of the real world, while AR tries to enhance it, and improve a person’s actions in the real world.
There are many possible applications of AR technology. For example, using the popular mobile phone game, Pokemon Go, players can go to different places and see virtual Pokemon characters standing in the real world before them. New Zealand company QuiverVision has developed an AR application that allows children to colour pictures and see virtual characters come to life with the colours they have used. Finally, there are many examples of mobile AR advertising applications where digital marketing content is shown over real printed pages.
AR technology is even more powerful when viewed in a head mounted display. Rather than having to hold up a mobile phone to see virtual content overlaid on the live camera view, the user can put on a see-through HMD and use both of their hands to interact with the 3D digital content that appears in front of them. A good example of this is the Microsoft HoloLens, a head worn computer and display that uses cameras to track the user’s hands and enables them to use simple gestures to grab and move virtual objects in front of them.
One of the most exciting uses of AR is for remote collaboration. Using this technology it is possible to enable remote people to appear in the real world in front of a person, or see through the eyes of another person. For example, the HoloLens Remote Assist application streams video from the HoloLens to a remote user’s desktop, and enables them to put graphics in HoloLens user’s view to help them complete a real world task. Using this application, a worker in a factory could fix a broken machine, under the guidance of a remote expert who is pointing at and drawing on objects in the real world.
The Empathic Computing Laboratory, recently established at the University of Auckland, is using AR, and other emerging technologies for even more advanced collaboration. Empathic Computing, among other things, involves research that allows a person to share the live experience of others. The aim is to allow a person to see what someone else is seeing and understand what they are feeling in real time. AR wearable computing and body worn sensors are key technologies for making this happen.
One of the example projects is the Empathy Glasses, a wearable display that combines see-through AR glasses, eye-tracking, and face expression recognition. The Empathy Glasses show a camera view from the AR glasses on the desktop of a remote person. Camera video also shows exactly where the person wearing the glasses is looking and their emotions. In this way, a remote helper knows exactly what the person is looking at and can help them in their work.
Previous research in collaboration has often focused on enabling remote people to communicate together as easily as if they were face to face. Empathic Computing uses Augmented Reality to go beyond this by helping people see through someone else’s eyes, and know what they are feeling. This could be extremely useful in the workplace where a person needs expert help to do their work. For example, a mechanic who is trying to fix a new model of car could have an expert from the car company jump into their view and use AR to provide visual cues to show them what to do.
However before this vision becomes a reality, significant research still needs to be done in a number of areas, such as how to capture and convey what someone is seeing, how to measure emotional state, and how best to share this with another person. Conducting this research at the Empathic Computing Laboratory will enable people to connect together in ways never before possible.