The Star Trek: The Next Generation TV show introduced millions of people to the idea of ââa holodeck: an immersive and realistic 3D holographic projection of a complete environment that you can interact with and even touch.
In the 21st century, holograms are Already used in a in various ways such as medical systems, education, art, security and defense. Scientists are still develop means use lasers, modern digital processors and motion detection technologies to create multiple different types of holograms that could change the way we interact.
My colleagues and I, who work in the Collapsible Electronics and Sensing Technologies Research Group at the University of Glasgow, have now developed a system of holograms of people using “aerohaptics”, creating sensations of touch with jets of air. These air jets provide a feeling of touch on people’s fingers, hands and wrists.
Over time, this could be developed to allow you to meet a virtual avatar of a coworker halfway around the world and really feel their handshake. It might even be the first steps towards building something like a holodeck.
To create that touch feeling, we use affordable, commercially available parts to combine computer-generated graphics with carefully directed and controlled jets of air.
In some ways, it’s a step beyond the current generation of virtual reality, which typically requires a headset to deliver 3D graphics and smart gloves or wearable controllers to provide haptic feedback, stimulation that feels like touch. . Most approaches based on wearable gadgets are limited to control virtual object that is displayed.
Controlling a virtual object does not give the feeling you would get when two people are touching. Adding an artificial tactile sensation can provide an extra dimension without having to wear gloves to feel objects, and thus gives a much more natural feel.
Using glass and mirrors
Our research uses graphics that give the illusion of a virtual 3D image. It is a modern variation of a 19th century illusion technique known as The Pepper Ghost, which delighted Victorian theatergoers with visions of the supernatural on stage.
The systems use glass and mirrors to give the impression that a two-dimensional image is floating in space without the need for additional equipment. And our haptic feedback is created with nothing but air.
The mirrors that make up our system are arranged in a pyramid with one side open. Users put their hands through the open side and interact with computer-generated objects that appear to be floating in the free space inside the pyramid. Objects are graphics created and controlled by software called the Unity Game Engine, which is often used to create 3D objects and worlds in video games.
Located just below the pyramid is a sensor that tracks the movements of users’ hands and fingers, and a single air nozzle, which directs jets of air towards them to create complex tactile sensations. The entire system is controlled by electronic equipment programmed to control the movements of the nozzles. We developed an algorithm that allowed the air nozzle to respond to users’ hand movements with appropriate combinations of direction and force.
One of the ways in which we have demonstrated the capabilities of the “aerohaptic” system is the interactive projection of a basketball, which can be touched, rolled and bounced in a convincing way. The tactile feedback of the system’s air jets is also modulated based on the virtual surface of the basketball, allowing users to feel the rounded shape of the ball as it rolls with their fingertips as they bounce and bounce it. slap in their palm when he returns.
Users can even push the virtual ball with varying force and feel the resulting difference in the feeling of a hard bounce or a soft bounce in their palm. Even something as simple as bouncing a basketball required us to work hard to model the physics of the action and how to replicate that familiar feeling with jets of air.
The smells of the future
While we don’t plan to deliver a full Star Trek holodeck experience in the near future, we are already boldly moving in new directions to add additional functions to the system. Soon, we hope to be able to change the temperature of the airflow to allow users to feel hot or cold surfaces. We are also exploring the possibility of adding scents to the airflow, deepening the illusion of virtual objects by allowing users to smell and touch them.
As the system expands and develops, we expect it to find uses across a wide range of industries. Providing more engaging video game experiences without having to carry bulky gear is a no-brainer, but it could also enable more compelling conference calls. You can even take turns adding components to a virtual circuit board while you collaborate on a project.
It could also help clinicians collaborate on treatments for patients, and make patients feel more involved and informed in the process. Doctors could see, feel and discuss the characteristics of tumor cells and show patients blueprints for a medical procedure.
Ravinder Dahiya receives funding from the Engineering and Physical Science Research Council (EP / R029644 / 1 and EP / R511705 / 1).