VIDEO: Nokia's Vision Of Mobile Mixed Reality
Wednesday, 09 September 2009 00:00
Written by Apocalypso.
As our lives become increasingly digital, and information about our environment becomes both more contextual and readily available, we will soon want to interact with the ever-growing amounts of information and expect capabilities provided by mobile technology to be delivered in more intuitive and convenient ways.
Rather than having to actively initiate a request for information or a service, we will want it to seamlessly blend into our daily routine, providing immediate feedback to us, as we need it, without disrupting our current activity. Smart phones are bringing about a new realm of "augmented" reality, where digital data can be visually overlaid into real life environments in real-time.
The emergence of more powerful, media-centric cellphones is accelerating humanity toward this vision of “augmented reality,” where data from the network overlays your view of the real world.
Any time you combine digital information with the real world, you end up somewhere within the spectrum of technologies collectively known as Mixed Reality. From enhancing online maps with real-world photos or other media, to interacting with a video game by simply waving your hands in the air, Nokia believes the line where digital information ends and the real world begins, is becoming increasingly blurred.
Mixed Reality in Everyday LifeOn one end of the Mixed Reality continuum is augmented reality, a technology that enhances the world around us by overlaying important data, usually in real-time. One can immediately bring to mind the image of a fighter pilot looking through his visor’s Heads-Up Display (HUD) at a view of the sky enhanced with real-world information, such as the target, altitude and horizon data. This type of technology has been used in specialized areas for years, but now is becoming much more commonplace, though many may not identify it as advanced technology in its daily context.
In American football, viewers are now treated to a dynamic yellow line drawn across the playing field, which marks the first down line. Players pass over and around this line as if it were actually drawn on the field itself, but it is just an illusion created by banks of computers, and geo-synced video cameras bolted to the stadium floor. In international football, or soccer, this technology can show on-pitch information such the distance a defender must stand back from a free kick. The technology to do this is relatively new, but has become such an integrated part of the viewing experience that “football” fans, of all types, simply can’t do without it.
On the other end of the Mixed Reality spectrum, video game makers have started using augmented virtuality in various ways to enhance the gaming experience. New controllers from companies such as Nintendo, Microsoft and Sony enable players to interact with games by tracking real movement in 3D space, providing a more intuitive, natural and overall fun gaming experience.
Mobile Mixed Reality
Researchers at Nokia have started pushing the boundaries of Mixed Reality by making it mobile. A phone becomes a “magic lens” which lets users look through the mobile’s display at a world that has been supplemented with information about the objects that it sees.
The various rich sensors that are being incorporated into new phones such as GPS location, wireless sensitivity, compass direction, accelerometer movement as well as sound and image recognition enable a new dimension of understanding and interacting with the world around us. Contextually tied to time, place and user, the information provided will be invaluable. Like other mixed reality implementations it won’t be long before we can’t live without it.
The various projects at Nokia Research Center that fall under the Mixed Reality umbrella represent an effort to capitalize on increasingly powerful mobile hardware to enable new ways of interacting with the world around us, in real time. The mobile phone can be used to connect the physical world with vast amounts of relevant online information by gathering rich sensor data and using it to contextualize and filter data depending on the user’s modality.
Mixed Reality View
By combining data from a variety of mobile phone-based sensors (e.g. camera, GPS, compass and accelerometer), Mixed Reality View is a prototype application created by NRC which allows users to browse their surroundings for interesting or useful objects using a live heads-up camera view. The users simply point their phone’s camera, and look “through” the display, just like taking a video. Objects of interest visible in the current view are high¬lighted on-screen, while the presence of peripheral objects are indicated by the top bar, giving the users 360° degree awareness.
Objects can be gathered from existing Point-Of-Interest data¬bases, or created by the user. They can be associated with physi¬cal objects, like buildings and monuments, or featureless spaces like squares and parks. Once selected by the user, objects provide access to additional information from the Internet and hyperlinks to other related objects, data, applications and services.
As our lives become increasingly digital, and information about our environment becomes both more contextual and readily available, we will soon want to interact with the ever-growing amounts of information and expect capabilities provided by mobile technology to be delivered in more intuitive and conve¬nient ways. Rather than having to actively initiate a request for information or a service, we will want it to seamlessly blend into our daily routine, providing immediate feedback to us, as we need it, without disrupting our current activity.
Strengthening Remote TiesBringing those who are far away to our sides is a future use case of mixed reality – being able to hold up the mobile’s camera so that the video is shared with friends and family online, complete with annotations of interesting places to see, and enabling those friends to immediately help provide feedback on what they are seeing.
Mirror worlds will also continue to advance, with more information about the real world being integrated into our online virtual worlds as well. Meetings that currently take place in cyberspace may be just as accessible by those in the real world, as those online.
Digital and Physical Fusion
Around the world, more and more people are spending more time on the go, outside of our homes, schools or offices. Because of this, it’s important to find better ways of interacting with our environment by using our mobiles as a device to help manage life’s complexities, connect locally, maintain social ties and make the world around us more transparent and enjoyable.
Through the use of haptic feedback, and other technologies just being invented, our interaction with the digital world and with each other will become more integrated and tangible. Being able to “feel” when a friend is close by through a shirt that hugs you is one example. Tapping a friend on the shoulder from miles away. Having your shoes vibrate if you take a wrong turn, or walk into a bad area of town.
Entering the Final Phase
The realization of the Mobile Mixed Reality vision depends on continued innovation in both software and hardware. The systems being developed depend on advanced algorithms and capabilities not yet common in mobile devices, such as directional data, haptic feedback or heads up displays. But what seems close to science fiction now, is quickly becoming reality.
The results will be incredible advances in the way we all interact with our mobile devices. By using our voices, gestures and other multi-modal input methods combined with new ways of seamlessly viewing information, new user interfaces will be enabled that have the potential to enhance the user experience far beyond what we have today.