Thursday, July 16, 2009

Cross Reality: When Sensors Meet Virtual Reality

Food for thought on where the world is heading. Thanks Mashable

Written by Richard MacManus / July 14, 2009 12:00 AM / 3 Comments

During my recent visit to MIT in Boston I met with Joseph Paradiso, Associate Professor and Director of the Responsive Environments Group at MIT Media Laboratory. He showed me some demos of what his lab is up to, focusing mostly on what is termed "Cross Reality". This is when sensor/actuator networks meet online virtual worlds.

Paradiso co-authored a paper that has just been released in the July-September edition of the IEEE Pervasive Computing Magazine. The paper outlines and analyzes Cross Reality experiments done within Second Life, the most popular virtual world with 15 million current subscribers. In this post we'll give you a layman's overview of the paper, because we think this trend is important to the Web's future.

What is Cross Reality?

Cross Reality is about connecting "location-specific 3D animated constructs" in virtual worlds to in-building sensor installations.

The paper notes that "the convergence of shared 3D virtual worlds with popular web-based data sources to form a "Second Earth" has been broadly predicted." It's also been the topic of many a science fiction novel. So it's interesting to see the latest practical experiments in this "hyper reality."

It should be noted that there are already commercial applications. The paper points to IBM's visualization of datacenter operation and VRcontext's ProcessLife technology. The latter "uses high-fidelity 3D virtual replicas of real plants or factories to remotely browse and influence industrial processes in realtime."

Billowing Power Strips

In one of its projects, MIT created a cross reality environment called "ShadowLab," which is a Second Life map of the Media Lab's third floor animated by data collected from a network of 35 "smart, sensor-laden power strips" (a.k.a. PLUGs). MIT chose to use power strips "because they are already ubiquitous in offices and homes," plus they have power and can be connected to a network.


Virtual DataPond in the Virtual Atrium (left) and a real DataPond in the real Media Lab Atrium (right)

MIT added other features to the power strips via expansion boards - such as motion sensors, temperature sensors, and memory cards for local data logging.

Ubiquitous Sensor Portals

MIT has also created a whole portal network that maps sensors to virtual worlds, called the Ubiquitous Sensor Portal. There are 45 portals currently in the Media Lab, each one featuring a myriad of environmental sensors - such as motion, light and sound level, vibration, temperature, and humidity. They have a small touch-screen display and audio speaker, for user interaction.

The Portals also act as base stations for an 802.15.4 network inside the lab, "enabling wireless communication with a variety of wearable sensors." Each portal has an extension into Second Life, allowing people to visit the Media Lab virtually. This isn't just a one-way process either; as well as affecting virtual worlds, portal interactions can push virtual phenomena into the user's physical space.


Two views of the virtual extension of a Portal into Second Life; the first shows sensor data over time, the second streaming real-time audio/video into Second Life.

Mobile

MIT expects that handhelds and mobile devices will play an important role in future Cross Reality applications, "both as a source of data to animate their users' environments and avatars and as augmented reality terminals through which local sensor networks can be explored and programmed." The lab has already begun to experiment in this area, with a Star Trek-inspired device it calls a Tricorder (image to right) and a newer device called the "Ubicorder." Both devices provide a real time interface to sensor data.

MIT expects the mobile area of cross reality to expand rapidly "once smart phone Augmented Reality becomes better
established." ReadWriteWeb has been following this trend closely; read our recent Augmented Reality analysis here and here.

Conclusion

The projects of the Responsive Environments Group at MIT are enabling real world data, increasingly being provided by sensor/actuator networks, to be plugged into virtual and physical interfaces.

The group is still exploring both the technical and practical sides of this, so it's uncertain what it will lead to in the commercial world. But we're certain it will fuel future startup innovation in the coming year or two. Watch this space!


Reblog this post [with Zemanta]

No comments: