RJDJ basically works like this: The phone can record sensory data around you (sound and movement), and associates it with a “scene” of your choice. These scenes, all created by different artists, are more like morphing soundscapes. They react to the incoming noises and movements and, depending on the scene you’ve selected, reacts differently–through both visuals and sounds. In a way, it’s a lot like a movie soundtrack scored in real-time.
Thanks yet again to NGT. I strongly recommend you check out this demo because its easier to appreciate when you see it in action.
No comments:
Post a Comment