The future, is near. So near, that you can see it.
Out of a host of amazing sessions on 27 November, the first day of SIGGRAPH Asia 2017, one was Artificial Intelligence Meets Virtual Reality and Augmented Realities.
The objective of the workshop was to invite scholars and practitioners to discuss synergies between VR and AR, AI and machine learning. A cross-disciplinary team of experts with a background in computer science, human-computer interaction (HCI), psychology/cognitive sciences, culture/communication studies, design and art were gathered to develop this fascinating intersection.
Aspects ranging from user experience, technologies applications, methods, cultural implications, communication theories to artistic approaches were discussed from 9 am to 6 pm.
A part of this was a session by Yoshifumi Kitamura on Designing Interactive Content. He started off with explaining how the space around us is becoming more intelligent, reactive and aware of our actions, conditions and intentions. He gave examples of the AC with sensors that detect where the subjects are, and swing the air in that direction, and the TV screen which moves in the direction of the eye of the subject.
He then asked the audience to imagine the space 10 years from then…and the scenario would be:
- A wall which changes its colour/texture according to a person’s physical condition, task, and other contents
- A table formed from wall or floor at a favourite place
- A monitor of a person’s physical condition, task engagement
- A window which can be used as a display
- Tele-communication using a hologram
These are dynamic aware interiors which are more active, enjoyable, efficient and comfortable. Verbal and/or nonverbal behaviours are sensed through motion, bio-information and brain waves. The space, interaction and information is used to create the design of the interiors.
He demonstrated dynamic aware Photoshop which sorts pictures very flexibly. Giving an example, Kitamura gathered photographs from famous places around the world on one screen- The Leaning Tower of Pisa, San Francisco bridge etc. Using metadata, Photoshop arranged all the pictures of the respective places on the world map.
Coming to Impact of Shape of Tables, Kitamura said, “In interiors, shape of table can be changed from round to square to rectangle. The location of the table can be changed according to human activity.” He displayed videos showing the same, as well as how multiple tables can be used to set various choices.
Kitamura highlighted the importance of explicit interaction – gesture based control – and implicit interaction – content based control. “By using environment and virtual space, you can change the depth of the room’s wall according to mood or work.”
He concluded with demonstrating a video of Light Tracer- a VR adventure game in which one has to navigate a princess by casting a ray of light that the character follows.
With the help of videos and images, Kitamura gave a glimpse of the vast, vast possibilities of AI in dynamic aware interiors, opening new windows to the applications of AI in VR and AR.