Can Unity really be reality?

Today, our group enjoyed the multitude of assets and design elements within Unity. We played around with different prefabs, researched new environmental designs, and added interesting and occasionally ridiculous assets to our barren virtual world (Did you know you could add a free digital rubber horse mask? Everyone should check that out–it’s incredible and incredibly absurd.). While we discovered more of Unity’s new and exciting design elements, we encountered some logistical issues when one of my group members attempt to pick up an arrow to shoot at a target during a VR demonstration. When we asked Ben how to pick up objects, he said that the program did not support tactile actions and that another program had to be installed in order to allow the user to pick up and touch objects. Ben, along with Professor Serrano, added that the ability to touch objects was not required in the original project outline. This notion seemed contradictory to the intentional of creating a virtual reality. As we discussed earlier this semester, our interpretation of reality relies on our sensory perceptions, including the ability to touch and feel objects. Without the ability to touch and directly interact with the objects in Unity environments, the virtual reality appears more like a stagnant museum exhibit. Of course, this information is based on my own interpretation of what reality entails.

 

Can a user be fully immersed in a Unity reality even if he/she cannot sensorially interact with the environment? And, if the environment cannot replicate the sensation of touch, what can Unity allow instead that still contributes to the user’s sensory experience in VR?

Leave a Reply