HOME  |  COMPANY  |  TECHNOLOGY  |  NEWS  |  EMPLOYMENT  |  CONTACT US

       Technology

 

 

 

Need for More Natural Man/Machine User Interfaces

Advances in the fields of 3D computer graphics, sensor technologies and display technologies are converging to enable the development of highly-immersive, fully-interactive 3D simulation environments, typically known as “Virtual Reality”.   However, the ability to become physically-immersed in a these virtual worlds is currently limited by game controller and windows-based user interfaces that require character movements and actions to be controlled with the hands, for example by moving a joystick, mouse, and/or pressing keys and buttons.   Not only do these hand-based interfaces fail to take advantage of the full capabilities of the human body, but they also limit the number of degrees-of-freedom a user can control simultaneously. Another limitation is that users must typically face the television or computer screen when interacting with the computer or game system.  

For full immersion, user interfaces need to support the fact that people can perform multiple tasks simultaneously with their hands, feet and a voice, as driving a car demonstrates: steer with one hand, change gears with the other; accelerate with the right foot, engage the clutch with the left; all while carrying on a conversation with a friend in the passenger's seat.  If the goal of virtual reality applications is to fully immerse the user in a computer generated world, a more appropriate, man/machine interface would allow users to control their character/avatar using sensorimotor responses that closely resemble the tasks and actions they would physically perform in the real world. 

Virtual Locomotion Controller

The next generation of fully immersive “active games” and “exergames”, in the spirit of the Nintendo Wii and Wii Fit, will require user interfaces that can not only track user body movements, but also map them directly to the actions of an avatar in a 3D simulated world.  To address this need, soVoz has developed an “embodied” user interface known as a Virtual Locomotion Controller (i.e. “VLController”).  The VLController allows users to control characters using body movements that closely resemble the tasks and actions they would physically perform the real world.   By tracking the movements of the feet and waist, the VLController enables users to specify a character’s locomotion style by assuming a body posture normally associated with that type of movement (e.g. walking, running, crouching, etc.), while at the same time controlling locomotion speed through foot forces and/or stepping motions and movement direction through body orientation.  As a result, the VLController provides an immersive user interface/game controller usable with a wide range of application software (e.g. training, simulation, gaming) and hardware platforms (PCs, embedded systems and game consoles).   In the current VLController implementation, inertial orientation sensors are combined with ultrasonic range sensors and force sensitive foot pads to track user body movements.  However, the VLContoller design is not dependent on a particular sensor technology; anything that can provide body position or orientation data will do. 

      

 

 

 

 

 

 

 

© Copyright 2010 soVoz, Inc.