By simulating motion physics and collisions we can move the robot in a realistic manner through our virtual world. The motion will be governed by the forces applied to actuators such as motorized wheels and joints. And rigid body physics will dictate the consequences of bumping into things.
Ultrasonic sonars are simulated with geometric collision systems as well. This will enable collision avoidance algorithms to be developed along with environmental mapping.
Visual imaging is possible using virtual cameras and rendering straight to memory.
3d audio may be used to simulate microphone input for sound detection and processing.
An interface to write programs will be exposed to users. This will enable people to write their own modules that analyzes sensor output and provide control commands for the robot actuators.
The simulation environment will provide display facilities so the user can observe the robot behavior with in the environment and provide means for visualization for sonar frustums and mapping overlays along with various other indicators.
Provide a data driven framework for setting up environments and scenarios along with other simulated entities
The simulation is connected to a distributes system that will provide the robot itself with scalability, modularity and extensibility