New Products
UC-win/Road for RoboCar(R) Ver.2
3D localization function with AR position-fixing system

Scheduled release date: January 2011 Simulation

 Approach to Robot engineering field
Since its creation, Forum8 activities were oriented towards Civil Engineering and Virtual Reality based 3D simulation. This year, the company started to extend its activities towards Robotics. The first project is the simulation of RoboCar(R), a robotic platform developed by the Robotics company ZMP(R). The RoboCar(R) is basically a scale model of a car, enhanced with devices and software algorithms allowing it to behave autonomously. In the meantime, Forum8 started the development of its own mobile robot.

 Functions the new version
The purpose of the RoboCar(R) is to provide car makers with Robotics inspired tools to develop new car driving assistance to make the driving safer and more comfortable. The Figure1 shows the version of the RoboCar(R) currently in use.It is equipped with:
- Incremental encoders mounted on each wheel and the main motor to estimate the position and orientation of the RoboCar(R).
- Laser Range finder, Infrared sensors and Stereo cameras to estimate the distance to the closest obstacles and identify known patterns for autonomous navigation.
- Wireless LAN adapter to allow a remote access to the RoboCar(R) from a distant PC.
- Accelerometer, Gyro
- Temperature sensorsC
■Figure1: Current version of RoboCar(R)

The whole system is managed by a main board based on a 500Mhz CPU running under Linux.
Additionally, a 2.7m x 1.8m course was built to simulate the actual use of a car, including pedestrian crossings, guard rails, traffic lights. The same course was then simulated along with the car model in a 3D VR environment using UC-win/Road driving simulator developed at Forum8 (Figure2).

■Figure2: View of UC-win/Road for RoboCar(R) (Top view and user view)

When driving in manual mode, the user controls the actual RoboCar(R) and its simulated model via a standard input device, for instance a wheel and pedals (acceleration and brake). UC-win/Road sends the user controls to the RoboCar(R) which in turn returns in real time the data from its sensors (Figure3).
■Figure3: Communication between RoboCar(R) and UC-win/Road

 2D Localization : Odometry
To ensure a proper simulation of the RoboCar(R), UC-win/Road needs to know its posture (position and orientation) at each sampling time. Until recently this 2D posture was estimated using the principle of Odometry: the new posture is estimated based on the knowledge of its posture at the previous sampling time and the measurement of the rotation of the wheels using the incremental encoder located on each wheel (Figure5).
This method has the advantage of being quite simple and straightforward but has the main drawback of being affected by errors of different nature: slippering of the wheels on the floor and several types of accumulative errors (due to the integration method, the variability of the actual radius of each wheel and so on).
The overall effect of those errors is that even when the actual RoboCar(R) and its simulated model are starting at the same posture, the estimated posture of the model will differ more and more as the RoboCar(R) travels a longer distance. A solution to this problem is to use an absolute localization system, giving the RoboCar(R)'s posture periodically, and independently from its previous posture (no accumulative errors). Several types of solutions were investigated, and the more simple and performing one was to use tools inspired from Augmented Reality.

■Figure4: (Optical) Rotary encoder within a RoboCar(R) wheel ■Figure5: Estimation of the current posture(X, Y, Θ) using Odometry principle

 3D Localization: AURELO
The Augmented Reality (AR) is a part of the mixed reality, which lies between the real world and the virtual world (Figure6).
■Figure6:Augmented Reality vs. Virtual Reality

Typically, an AR scene presents CG items inserted in a live video feed. A part of AR tools rely on markers to locate the 3D posture of the item to track.The AUgmented REality LOcalization system we developed allows to track several markers simultaneously (Figure7). Each marker shows an identical boundary (black square) and the identification of each marker is made according to the pattern inside the black square.

■Figure7: 3D localization of 2 markers using AURELO

The estimation error increases with the distance between the marker to the camera. For a distance less than 1.5m, the estimation error is less than 7mm in position and 2 degrees in orientation. This result is quite good for a camera based system.The AR tools that we used can only identify the transformation matrix of each marker relatively to the camera reference frame. As a consequence, two markers were used to be able to track the RoboCar(R)'s absolute posture in the actual world, using a camera located on top of the RoboCar(R) course. The Figure xx shows the actual view of the top camera.
In the current version, the AURELO system can track the RoboCar(R) over a 2.7m x 1.8m course. The promising results encouraged us to think about the next version of the system in which we hope to extend this tracking performance to a wider area using a multi camera system.

■Figure8: View from AURELO top camera
(Up&Coming 2011 New Year Issue)



[ ユーザー紹介 ]
[ お知らせ ]

>> 製品総合カタログ

>> プレミアム会員サービス
>> ファイナンシャルサポート

柔構造樋門の設計・3D配筋 Ver.14
マンホールの設計・3D配筋 Ver.8
開水路の設計・3D配筋 Ver.6
UC-1 Cloud 自動設計 BOXカルバート Ver.2
建築杭基礎の設計計算 Ver.6


7/2  UC-win/Road・VR
7/3  VRまちづくりシステム体験
7/7  レジリエンスデザイン・BIM系
7/9  Shade3D体験

《UC-1 for SaaS》
・Engineer's Studio®解析支援