The goal of the project is a development of a 6 DoF multi-user interactive input system for multi-screen back-projection virtual environments using infrared laser pointers and infrared sensitive-cameras.
A set of rays produced by infrared lasers fixed to a hand-held interactive device is projected to the screens and an image of the projected pattern on walls of the virtual reality environmentis captured by the IR cameras. The position of the laser dots on the screen retrieved using image processing algorithms. The resulting positions are used to compute the pose and the orientation of the input devices according to the screens using a specific mathematical model based on the geometric constraints.
The general project algorithm can be described in three main steps:
Recognition, Calibration and Reconstruction.
- using GigE camera
- using Emulator data
- using Video sream
- Blobs detection on 2D camera images
- Lens distortions correction
- Applying homography transformation and converting into real-world coordinates
- Intrinsic camera calibration: Determination of the radial and the tangential distortion coefficients of the cameras
- Screen calibration: Determining homography matrix using markers at the screen corners illuminated by IR flash light
- Pattern-emitting device calibration: Calculating the divergence angels between the different rays emitted by the pointer
- Correspondence problem: Distinguishing between points within the pattern issued from the same laser pointer
- Patterns separation: Distinguishing between points belonging to different patterns issued from different laser pointers
- Model based pose-fitting algorithm: Used to determine the 3D pose of the input device
Contact: Prof. Dr.-Ing. Rainer Herpers, Timur Saitov
Publications: An Optical Laser-based User Interaction System for CAVE-type Virtual Reality Environments, Acceleration of BLOB Detection for Image Processing
Related Work: On Pose Recovery for Generalized Visual Sensors (2004), Augmenting a Laser Pointer with a Diffraction Grating for Monoscopic 6DoF Detection
This project is funded by ‘FHprofUnt’