Demos on real time 4D visualization of multi pedestrian scenarios

Videos in this page demonstrate the i4D reconstruction process for outdoor scenes with multiple walking pedestrians. The Lidar monitors the scene from a fixed position and provides a dynamic point cloud. This information is processed to build a 3D model of the environment and detect and track the pedestrians. Each of them is represented by a point cluster and a trajectory. A moving cluster is then substituted by a detailed 4D model created in the studio. The output is a geometrically reconstructed and textured scene with avatars that follow in real time the trajectories of the pedestrians. (Press on the images to view the demo videos).

Motion detection and multiple person tracking using various real time Lidar sensors (Velodyne VLP16, HDL32, HDL64)

Complete reconstruction flow, showing in parallell the recorded video and Lidar point cloud sequence of the scene, the output of the Lidar-based pedestrian tracker, and the reconstructed 4D scene.

4D reconstructed scene view by a simulated moving camera: the reconstructed scene is shown without and with texture from the viewpoint of a moving simulated camera.

Long term person tracking with re-identification based on a Lidar point cloud sequence. Person re-idenitification is achieved with static features (see an improved version of this approach by the gait recognition demo)

Lidar-camera registration: parallell view of the Lidar point cloud and a video stream, with projection of 3D bounding boxes of people



Geo-Information Computing @ Machine Perception Lab.

GeoComp Research Group

Machine Perception Laboratory

SZTAKI main page

GeoComp Demos:

Demo page



GeoComp Group leader: Dr. Csaba Benedek

i4D project manager: Dr. Zsolt Jankó

Head of MPLab: Prof. Tamás Szirányi

MPLab administration: Anikó Vágvölgyi


Kende utca 13-17
H-1111 Budapest, Hungary
Tel: +36 1 279 6194
Fax: +36 1 279 6292

Flag Counter