Demos on real time 4D visualization of multi pedestrian scenarios

Videos in this page demonstrate the i4D reconstruction process for outdoor scenes with multiple walking pedestrians. The Lidar monitors the scene from a fixed position and provides a dynamic point cloud. This information is processed to build a 3D model of the environment and detect and track the pedestrians. Each of them is represented by a point cluster and a trajectory. A moving cluster is then substituted by a detailed 4D model created in the studio. The output is a geometrically reconstructed and textured scene with avatars that follow in real time the trajectories of the pedestrians. (Press on the images to view the demo videos).

Motion detection and multiple person tracking using various real time Lidar sensors (Velodyne VLP16, HDL32, HDL64) (new)

Gait based person identification and activity recognition in multi-pedestrian environments based on real time Lidar measurements (available for Velodyne HDL64 sensor, coming soon: VLP16 version) (new)

Benchmark: SZTAKI Lidar Gait-and-Activity (SZTAKI-LGA) database

Complete reconstruction flow, showing in parallell the recorded video and Lidar point cloud sequence of the scene, the output of the Lidar-based pedestrian tracker, and the reconstructed 4D scene.

4D reconstructed scene view by a simulated moving camera: the reconstructed scene is shown without and with texture from the viewpoint of a moving simulated camera.

Long term person tracking with re-identification based on a Lidar point cloud sequence. Person re-idenitification is achieved with static features (see an improved version of this approach by the gait recognition demo)

Lidar-camera registration: parallell view of the Lidar point cloud and a video stream, with projection of 3D bounding boxes of people