back to list

Project: Unstructured dynamic environment reconstruction

Description

VinciTech is a robot company that is developing a lightweight robotic platform for collaborative tasks. This robotic platform is intended to work alongside people, as a COllaborative roBOT (COBOT). The robot arm itself is made of Igus Robolink W components. In a previous MSc project, the TU/e has developed an online dynamic environment reconstruction algorithm based on a depth camera and radial basis functions (RBF) interpolation that provides an environment description that can be used in conjunction with a robot controller for executing motion tasks while avoiding static and, in particular, dynamic obstacles (see Figure 1 for an illustration). In this new MSc project, we want to go further, by integrating multiple cameras, speeding up the environment reconstruction rate (30 Hz or more). The use of multiple cameras is essential in removing the unavoidable shadow created by the robot and moving agents entering the scene.

Details
  • Develop a theretical framework to use multiple 3D cameras (2 or more), in order to fuse several point-clouds online, remove the the robot from the science (robot geometry and configuration are know at real-time), and generate a dynamic enviroment description suitable to be used for robot control. 2.
  • Improve the current C++ software by speeding up computations, allowing for the use of multiple depth cameras, and removal of the robot arm from the reconstructed enviroment 3.
  • Perform a literature review on the subject, describing pros and cons with respect to other methods for robot-enviroment collision avoidance based on multi depth cameras
Requirements
  • Background in robotics (forward kinematics, robot control)
  • Good programming skills, in particular, on C++ programming
  • Background in computer vision (point clouds, depth images)
  • Good communications skills

Previous experience with open-source robotics software (rViz, Gazebo, ROS) and parallel computation (openMP, Invia CUDA) are highly appreciated.

Details
Supervisor
Andrei Jalba