The Robotics Chair is an interdisciplinary group of researchers led by Prof. Andreas Nüchter. We work with manipulators, mobile robots (drones, ground based and under-water), and sensor systems, such as 3D laser laser scanners with applications in every conceivable engineering domain. Focus of our research is 3D Point Cloud processing and some works are available in our open sourced toolkit at sourceforge.

Prof. Dr. Andreas Nüchter (Robotics)

Computer Science building M2, Room BH010, Campus Hubland (South)

The chair of robotics hosts the computer engineering group led by Prof. Dr.-Ing. Matthias Jung 

Computer Science building M2, Room B112, Campus Hubland (South)

About the robotics group

The department deals with robotics and automation, cognitive systems and artificial intelligence. Research topics include in particular the 3D capture of surfaces, shapes and environments using 3D laser scanners and cameras. Quite a bit of work has been and is being done in the context of SLAM (simultaneous localization and mapping) - SLAM is a robotics technique that requires a mobile system to simultaneously create a map of its environment and estimate its spatial location within that map. It is fundamental for autonomy of mobile robots. The chair also deals with high accuracy optical measurement methods and algorithms for surface reconstruction. 3D sensor data are not exclusively acquired by mobile robots. In recent years, applications to manipulators have been frequently investigated. Here, there is less demand for a solution of SLAM, but rather high-precision calibrations are required to exhaust robot specifications, for example, to perform accurate 3D printing with industrial manipulators.


Akademy 2024 will take place at JMU

Würzburg robots participate in the Mars simulation AMADEE-24.

Paper accepted in IEEE Transactions on Robotics


In this project, the consortium is researching and developing the technologies needed to enable a new generation of underwater sensors.

Underwater VR for astronauts

We plan in this study to combine diving goggles with a VR head set and to use our small 40 cubic meter pool for simulating a space environment.


Data from a sensor network of different mapping sensors as well as as-built data from previous surveys are automatically merged and interpreted into a common, highly accurate three-dimensional map