Welcome to my Homepage! My name is Daniel Kuhner. I'm a computer scientist and a PhD student in the "Autonmous Intelligent Systems" lab of Prof. Wolfram Burgard at the University of Freiburg, Germany. My research topics include manipulation planning, computer vision, human-robot interaction and brain-machine interfaces. In the following you will find a list of my publications and projects along with some additional material (e.g., videos, code):

Publications

Projects

If you have any questions don't hesitate to contact me:

Contact me

Under Review

Augmenting Action Model Learning by Non-Geometric Features

Iman Nematollahi, Daniel Kuhner, Tim Welschehold, Wolfram Burgard
Submitted to IEEE Int. Conf. on Robotics and Automation (ICRA), Montreal, Canada, 2019 

Abstract

Learning from demonstration is a powerful tool for teaching manipulation actions to a robot. It is, however, an unsolved problem how to consider knowledge about the world and action-induced reactions such as forces imposed onto the gripper or measured liquid levels during pouring without explicit and case dependent programming. In this paper, we present a novel approach to include such type of knowledge directly in form of measured features. To this end, we use action demonstrations together with external features to learn a dynamic system in a Gaussian Mixture Model (GMM) representation. Accordingly, during action imitation, the system is able to couple the geometric trajectory of the motion to measured features in the scene. We demonstrate the feasibility of our approach with a broad range of external features in real-world robot experiments including a drinking, handover and pouring task.

Deep Learning Based BCI Control of a Robotic Service Assistant Using Intelligent Goal Formulation

Daniel Kuhner, Lukas D.J. Fiederer, Johannes Aldinger, Felix Burget, Martin Völker, Robin T. Schirrmeister, Chau Do, Joschka Boedecker, Bernhard Nebel, Tonio Ball, Wolfram Burgard
Submitted to Robotics and Autonomous Systems
Preprint available at bioRxiv

Download the PDF file of this paper

Abstract

As autonomous service robots become more affordable and thus available for the general public, there is a growing need for user-friendly interfaces to control these systems. Control interfaces typically get more complicated with increasing complexity of the robotic tasks and the environment. Traditional control modalities as touch, speech or gesture commands are not necessarily suited for all users. While non-expert users can make the effort to familiarize themselves with a robotic system, paralyzed users may not be capable of controlling such systems even though they need robotic assistance most. In this paper, we present a novel framework, that allows these users to interact with a robotic service assistant in a closed-loop fashion, using only thoughts. The system is composed of several interacting components: non-invasive neuronal signal recording and co-adaptive deep learning which form the brain-computer interface (BCI), high-level task planning based on referring expressions, navigation and manipulation planning as well as environmental perception. We extensively evaluate the BCI in various tasks, determine the performance of the goal formulation user interface and investigate its intuitiveness in a user study. Furthermore, we demonstrate the applicability and robustness of the system in real world scenarios, considering fetch-and-carry tasks and tasks involving human-robot interaction. As our results show, the system is capable of adapting to frequent changes in the environment and reliably accomplishes given tasks within a reasonable amount of time. Combined with high-level planning using referring expressions and autonomous robotic systems, interesting new perspectives open up for non-invasive BCI-based human-robot interactions.

Conference Papers

Closed-Loop Robot Task Planning Based on Referring Expressions

Daniel Kuhner, Johannes Aldinger, Felix Burget, Moritz Göbelbecker, Wolfram Burgard, Bernhard Nebel
Proc. of the IEEE/RJS International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 2018

Download the PDF file of this paper

Abstract

Increasing the accessibility of autonomous robots also for inexperienced users requires user-friendly and highlevel control opportunities of robotic systems. While automated planning is able to decompose a complex task into a sequence of steps which reaches an intended goal, it is difficult to formulate such a goal without knowing the internals of the planning system and the exact capabilities of the robot. This becomes even more important in dynamic environments in which manipulable objects are subject to change. In this paper, we present an adaptive control interface which allows users to specify goals based on an internal world model by incrementally building referring expressions to the objects in the world. We consider fetch-and-carry tasks and automatically deduce potential high-level goals from the world model to make them available to the user. Based on its perceptions our system can react to changes in the environment by adapting the goal formulation within the domain-independent planning system.

Bibtex
@inproceedings{kuhnerd18iros,
   author = {Daniel Kuhner and Johannes Aldinger and Felix Burget and Moritz Göbelbecker and Wolfram Burgard and Bernhard Nebel}, 
   title = {Closed-Loop Robot Task Planning Based on Referring Expressions}, 
   booktitle = {Proc. of the IEEE/RJS International Conference on Intelligent Robots and Systems (IROS)}, 
   url = {http://ais.informatik.uni-freiburg.de/publications/papers/kuhnerd18iros.pdf},  
   address = {Madrid, Spain}, 
   year = {2018} 
}

Acting Thoughts: Towards a Mobile Robotic Service Assistant for Users with Limited Communication Skills

Felix Burget, Lukas Dominique Josef Fiederer, Daniel Kuhner, Martin Völker, Johannes Aldinger, Robin Tibor Schirrmeister, Chau Do, Joschka Boedecker, Bernhard Nebel, Tonio Ball, Wolfram Burgard
Proceedings of the IEEE European Conference on Mobile Robotics (ECMR), Paris, France, 2017

Download the PDF file of this paper

Abstract

As autonomous service robots become more affordable and thus available also for the general public, there is a growing need for user friendly interfaces to control the robotic system. Currently available control modalities typically expect users to be able to express their desire through either touch, speech or gesture commands. While this requirement is fulfilled for the majority of users, paralyzed users may not be able to use such systems. In this paper, we present a novel framework, that allows these users to interact with a robotic service assistant in a closed-loop fashion, using only thoughts. The brain-computer interface (BCI) system is composed of several interacting components, i.e., non-invasive neuronal signal recording and decoding, high-level task planning, motion and manipulation planning as well as environment perception. In various experiments, we demonstrate its applicability and robustness in real world scenarios, considering fetch-and-carry tasks and tasks involving human-robot interaction. As our results demonstrate, our system is capable of adapting to frequent changes in the environment and reliably completing given tasks within a reasonable amount of time. Combined with high-level planning and autonomous robotic systems, interesting new perspectives open up for non-invasive BCI-based humanrobot interactions.

loading...
loading...
Bibtex
@inproceedings{burget17ecmr,
   author = {Felix Burget, Lukas Dominique Josef Fiederer, Daniel Kuhner, Martin Völker, Johannes Aldinger, 
Robin Tibor Schirrmeister, Chau Do, Joschka Boedecker, Bernhard Nebel, Tonio Ball, Wolfram Burgard}, booktitle = {Proc.~of the IEEE European Conference on Mobile Robotics (ECMR)}, year = {2017}, title = {Acting Thoughts: Towards a Mobile Robotic Service Assistant for Users with Limited Communication Skills}, address = {Paris, France}, url = {http://ais.informatik.uni-freiburg.de/publications/papers/burget17ecmr.pdf} }

Poisson-Driven Dirt Maps for Efficient Robot Cleaning

Jürgen Hess and Maximilian Beinhofer and Daniel Kuhner and Philipp Ruchti and Wolfram Burgard
In Proc. of the IEEE Int. Conf. on Robotics and Automation (ICRA), Karlsruhe, Germany, May 2013

Download the PDF file of this paper

Abstract

Being able to estimate the dirt distribution in an environment makes it possible to compute efficient cleaning paths for robotic cleaners. In this paper, we present a novel approach for modeling and estimating the dynamics of the dirt generation in an environment. Our model uses cell-wise Poisson processes on a regular grid to represent the dirt in the environment, which allows for an effective estimation of the dynamics of the dirt generation and for making predictions about the absolute dirt values. We propose two efficient cleaning policies which are based on the estimated dirt distributions and can easily be adapted to different needs of potential users. In extensive experiments carried out in simulation and with a modified iRobot Roomba vacuum cleaning robot, we demonstrate the effectiveness of our approach.

Bibtex
@inproceedings{hess13icra,
   author = {J{\"u}rgen Hess and Maximilian Beinhofer and Daniel Kuhner and Philipp Ruchti and Wolfram Burgard},
   title = {Poisson-Driven Dirt Maps for Efficient Robot Cleaning},
   booktitle = {Proc.~of the IEEE Int. Conf. on Robotics & Automation (ICRA)},
   month = {May},
   year = 2013,
   address = {Karlsruhe, Germany},
   url = {http://ais.informatik.uni-freiburg.de/publications/papers/hess13icra.pdf},
}

Task Space Motion Planner

This is an implementation of a motion planner that uses inverse differential kinematics to determine a trajectory for a robotic manipulator. The planner itself constructs a graph composed of task poses, e.g., the end-effector poses of the robot. By using the A* path planning algorithm the framework tries to connect the start and goal states while evaluating the feasibility of the motion using inverse differential kinematics and collision checks. It is especially thought for constraint end-effector motions as needed in the setup of our Neurobots project in which we manipulate objects as cups and bottles containing liquids. However, it is not limited to such applications. It can be also used for platform or combined arm-platform planning. 

We tested the framework with KUKA omniRob and iiwa robots. The planner consists of several ROS packages. The code is available at GitHub.

GitHub

Private:

Daniel Kuhner
Ingeborg-Drewitz-Allee 16
79111 Freiburg
E-Mail: This email address is being protected from spambots. You need JavaScript enabled to view it.

University:

Daniel Kuhner
Georges-Koehler-Allee 80
79110 Freiburg
E-Mail: This email address is being protected from spambots. You need JavaScript enabled to view it.