Intelligent Systems


2024


no image
Natural and Robust Walking using Reinforcement Learning without Demonstrations in High-Dimensional Musculoskeletal Models
2024 (misc)

Abstract
Humans excel at robust bipedal walking in complex natural environments. In each step, they adequately tune the interaction of biomechanical muscle dynamics and neuronal signals to be robust against uncertainties in ground conditions. However, it is still not fully understood how the nervous system resolves the musculoskeletal redundancy to solve the multi-objective control problem considering stability, robustness, and energy efficiency. In computer simulations, energy minimization has been shown to be a successful optimization target, reproducing natural walking with trajectory optimization or reflex-based control methods. However, these methods focus on particular motions at a time and the resulting controllers are limited when compensating for perturbations. In robotics, reinforcement learning~(RL) methods recently achieved highly stable (and efficient) locomotion on quadruped systems, but the generation of human-like walking with bipedal biomechanical models has required extensive use of expert data sets. This strong reliance on demonstrations often results in brittle policies and limits the application to new behaviors, especially considering the potential variety of movements for high-dimensional musculoskeletal models in 3D. Achieving natural locomotion with RL without sacrificing its incredible robustness might pave the way for a novel approach to studying human walking in complex natural environments. Videos: this https://sites.google.com/view/naturalwalkingrl

link (url) [BibTex]

2022


no image
A Sequential Group VAE for Robot Learning of Haptic Representations

Richardson, B. A., Kuchenbecker, K. J., Martius, G.

pages: 1-11, Workshop paper (8 pages) presented at the CoRL Workshop on Aligning Robot Representations with Humans, Auckland, New Zealand, December 2022 (misc)

Abstract
Haptic representation learning is a difficult task in robotics because information can be gathered only by actively exploring the environment over time, and because different actions elicit different object properties. We propose a Sequential Group VAE that leverages object persistence to learn and update latent general representations of multimodal haptic data. As a robot performs sequences of exploratory procedures on an object, the model accumulates data and learns to distinguish between general object properties, such as size and mass, and trial-to-trial variations, such as initial object position. We demonstrate that after very few observations, the general latent representations are sufficiently refined to accurately encode many haptic object properties.

link (url) Project Page [BibTex]

2022

link (url) Project Page [BibTex]


A Soft Vision-Based Tactile Sensor for Robotic Fingertip Manipulation
A Soft Vision-Based Tactile Sensor for Robotic Fingertip Manipulation

Andrussow, I., Sun, H., Kuchenbecker, K. J., Martius, G.

Workshop paper (1 page) presented at the IROS Workshop on Large-Scale Robotic Skin: Perception, Interaction and Control, Kyoto, Japan, October 2022 (misc)

Abstract
For robots to become fully dexterous, their hardware needs to provide rich sensory feedback. High-resolution haptic sensing similar to the human fingertip can enable robots to execute delicate manipulation tasks like picking up small objects, inserting a key into a lock, or handing a cup of coffee to a human. Many tactile sensors have emerged in recent years; one especially promising direction is vision-based tactile sensors due to their low cost, low wiring complexity and high-resolution sensing capabilities. In this work, we build on previous findings to create a soft fingertip-sized tactile sensor. It can sense normal and shear contact forces all around its 3D surface with an average prediction error of 0.05 N, and it localizes contact on its shell with an average prediction error of 0.5 mm. The software of this sensor uses a data-efficient machine-learning pipeline to run in real time on hardware with low computational power like a Raspberry Pi. It provides a maximum data frame rate of 60 Hz via USB.

link (url) Project Page [BibTex]

link (url) Project Page [BibTex]


no image
Machine-Learning-Driven Haptic Sensor Design

Sun, H.

University of Tuebingen, Library, 2022 (phdthesis)

Abstract
Similar to biological systems, robots may need skin-like sensing ability to perceive interactions in complex, changing, and human-involved environments. Current skin-like sensing technologies are still far behind their biological counterparts when considering resolution, dynamics range, robustness, and surface coverage together. One key challenge is the wiring of sensing elements. During my Ph.D. study, I explore how machine learning can enable the design of a new kind of haptic sensors to deal with such a challenge. On the one hand, I propose super-resolution-oriented tactile skins, reducing the number of physical sensing elements while achieving high spatial accuracy. On the other hand, I explore vision-based haptic sensor designs. In this thesis, I present four types of machine-learning-driven haptic sensors that I designed for coarse and fine robotic applications, varying from large surface (robot limbs) to small surface sensing (robot fingers). Moreover, I propose a super-resolution theory to guide sensor designs at all levels ranging from hardware design (material/structure/transduction), data collection (real/simulated), and signal processing methods (analytical/data-driven). I investigate two designs for large-scale coarse-resolution sensing, e.g., robotic limbs. HapDef sparsely attaches a few strain gauges on a large curved surface internally to measure the deformation over the whole surface. ERT-DNN wraps a large surface with a piece of multi-layered conductive fabric, which varies its conductivity upon contacts exerted. I also conceive two approaches for small-scale fine-resolution sensing, e.g., robotic fingertips. BaroDome sparsely embeds a few barometers inside a soft elastomer to measure internal pressure changes caused by external contact. Insight encloses a high-resolution camera to view a soft shell from within. Generically, an inverse problem needs to be solved when trying to obtain high-resolution sensing with a few physical sensing elements. I develop machine-learning frameworks suitable for solving this inverse problem. They process various raw sensor data and extract useful haptic information in practice. Machine learning methods rely on data collected by an automated robotic stimulation device or synthesized using finite element methods. I build several physical testbeds and finite element models to collect copious data. I propose machine learning frameworks to combine data from different sources that are good enough to deal with the noise in real data and generalize well from seen to unseen situations. While developing my prototype sensors, I have faced reoccurring design choices. To help my developments and guide future research, I propose a unified theory with the concept of taxel-value-isolines. It captures the physical effects required for super-resolution, ties them to all parts of the sensor design, and allows us to assess them quantitatively. The theory offers an explanation about physically achievable accuracies for localizing and quantifying contact based on uncertainties introduced by measurement noise in sensor elements. The theoretical analysis aims to predict the best performance before a physical prototype is built and helps to evaluate the hardware design, data collection, and data processing methods during implementation. This thesis presents a new perspective on haptic sensor design. Using machine learning to substitute the entire data-processing pipeline, I present several haptic sensor designs for applications ranging from large-surface skins to high-resolution tactile fingertip sensors. The developed theory for obtaining optimal super-resolution can guide future sensor designs.

link (url) [BibTex]

link (url) [BibTex]

2014


no image
Robot Learning by Guided Self-Organization

Martius, G., Der, R., Herrmann, J. M.

In Guided Self-Organization: Inception, 9, pages: 223-260, Emergence, Complexity and Computation, Springer Berlin Heidelberg, 2014 (incollection)

link (url) DOI [BibTex]

2014

link (url) DOI [BibTex]

2013


no image
Behavior as broken symmetry in embodied self-organizing robots

Der, R., Martius, G.

In Advances in Artificial Life, ECAL 2013, pages: 601-608, MIT Press, 2013 (incollection)

[BibTex]

2013

[BibTex]

2011


no image
Tipping the Scales: Guidance and Intrinsically Motivated Behavior

Martius, G., Herrmann, J. M.

In Advances in Artificial Life, ECAL 2011, pages: 506-513, (Editors: Tom Lenaerts and Mario Giacobini and Hugues Bersini and Paul Bourgine and Marco Dorigo and René Doursat), MIT Press, 2011 (incollection)

[BibTex]

2011

[BibTex]

2010


no image

no image
\textscLpzRobots: A free and powerful robot simulator

Martius, G., Hesse, F., Güttler, F., Der, R.

\urlhttp://robot.informatik.uni-leipzig.de/software, 2010 (misc)

[BibTex]

[BibTex]


no image
Playful Machines: Tutorial

Der, R., Martius, G.

\urlhttp://robot.informatik.uni-leipzig.de/tutorial?lang=en, 2010 (misc)

[BibTex]

[BibTex]


no image
Taming the Beast: Guided Self-organization of Behavior in Autonomous Robots

Martius, G., Herrmann, J. M.

In From Animals to Animats 11, 6226, pages: 50-61, LNCS, Springer, 2010 (incollection)

link (url) DOI [BibTex]

link (url) DOI [BibTex]