Tamim Asfour, Karlsruhe Institute of Technology, Germany
Title: Generation of multi-contact whole-body motions based on natural language models learned from human motion data
Abstract: Generating multi-contact whole-body motions for humanoid robots and whole-body exoskeletons constitutes an open and, due to its high dimensionality, a very challenging problem of vital interest for humanoid robotics research. In this talk, we present a data-driven approach for generating sequences of whole-body poses with multi-contacts, which is inspired by techniques from natural language processing. The approach uses human motion data for the autonomous generation of sequences of whole-body pose transitions for whole-body tasks that use the environment to enhance balance. To this end, we present a large-scale whole body human motion database together with techniques for systematic organization, annotating, classification of human motion data as well as for contact-based segmentation of whole-body motion into support poses. These poses are subsequently used to train an n-gram language model, whose words are whole-body poses and whose sentences are sequences of these poses that characterize a motion. Using this language model, a sequence of whole-body pose transitions, which satisfies the constraints imposed by the task is generated as the sequence of transitions with the highest probability according to the language model.
Bio: Tamim Asfour is full Professor at the Institute for Anthropomatics and Robotics, Karlsruhe Institute of Technology (KIT) where he holds the chair of Humanoid Robotics Systems and is head of the High Performance Humanoid Technologies Lab (H2T). His current research interest is high performance 24/7 humanoid robotics. Specifically, his research focuses on engineering humanoid robot systems, which are able to perform grasping and dexterous manipulation tasks, learn from human observation and sensorimotor experience as well as on the mechano-informatics of humanoids as the synergetic integration of mechatronics, informatics and artificial intelligence methods into integrated complete humanoid robot systems. He is developer of the ARMAR humanoid robot family and is leader of the humanoid research group at KIT since 2003. Tamim Asfour received his diploma degree in Electrical Engineering in 1994 and his PhD in Computer Science in 2003 from the University of Karlsruhe (TH).
Mehdi Benellegue, JRL, Japan
Title: Bipedal locomotion: a continuous tradeoff between robustness and energy-efficiency
Abstract: Walking is a mechanical process involving the body and subject to balance constraints. The structure of the body can allow the emergence of a stable walking using few to no energy, but this motion is sensitive to external perturbations and prone to fall easily. On the other hand, with strong actuation and anticipation, much more disturbances can be overcome, but at the cost of high energy consumption. The choice between these two modes of locomotion is more than a context-dependent binary selection. It is a continuous tradeoff, not only providing a span of different walking control policies but building the framework for a precise classification of these controllers. In this talk, we discuss, for the humans and the robots, the two extreme walking paradigms and how to sort the solutions ranging between them. Finally, we analyze the incentives behind the decision to change to a more robust or more efficient control policy. The development of these subjects is expected to constitute the paradigm for the design of humanoid robots and new walking controllers.
Bio: Mehdi Benallegue received the ingénieur degree from the Institut National d’Informatique (INI), Algeria, in 2007, the M.Sc. degree from the University of Paris 7, Paris, France, in 2008 and the Ph.D. degree from Universit e de Montpellier 2, France, in 2011. He has been a postdoctoral researcher in a neurophysiology laboratory in Collège de France and in LAAS CNRS. He is currently a Research Associate with the Humanoid Research Group in National Institute of Advanced Industrial Science and Technology, Japan. His research interests include estimation and control of legged locomotion, biomechanics, neuroscience and computational geometry.
Etienne Burdet, Imperial College London, UK
Title: Interaction control: in humans, for robots
Abstract: How do we interact with unknown environments and with other humans? By investigating how humans deal with unstable dynamics, we could elucidate how humans adapt to the environment in order to carry unstable tasks typical of tool use, and how they learn to coordinate muscle appropriately through practice. The first part of this talk will review these results and show how this lead to a model motor adaptation, which can also be used as a new robot behaviour bringing flexibility to the control of interactive tasks. Recently we investigated interactive motor tasks between humans, using dual robotic interface that enable us to modulate the interaction between the partners. The second part of the presentation will describe some of the surprising results elucidating involuntary coordination patterns between humans and analyse the underlying control mechanism.
Bio: Dr. Etienne Burdet is Chair of Human Robotics at The Imperial College of Science, Technology and Medicine in UK. He is also a visiting Professor at Nanyang Technology in Singapore and at University College London. He holds an MSc in Mathematics (1990), an MSc in Physics (1991), and a PhD in Robotics (1996), all from ETH-Zürich. He was a postdoctoral fellow with TE Milner from McGill University, Canada, JE Colgate from Northwestern University, USA and Mitsuo Kawato of ATR in Japan. Professor Burdet’s group uses an integrative approach of neuroscience and robotics to: i) investigate human motor control, and ii) design efficient systems for training and rehabilitation, which are tested in clinical trials.
Robert Gregg, University of Texas at Dallas, USA
Title: A continuous parameterization of human locomotion over time and task for unified control of powered prosthetic legs
Abstract: The human gait cycle is typically viewed as a periodic sequence of discrete events, starting with heel contact during initial stance and ending with knee extension during late swing. This convention has informed the design of control strategies for powered prosthetic legs, which almost universally switch between multiple distinct controllers through the gait cycle based on a finite state machine. Human locomotion is further discretized by activity, resulting in task-level state machines with a small set of task-specific control modes, e.g., one for uphill and one for downhill. However, this discrete methodology cannot synchronize to the continuous motions of the user or adapt to the continuum of user activities. Instead of discretely representing human locomotion, this talk will present a continuous parameterization of human locomotion across measureable phase and task variables. Two studies with 10 able-bodied human subjects identify 1) a phase variable that robustly parameterizes knee and ankle patterns across perturbations to the gait cycle, and 2) task variables to parameterize kinematic adaptations to ground slope and walking speed. A unifying prosthetic leg controller is then designed around this continuous parameterization of human gait to synchronize prosthetic joint patterns with the timing and activity of the human user. The viability of this approach is demonstrated by experiments with above-knee amputee subjects walking on a powered knee-ankle prosthesis at variable speeds and inclines.
Bio: Robert D. Gregg IV received the B.S. degree in electrical engineering and computer sciences from the University of California, Berkeley in 2006 and the M.S. and Ph.D. degrees in electrical and computer engineering from the University of Illinois at Urbana-Champaign in 2007 and 2010, respectively. He joined the Departments of Bioengineering and Mechanical Engineering at the University of Texas at Dallas (UTD) as an Assistant Professor in June 2013 with an adjunct appointment at the UT Southwestern Medical Center. Prior to joining UTD, he was a Research Scientist at the Rehabilitation Institute of Chicago and a Postdoctoral Fellow at Northwestern University. His research concerns the control mechanisms of bipedal locomotion with application to wearable control systems, including prostheses and orthoses. Dr. Gregg is a recipient of the NSF CAREER Award, the NIH Director’s New Innovator Award, and the Career Award at the Scientific Interface from the Burroughs Wellcome Fund. His work has been recognized with the Best Student Paper Award of the 2008 American Control Conference and the 2015 IEEE Conference on Decision & Control, the Best Technical Paper Award of the 2011 CLAWAR Conference, and the 2009 O. Hugo Schuck Award from the IFAC American Automatic Control Council. Dr. Gregg is a Senior Member of the IEEE Control Systems Society and the IEEE Robotics & Automation Society.
Herman van der Koij, University of Twente, Netherlands
Title: How do you humans cope with unexpected perturbations of gait?
Abstract: How humans maintain balance during walking is still poorly understood. To unravel the the underlying mechanics of balance control during gait, we systematically applied perturbations to the waist in different directions, with different amplitudes and at different phases of the gait cycle. Muscle activity, full body kinematics and ground reactions forces were recorded. Joint kinematics and kinetics were calculated with OpenSim. Following backward and forward perturbations, step length and step time did not change, while following sidewards perturbations strong modulation in step length, width and time were observed. The extrapolated centre of mass model could predict the CoP relative to the COM at the end of the double stance phase from the CoM velocity at the beginning of the same double stance phase. With a linearised inverted pendulum model and energy cost functions the observed changes in step parameters could be globally predicted. With a more complex neuromuscular model the observed joint torques could be well predicted, in particular at the hip and knee joints. The models derived from the unique data-set we collected can be used in the control of humanoids and wearable robots. We implemented these models successfully in exoskeletons and a haptic gait trainer.
Bio: Prof. Dr. ir. Herman van der Kooij, (1970) received his Phd with honors (cum laude) in 2000 and is professor in Biomechatronics and Rehabilitation Technology at the Department of Biomechanical Engineering at the University of Twente (0.8 fte), and Delft University of Technology (0.2fte), the Netherlands. His expertise and interests are in the field of human motor control, adaptation, and learning, rehabilitation robots, diagnostic, and assistive robotics, virtual reality, rehabilitation medicine, and neuro computational modeling. He has published over 150 publications in the area of biomechatronics and human motor control. He was awarded the prestigious Dutch VIDI and VICI personal grants in 2001 and 2015 respectively. He is associate editor of IEEE TBME and IEEE Robotics and Automation Letters, member of IEEE EMBS technical committee of Biorobotics, and was member of several scientific program committees in the field of rehabilitation robotics, bio robotics, and assistive devices. He was co-group leader of the technology development for new rehabilitation robotics workgroup of the Cost Action European Network on Robotics for NeuroRehabilitation. He is member of the program committee of the Dutch IMDI core on Neurocontrol, and of the NeuroSipe program. He is the coordinator of the FP7 project Symbitron.
Dana Kulic, University of Waterloo, Canada
Title: Human Motion Understanding for Performance Feedback
Abstract: In this talk, we will describe our work developing systems for on-line measurement and analysis of human movement that can be used to provide feedback to patients and clinicians/coaches during the performance of rehabilitation and sports training exercises. The system consists of wearable inertial measurement unit (IMU) sensors attached to the patient’s limbs. The IMU data is processed to estimate joint positions. We will describe an approach to improve the accuracy of pose estimation via on-line learning of the dynamic model of the movement, customized to each patient. Next, the pose data is segmented into exercise segments, identifying the start and end of each motion repetition automatically. The pose and segmentation data is visualized in a user interface, allowing the patient to simultaneously view their own movement overlaid with an animation of the ideal movement. We will present results of user studies analyzing the system capabilities for gait measurement of stroke patients undergoing gait rehabilitation, and demonstrating the significant benefits of feedback with patients undergoing rehabilitation following hip and knee replacement surgery.
Bio: Dana Kulic received the combined B. A. Sc. and M. Eng. degree in electro-mechanical engineering, and the Ph. D. degree in mechanical engineering from the University of British Columbia, Canada, in 1998 and 2005, respectively. From 2002 to 2006, Dr. Kulic worked with Dr. Elizabeth Croft as a Ph. D. student and a post-doctoral researcher at the CARIS Lab at the University of British Columbia, developing human-robot interaction strategies to quantify and maximize safety during the interaction. From 2006 to 2009, Dr. Kuliｃ was a JSPS Post-doctoral Fellow and a Project Assistant Professor at the Nakamura-Yamane Laboratory at the University of Tokyo, Japan, working on algorithms for incremental learning of human motion patterns for humanoid robots. Dr. Kulic is currently an Associate Professor at the Electrical and Computer Engineering Department at the University of Waterloo, Canada. Her research interests include robot learning, humanoid robots, human-robot interaction and mechatronics.
Katja Mombaur, University of Heidelberg, Germany
Title: Optimizing wearable robots for the lower back support
Abstract: Understanding human movement and how it changes under different levels of support is an important topic for the design of wearable robots. The European project SPEXOR aims at developing spinal exoskeletons for low back pain prevention and vocational reintegration. In this talk, we present research performed at Heidelberg University in the context of the SPEXOR project in which we use model-based optimization to predict human movement and optimize exoskeleton design to provide best possible support. We consider scooping, lifting, squatting and turning motions. The development of good parameterized models of the subject and the exoskeleton and of the coupling of these two sub-models is important for our research. The optimization goals range from mimicking motions observed in human motion capture recordings to optimizing performance or effort related criteria. We present results for optimized passive exoskeletons using springs in the hip and lumbar regions as well as active exoskeletons with motors that can produce more general torque histories.
Bio: Katja Mombaur is a full professor at the Institute of Computer Engineering (ZITI) of Heidelberg University and head of the Optimization in Robotics & Biomechanics (ORB) group as well as the Robotics Lab. She holds a diploma degree in Aerospace Engineering from the University of Stuttgart and a Ph.D. degree in Mathematics from Heidelberg University. She was a postdoctoral researcher in the Robotics Lab at Seoul National University, South Korea. She also spent two years as a visiting researcher in the Robotics department of LAAS-CNRS in Toulouse. Katja Mombaur is coordinator of the newly founded Heidelberg Center for Motion Research. She also is PI in the European H2020 project SPEXOR and the Graduate School HGS MathComp as well as in several national projects. Until recently, she has coordinated the EU FP7 project KoroiBot and was PI in the EU projects MOBOT and ECHORD–GOP. She is founding chair of the IEEE RAS technical committee Model-based optimization for robotics.
Gentiane Venture, TUAT, Japan
Title: From the motion capture studio to daily life environments
Abstract: In this presentation, we will present the technologies we have developed based on motion capture technology using lab grade devices to create personalized dynamics model and analyze dynamics of human motion. And our effort to transfer these technologies to low cost and high flexibility devices that can be used in-house or in out-patient clinics and the challenges to overcome in the future to make these technologies fully accessible.
Bio: Gentiane Venture is a French roboticist who has been working in academia in Tokyo, Japan for more than 10 years. She is a distinguished professor with the Tokyo University of Agriculture and Technology. After graduating from Ecole Centrale de Nantes and obtaining a PhD from University of Nantes in 2000 and 2003 respectively, she works at the French Nuclear Agency and at the University of Tokyo. She started in 2009 with Tokyo University of Agriculture and Technology where she has established an international research group working on human science and robotics. The researchers of her group try to encompass human motion dynamics and non-verbal communication into complex intelligent robot behavior design to achieve personalized human machine interaction. The work of her group is highly interdisciplinary by collaborating with therapists, sociologists, psychologists, physiologists, philosophers, neuroscientists, ergonomists, biomechanists, and designers.
Katsu Yamane, Disney Research, USA
Title: Human Movement Understanding for Physical Human-Robot Interaction
Abstract: Physical human-robot interaction (pHRI) requires understanding of human movements in two levels: 1) understanding the intention or emotion through the movement of the human which the robot is interacting with, and 2) understanding how the robot should react through its movement to create plausible interaction. In this talk, we present how we handle these issues in our past and current projects in pHRI. The “ball catch” project addresses the issue of providing natural reaction through robot motions, while the “human-to-robot object hand-off” project focuses on the issue of estimating the timing and location of object hand off. Finally, we will present some of the preliminary findings on what information is required to create genuine human-robot hugging experience.
Bio: Dr. Katsu Yamane is a Senior Research Scientist at Disney Research, Pittsburgh and an Adjunct Associate Professor at the Robotics Institute, Carnegie Mellon University. He received his B.S., M.S., and Ph.D. degrees in Mechanical Engineering in 1997, 1999, and 2002 respectively from the University of Tokyo, Japan. Prior to joining Disney in 2008, he was an Associate Professor at the University of Tokyo and a postdoctoral fellow at Carnegie Mellon University. Dr. Yamane is a recipient of King-Sun Fu Best Transactions Paper Award and Early Academic Career Award from IEEE Robotics and Automation Society, and Young Scientist Award from Ministry of Education, Japan. His research interests include humanoid robot control and motion synthesis, physical human-robot interaction, character animation, and human motion simulation.
Eiichi Yoshida, AIST, Japan
Title: Product design and evaluation by integrated motion synthesis for digital human and humanoid
Abstract: We present an integrated approach to motion synthesis for human simulation and humanoid robotics that is useful for product design and evaluation. The first axis is to develop a system for human-centered product design through understanding humans’ motion principles by using a digital human that can model its shape, musculo-skeletal structure and motions, as well as interactions with devices. Another main research direction is development of a humanoid robot that can reproduce various human behaviors to serve as an evaluator of products such as assistive devices. This allows estimating its mechanical supportive effects in a quantitative manner, which is difficult with human measurement. We also introduce applications of this research to standardization of wearable lumbar-support assistive devices.
Bio: Eiichi Yoshida received M.E and Ph.D degrees from Graduate School of Engineering, the Universi-ty of Tokyo in 1993 and 1996 respectively. Since 2009, he has been serving as Co-Director of CNRS-AIST JRL (Joint Robotics Laboratory), UMI3218/RL, and appointed as Director in 2017. He also serves as Deputy Director of AIST/IS since 2015, after serving as Co-Director of LIA JRL-France at LAAS-CNRS, Toulouse, France, from 2004 to 2008. He was awarded the title of Chevalier, lfOrdre National du Merite from French Government in 2016. His research interests include robot task and motion planning, human modeling, and humanoid robots.