Evan Ackerman reports in IEEE:
Humanoid robots use this crouching squat-walk to get around, because those perpetually bent legs are how it keeps from falling over because bent knees give them a lot more control. A controller with fewer constraints resulted in a more natural gait. The long term vision is to make robots that are capable of equal locomotion feats as humans, so they can function as true human avatars.
Humanoid robots have a very distinctive walk. Knees bent, torso as stationary as possible. Even Boston Dynamics’ own Atlas uses this crouching sort of squat-walk to get around, because those perpetually bent legs are how it keeps from falling over. This sort of gait is so common with humanoid robots that it’s become the “normal” robot gait, but it’s also not at all the way that humans walk. We walk with straight legs, locking our knees with each stride, because it’s much easier to support our weight that way. You can try it for yourself: that bent knee “bipedal robot” walk gets tiring to keep up, because your leg muscles always have to be engaged.
At IHMC, roboticists are busy solving this problem by teaching Atlas to walk more like we do. In addition to making humanoid robots more efficient, a more natural walking gait could make them more versatile as well, able to handle more rugged terrain that robots that spend all of their time in a squat.
Most robots don’t walk this way because bent knees give them a lot more control. With a bent knee, you can either bend it more or straighten it to help keep your robot stable, whereas straight legs mean that balancing has to be done mostly with the ankles instead. Another difference you might notice is that bent-legged walking robots tend to lift their feet straight up and put them straight down again, while the IHMC Atlas uses a “toe-off” motion like humans do. Walking like this is a much harder problem to solve, which is why it looks weird to see Atlas doing it—with a very few exceptions, DURUS being one of them, humanoid robots just don’t walk with straight legs.
To get Atlas to walk this way, IHMC developed a new whole-body control framework, and then nudged the controller towards giving the robot straight legs without explicitly requiring the legs to exert a force on the ground. By leaving other motions (like the motion of the robot’s feet) unconstrained, natural gait behaviors emerge on their own, like the toe-off motion. In other words, IHMC didn’t have to program that in specifically, it was just the best solution that the controller came up with for walking while trying to maintain straight legs. It certainly seems promising that a controller with fewer constraints resulted in a more natural gait, and in real-world testing, Atlas was able to walk over a variety of terrain and even react to mildly aggressive shoves with its legs straight.
For more details, we spoke with the first author of the 2018 ICRA paper where this research was presented, Robert Griffin, via email.
IEEE Spectrum: Can you summarize why most bipedal robots walk in a squat and why most humans don’t?
Robert Griffin: Humans likely walk with straighter legs because walking with bent legs is much less efficient. Standing straight requires almost no forces at the knees, while bending your knees requires significant forces. However, most walking models used for robots assume a constant center of mass height throughout the gait as part of the model. The only way for them to achieve this constant height is to walk with a squatted posture. Additionally, planning a height trajectory is a pretty challenging problem, particularly for an online application where the robot has to compute the trajectory in real time. This results in most scientists avoiding that problem for online applications. Lastly, walking squatted allows the robot to avoiding straightening out its legs and potentially running into a singularity at the knee joint, which can be tricky to handle.
What are the disadvantages of walking with straight legs?
When walking with straight legs, the robot has less control authority. Think of an athlete: they bend their knees slightly in an "athletic stance" so that they can quickly change directions. When your knee is straight, you can’t use it to push against the ground. When it’s bent, you can achieve larger ground reaction forces, allowing greater control authority. Additionally, impacts when taking a step become unavoidable. This can be tough for the robot to handle.
The way Atlas walks in the video is similar to the way we’ve seen DURUS walk. How is you research related to what they’ve been working on?
The gait that DURUS shows is excellent, and I’m glad to be compared to it, as both our research and theirs are trying to achieve a more natural, human-like gait. To my knowledge, they take a significantly different approach than ours, running an offline optimization using the hybrid-zero dynamics of the robot to determine joint setpoints and generate a specific gait. In our approach, we’re not defining joint commands for the robot directly, and calculate everything online, including the dynamic trajectory.
To what extent did fewer constraints lead to the emergence of more natural walking behaviors? Did that surprise you?
In this work, we didn’t specify a desired height, and when toeing off, we didn’t specify anything about the foot pitch. This allowed the robot to walk with straighter legs. It also allowed the robot to use a toe-off motion as it needed. I think the toe-off motion is really what surprised me at first. I also was surprised at how well the robot was able to bend its knees when necessary and lower its height to achieve horizontal tasks. In simulation, we were able to have it take much longer single steps (starting from rest and ending at rest after a single step), about 1.5 m, and much longer steps downs (approximately 0.5 m) than we’d otherwise been capable of. We’re pretty excited to try this on hardware, but haven’t had the chance yet.
When you say that you’d like to get the robot “to traverse similar terrains as humans,” can you describe what kinds of specific capabilities you’re thinking of?
The long term vision is to make robots that are capable of equal locomotion feats as humans, so they can function as true human avatars. What we’d really like to try next with this algorithm is larger step downs, especially with uncertainty, where the robot doesn’t know that it has to step down. Humans are really adept at this, with something about our gaits just being naturally robust to uncertainty. We’re trying now to design approaches that are capable of both precise footstep placement, such as when walking over a rock field with few, sparse footholds, and are robust to when this precision fails, such as really compliant terrain with lots of subtle height variations, using a single algorithm. What we have currently is capable of really good precision, where-as robots like ATRIAS have shown impressive levels of robustness to uncertainty. With think with a few tricks, which can combine this robustness with precision to make a truly capable robot.
This idea of combining precise footstep placement with robustness is a very interesting one— ATRIAS is famous for being able to walk across rough terrain with no sensors at all, relying entirely on its nimble legs and sense of balance to stay upright no matter what its feet end up doing. But sometimes you need precise foot placement to perform specific tasks, and it would be more than a little bit useful to be able to do both of these things at once.
0 comments:
Post a Comment