Virtual Obstacle Courses are being used to ‘train’ robots to walk.

0

Virtual Obstacle Courses are being used to ‘train’ robots to walk.

Virtual simulations are being used to teach robots, notably four-legged, dog-like robots, how to walk properly.

These robots aren’t exactly real, according to ArsTechnica, because they’re CGI. They are, however, robots developed by a team of researchers from the ETH Zurich in Switzerland in partnership with NVIDIA.

The four-legged robots, dubbed ANYmals, are being trained to conquer simulated obstacles such as steps, slopes, and sharp drops.

Every time the robots succeeded in overcoming a task, they were given a more difficult one. The researchers achieved this by tweaking the control algorithm and making it a little more difficult.

According to the researchers, the major purpose of this experiment is to develop a machine learning algorithm that may be used to control robot legs in the real world.

A video of the experiment, titled “Massively Parallel Deep Reinforcement Learning,” is available on the Robotic Systems Lab’s YouTube channel:

ANYmals are designed to “deliver unprecedented movement” when overcoming real-world obstacles, according to ANYbotics. Stairs, steps, gaps, and even small spaces fall into this category.

Furthermore, the four-legged robots appear to have been engineered to be extremely tough. ANYbotics guarantees “consistent performance” even in extreme weather situations like rain, wind, snow, dust, and water splashes.

Also see: Robot Cockroach Has Engineering That Doesn’t Get Crushed and Moves Like an Insect

What Does It Take for These Robots to ‘Learn’?

For many years, roboticists have struggled to replicate natural walking gaits, particularly for bipedal devices. Despite being able to walk on two legs, even the most advanced of these robots (such as Boston Dynamics’ parkour robots) have limited mobility.

These four-legged machines, on the other hand, are using nature to solve the difficulty of learning to walk. “Reinforcement training,” to be precise. In the actual world, animals learn certain life skills through this type of teaching. The researchers employed this specific AI technique to teach the robots using positive and negative feedback simulations for the ANYmals.

The program “judges” how effectively the robots formed the task as they navigate the virtual course, then tweaks the settings.

NVIDIA’s AI technology is proving to be effective.

The ANYmals are using NVIDIA’s AI technology to help them reach their goals. The researchers claim that by running the robots on this specific AI hardware, the amount of time spent “training” them has been reduced to less than one-hundredth of what it was previously.

It’s this unique AI technology that’s causing the problem. News from Brinkwire in a nutshell.

Share.

Comments are closed.