Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

AI does not mean robots are coming


Stay informed about free updates

Pepper the humanoid robot was born in 2014. It enjoyed a brief wave of hype, including visit the Financial Times meeting with the editor. “This is a robot that behaves freely, driven by love,” said Masayoshi Son, the head of its main sponsor, SoftBank. Alibaba and Foxconn have also invested hundreds of millions in making robots a part of everyday life. However, it was not to be. You still find the occasional Pepper in a public library in Japan, unrestrained, head bowed, like a four-foot-tall Pinocchio who dreamed of becoming a real boy but didn’t- they are there. Production ceased in 2021 and only 27,000 units were ever made.

Yet the vision of humanoid robots – of machines like us that can do all the work we don’t want to do – is too enticing to let go for long. Recent, impressive advances in artificial intelligence have sparked a new wave of enthusiasm for robots. “The next wave of AI is physical AI. AI that understands the laws of physics, AI that can work among us,” said Jensen Huang, chief executive of chip maker Nvidia, earlier this year. Nvidia has made great strides in training AI models to become the second largest company in the world by market capitalisation.

Billions of dollars in venture capital are pouring into robotics startups. They intend to use the same kind of training methods that allow computers to predict whether a protein will fold or produce a truly amazing text. They aim, firstly, to allow robots to understand what they see in the physical world, and secondly, to interact with it naturally, to solve large programming tasks that include simple actions such as picking and manipulating thing.

Such a dream. However, investors and early stage entrepreneurs may end up as disillusioned as those who backed Pepper. It’s not because AI is useless. Rather, it’s because the hurdles to making an economical robot that can cook dinner and clean the bathroom are hardware, not just software, and AI itself doesn’t deal with it, yet say nothing about solving it.

These physical problems are many and severe. For example, a human arm or leg is powered by muscles, while a robot’s leg must be powered by motors. Each range of motion that the foot must travel requires multiple motors. All of this is possible, as robotic arms in factories show, but the efficient motors, gears and transmissions involved create mass, cost, power requirements and many wearable parts. which will be destroyed.

After creating the desired movement, there is a challenge to feel and feel. If you pick up a piece of fruit, for example, then the nerves in your hand will tell you how soft it feels and how hard you can squeeze it. You can taste whether food is cooked and smell whether it is burning. None of those cells are easy to deliver to a robot, and to the extent that they can, they add to the cost. Machine vision and AI can pay off, by checking whether fruit is ripe or food in a pan has the right color, but it’s an imperfect area.

Then there is the matter of power. Any autonomous machine needs its own power source. Robot arms in factories are connected to large cables. They can’t walk. The humanoid robot has a great advantage of using a battery, but then there are trade-offs in quantity, strength, power, flexibility, operation time, usable life and cost. These are some of the problems. Many smart people are working to solve them and are making progress. But the point is that these are physical, long-term and difficult challenges. Even a revolution in AI doesn’t make them go away.

So, does AI make it possible in the physical world? Instead of imagining how technology will enable new machines, it’s easier to imagine how existing machines will change once AI is applied to them.

An obvious example is self-driving cars. In this case, the machine does not need to change at all: the movement of the car through the physical world and the source of its energy will work as always, while the emotions involved in driving the car are almost reflected in completely. With the new vogue of AI, the hype cycle of autonomous cars is over. It should be the opposite: driving is a big market and a real world challenge that AI can easily handle, a point where anyone is tempted to invest in other systems to robots that should think about it.

It is also reasonable to think about how existing robots – from industrial robot arms to vacuum cleaners – will change. AI-powered machine vision will subtly increase the range of tasks a robotic arm can perform and make it safer for them to work alongside humans. Lightweight, single-purpose devices like cleaning robots will be even better. For example, in Chinese hotels, it has become common for a robot to deliver goods to your room. That kind of limited and controlled autonomy is easily granted.

In this way, AI will slowly bring us closer to androids. As for a robot like Pepper that can clean the toilet – sadly it’s too easy to make one that writes bad poetry, and that’s not likely to change anytime soon.

robin.harding@ft.com



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *