The battle between robots and humans reaches a ‘turning point’

Jean J. Sanders

Comment

Warehouse robots are finally reaching their holy grail moment: picking and sorting objects with the dexterity of human hands.

Amazon has robotic arms that can pick and sort cumbersome items like headphones or plushy toys before they’ve been boxed. FedEx has piloted a similar system, which it uses in some warehouses to sort mail of various sizes.

And other companies are making progress, too.

For decades, training a robot to be more humanlike has stumped engineers, who couldn’t replicate the ability to grip and move items. But now gains in artificial intelligence technology, cameras and engineering are bearing fruit, allowing robots to see objects of varying shapes and sizes and adjust their grasp accordingly.

The technology, computer scientists say, is finally getting reliable enough that companies find it feasible to deploy.

“This moment is a turning point,” said Kris Hauser, a robotics expert and computer science professor at the University of Illinois at Urbana-Champaign. “They’re competent enough at this point.”

But there’s also contentious debate. Critics worry robots will take people’s jobs, though boosters say it’ll just create different ones. Others note more robots could result in higher rates of worker injury, or result in tougher human surveillance to ensure they’re hitting targets.

Beth Gutelius, an economic development professor at the University of Illinois at Chicago, said the way companies unleash these robots without much testing or regard to worker safety is concerning.

“Shouldn’t we all want these things to work better for more people?” she said.

Amazon founder Jeff Bezos owns The Washington Post.

These robots were trained on AI. They became racist and sexist.

Robots have been on the scene for years, but it’s been a slog for scientists to get them to replicate tasks as well as humans — particularly when it comes to hands. Amazon has Kiva robots, which look like Roombas and move packages on the factory floor, but still need humans to pack and sort them.

Elon Musk has notoriously said he would automate Tesla’s manufacturing, but humans are still needed to do work on the assembly line at the company’s Fremont, Calif., factory. He also recently unveiled Tesla’s proto type humanoid robot Optimus, which is aiming to reshape physical work.

Google recently unveiled robots that are fueled by artificial intelligence to help humans with everyday tasks. Some robots are even learning how to cook fries.

Despite the advances, the hardest challenge for researchers has been teaching robots to adjust their grips to different sizes and shapes, said Ken Goldberg, an industrial engineering professor at the University of California at Berkeley.

But in the past decade, things have started to change, he said. 3D camera technology, spurred by Microsoft’s Kinect motion sensing cameras, has become better at spotting images. Deep learning, a field of artificial intelligence that uses algorithms loosely modeled on the brain, allows computers to analyze more images. Researchers started better understanding the physics of grasping things, and incorporating that into robotic suction cups and pickers.

The result: modern-day robotic machines that often look like long arms. Their vision is fueled by software that uses machine learning algorithms to analyze what objects look like to instruct robots on how to grip things. The suction cups or claws adjust pressure and control with the finesse humans take for granted.

Amazon in particular has been chasing the technology, the industry experts said. As one of the world’s largest retailers, plagued with high rates of turnover and promises to deliver packages quickly, it made strong financial sense to try to automate warehouse processes as much as possible.

In 2012, the company acquired mobile robotics company Kiva for $775 million in cash. In 2014, the company announced a “picking challenge,” challenging scientists to create robots that could pick up assorted items, varying from Sharpies to Oreo cookie packages, from a mobile shelf.

Last month, Amazon unveiled its picking-and-sorting robot called Sparrow, a long robotic arm that can grab items before they are packed in boxes. It’s being researched and developed in Massachusetts and in operation at an Amazon facility in Dallas, officials said. It can sort roughly 65 percent of products in its inventory, according to company officials, but nationwide expansion plans aren’t set yet.

The robot fits into a broader automation strategy, according to Amazon. If mastered, Sparrow could pick products up after they’ve been offloaded from trucks and before they’re wrapped and put onto mobile shelving. Once boxed, Amazon’s robotic system, called Robin, could sort them to their destination. Cardinal, another robotic machine, could put them into a waiting cart, before being loaded onto a truck.

Amazon has consistently said more machines will allow people to find better jobs. Robots are “taking on some of the highly repetitive tasks within our operations, freeing up our employees to work on other tasks that are more engaging,” said Xavier Van Chau, a spokesman for the company.

In March, mailing giant Pitney Bowes inked a $23 million deal with Ambi Robotics to use the company’s picking-and-sorting robots to help sort packages of various shapes, sizes and packaging materials. In August, FedEx agreed to purchase $200 million in warehouse robotics from Berkshire Grey to do similar tasks. A few months before that, it launched an AI-fueled mail sorting robot in China.

Although the bulk of the technology started to appear a few years ago, it’s taken time to ensure these systems reduce errors down to less than 1 percent, said Hauser, which is crucial for company bottom lines.

“Each mistake is costly,” he added. “But now, [robots] are at a point where we can actually show: ‘Hey, this is going to be as reliable as your conveyor belt.’”

As Walmart turns to robots, it’s the human workers who feel like machines

Revenue generated by companies making picking-and-sorting robots are skyrocketing, said Ash Sharma, a robotics and warehouse industry expert at Interact Analysis, a market research firm.

The research firm estimates companies that make these products will rake in $365 million this year. Next year, it’s estimated to be over $640 million. It’s a jump from the roughly $200 million last year and $50 million in 2020 these companies generated in revenue, data forecasts show.

A big factor is the labor shortage, he said.

Gutelius, of the University of Illinois at Chicago, said that although the technology proves interesting, it comes with risks. With more robots on warehouse floors, workers alongside them will have to work at a quicker pace, risking more injuries.

The Washington Post has reported that Amazon warehouses can be more dangerous than rivals. Experts say that adding robots to the process can increase injuries.

Van Chau said machines doing repetitive tasks will help workers. “We can take some of that strain away from employees,” he said.

The next generation of home robots will be more capable — and perhaps more social

But Gutelius says companies making claims that these robots will help need to be scrutinized, saying they tend to implement solutions too quickly.

“It’s sort of classic ‘move fast and break things,’” she said. “And in this case, I think ‘breaking things,’ it ends up being people.”

Leave a Reply

Next Post

Developing Software for Connected Cars – Grape Up

Automotive is transforming into a hyper-connected, software-driven industry that goes far beyond the driving experience. How to build applications in such an innovative environment? What are the main challenges of providing software for connected cars and how to deal with them? Let’s dive into the process of utilizing the capabilities […]
Developing Software for Connected Cars – Grape Up

You May Like