It just so happens that the phrase “turn the lights on” sounds a lot like the word “tenderloin.” That seemingly unimportant phonetic connection became an early challenge for Amazon’s Alexa Shopping team. After all, the world’s largest online store didn’t want to ship its customers surprise packages of meat when all they wanted was to flick on a light switch.
So the company devised a ranking system for its voice commands, placing a request for the lights, which is used a lot, high above a request for tenderloin, which isn’t. To hone this system, the company gave Alexa contextual awareness too, so the voice assistant could tell if a conversation is related to groceries and not smart home controls.
“When we identify that’s the context of your dialogue, we then do the ranking within that context and recognize that the word you actually said was ‘tenderloin,”http://www.cnet.com/” Chuck Moore, vice president of, told me during Amazon’s re:MARS artificial intelligence and robotics conference in Las Vegas earlier this month.
This precise voice-recognition processing is part of Amazon’s push to bring its AI expertise and automation to just about every layer of its business, including its warehouse robots, cashierless retail stores and, of course, Alexa. This behind-the-scenes tech is already providing Amazon’s customers with faster deliveries and helping people streamline their errands, like creating a shopping list or picking up a gallon of milk.
The retail giant is just one of many tech heavy-hitters pouring resources into AI, which allows computers and bots to perform higher-level tasks like decision-making and predicting customers’ needs. Microsoft, Google, Apple and Facebook are also touting how the technology can change and improve our lives.
At re:MARS, CNET spoke to four Amazon executives representing a wide range of its businesses. They provided an exclusive look into some of the inner workings of Amazon’s AI development, showing how the tech has become the critical ingredient they use to compete against rival retailers like Walmart and cloud-service providers like Microsoft and Google.
“[AI] is sprinkled everywhere,” Creative Strategies analyst Carolina Milanesi said after attending re:MARS. “It’s an integral part of every service they offer, every product they make and every business they run.”
But automation and AI have also become dirty words for plenty of people, with the terms dredging up worries about robots stealing peoples’ jobs. Though automation of tasks has happened for centuries, the rapid development of new technology has the potential to disrupt huge chunks of the economy. Analysts at Oxford Economics now predict up to 20 million global manufacturing jobs could be automated out of existence by 2030. Other studies say tens of millions of US jobs are at high risk, too, particularly low-skill repetitive work like transportation and warehouses.
Amazon executives say they don’t see gloom and doom in AI and automation, noting that they continue to hire thousands more people to work alongside their warehouse bots and to create the latest machine-learning code.
“We’re not particularly worried about job displacement,” Brad Porter, an Amazon robotics vice president, said. “It’s not, ‘Oh, do we have too many people?’ That’s never the problem we’re trying to solve. We’re growing, we need to hire more people.”
Milanesi noted that the company’s leaders made an effort to talk at the conference about both the benefits of AI and the many potential problems.
“The fact that they are acknowledging that there is complexity that needs to be addressed, and needs to be addressed right, is the critical first step,” she said.
Inside Amazon Go’s AI brain
Onstage in front of thousands of re:MARS attendees at the Aria Resort & Casino, Dilp Kumar, vice president of, showed a bird’s-eye video of what Amazon’s hundreds of AI-infused cameras see in its Go stores.
These convenience stores let customers check in at turnstiles, grab what they want and walk out without having to stop at a register. The cameras perceive the store floor as a jumble of bubbles — each representing a shopper — vibrating and moving around, some of them clumped together.
Amazon relies on AI software to make sense of this information and ensure customers are charged for only what they take out of the store.
Later, in an Aria conference room, Kumar told me the Go store software was fed loads of videos to train it on all kinds of potential situations it may come across. When engineers identified weaknesses in the system, they shot footage of real people at a store performing specific actions, such as two shoppers grabbing for the same product at the same time, then used those images to improve the AI. Kumar’s team added into the mix videos of computer-generated stores and customers, too.
Figuring out what items were picked up was another complicated problem. Potato chip bags are often crumpled, obscuring their labels. Packages of the same brand’s coffee can look nearly identical. Plus, new products get added in all the time, and existing ones get new packaging. Add to that dozens of customers milling around, picking up and moving things, sometimes blocking the view of the Go cameras.
Kumar said his team found that pointing out products’ distinguishing characteristics for the AI didn’t work. For example, telling the machines to check the labels to decipher the difference between raspberry and strawberry jam falls apart pretty quickly.
“As you increase the number of items, or as there are even small changes in packaging, it can just become very brittle,” he said.
So Kumar’s team used deep-learning algorithms that figured out on their own what characteristics to pay attention to, and it worked.
Though the Go stores are jammed with cameras, Kumar said only a small percentage of the footage is stored by Amazon. There’s just too much video to keep and the AI doesn’t get any smarter being fed more of the same activities. After new footage is shown to the software to keep training it, that data is usually deleted.
Factory robots break out of their cages
Porter, the robotics executive, is hard at work giving Amazon’s warehouse robots eyes.
These bots, which look like souped-up, orange Roombas, are used in Amazon’s fulfillment centers to pick up and bring products stacked in thin yellow containers to their human co-workers. Amazon already uses 200,000 of them. Up until now, these robots figured out how to move around by reading barcodes taped to the warehouse floor. It’s a crude system, but it works well in cagelike enclosures, where humans can’t wander in and get knocked over by the bots.
But after Amazon purchased the autonomous-driving company Canvas Technology earlier this year, it was able to create new warehouse bots with sophisticated computer vision. Those new eyes will allow the robots for the first time to work in more areas of the warehouse floor alongside Amazon workers.
The activities these bots will perform are the unsexy tasks humans would rather not do, Porter said, including carting empty totes to the loading dock, sorting packages and taking out the trash. That’ll let workers focus on more-important jobs like packing deliveries, which should help customers’ orders arrive faster.
The introduction of more robots andhas been harshly criticized by unions and advocacy groups, which are raising concerns about job losses.
“(Amazon CEO) Jeff Bezos’ vision for our economy is focused on driving up profits at any cost by replacing talented employees with automation,” Marc Perrone, president of the United Food and Commercial Workers International Union, said in a statement. “While Amazon is raking in billions in tax cuts from cities desperate for new jobs, the company is ruthlessly working to eliminate the jobs of thousands of its current employees.”
Porter said Amazon continues to grow and that demand from customers keeps increasing, so more people are getting hired, not fewer. He added that these bots do change the nature of warehouse work — employees used to walk down aisles to gather orders instead of waiting for a robot to bring the products to them — but these facilities go through changes all the time.
Kumar said he expects people will always be needed at Go stores to help answer customers’ questions and restock shelves.
“I view this less about job loss,” Kumar said. “I view this more about shifting what associates were previously doing in the store.”
Alexa does your shopping
Moore, the head of Alexa Shopping, said the goal of his group is to allow customers to buy whatever they want through Amazon’s voice assistant. But with a catalog of roughly 500 million products globally, 400 million of which Amazon sells, there are far too many items to record ahead of time to teach the AI.
Instead, Amazon uses your purchasing history on Amazon’s website and your past Alexa utterances to figure out whether you asked for L’Occitane hand cream or a movie starring Michael Caine.
Moore said Amazon plans to use Alexa Shopping to help people quickly take care of “low-consideration items,” like reorders or stuff they forget to pick up at the supermarket. That could become a huge moneymaking opportunity for Amazon, but so far voice shopping hasn’t taken off with the public, since the technology is still developing, said Vivek Pandya, lead analyst for Adobe Digital Insights.
“The promise of voice is that level of convenience,” he said. “If it’s convenient, you do it more frequently.”
Going forward, Alexa engineers are working on making the voice assistant more proactive, with plans to have Alexa suggest reorders to you after noticing your shopping patterns, Moore said.
Toni Reid, vice president of Alexa experience anddevices, suggested Alexa may become more predictive, too, with it automatically creating routines for people, like opening the blinds, firing up the coffee maker and turning on the traffic report in the morning.
The currentfeature is an early example of this work, with the assistant suggesting things to people if it notices a break in a pattern, like the backdoor being unlocked at night.
If the thought of proactive voice assistants, robots with eyes and camera-filled smart stores makes you a little queasy, it’s likely all three of these concepts will keep getting even more sophisticated as Amazon strengthens its AI muscles.
But even if Amazon convinces people these technologies offer more convenience without sacrificing jobs, its more proactive AIs will still have to tackle the challenge of understanding people’s wants with less human input to steer them in the right direction.
“It has to be right often,” Reid said, “or we end up annoying customers.”
First published on June 28 at 5 a.m. PT.
Updated at 11:40 a.m. PT, and on June 29: Adds more background.