Arrow-right Camera
The Spokesman-Review Newspaper
Spokane, Washington  Est. May 19, 1883

Fired by bot at Amazon: ‘It’s you against the machine’

Stephen Normandin delivered packages in Phoenix, Ariz., until he was fired by Amazon via automated email.  (Bloomberg)
By Spencer Soper Bloomberg

Stephen Normandin spent almost four years racing around Phoenix, Ariz., delivering packages as a contract driver for Amazon.com Inc.

Then one day, he received an automated email. The algorithms tracking him had decided he wasn’t doing his job properly.

The 63-year-old Army veteran was stunned. He’d been fired by a machine.

Normandin says Amazon punished him for things beyond his control, such as locked apartment complexes, that had prevented him from completing his deliveries.

He said he took the termination hard and, priding himself on a strong work ethic, recalled that during his military career he helped cook for 250,000 Vietnamese refugees at Fort Chaffee in Arkansas.

“I’m an old-school kind of guy, and I give every job 110%,” he said. “This really upset me because we’re talking about my reputation. They say I didn’t do the job when I know damn well I did.”

Normandin’s experience is a twist on the decades-old prediction that robots will replace workers.

At Amazon, machines are often the boss – hiring, rating and firing millions of people with little or no human oversight.

Amazon became the world’s largest online retailer in part by outsourcing its sprawling operations to algorithms – sets of computer instructions designed to solve specific problems.

For years, the company has used algorithms to manage the millions of third-party merchants on its online marketplace, drawing complaints that sellers have been booted off after being falsely accused of selling counterfeit goods and jacking up prices.

Increasingly, the company is ceding its human-resources operation to machines as well, using software not only to manage workers in its warehouses but to oversee contract drivers, independent delivery companies and even the performance of its office workers.

People familiar with the strategy say Chief Executive Officer Jeff Bezos believes machines make decisions more quickly and accurately than people, reducing costs and giving Amazon a competitive advantage.

Amazon started its gig-style Flex delivery service in 2015, and the army of contract drivers quickly became a critical part of the company’s delivery machine.

Typically, Flex drivers handle packages that haven’t been loaded on an Amazon van before the driver leaves. Rather than making the customer wait, Flex drivers ensure the packages are delivered the same day.

They also handle a large number of same-day grocery deliveries from Amazon’s Whole Foods Market chain. Flex drivers helped keep Amazon humming during the pandemic and were only too happy to earn about $25 an hour shuttling packages after their Uber and Lyft gigs dried up.

But the moment they sign on, Flex drivers discover algorithms are monitoring their every move.

Did they get to the delivery station when they said they would? Did they complete their route in the prescribed window? Did they leave a package in full view of porch pirates instead of hidden behind a planter as requested?

Amazon algorithms scan the gusher of incoming data for performance patterns and decide which drivers get more routes and which are deactivated.

Human feedback is rare. Drivers occasionally receive automated emails, but mostly they’re left to obsess about their ratings, which include four categories: Fantastic, Great, Fair or At Risk.

Bloomberg interviewed 15 Flex drivers, including four who say they were wrongly terminated, as well as former Amazon managers who say the largely automated system is insufficiently attuned to the real-world challenges drivers face every day.

Amazon knew delegating work to machines would lead to mistakes and damaging headlines, these former managers said, but decided it was cheaper to trust the algorithms than pay people to investigate mistaken firings so long as the drivers could be replaced easily.

So far, Amazon has had no trouble finding Flex contractors. Globally, some 4 million drivers have downloaded the app, including 2.9 million in the U.S., according to AppAnnie.

And more than 660,000 people in the U.S. downloaded it in the first five months of this year, up 21% from the same period a year ago, according to SensorTower, another app tracker.

Inside Amazon, the Flex program is considered a great success, whose benefits far outweigh the collateral damage, said a former engineer who helped design the system.

In a statement, Amazon spokeswoman Kate Kudrna called drivers’ claims of poor treatment and unfair termination anecdotal and said they don’t represent the experience of the vast majority of Flex drivers.

“We have invested heavily in technology and resources to provide drivers visibility into their standing and eligibility to continue delivering, and investigate all driver appeals,” she said.

As independent contractors, Flex drivers have little recourse when they believe they’ve been deactivated unfairly.

There’s no paid administrative leave during an appeal. Drivers can pay $200 to take their dispute to arbitration, but few do, seeing it as a waste of time and money.

When Ryan Cope was deactivated in 2019, he didn’t bother arguing or consider paying for arbitration. By then, Cope had already decided there was no way he could meet the algorithms’ demands.

Driving miles along winding dirt roads outside Denver in the snow, he often shook his head in disbelief that Amazon expected the customer to get the package within two hours.

“Whenever there’s an issue, there’s no support,” said Cope, 29. “It’s you against the machine, so you don’t even try.”

When drivers do challenge poor ratings, they can’t tell if they’re communicating with real people.

Responses often include just a first name or no name at all, and the replies typically apply to a variety of situations rather than a specific problem.

Even if a name is attached, a machine most likely generated the first few email responses, according to people familiar with the matter.

When human managers get involved, they typically conduct a hasty review – if they do one at all – because they must meet their own performance standards.

A former employee at a driver support call center said dozens of part-time seasonal workers with little training were assigned to oversee issues for millions of drivers.

“Amazon doesn’t care,” the former Amazon employee said. “They know most people will get their packages and the 2 or 3 percent who don’t will get something eventually.”

Amazon has automated its human-resources operation more than most companies.

But the use of algorithms to make decisions affecting people’s lives is increasingly common.

Machines can approve loan applications, and even decide if someone deserves parole or should stay behind bars.

Computer science experts have called for regulations forcing companies to be transparent about how algorithms affect people, giving them the information they need to call out and correct mistakes.

Legislators have studied the matter but have been slow to enact rules to prevent harm.

In December, Sen. Chris Coons, D-Del., introduced the Algorithmic Fairness Act.

It would require the Federal Trade Commission to create rules that ensure algorithms are being used equitably and that those affected by their decisions are informed and have the opportunity to reverse mistakes. So far his proposal has gone nowhere.

Neddra Lira, of Arlington, Texas, started making deliveries through the Amazon Flex app in 2017.

A 42-year-old school-bus driver and mother of three, she took the side job during holiday breaks and summers to earn extra money, which she used to pay for her daughter’s gymnastics lessons.

When the pandemic hit and schools closed, Lira turned to Flex as her primary source of income, delivering packages as well as groceries from Whole Foods.

She liked the flexibility and opportunity to pocket about $80 for a four-hour route, after subtracting gas for her Chevrolet Trax crossover.

Lira estimates she delivered about 8,000 packages and had a “great” performance rating most of the time.

Amazon algorithms rate drivers based on their reliability and delivery quality, mostly measured by whether they arrived to pick up packages on time, if they made the deliveries within the expected window and followed customers’ special requests.

Flex metrics focus mostly on punctuality, unlike ride-hailing services such as Uber and Lyft, which also prioritize things like a car’s cleanliness or driver courtesy.

Moreover, Uber and Lyft passengers know when they’re stuck in traffic, so drivers are less likely to be penalized for circumstances beyond their control.

An Amazon customer has no idea what obstacles Flex drivers encounter on the way to their residence, and neither do the algorithms clocking them.

Lira says sometimes there were so many drivers lined up outside the delivery station, she waited as long as an hour to retrieve her packages, putting her behind schedule before she even started her route.

When she spotted a nail in her tire, Amazon didn’t offer to come retrieve the packages but asked her to return them to the delivery station.

Lira was afraid the tire would go flat but complied to protect her standing.

Despite explaining the situation, her rating dropped to “at risk” from “great” for abandoning the route and took several weeks to recover.

Time and again, Lira was reassured that her rating was fine. A typical email arrived on Oct. 1.

“Your standing is currently great, which means you’re one of our best delivery partners,” said the message signed “Madhu S.”

But the very next day, “Bhanu Prakash” emailed to say she had violated Flex’s terms of service.

“As a result, you’re no longer eligible to participate in the Amazon Flex program and won’t be able to sign in to the Amazon Flex app.”

Lira was provided an email address and invited to appeal the termination within 10 days.

She did so and asked why she was deactivated so she could tell Flex driver support what went wrong. She never got further specifics.

She followed up Oct. 18, explaining that she was a single mother laid off from her regular job due to the pandemic and that Flex was the only thing keeping her afloat.

Lira received what appears to be an automated response from “The Amazon Flex Team” apologizing for the delay and assuring her that her situation would be investigated by the appropriate team.

Three days later, on Oct. 21, she received a message from “Margaret” saying “we are still reviewing your appeal.” Then a week later, on Oct. 28, an email signed “SYAM” said, “We’ve reviewed your information and taken another look at your history. Our position has not changed and we won’t be reinstating your access to the Amazon Flex program. … We wish you success in your future endeavors.”

Without the driving gig, Lira began to struggle financially. She stopped paying her mortgage, and her car was repossessed two days after Christmas with donated presents for her kids still inside.

Lira was forced to take a government handout to pay her electric, gas and water bills.

Eventually she started driving the school bus again and used most of a pandemic stimulus check to get her car back, paying $2,800 in missed payments, repossession and storage fees.

“It just wasn’t fair,” Lira said. “I nearly lost my house.”

The computer engineers who designed Flex worked hard to make it fair and consider such variables as traffic jams and problems accessing apartments that the system can’t detect, former employees said.

But no algorithm is perfect, and at Amazon’s size even a small margin of error can be considered a huge success internally and still inflict a lot of pain on drivers.

Amazon Flex drivers deliver about 95% of all packages on time and without issues, according to a person familiar with the program. Algorithms examine that remaining 5% for problematic patterns.

The Flex algorithms began as blunt instruments and were refined over time. Early on, according to a person familiar with the situation, designers set too tight a time period for drivers to get to the delivery station.

They had failed to factor in human nature. Drivers eager for work would promise to arrive by a certain time when they were too far away to make it.

The flaw set good drivers up to fail, the person said, and was fixed only after a widespread plunge in ratings.

The system also uses GPS to decide how long it should take to reach a specific address but sometimes fails to account for the fact that navigating a rural road in the snow takes a lot longer than a sunny day.The system worked fine for Normandin for years. An Arizona native who previously delivered pizzas at night and newspapers in the morning, he knew all the short cuts and traffic choke points.

He also drove for Uber and Lyft, but took on more Flex work during the pandemic when demand for rides dropped and it became riskier ferrying passengers than carting packages.

Normandin enjoyed stellar ratings and was even asked if he’d like to train other drivers.

He had a well-honed system: sorting packages before leaving the station, putting his first deliveries in the front seat, the next several packages in the rear and tucking the last batch deep in the back of his 2002 Toyota Corolla.

Normandin has been medically disabled for more than a decade due to a stomach ailment and back problems that prevent him from sitting or standing in one place for prolonged periods. He liked gig work because he could work a few hours at a time.

Then, starting last August, Normandin had a string of setbacks he maintains were beyond his control.

Amazon assigned him some predawn deliveries at apartment complexes when their gates were still locked, a common complaint among Flex drivers.

The algorithm instructs drivers in such instances to deliver packages to the main office, but that wasn’t open either. Normandin called the customer as instructed-a long shot because most people don’t answer calls from unfamiliar numbers, especially early morning.

He called driver support, which couldn’t get through to the customer either. Meanwhile, the clock was ticking, and the algorithm was taking note.

“There are a lot of things the algorithms don’t take into consideration and the right hand doesn’t know what the left hand is doing,” Normandin said.

Around the same time, he was asked to deliver packages to an Amazon locker in an apartment complex but couldn’t open it.

After 30 minutes on the phone with support he was told to return the packages to the delivery station. Then his rating crashed. Normandin called support again to explain that a malfunctioning locker was responsible and says he was told the problem would be remedied.

“They never fixed it,” he said, “and it took six weeks for my rating to go back up.”

On Oct. 2, Normandin woke at 3 a.m., showered and grabbed his phone to find a Flex route but couldn’t log on. He checked his email and found a generic message from Amazon signed by “Gangardhar M.”

It said Normandin’s standing had “dropped below an acceptable level” and that he was being terminated.

Then began a process familiar to anyone who has found themselves trapped in an automated customer-service loop-except in this case Normandin wasn’t seeking a refund for a damaged product. He was fighting to get his job back.

Offered Amazon’s standard 10 days to appeal, Normandin emailed Flex support and asked that his termination be reversed.

He explained that he had already flagged Amazon about circumstances beyond his control and had been promised the infractions wouldn’t be held against him.

Normandin received a response the next day from “Pavani G,” thanking him for “providing more context about your history with Amazon Flex.”

Normandin responded to that email with additional information and received the same exact response promising to look into the issue, but this time it was signed by “Bitan Banerjee.”

The email pledged to provide an answer within six days. Seven days later, “Arnab” emailed to apologize for the delay and promised an update as soon as possible.

Meanwhile, Normandin wasn’t making any money.

He was counting on Amazon’s annual Prime Day sale, which had been pushed back to October, to make money he needed to pay bills.

With no response by Oct. 19, Normandin messaged Amazon again, this time copying Bezos.

“I am asking for specific details on how this decision for deactivation of my account was reached,” he wrote. “I am confident after a thorough review of my entire delivery history as an Amazon Flex driver will show a consistent history of performing at the highest level, of a reasonable and prudent person.”

About 12 hours later, he got a response informing him that Bezos had received the email and instructed “Taylor F” to research the issue and respond on his behalf.

On Oct. 23, Normandin received an email from “Raquel” on the Amazon Flex Support Team to tell him they were still reviewing his appeal.

Former Amazon employees who worked on Flex said escalating to Bezos is a common tactic among deactivated drivers but seldom helps them.

The verdict arrived on Oct. 28 from “SYAM,” the same name in the final message to Lira.

The email didn’t directly respond to Normandin’s claims but acknowledged the job’s challenges, saying: “We understand that every delivery partner has difficult days and that you may sometimes experience delays, and we have already taken this into account.”

But Normandin still wasn’t getting his gig back.

After the shock subsided, he tried a couple of other delivery services but instead decided to use his pandemic stimulus money to start a small-engine repair business.

It was time to deal directly with human beings again. Of the people who designed the algorithms that tracked, rated and eventually fired him, Normandin said, “It seems they don’t have any common sense about how the real world works.”