I Got a Job at an Amazon Warehouse Without Talking to a Single Human

What the experience taught me about automating the hiring process


Ryan Fan

3 years ago | 6 min read

A few weeks ago, I had just completed an application to work in a warehouse for Amazon. I had watched a video and completed a quiz showing that I knew that to stow items — heavy goes on the bottom, light goes on top.

About 20 minutes later, Amazon emailed me that I had the job at the shift I desired. The email said to come into the warehouse recruiting office in Baltimore to take a photo for my ID and have my official documents, like my social security number and passport, ready to be scanned.

I was conflicted. It was the easiest and most streamlined hiring process I’d ever gone through, and I was happier to have the job than to not have the job. At the same time, I got a job at one of the world’s biggest companies without ever speaking to a single human being.

Applying to work at Amazon was so easy, but it made me take the notion of automation taking over the world seriously for the first time.

A few days later, I went through with the drug test and checked some boxes for the background check, and I was set to start work at Amazon in just over two weeks. Applying to work at Amazon was so easy, but it made me take the notion of automation taking over the world seriously for the first time.

When I brought my documents to the warehouse, I interacted with two human beings. I was in the recruiting warehouse where I would eventually work for less than five minutes.

“That’s it?” I asked.

“That’s it,” the guy said.

Even though I got the job, and quickly, I have mixed feelings about automating the job application process. It’s unsettling to get a job so quickly and without talking to a single person.

Did they really not care about any human aspects of my application? Did they care about quality and my ability to actually do the work? And could they learn any of that by meeting me?

Completing the orientation process involved an hour of online training videos covering everything from proper attire to safety and some quizzes.

As I write this, I’ve just completed my first 40-hour week and I’ve enjoyed my job as a warehouse picker, where I pull orders off of shelves and put them out for delivery.

The biggest lesson so far is that if you buy clothes from Amazon or any other online retailer, wash them before you wear them because a lot of people have to touch those clothes before they get to you.

Automating the hiring process as Amazon has done for its hourly warehouse workers saves a lot of time, especially for large companies (Amazon topped 935,000 employees as of April 30).

Some argue that automating the hiring process also eliminates bias. Writing in the Harvard Business Review Frida Polli says that without automation many applicants are never even seen, but A.I. can assess the entire pipeline of candidates. In addition, it can eliminate unconscious human biases with the right amount of auditing.

However, according to Cornell professors Manish Raghavan and Solon Barocas, at The Brookings Institute, automating the hiring process as an anti-bias intervention can sometimes exacerbate bias.

In the Social Science Research Network (SSRN), law professor Ifeoma Ajunwa notes that the tools prospective employees might be required to use in an automated hiring process can be more restrictive than filling out an application and handing it to a hiring manager, especially for low-wage and hourly jobs.

Amazon, in fact, has already discovered bias in its automated hiring tools. In 2018, the company’s machine-learning specialists discovered that their recruiting engine favored men.

The computer models were trained to vet applicants by observing patterns in résumés submitted to the company over a 10-year-period, and during the 10-year period, most of the résumés came from men. As a result, the tool learned to penalize resumes that included the word “women.” Women who went to all women’s colleges were put at a disadvantage.

Engineers’ attempts to fix the problem were unsuccessful.

Amazon’s tool was biased because it had been trained using an overwhelmingly male sample pool of résumés.

Ajunwa also notes that online hiring algorithms can be restrictive for those applying to white-collar jobs. Goldman Sachs, for instance, in 2016 embraced the concept of automated interviewing as an initiative to hire a more diverse workforce.

But in a 2019 New York Times opinion article, Ajunwa argued that too much automation creates a closed-loop system with no accountability or transparency.

Advertisements created by algorithms encourage certain people to send in their résumés. After the résumés have undergone automated culling, a lucky few are hired and then subjected to automated evaluation, the results of which are looped back to establish criteria for future job advertisements and selections.

She cites two examples of automated hiring discrimination. In one, a 2017 lawsuit opened by then Illinois Attorney General Lisa Madigan, found that automated hiring platforms discriminated against older applicants through the tools themselves.

Drop-down menus for the years a person attended college didn’t go back far enough to accommodate all workers, even though people are staying in the workforce longer. In a 2016 class-action lawsuit against Facebook, Facebook was eventually pressured to curb its paid ad platform to comply with anti-discrimination laws.

The Lookalike Audiences feature allowed employers to choose only Facebook users that “looked like” its current employees to see job advertisements. So if a company only had white employees, only white Facebook users would see the ads, or if the company only had women employees, Facebook would selectively target women users.

Ajunwa argues that since employers have the discretion to choose a “cultural fit,” they can further discriminate. They could, for instance, use pre-employment personality assessments, which the Equal Employee Opportunity Commission (EEOC) found in 2018 are probably creating a pattern of discrimination against racial or ethnic minorities.

The EEOC ruling found that Best Buy’s personality tests violated Title VII of the Civil Rights Act, which forced Best Buy to stop using its assessments.

These personality tests, as part of Best Buy’s automated hiring process, were used to predict how workers would perform. In addition, prioritizing a “lack of gaps in employment” hurts women who have to take leave from the workplace to take care of children.

Automation is not slowing down anytime soon, and especially not in the hiring process, but safeguards are needed to prevent exacerbating workplace discrimination.

According to attorneys Mark Girouard and Maciej Ceglowski, regulations by the EEOC hold companies accountable for hiring decisions and tools, and even require that companies keep the data in the case of a bias claim.

So companies can be liable for bias, even if they don’t know why an algorithm chose one group over another. In the SSRN paper, Ajunwa also argues for putting more of a burden on the employer, when plaintiffs bring suits of discrimination.

So if a plaintiff uses a hiring platform that has a problematic design feature like a personality test, in a court of law, the employer would have to provide statistical proof in audits to show that its hiring platform is not unlawfully discriminating against certain groups of candidates.

In a separate paper in SSRN, Ajunwa suggests requiring employers to conduct internal and external audits. These audits would make sure that no applicants are disproportionately excluded.

The Occupational Safety and Health Administration (OSHA) already recommends audits to ensure safe working conditions for employees, and has a safety certification system that gives marks to employers who fulfill those standards.

Ajunwa argues for the same audit and certification system in automated hiring platforms, as well as union collective bargaining that can work with employers to determine criteria actually necessary for determining job fit, as well as for protecting applicant data.

Right now, Covid-19 has drastically accelerated the use of A.I. in the hiring process, with examples of more recruiters moving job interviews and other recruiting interactions online. In an effort to curb face-to-face interactions during the pandemic, many more companies are relying on A.I. and videoconferencing platforms like Zoom.

But we still need humans to ensure that the hiring process is fair and equitable.

I can only assume that I look pretty good to an automated A.I. hiring platform. I’m a young, 23-year-old man with a bachelor’s degree, no criminal record, and no disabilities.

An automated hiring process favored me getting a job at Amazon in less than 20 minutes. I had a great and fast hiring experience and was able to work on the warehouse floor very soon after filling out my application. But how can we ensure that everyone else has an equal opportunity?


Created by

Ryan Fan







Related Articles