March 27, 2023

Getting a Job: Working for AI

Author photoBy Karen Sternheimer

I have been fortunate to have had my job for more than twenty years. I have never looked for a job in the twenty-first century. If I did, the process would be a lot different than it was in the 1990s. Monster.com, the first online resume database, only launched in 1999. And while the internet might have had job listings, old-fashioned snail mail was still the main way to apply for a job for many years after that.

Back in the twentieth century, writing a good resume was key. It still is today, but an algorithm is likely to be the first to “see” your resume. In theory, this is meant to help streamline the hiring process and perhaps even get better candidates. Even a first interview might be submitted as a video, screened by a bot to read a candidate’s facial expressions and keywords used.

This practice reflects what Max Weber might have seen as a form of rationalization, a means of creating more efficiency within a large bureaucratic organizations. Contrast this with the time-consuming process our department goes through when conducting a job search: a committee of three or more faculty members reviews hundreds of applications for one faculty position. This can be a long process, taking the time of several people with many other responsibilities.

Some candidates can be immediately disqualified for lacking minimum qualifications. I once reviewed applications and occasionally some people will apply to be a sociology professor without a degree—even a bachelor’s degree. But for the most part, we read through packets of material for each qualified candidate, including letters of recommendation and publications.

While this work is time consuming, we have not considered turning it over to an algorithm, even to get the process started. According to a Harvard Business Review report, hiring algorithms can be deeply problematic:

To attract applicants, many employers use algorithmic ad platforms and job boards to reach the most “relevant” job seekers. These systems, which promise employers more efficient use of recruitment budgets, are often making highly superficial predictions: they predict not who will be successful in the role, but who is most likely to click on that job ad.

These predictions can lead jobs ads to be delivered in a way that reinforces gender and racial stereotypes, even when employers have no such intent. In a recent study we conducted together with colleagues from Northeastern University and USC, we found, among other things, that broadly targeted ads on Facebook for supermarket cashier positions were shown to an audience of 85% women, while jobs with taxi companies went to an audience that was approximately 75% black. This is a quintessential case of an algorithm reproducing bias from the real world, without human intervention.

World Economic Forum noted similar problems:

It has been shown that in the US labor market, African-American names are systematically discriminated against, while white names receive more callbacks for interviews. However, we observe bias not only because of human error, but also because the algorithms increasingly used by recruiters are not neutral; rather, they reproduce the same human errors they are supposed to eliminate. For example, the algorithm that Amazon employed between 2014 and 2017 to screen job applicants reportedly penalized words such as ‘women’ or the names of women’s colleges on applicants’ CVs.

These forms of artificial intelligence, or AI, reflect and even amplify existing biases embedded in the workforce. They don’t end with the hiring process. As a Los Angeles Times column recently discussed, drivers for Uber and Lyft have found themselves “deactivated”—bot-speak for terminated—presumably by an algorithm:

new survey of 810 Uber and Lyft drivers in California shows that two-thirds have been deactivated at least once. Of those, 40% of Uber drivers and 24% of Lyft drivers were terminated permanently. A third never got an explanation from the gig app companies.

Drivers of color saw a higher rate of deactivation than white drivers — 69% to 57%, respectively. A vast majority of the drivers (86%) faced economic hardship after getting fired by the app, and 12% lost their homes.

Deactivation hit even the most experienced drivers: The report, conducted by Rideshare Drivers United and the Asian Law Caucus, found that drivers who were deactivated had worked, on average, 4 1/2 years for Uber and four years for Lyft.

The World Economic Forum article concludes with suggestions for workers on how to craft a good AI-read resume, but the implication is that it is up to workers to somehow outsmart the algorithm. The article’s link title, “AI Assisted Recruitment is Biased: Here’s How to Beat it” implies that it can be beat. That seems unlikely. Given the complexities of any algorithm are largely proprietary; translation: users typically don’t know exactly what it is looking for.

Of course, AI isn’t going away, and it can potentially enhance our lives. New research into using AI for medical diagnoses might potentially save lives…or reflect existing inequalities in healthcare. Rather than blaming AI, the solution is human-based, and requires that we examine systemic inequalities and consider how they might be removed from algorithms. Maybe someone will develop an algorithm for that.

Comments

Her job is quite hard

This is a really great site. Thanks for your sharing.

Thank you very much for this information, it was very useful to me.

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Working...
Your comment could not be posted. Error type:
Your comment has been posted. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

Working...

Post a comment

Become a Fan

The Society Pages Community Blogs

Interested in Submitting a Guest Post?

If you're a sociology instructor or student and would like us to consider your guest post for everydaysociologyblog.com please .

Norton Sociology Books

The Real World

Learn More

Terrible Magnificent Sociology

Learn More

You May Ask Yourself

Learn More

Essentials of Sociology

Learn More

Introduction to Sociology

Learn More

The Art and Science of Social Research

Learn More

The Family

Learn More

The Everyday Sociology Reader

Learn More

Race in America

Learn More

Gender

Learn More

« Who are You: Work, Education, and Identity | Main | Public Libraries as Social Infrastructure »