Employee Survival Guide®
The Employee Survival Guide® is an employment law podcast only for employees about everything related to work and your career. We will share with you all the employment law information your employer and Human Resources does not want you to know about working and guide you through various work and employment law issues. This is an employee podcast.
The Employee Survival Guide® podcast is hosted by seasoned Employment Law Attorney Mark Carey, who has only practiced in the area of Employment Law for the past 29 years. Mark has seen just about every type of employment law and work dispute there is and has filed several hundred work related lawsuits in state and federal courts around the country, including class action suits. He has a no frills and blunt approach to employment law and work issues faced by millions of workers nationwide. Mark endeavors to provide both sides to each and every issue discussed on the podcast so you can make an informed decision. Again, this is a podcast only for employees.
Subscribe to our employee podcast show in your favorite podcast app including Apple Podcasts and Spotify.
You can also subscribe to our feed via RSS or XML.
If you enjoyed this episode of the Employee Survival Guide ® please like us on Facebook, X and LinkedIn. We would really appreciate if you could leave a review of this employee podcast on your favorite podcast player such as Apple Podcasts and Spotify. Thank you!
For more information, please contact Carey & Associates, P.C. at 203-255-4150, or email at info@capclaw.com.
Also go to our website EmployeeSurvival.com for more helpful information about work and working.
Employee Survival Guide®
Algorithmic Bias or AI Hiring Bias: The Case of Derek Mobley vs. Workday Inc
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Comment on the Show by Sending Mark a Text Message.
What happens when the technology designed to streamline hiring processes inadvertently perpetuates discrimination? In this eye-opening episode of the Employee Survival Guide®, Mark Carey dives deep into the landmark legal case of Derek Mobley versus Workday, Inc. , a case that raises urgent questions about algorithmic bias or AI hiring bias and the responsibilities of tech companies in the employment landscape. Mobley, a Black man over 40, alleges he faced unjust rejection from over 100 job applications due to the very algorithms intended to enhance fairness in hiring. He has presented the first and best case for algorithmic bias or AI hiring bias.
Join us as we dissect the implications of this case not only for Mobley but for countless job seekers navigating the complex world of employment discrimination. This episode sheds light on critical algorithmic bias or AI hiring bias issues surrounding employee rights, including the often-overlooked concept of disparate impact discrimination. How can seemingly neutral hiring practices disproportionately affect marginalized groups? As we explore these themes, we’ll also consider whether Workday can be deemed an employment agency liable for the discriminatory outcomes generated by its software. Workday will represent the first legal case of algorithmic bias or AI hiring bias in the country.
Mark and his guest provide a thorough analysis of the legal arguments at play in algorithmic bias or AI hiring bias, examining the broader ramifications for the HR tech industry. With the rise of AI bias in hiring, the stakes have never been higher. This discussion is essential for anyone concerned about workplace culture, employee empowerment, and the fight against discrimination in the workplace. We tackle pressing topics such as age discrimination, race discrimination, and the need for transparency and accountability in AI hiring technologies.
As we navigate through the complexities of employment law issues, listeners will gain valuable insights into the legal rights at work and the importance of advocating for oneself in the face of systemic challenges. This episode is not just a legal analysis; it's a call to action for employees everywhere to understand their rights and to demand fair treatment in the AI hiring process.
Whether you're an employee facing discrimination, a job seeker navigating remote work challenges, or a tech company looking to improve your hiring practices, this episode of the Employee Survival Guide® is packed with insider tips and essential knowledge. Tune in for a discussion that is not only informative but also empowering, as we equip you with the tools to survive and thrive in today's competitive job market.
Don't miss out on this critical conversation about employee survival, workplace rights, and the future of hiring t
If you enjoyed this episode of the Employee Survival Guide please like us on Facebook, X and LinkedIn.
We would really appreciate if you could leave a review of this podcast on your favorite podcast player such as Apple Podcasts and Spotify. Leaving a review will help other employees find the Employee Survival Guide.
For more information, please contact our employment attorneys at Carey & Associates, P.C. at 203-255-4150, www.capclaw.com.
Disclaimer: For educational use only, not intended to be legal advice.
Algorithmic Bias in Hiring
Speaker 1Welcome back. Today we're taking a deep dive into a case that's making waves in the world of tech and hiring Derek Mobley versus Workday Inc. It's not just a legal battle. It really gets you thinking. How are algorithms used in hiring? Can companies like Workday be held responsible if there's bias?
Speaker 2Yeah, it's a really interesting case, isn't it? It shows just how much AI is affecting our lives now, like even finding a job.
Speaker 1And it all revolves around Derek Mobley, a black man over 40 who claims he was rejected from over 100 jobs and all of them used Workday software for some part of hiring.
Speaker 2What's really striking is he wasn't just applying to one company or even one industry. All sorts of jobs, different sectors, and every time he hit this Workday wall.
Speaker 1OK, so before we jump in too deep, can you give us some background on Workday? What exactly do they do? Why are they the focus here?
Speaker 2So Workday is a big name in HR tech, cloud based software stuff like HR payroll and well, this is important for the case Talent management. They work with tons of companies, lots of Fortune 500 firms, even so, their software. It could be affecting a huge number of people applying for jobs of Fortune 500 firms, even so their software.
Speaker 1it could be affecting a huge number of people applying for jobs. Wow, they're not just some small startup then. They're a major player in this hiring space.
Speaker 2Exactly, and that's partly why this case is so big. It's not just about one guy looking for work. It's about the potential for bias, you know, algorithmic bias on a massive scale.
Speaker 1Okay, Back to Mobley. He was applying for all these jobs each time running into Workday. What was that experience like for him?
Speaker 2Well, he'd find postings on LinkedIn pretty standard stuff. But clicking apply he'd get redirected to a Workday platform on the company's website.
Speaker 1So, even though he's applying to different companies, it's always Workday behind the scenes handling his application, always Workday behind the scenes handling his application.
Speaker 2That's right. Every time new Workday account, upload his resume, sometimes even these Workday assessments like personality tests.
Speaker 1Workday is collecting a lot of data on that.
Speaker 2And that's crucial right. Algorithms need data to learn, and that data, it's not just what's on your resume.
Speaker 1Like what else?
Speaker 2Well, think about it when you create a Workday account, maybe your age, location, education, history and those personality tests they might show your personality traits. Are you emotionally stable? How about risk aversion?
Speaker 1Hmm, yeah, I see your point. It's all data that an algorithm could use to make decisions about you.
Speaker 2Exactly, and this is where Mobley's concerns start. He argues Workday's tools are discriminatory. They use biased data, things like personality tests that might put certain people at a disadvantage.
Speaker 1So he's not saying he's just unlucky. There's something wrong with how Workday's algorithms are making decisions.
Speaker 2Yeah, inherently biased that's what he's claiming against people like him, black, over 40, and with mental health conditions like anxiety and depression.
Speaker 1He was rejected from over 100 jobs. It's not just a few rejections here.
Speaker 2And get this, some of those rejection emails middle of the night like 2 am.
Speaker 1That is kind of creepy, got to admit. Definitely sounds like automation was involved.
Speaker 2Makes you wonder how much human judgment was really there, versus an automated decision made by Workday's software.
Speaker 1That's a big question. Goes to the heart of this case. But before we go further, what does Mobley mean when he says Workday's tools are discriminatory? Does he mean they're designed to discriminate against certain groups?
Speaker 2Not necessarily he's arguing. It's what's called disparate impact discrimination.
Speaker 1OK, disparate impact Sounds like legal jargon. Can you explain that for us?
Speaker 2It is legal stuff, but super important here. Disparate impact means, even if a practice doesn't mean to discriminate, if it ends up disproportionately harming a protected group, well, legally that can still be discrimination.
Speaker 1Ah, so even if Workday wasn't trying to discriminate, if their algorithms have that effect, they could still be in trouble.
Speaker 2Exactly that's Mowgli's point, Even if companies using Workday mean well the software itself can lead to bad outcomes. Discrimination Interesting, so it's not just about intent, but the actual impact Right on, and in this case it makes us face the possibility of algorithmic bias in a system that's relying more and more on AI for big decisions.
Speaker 1Okay, we've got the background on Workday, Mbley's experience of constantly being judged by their algorithms and this idea of disparate impact. What are Mobley's actual legal claims? What is he arguing in court?
Speaker 2Actually a couple of different arguments, and they both hinge on whether Workday can be held liable for the discrimination, not just the individual employers.
Speaker 1OK, now I'm really interested. What are those arguments?
Speaker 2First one Workday is an employment agency under laws like Title VII of the Civil Rights Act, the Age Discrimination and Employment Act, ada, that kind of thing.
Speaker 1So he's saying they're in the business of finding people jobs like a regular employment agency.
Speaker 2That was his initial argument. Yeah, because Workday is so deep in the hiring process. The gatekeepers they should have the same anti-discrimination rules as any other agency.
Speaker 1Makes sense. I mean, they are screening candidates, right?
Speaker 2They are. But the court actually dismissed that specific claim. They said Workday doesn't technically procure employees, legally speaking not actively finding people to fill jobs. They just provide the software, the platform.
Speaker 1So Workday's off the hook then.
Speaker 2Not entirely. Here's where it gets a bit tricky legally. The court did say, while not an employment agency Mobley's got a case, a plausible one that Workday was acting as an agent of those employers.
Speaker 1Hold on Employment agency and agent. What's the difference? They both seem to be involved in helping companies find employees.
Speaker 2It is subtle but important difference. An employment agency their main business is connecting job seekers and employers. Think headhunting firms, temp agencies they actively go out and recruit and place people.
Speaker 1So Workday isn't doing that, they're giving the software.
Speaker 2Right. But being an agent of the employer, they take on some of the employer's responsibilities. The court's view if Workday's doing core, that's exactly it and that's why this case is so big for the whole HR tech world. If the court sides with Mobley on this agent idea, big precedent Software companies could be held accountable for algorithmic bias in their hiring tools.
Speaker 1That is huge. But Workday is fighting back hard. I bet Not just going to accept liability.
Speaker 2Of course not. They've got their legal team working on their defense. Main argument we're just the software provider, a neutral platform, basically saying our customers, the employers they set the hiring criteria, make the decisions.
Speaker 1So don't blame us, blame the companies using our software.
Speaker 2That's the gist, but it's not quite that simple.
Speaker 1Why not? What's Mobley's counter argument? Why not? It can't be that easy, can it? What's Mobley saying?
Speaker 2Well, think of it this way Imagine buying a car, but the brakes are faulty. You get in an accident. You wouldn't just blame yourself, would you? You'd hold the carmaker responsible too. Yeah for sure, especially if they knew about the bad breaks and didn't do anything to fix them Exactly. And that's part of what Mobley's arguing. He's saying Workday knows their algorithms can be biased. There are studies out there showing how AI can carry over those biases from society discriminate based on race, gender, all sorts of things.
Speaker 1So he's saying Workday is aware, or should be aware, that their software could lead to discrimination. They can't just play dumb.
Speaker 2That's right. And he says they haven't done enough to deal with those potential biases.
Speaker 1So we've got this back and forth right Workday saying we're just the software guys. Up to the employers to use it fairly. Mobley's side is no, you built the tool. You knew it could be biased. You're responsible for what it does, even if you didn't want to discriminate.
Speaker 2You've got it. It's a really complex situation, not black and white at all. Legal stuff, ethical questions the court's got to figure it all out.
Speaker 1OK, the case hinges on whether Workday's an agent of the employers and if they can be held responsible for any discrimination. But there's something else I'm wondering. Mobley hasn't said which companies he thinks actually discriminate against him right.
Speaker 2That's true, and Workday's using that in their defense. Okay, you say you were discriminated against, but by who? Show us the proof that specific employers were biased against you because you're Black, over 40, or have a disability?
Speaker 1So Mobley's got a challenge he has to show the connection Workday's algorithms plus what specific employers did led to those unfair rejections.
Speaker 2You got it. It's not enough to say Workday's software might be biased generally. He needs to show how that bias played out for him across all those job applications. And proving discrimination Never easy, but proving it's from algorithmic bias even tougher.
Speaker 1Totally agree. So what are the hurdles he's facing in proving his case?
Speaker 2Well, for starters, he needs data showing exactly how Workday's algorithms were used in his specific applications. What were the screening criteria? What factors were weighted more heavily? How did he score on those Workday assessments? That kind of thing.
Speaker 1I bet getting that data from workday is an uphill battle.
Speaker 2Probably yeah. Companies guard their algorithm closely, trade secrets, that whole thing. Mowgli might have to fight tooth and nail for that information.
Speaker 1Okay, let's say he gets the data. What does he do with it To prove his case?
Speaker 2He has to show a pattern of rejections and it can't be because of anything other than his race, age or disability. For instance, if he consistently aced the skills assessments but kept getting rejected for jobs needing those skills, that could be evidence of bias.
Speaker 1It all comes down to showing a clear connection Workday's algorithms, what the employers did and the discrimination he faced.
Speaker 2Exactly, it's a high bar to clear. But if he can do it, big implications not just for him but for the whole industry.
Speaker 1Okay, let's play this out. He wins the case. What then? What happens to Workday?
Speaker 2Well, there's the money, of course. If the court says they're liable for discrimination, they could have to pay. Mobile damages could be a lot of money. But the bigger thing, the legal precedent what do you mean by legal precedent? If Workday loses, this case could open the floodgates for lawsuits against other HR tech companies Sends a message you can't just say we're a neutral platform. You've got a responsibility to make sure your algorithms are fair. Don't lead to discrimination.
Speaker 1So a win for Mobley could change the whole game for the industry.
Speaker 2Definitely possible. Companies like Workday might have to be way more transparent about how their algorithms work, more proactive about checking for bias and taking responsibility for the decisions their software is involved in.
Speaker 1That's a huge shift, shows how important this case really is. It's not just one guy and his job search. It's about the role of algorithms in all our lives. Can technology make inequality worse or can it challenge it?
Speaker 2Exactly, and as AI gets more and more powerful, that debate's only going to get more intense.
Speaker 1So back to this specific case. What's next? Where do things stand now?
Speaker 2The court's given Mobley a chance to revise his complaint, provide more specific evidence to back up his claims. It's a crucial moment for him to bolster his case and make those connections we've been talking about.
Speaker 1He's got to show that concrete link between Workday's algorithms and the rejections he faced.
Speaker 2Right. He needs to prove Workday's actions, as that agent of the employers directly led to him being rejected from those jobs.
Speaker 1He's got a lot to do, but if he pulls it off, the impact could be massive.
Speaker 2Absolutely A case worth keeping an eye on.
Speaker 1This deep dive has been fascinating. I can't wait to see how it all plays out. Thanks for helping us understand all the intricacies.
Speaker 2Happy to do it. Where law, tech and ethics meet, Always a lot to think about.
Speaker 1Okay, we've covered a lot Derek Mobley's story, Workday's role, disparate impact, the legal arguments. We even touched on what this case could mean for the whole HR tech world and algorithms in general.
Speaker 2But let's take a step back for a second. What does this case really mean? Good point it raises big questions going beyond just the legal stuff. How does tech fit into society? Can it do good or can it do harm?
Speaker 1Yeah, like, what does fairness mean when algorithms are involved and who's responsible when these systems make decisions that have real consequences for people?
Speaker 2Exactly the questions we need to be asking. We can't just embrace every new technology without thinking critically about how it might affect people.
Speaker 1Right. It's not about saying no to technology. It's about using it responsibly, ethically, in a way that benefits everyone.
Speaker 2Couldn't agree more and cases like this, as messy and complicated as they are, they help us have those conversations, figure out how to navigate this new world.
Speaker 1OK, we've looked at Mobley's claims Workday's defense, what this case might mean for the whole industry. But, as we wrap up, what are the key takeaways for our listeners, especially when it comes to understanding how algorithms might be affecting their own job searches?
Speaker 2Be aware that's the biggest thing Know that algorithms are being used more and more in hiring and that those algorithms can be biased, even if they weren't meant to discriminate.
Speaker 1So not paranoia, just being informed.
Speaker 2Right, know how these systems work, what data they're using, what blind spots they might have, and speak up, demand transparency and accountability from the companies using this technology.
Speaker 1That's a great point. As job seekers, we have a right to know how these decisions are made.
Speaker 2Absolutely, and the more we know about these systems, the better we can navigate them, make sure they're being used fairly and ethically.
Speaker 1It's about being empowered, not just letting the algorithms decide for us.
Speaker 2Exactly, Technology is a tool. Any tool can be used for good or bad. It's up to us to decide how it's used to make sure it reflects our values, our goals.
Speaker 1Well said, really thought-provoking, deep dive. Thanks for sharing your insights. My pleasure and to all of you listening. Thanks for joining us on the deep dive. We'll be back soon with another Deep Dive into a topic that'll get you thinking. So to prove his case, mobley really needs to paint a clear picture for the court. What kind of evidence will they be looking for specifically?
Speaker 2They need to see a direct link, you know, from those Workday algorithms to the rejections he got. Like did the system red flag something about Mobley that caused his applications to be automatically tossed out? Did he get consistently lower scores on Workday's assessments, scores that don't match his actual qualifications? That's what the court needs to figure out.
Speaker 1And on top of that his legal team has to counter Workday's argument the whole we're just a neutral platform thing that it's the employers who should be held responsible for using the software fairly.
Speaker 2Exactly. They have to make a strong case that Workday was more than just a software provider, that they were acting on behalf of those employers like an agent and therefore share the blame for any discrimination that happened.
Speaker 1It all boils down to proving that connection Workday's actions, the algorithms they created and the negative impact it had on Mobley's job search.
Speaker 2That's the heart of the matter. And if Mobley wins could send shockwaves through the whole HR tech world.
Speaker 1What kind of intact are we talking about? Paint us a picture.
Speaker 2Imagine a future where companies like Workday they're required to check their algorithms for bias regularly, to be open about the criteria they're using to screen candidates and to be held accountable for any unequal impact their software might be having. That's the kind of change this could bring.
Speaker 1So this case, it could really change how these companies do business, how they even design their products.
Speaker 2It's definitely within the realm of possibility, and it could give job seekers more power too. They could start demanding more transparency and fairness from the companies they apply to.
Speaker 1It sounds like this case could be a real turning point in this whole debate about making algorithms accountable in the hiring process, but I'm sure there are some people who worry about too much regulation in this area. What are some of those concerns?
Speaker 2Well, some folks argue that too much regulation could stifle innovation. You know those concerns. Well, some folks argue that too much regulation could stifle innovation. You know, hold back progress in the HR tech sector. The worry is that if companies are constantly looking over their shoulder, afraid of lawsuits about algorithmic bias, they might be less likely to create new and innovative tools.
Speaker 1So it's a delicate balance, protecting job seekers from being treated unfairly but also not squashing progress in the field.
Speaker 2Exactly that's what makes this case so complicated and so important. We have to face these tough questions head on and find solutions that encourage innovation while ensuring fairness and justice. It's not an easy task.
Speaker 1So, as we wrap up this deep dive, what's the one key takeaway you'd want our listeners to remember about the Mobley versus Workday case?
Speaker 2The big takeaway Don't just sit back and watch. We can't be passive in this new age of algorithms. We need to stay informed, stay engaged and be willing to ask the hard questions how is this technology being used? How is it affecting our lives? Those are the questions we need to be asking.
Speaker 1Well said. It's about understanding the role technology plays in our world. Thanks again for walking us through this complicated and fascinating case.
Speaker 2It's been my pleasure Always enjoy these conversations.
Speaker 1And to all our listeners, thanks for joining us on the Deep Dive. We'll catch you next time with another Deep Dive into a topic that'll get those brain cells firing. Until then, keep those questions coming.