Employee Survival Guide®
The Employee Survival Guide® is an employment law podcast only for employees about everything related to work and your career. We will share with you all the employment law information your employer and Human Resources does not want you to know about working and guide you through various work and employment law issues. This is an employee podcast.
The Employee Survival Guide® podcast is hosted by seasoned Employment Law Attorney Mark Carey, who has only practiced in the area of Employment Law for the past 29 years. Mark has seen just about every type of employment law and work dispute there is and has filed several hundred work related lawsuits in state and federal courts around the country, including class action suits. He has a no frills and blunt approach to employment law and work issues faced by millions of workers nationwide. Mark endeavors to provide both sides to each and every issue discussed on the podcast so you can make an informed decision. Again, this is a podcast only for employees.
Subscribe to our employee podcast show in your favorite podcast app including Apple Podcasts and Spotify.
You can also subscribe to our feed via RSS or XML.
If you enjoyed this episode of the Employee Survival Guide ® please like us on Facebook, X and LinkedIn. We would really appreciate if you could leave a review of this employee podcast on your favorite podcast player such as Apple Podcasts and Spotify. Thank you!
For more information, please contact Carey & Associates, P.C. at 203-255-4150, or email at info@capclaw.com.
Also go to our website EmployeeSurvival.com for more helpful information about work and working.
Employee Survival Guide®
Inside The AI Dash Cam Biometric Data Lawsuit Transforming Workplace Privacy: Joshua Lewis v. Maverick Transportation LLC and Lytx, Inc.
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Comment on the Show by Sending Mark a Text Message.
What happens when cutting-edge AI technology collides with the fundamental right to privacy in the workplace? Join Mark Carey and his co-hosts as they unravel the intricate web of artificial intelligence (AI) and workplace privacy through a gripping legal saga involving Lytics Incorporated and its AI-enabled dashcam system for commercial trucking. This episode of the Employee Survival Guide® dives deep into a federal class action lawsuit that has sent shockwaves through the industry, raising crucial questions about employee rights, surveillance, and the ethical implications of biometric data collection.
As the hosts dissect the allegations surrounding the Illinois Biometric Information Privacy Act (BIPA), they illuminate the contentious debate over workplace surveillance and the potential risks of unregulated technology. With Lytics' dashcams reportedly scanning and analyzing drivers' faces to identify risky behaviors, the implications are staggering. How does this technology affect employee rights? What does it mean for the future of work? The discussion culminates in a staggering $4. 25 million settlement, but the questions linger: are we sacrificing our privacy for safety?
Throughout this episode, we explore the broader implications of workplace surveillance, touching on critical themes like discrimination, employee empowerment, and the evolving legal landscape of employment law. As we navigate through the complexities of employment contracts, severance negotiations, and workplace rights, you'll gain invaluable insights into how technology is reshaping the workplace culture. Whether you're grappling with hostile work environments, retaliation claims, or performance monitoring, this episode offers essential survival tips for employees facing the challenges of modern work.
Discover how to advocate for yourself in an era where AI and biometric data collection are becoming the norm, and learn about your rights when it comes to workplace discrimination, including race, gender, and disability discrimination. With insider tips on negotiating severance packages and understanding employment law issues, this episode is a must-listen for anyone looking to navigate the complexities of their career with confidence.
So, if you're ready to empower yourself and take control of your workplace experience, tune in to the Employee Survival Guide® and arm yourself with the knowledge you need to thrive in today's rapidly evolving job market. Don't let technology dictate your rights—be informed, be prepared, and be ready to survive!
If you enjoyed this episode of the Employee Survival Guide please like us on Facebook, Twitter and LinkedIn. We would really appreciate if you could leave a review of this podcast on your favorite podcast player such as Apple Podcasts and Spotify. Leaving a review will inform other listeners you found the content on this podcast is important in the area of employment law in the United States.
For more information, please contact our employment attorneys at Carey & Associates, P.C. at 203-255-4150, www.capclaw.com.
Disclaimer: For educational use only, not intended to be legal advice.
Welcome back to the deep dive. Today we are opening up a stack of documents that sit right at the collision point of modern artificial intelligence and your privacy at work.
SPEAKER_00And a massive federal class action lawsuit.
SPEAKER_01Right, exactly. If you have ever wondered exactly who owns the mathematical map of your face or how the tools we use to stay safe might be fundamentally changing the definition of surveillance, you are in the exact right place.
SPEAKER_00Aaron Powell You really are. It is a fascinating space.
SPEAKER_01Aaron Powell Because we are looking at a dispute that on the surface is about commercial truck drivers and dash cams, but underneath it is a battle over the raw materials of the AI age.
Sources And The Neutral Ground Rules
SPEAKER_00Aaron Powell It is a remarkable convergence of technology and constitutional theory and state law.
SPEAKER_01Yeah.
SPEAKER_00The source material driving our deep dog today consists of highly detailed legal filings from a federal class action lawsuit that played out in the United States District Court of the Southern District of Illinois.
SPEAKER_01We are pulling directly from three primary documents, right?
SPEAKER_00Aaron Powell Yes, three main ones. First, the amended class action complaint, which lays out the plaintiff's grievances and their technical analysis of the surveillance in extensive detail.
SPEAKER_01Which is fascinating reading, by the way.
SPEAKER_00It really is. Second, we had the defendant's official answer to that complaint, where the tech company at the center of this pushes back with a veritable wall of robust legal and technical defenses.
SPEAKER_01A massive wall.
SPEAKER_00And finally, we are examining the memorandum in support of the final$4.25 million class action settlement.
SPEAKER_01The mission for our deep dive today for you, the learner listening, is to take these dense legal filings and translate them.
SPEAKER_00Aaron Powell Unpack them completely.
SPEAKER_01We want to unpack exactly how an AI-enabled dash cam system, a piece of hardware placed inside commercial trucking fleets strictly to monitor safety, ended up triggering a massive legal war under the Illinois Biometric Information Privacy Act, which you'll hear us refer to as BIPA today.
SPEAKER_00We will be using that acronym a lot.
SPEAKER_01We are going to explore the granular mechanics of the technology, the incredibly strict legal frameworks trying to contain it, and the high-stakes risk assessments that lead to multi-million dollar payouts.
SPEAKER_00And before we get into the weeds, a crucial point to establish regarding our approach to these sources. We are going to be presenting the factual allegations made by the plaintiffs.
SPEAKER_01Right, the claims they are making.
SPEAKER_00Exactly. And we're also going to present the vigorous categorical denials and counterarguments made by the defendants.
SPEAKER_01Just laying it all out.
SPEAKER_00Aaron Ross Powell We are entirely impartial here. Our goal is not to take aside or act as a judge, but simply to analyze the fascinating, sometimes contradictory arguments presented in these source documents. We are your guides through the complexity of the claims.
The Tech: MV Plus AI Explained
SPEAKER_01Okay, let's unpack this. Let's start with the technology at the center of the storm because without understanding the hardware, the legal arguments just don't make sense.
SPEAKER_00They fall apart completely.
SPEAKER_01The defendant in this case is a company called Linux Incorporated. They are a San Diego-based video telematics and fleet management systems corporation.
SPEAKER_00Aaron Powell, which is a very long way of saying they provide high-tech video and analytics services to the transportation industry.
SPEAKER_01Aaron Powell Right, to put it in plain English. And according to the complaint, they are an absolute giant in this space. The sources note that Lidex technology is used by more than 4,000 fleets across the country. Trevor Burrus, Jr.
SPEAKER_00It is a massive footprint.
SPEAKER_01They claim to hold data based on over 100 billion miles of driving.
SPEAKER_00Aaron Powell That 100 billion miles figure is critical. I mean, it is not just a marketing statistic, it is the foundational data set they continually use to refine the accuracy of their software.
SPEAKER_01Because that is how machine learning works, right?
SPEAKER_00Exactly. The more miles their cameras observe, the more edge cases their algorithms encounter, and the more robust their models become over time.
SPEAKER_01It just keeps learning.
SPEAKER_00The specific piece of hardware facilitating this massive beta gathering and the device at the center of the lawsuit is the SF 300 drive cam.
SPEAKER_01Which is a specialized camera system retrofitted into commercial trucks.
SPEAKER_00But we need to be clear, it is entirely different from a standard dash cam you might buy for your personal car to record fender benders.
SPEAKER_01Right. You just stick a GoPro on your dash. This is not that.
SPEAKER_00No. The SF 300 monitors both the outside of the vehicle and the interior cab of the truck simultaneously. It is looking out at the highway ahead and it is looking directly inward at the driver.
SPEAKER_01And it is what the camera does with that inward-facing video that kicks off this entire legal saga. Lidex markets their premier technology under a specific acronym. They call it MV Plus AI.
SPEAKER_00MV plus AI stands for machine vision plus artificial intelligence. It does. And Lydix has a very distinct way of explaining this to their clients, which the plaintiffs actually quoted directly in their complaint to build their case.
SPEAKER_01Using their own marketing against them.
SPEAKER_00According to Linux's own documentation, the machine vision component sees and recognizes, while the artificial intelligence component interprets and decides.
SPEAKER_01Which is a really interesting distinction.
SPEAKER_00The stated purpose of this combined system is to detect risky driving behaviors in real time. We are talking about detecting inattentive driving or speeding or failing to wear a seatbelt.
SPEAKER_01The standard safety stuff.
SPEAKER_00Right, but it goes much further into behavioral monitoring. The system is looking for drivers who are smoking, eating, drinking, or using a mobile device while behind the wheel.
SPEAKER_01When the system detects these behaviors, it issues an in-cab audio alert, usually a beep or a spoken warning to the driver.
SPEAKER_00The goal is to prompt the driver to self-correct in the moment, theoretically preventing a catastrophic highway accident before it happens.
From Safety Tool To Face Scanning Claims
SPEAKER_01Which sounds undeniably great on paper. I mean, preventing accidents involving massive commercial trucks is a goal everyone shares. Nobody wants tired or distracted truck drivers on the road. Absolutely. But this is where the plaintiff's technical claims come in. And this forms the entire biometric foundation of the lawsuit. The plaintiffs, who are commercial truck drivers, allege that this system is not just a passive video camera recording footage for safety reviews.
SPEAKER_00They allege it is an active face scanning surveillance tool.
SPEAKER_01The crux of the plaintiff's argument is basically an engineering assumption. They claim that in order for this drive cam to know with high algorithmic certainty that a driver is holding a cell phone to their ear or putting a cup of coffee to their lips, the camera first has to mathematically map the driver's face.
SPEAKER_00The allegation is that the camera scans the driver's face, geometry, identifying unique anchor points.
SPEAKER_01Like what kind of points?
SPEAKER_00Things like the distance between the eyes, the curve of the mouth, the tip of the nose, the edge of the lips. The plaintiffs argue that the system isn't merely detecting a general human shape. They allege it is extracting highly specific geometric data points from that individual human's face to calculate exactly what that human is doing at any given millisecond.
SPEAKER_01Aaron Powell To make this concrete for you listening, I really want to describe the visual exhibits the plaintiffs submitted in this complaint because they are wild to look at.
SPEAKER_00They are striking images.
SPEAKER_01The plaintiffs pulled these illustrations directly from Lytics' own marketing materials and technical white papers. They show images of a driver sitting in the cab and overlaying the driver's face is this glowing digital wireframe mesh.
SPEAKER_00It looks exactly like the motion capture technology they use in Hollywood sci-fi movies or video games.
SPEAKER_01Exactly. The complaint shows these wireframes locked onto the driver's head movements. In one image, a driver is holding a phone, and the system is allegedly tracking the proximity of the phone's coordinates to the facial landmark coordinates.
SPEAKER_00There is another showing a driver with a cup and another with a cigarette.
SPEAKER_01And the plaintiffs point to these images and say, look, you cannot accurately track a cigarette going into a mouth unless you have mathematically mapped exactly where that specific mouth is using facial geometry.
SPEAKER_00It is a very compelling visual argument.
SPEAKER_01I have to admit, seeing those wireframes makes the concept of workplace monitoring feel incredibly intimate. I was joking earlier that if I had one of these in my car during my morning commute, the AI would be screaming at me every 10 seconds for sipping my coffee or adjusting my radio.
SPEAKER_00It is a level of scrutiny that most people have never experienced. To really understand the mechanics of what the plaintiffs are alleging the complaint includes several highly technical exhibits, specifically exhibits eight through eleven.
SPEAKER_01Which detailed the fundamental underlying mechanics of computer vision and face detection technology.
SPEAKER_00And the very first concept we have to untangle here, which is absolutely crucial in biometric law, is the stark difference between face detection and face recognition.
SPEAKER_01I am glad we are pausing on this because to anyone outside of a computer science lab, face detection and face recognition sound like two ways of saying the exact same thing.
SPEAKER_00They really do sound synonymous.
SPEAKER_01But in this legal battle, the distinction is everything.
SPEAKER_00It truly is. Face detection is the process of an algorithm looking at a digital image and answering a very basic question. Is there a human face anywhere in this picture?
Face Detection vs Face Recognition
SPEAKER_01It just finds a face, any face.
SPEAKER_00Exactly. It finds a face, it does not care who the face belongs to. Face recognition takes it a massive step further and answers the question: whose face is this?
SPEAKER_01It identifies the person.
SPEAKER_00Right. It takes the detected face, measures it, and matches it to a specific known identity in a database, like unlocking your smartphone with your face.
SPEAKER_01Now the plaintiffs are alleging that Lytics system uses sophisticated face detection algorithms that inherently cross the line into biometric collection.
SPEAKER_00Specifically, the exhibits detail foundational computer vision models like the Viola Jones framework and hair cascade detection.
SPEAKER_01Let's break down hair cascade detection because the way these algorithms actually see is totally alien to how a human brain processes a picture. An algorithm doesn't look at a photo and see a person.
SPEAKER_00No, not at all.
SPEAKER_01It sees a grid of millions of pixels with different numerical color values.
SPEAKER_00Exactly. A face detection algorithm starts by scanning the pixels of an image for the easiest, most universal human feature to detect, which is the eyes.
SPEAKER_01Why the eyes specifically?
SPEAKER_00Because in a digital image, the area across the eyes is generally darker than the area across the upper cheeks of the forehead, simply due to the way shadows fall on the human skull. Oh, that makes sense. The algorithm isn't looking for an eyeball. It is looking for that specific mathematical contrast pattern in the pixel data. Dark, light, dark. Once it finds that contrast of the eyes, it attempts to detect the eyebrows, the mouth, the bridge of the nose, the nostrils.
SPEAKER_01It applies a cascading series of mathematical tests.
SPEAKER_00That is the cascade and hard cascade.
SPEAKER_01I was reading through this part of the source material, and the best way I could visualize it is to imagine the algorithm as an incredibly strict, very fast bouncer at a nightclub doing a visual pat down.
SPEAKER_00I like that analogy.
SPEAKER_01The bouncer isn't looking at your whole outfit at once. First, they check if you have an ID in your hand, that is the eyes. If you don't have an ID, they instantly reject you and move to the next person in line.
SPEAKER_00They don't even bother looking at anything else.
SPEAKER_01Right. If you do have an ID, they move to the second test. Does the picture match? That is the nose and mouth. If it fails at any step, the bouncer immediately stops processing you and moves on. It is a rapid cascading series of yes or no tests.
SPEAKER_00And if a region of pixels passes every single stage of the bouncer's checks, the algorithm concludes, mathematically, that it has found a human face.
SPEAKER_01That highlights the speed and the sequential nature of the processing. It happens in fractions of a second continually, frame by frame.
SPEAKER_00But this brings us to the part of the process that the plaintiffs argue crosses the legal line. Feature extraction.
SPEAKER_01Right. This is the core of their claim.
SPEAKER_00The plaintiffs' exhibits explain that once the model is trained and operating, it extracts these specific features, these geometric data points representing the eyes, nose, and mouth, and it compares them to stored patterns to verify the face and track its real-time movements.
SPEAKER_01The plaintiffs argue that this extraction process, by its very nature, involves collecting biometric identifiers.
SPEAKER_00Their argument is that you cannot run a mathematical algorithm that plots the distance between my eyes and my mouth to see if I am drinking coffee without simultaneously capturing my unique face geometry.
SPEAKER_01This is where I want you to step into the shoes of the drivers for a moment. Imagine your own workplace, whatever that looks like. Imagine you were sitting on your desk or operating a forklift or stocking inventory.
SPEAKER_00Just doing your normal daily tasks.
How Algorithms See: Haar Cascades
SPEAKER_01Right. How would it change your psychological state if a company mandated camera was pointed at you and every single second of your workday and algorithm was mathematically plotting your facial landmarks?
SPEAKER_00It is a lot of pressure.
SPEAKER_01It is checking the geometry of your eyes and mouth over and over again, running those bouncer tests constantly just to ensure you aren't taking a sip of water or glancing down at a text message. It fundamentally changes the feeling of being on the clock. It is a profound shift in what it means to be monitored by your employer.
SPEAKER_00It certainly is. And that profound shift in surveillance brings us directly into the legal framework that this entire lawsuit is built upon.
SPEAKER_01Let's talk about that framework. We need to look at BIPA, the Illinois Biometric Information Privacy Act. If you follow technology or privacy news, you've probably heard this acronym whispered in terror by corporate lawyers.
SPEAKER_00It is widely considered the strictest biometric privacy law in the United States.
SPEAKER_01What is genuinely remarkable about BIPA is its history. It was enacted all the way back in 2008.
SPEAKER_00Which is crazy to think about. In terms of modern artificial intelligence, deep learning, and smartphone technology, 2008 is practically the Stone Age. Exactly. But the Illinois legislature was incredibly prescient. They enacted BIPA because they recognized early on the severe, unique, and irreversible risks of compromised biometric data.
SPEAKER_01The core philosophy behind BPA is actually really elegant when you strip away the legalese. Think about standard data breaches. If someone hacks a corporate server and steals your password or your social security number or your credit card details, that is a massive headache.
SPEAKER_00Huge hassle. You have to make phone calls, freeze your credit trait, maybe deal with some fraud.
SPEAKER_01But you can change your password. You can cancel your credit card. You can be issued a new PN. The damage can be mitigated.
SPEAKER_00You can reset those things.
SPEAKER_01Right. But if a server is hacked and cyber criminals steal the exact mathematical map of your face geometry or your fingerprints or your retina scan, you cannot change your face.
SPEAKER_00No, you're stuck with it.
SPEAKER_01You cannot get new fingerprints. You are permanently compromised. That data is tied to your immutable physical biology forever.
SPEAKER_00That concept of biological immutability is the engine of the law. And because the stakes of losing that data are so unimaginably high, BIPA lays out very strict definitions to protect it.
SPEAKER_01How does it actually define the data?
SPEAKER_00Aaron Powell The statute defines a biometric identifier as a retina or iris scan, fingerprint voice print, or a scan of hand or face geometry.
SPEAKER_01Aaron Powell So very specific physical traits.
SPEAKER_00Yes. But it also casts a wider net by defining biometric information as any information, regardless of how it is captured, converted, stored, or shared, that is based on an individual's biometric identifier and used to identify them.
SPEAKER_01So armed with those strict definitions, the pointiffs in this class action bring the hammer down. Their complaint hits a lens with three specific counts. Three distinct ways they allege this AI dash cam system violated Illinois law.
SPEAKER_00Let's walk through them because each one builds on the last.
SPEAKER_01Count I focuses entirely on the retention schedule, which falls under section 15A of BIPA. What exactly does this section demand from a company?
SPEAKER_00Section 15A is about data lifecycle management. The rule states that any private entity in possession of biometric identifiers must have a written policy, and crucially, that policy must be made available to the public.
SPEAKER_01It can't be a secret internal memo.
SPEAKER_00No, it has to be public. This policy has to establish a clear retention schedule and specific guidelines for permanently destroying that biometric data once the initial purpose for collecting it has been satisfied. Or within three years of the person's last interaction with the company, whichever comes first.
Feature Extraction As Biometric Capture
SPEAKER_01So you can't just quietly hoard the data in a server farm forever building up a shadow profile of a person over decades. You have to tell the public exactly when and how you're going to delete it.
SPEAKER_00Precisely. The transparency is mandatory. And the plaintiff's core allegation for Count I is that Lytics completely failed to maintain such a policy.
SPEAKER_01But do then they have a privacy policy on their website?
SPEAKER_00The complaint notes that while Litex obviously had a standard privacy policy on its website, that policy actively disclaimed responsibility for the data practices of its clients, the trucking companies.
SPEAKER_01Oh, so they're pointing the finger at the employers.
SPEAKER_00Yes. Linux's policy stated that they act only as a service provider, processing personal information on behalf of their clients, and directed anyone reading it to go look at the client's own individual privacy policies.
SPEAKER_01So they were basically saying, don't look at us, look at the trucking fleet you work for.
SPEAKER_00Exactly. The plaintiffs argue this passing of the BERC completely fails to meet Brempe's explicit requirement for a publicly available definitive destruction schedule maintained by the entity actually holding the data, which they argue is Lidex.
SPEAKER_01Okay, so they allegedly didn't have a specific retention schedule. To a layman, that might sound like a bureaucratic oversight. Maybe a failure of the compliance department to update a webpage. Is a missing public policy really enough to trigger a multimillion dollar federal lawsuit?
SPEAKER_00On its own, a technical violation of 15A is serious, but it is the second count that truly creates the massive legal exposure. The plaintiffs follow up with count two, which is the bedrock of BPA. Informed written consent under section 15B.
SPEAKER_01This is the section that gives corporate compliance officers nightmares. Walk us through the mechanics of 15B.
SPEAKER_00The rule is absolute a company cannot collect, capture, purchase, receive through trade, or otherwise obtain your biometrics without doing three specific things.
SPEAKER_01What are the three steps?
SPEAKER_00First, they must inform you in writing that your biometric identifiers or information are being collected or stored. Second, they must inform you in writing of the specific purpose and the length of term for which your biometrics are being collected, stored, and used.
SPEAKER_01Okay, so full disclosure.
SPEAKER_00Full disclosure. And third, and this is where companies always trip up, they must receive a written release executed by you.
SPEAKER_01An actual signature.
SPEAKER_00Yes. You have to physically or digitally sign a document, giving them explicit informed permission before a single piece of biometric data is collected.
SPEAKER_01And the allegation here from the drivers is a total across the board failure on all three fronts. The drivers claim they were never informed in writing by Lytics that their face geometry was being mathematically scanned by these inward-facing dash cams.
SPEAKER_00They claim they were completely in the dark.
SPEAKER_01They claim they were never told how long the resulting data would be kept on Lytics servers. And most importantly, they allege they absolutely never signed any kind of written release, allowing Lytics to capture their biometrics in the first place. They just showed up for work, got into their assigned trucks, and the cameras were already rolling.
SPEAKER_00Building on that lack of consent, we reach Count 3, which is perhaps the most fascinating from a modern tech industry perspective. Count 3 looks at the commercialization of this data under Section 15C.
SPEAKER_01What does 15C say?
SPEAKER_00The rule in BIPA is incredibly strict. No private entity in possession of a biometric identifier may sell lease trade or otherwise profit from a person's biometric identifiers or biometric information.
BIPA’s Origins And Core Logic
SPEAKER_01The way the plaintiffs structure this allegation is really clever because they aren't accusing Lytics of taking a truck driver's face map and literally selling it to a third-party data broker for a quick buck. That would be a cartoonish violation.
SPEAKER_00Right. Nobody's accusing them of selling faces on the black market.
SPEAKER_01Instead, the plaintiffs look at the economics of how AI is built. They claim that Lytics uses this massive database of continuously scanned drivers, again, drawing on that hundred billion miles of driving data to relentlessly train and refine its artificial intelligence.
SPEAKER_00Trevor Burrus, Jr. By continually feeding their machine learning algorithms with this allegedly biometric data, Linux engineers newer, better, more accurate behavioral detection products.
SPEAKER_01They then turn around and market this supposedly superior, highly trained technology to new corporate clients to win market share.
SPEAKER_00That is the core of their argument. The plaintiffs are arguing that using biometric data as the raw material to build a better commercial product, and then selling that resulting product to increase market dominance is a direct, albeit indirect, violation of BIPA's no-profit rule.
SPEAKER_01They are essentially saying you are profiting off our faces by using them to make your AI smarter so you can sell more dash cams.
SPEAKER_00What's fascinating here is how the defense chose to respond to this massive multi-layered attack. Because Linux did not just roll over and try to settle quietly on day one.
SPEAKER_01Not at all.
SPEAKER_00When you read Defendant Linux's official answer to the complaint, you don't just see a disagreement over how to read a statute. You see a masterclass in aggressive, comprehensive, scorched-earth legal defense.
SPEAKER_01The answer document is formidable. Lytics didn't just quibble with the plaintiff's interpretation of Illinois law. They flat out denied the entire foundational technological premise of the lawsuit. It is a blanket, categorical denial of the core allegation.
SPEAKER_00It is a fundamental clash of reality. The plaintiffs are saying your machine vision mathematically maps our faces to know if we were smoking a cigarette, and Lytics is saying no, it doesn't. You fundamentally misunderstand how our proprietary software works.
SPEAKER_01It is two completely different stories.
SPEAKER_00Throughout their answer, Litz freely admits that their MV plus AI technology detects risky driving behaviors. They proudly admit they sell this technology to improve fleet safety and save lives.
SPEAKER_01They stand behind the product.
SPEAKER_00But they vehemently, repeatedly deny that the drive cam scans drivers' face geometry. They deny that it harnesses or extracts biometric data points as defined by BIPA. And they expressly unequivocally deny. Maintaining a trove of biometrics on their servers.
SPEAKER_01So how do they explain what the camera is actually doing then?
SPEAKER_00They argue that their system detects the objects and the behaviors, the presence of a cell phone, the motion of an arm, the contrast of an eye closing, but it does not create or store a geometric map of the driver's unique facial identity.
Count I: Retention Policy Failures
SPEAKER_01Aaron Powell And that technological denial, as forceful as it is, is just the opening salvo. Leninka's legal team doesn't put all their eggs in the we don't scan faces basket. In their answer, they list 23 separate affirmative defenses.
SPEAKER_0023 is a lot.
SPEAKER_01Now, for those who might not spend their free time reading civil litigation filings, an affirmative defense is essentially a legal pivot. It is a way for a defendant to say, even if a jury decides that everything the plaintiff says is 100% true, which, for the record, we strongly deny, the plaintiff still cannot win this lawsuit because of this entirely separate legal reason.
SPEAKER_00Categorizing and analyzing these 23 defenses reveals a brilliant multi-pronged strategy designed to attack the lawsuit from every conceivable angle. Let's dig into the major categories of these defenses because they show how complex it is to apply a state privacy law to interstate technology.
SPEAKER_01First, we have what we could loosely call the we're just the vendor defense.
SPEAKER_00Right. Lytics argues that it is a third-party technology provider. They do not employ these truck drivers. The trucking companies, the fleets that hire the drivers, are the ones who purchase a Lytics system and mandate the installation of the cameras in their trucks.
SPEAKER_01Lytics argues that BIPA should not apply to a third-party software vendor that has no direct contractual employment relationship with the individuals being recorded.
SPEAKER_00They claim it would result in an absurd and overly burdensome application of the statute to hold the software developer liable for how an employer chooses to monitor its own employees.
SPEAKER_01It's like suing Microsoft because your boss uses Excel to track your bathroom breaks.
SPEAKER_00That is exactly the logic they are using. Building on the relationship between the driver and the employer, Linux brings up the implied consent and clean hands defense.
SPEAKER_01How does implied consent work in this context?
SPEAKER_00Aaron Powell This defense posits that even if biometric data was collected, which again they fiercely deny the drivers, implicitly consented to that collection through their actions.
SPEAKER_01How exactly does one implicitly consent to a biometric scam?
SPEAKER_00Linux's argument is that these drivers voluntarily got into the cabs of these commercial trucks day after day, week after week. They continuously drove trucks that they knew full well were equipped with inward-facing Lidex technology.
SPEAKER_01There is a camera staring right at them.
SPEAKER_00Yes. Lidex argues you cannot knowingly drive a monitored truck for months, reap the financial benefits and wages of that employment, and then turn around years later and claim your privacy was secretly invaded without your consent. By doing the job under those known conditions, they argue the drivers accepted the terms.
SPEAKER_01Lidex also brings up the no actual harm argument, which is a classic enduring defense in privacy and data breach litigation.
SPEAKER_00It is the so what defense. Lytics points out quite accurately that the plaintiffs didn't suffer any actual concrete damages.
SPEAKER_01No one's identity was stolen.
SPEAKER_00No. No one's bank account was drained by a hacker. No one lost a job or suffered reputational damage because a cyber criminal stole a wireframe map of their face from a Lytics server.
Count II: Notice And Written Consent
SPEAKER_01Lytics argues that BPA was designed to prevent harm, and without actual demonstrable harm, the plaintiffs are not entitled to the massive, ruinous financial damages they are seeking.
SPEAKER_00But the defense doesn't stop at state law interpretations or procedural arguments. They go all the way to the top. They pull out the United States Constitution and federal law to mount their counterattacks.
SPEAKER_01These are the truly heavy legal theories that threaten to invalidate the plaintiff's entire case.
SPEAKER_00Indeed. Lidix raises the doctrine of federal preemption. In the hierarchy of American law, federal law trumps state law. Lidicks argues that by PA, which is a state-level privacy law passed in Illinois, is preempted by the massive body of federal laws governing the interstate trucking industry.
SPEAKER_01They point specifically to federal trucking safety regulations and the Federal Aviation Administration Authorization Act, or FAA.
SPEAKER_00Which, despite its name, heavily deregulated and governs the motor carrier industry.
SPEAKER_01The argument is straightforward. The federal government heavily regulates commercial trucking to ensure safety across all 50 states.
SPEAKER_00Right. If an AI dash cam is proven to improve federal safety standards and reduce highway fatalities, the state of Illinois cannot use a local privacy law to effectively ban, penalize, or severely restrict that safety technology. The federal mandate for safety overrides the state mandate for privacy.
SPEAKER_01Then they invoke the dormant commerce clause of the U.S. Constitution. I want to spend a minute on this because it is a fascinating application of a very old constitutional principle to cutting-edge AI.
SPEAKER_00It is a brilliant stretch of the law.
SPEAKER_01Historically, the framers of the Constitution included the Commerce Clause because they didn't want the individual states getting into trade wars with each other, setting up tariffs at their borders, and destroying the national economy. Trevor Burrus, Jr.
SPEAKER_00They needed free trade between the states. Trevor Burrus, Jr.
SPEAKER_01The dormant aspect of this clause prohibits states from passing legislation that improperly burdens or discriminates against interstate commerce. Applying that to AI dash cams is a brilliant legal maneuver. Trevor Burrus, Jr.
SPEAKER_00Liddox argues that applying an Illinois privacy law to interstate commercial truck drivers, drivers who are constantly crossing state lines carrying essential goods across the country creates an unconstitutional, undue burden on the national supply chain.
SPEAKER_01Just imagine the logistical nightmare this would create. Let's say a truck driver is hauling a load of goods from Indiana through Illinois and into Iowa.
SPEAKER_00Following a standard route.
SPEAKER_01The safety camera is on in Indiana, but the moment the truck's tires cross the state line into Illinois, the trucking company would have to somehow geofence the truck and automatically disable the safety features of the AI camera to avoid BIPA liability, only to turn it back on the moment the truck enters Iowa.
SPEAKER_00Lytics argues that Illinois simply does not have the constitutional authority to impose its hyperlocal privacy standards on the national interstate supply chain in a way that forces companies to degrade their safety protocols.
Count III: Profiting From Biometrics
SPEAKER_01And lastly, if all of those defenses fail, Lytics brings up the excessive fines clause of the Eighth Amendment. Let's do the math on this because this is where the sheer scale of the risk becomes terrifying for a corporation.
SPEAKER_00Aaron Powell The numbers get huge very fast.
SPEAKER_01BPA allows for statutory damages of$1,000 for every single negligent violation and$5,000 for every single intentional or reckless violation.
SPEAKER_00Is it one violation per driver, or is it one violation every single time the camera mathematically scans the face?
SPEAKER_01If an algorithm runs at 30 frames per second, checking facial geometry continually for an eight-hour driving shift, you can see how the math quickly becomes astronomical.
SPEAKER_00Exactly. Even using conservative estimates, if you multiply a$5,000 intentional violation penalty by thousands of truck drivers, the resulting damages don't just punish the company, they obliterate it.
SPEAKER_01Lytics argues that using a state statute to impose hundreds of millions or potentially billions of dollars in damages for a technical privacy violation where no actual data was hacked or stolen operates as an unconstitutional, disproportionate punishment that violates the excessive fines clause.
SPEAKER_00This wall of defenses really crystallizes the tension of the whole case. The technology scales effortlessly across state lines, across borders, and across corporate boundaries. But the legal accountability and the laws governing that tech are fragmented, localized, and incredibly rigid.
SPEAKER_01It's easy to get lost in the constitutional law, the dormant commerce clause, and the mathematical mechanics of Harkas State algorithms, but we have to remember there are actual human beings sitting inside these truck cabs.
SPEAKER_00We can't lose sight of the people.
SPEAKER_01Let's pivot and look at the reality of the people living under this technology. The amended complaint brings forward three named plaintiffs to represent the massive class of drivers. Joshua Lewis, who drove for Maverick Transportation, James Kavanaugh, who drove for Quick Crete, and Nathaniel Timmins, who drove for Gemini Motor Transport.
SPEAKER_00The deep dive into their daily lives, as outlined in the complaint, paints a very specific and isolating picture of the alleged surveillance reality. We have to contextualize their workspace.
SPEAKER_01Right. This isn't an office.
SPEAKER_00These aren't people sitting in an office cubicle from nine to five where they can get up, walk to a break room, or step outside for a private phone call. They're over-the-road commercial truck drivers. The cab of that truck is their entire world for hours and days on end.
SPEAKER_01It is their workspace while driving their break room while parked and often their dining room and bedroom.
SPEAKER_00And according to the complaint, the drive cam is a constant, unblinking presence in that confined space.
SPEAKER_01The psychological toll of knowing an AI lens is pointed at you analyzing your micro movements while you are just trying to eat a sandwich or drink a coffee on a grueling 10-hour haul is immense.
SPEAKER_00It creates a pressure cooker environment.
SPEAKER_01But there is a specific technical detail from the sources that brings this entire federal lawsuit into the jurisdiction of Illinois, and it has to do with tracking.
SPEAKER_00Yes. The geolocation data, the inward-facing camera doesn't exist in a vacuum. The Little Rake system is a comprehensive telematics unit, meaning it is constantly tracking. The GPS coordinates the speed and the physical location of the trucks in real time.
SPEAKER_01And that GPS ping is the trap being sprawn.
SPEAKER_00Exactly. These drivers work national or regional routes. But because their designated routes inevitably took them across the border and physically into the state of Illinois, BPA's jurisdiction was triggered.
The Defense: We Don’t Scan Faces
SPEAKER_01So they didn't even have to be Illinois residents.
SPEAKER_00No, not at all. The plaintiffs construct a very logical argument. Because Linux's system was actively tracking the truck's precise geolocation, Lytics knew exactly where the drivers were at all times. Therefore, Lytics knew full well that the alleged facial scanning was occurring while the drivers were physically present within the geographic borders of Illinois.
SPEAKER_01It didn't matter if the trucking company employing the driver was based in Arkansas or if Lytics' server farms were based in California.
SPEAKER_00The moment those tires rolled onto Illinois asphalt with the inward-facing camera running, the plaintiffs argue Illinois law applied and Lytics was on the hook.
SPEAKER_01Here's where it gets really interesting because you have this unstoppable technological force meeting an immovable legal object. You have a massive, well-frunded tech company vigorously denying the claims and raising 23 formidable constitutional level defenses.
SPEAKER_00And on the other side, you have plaintiffs armed with the strictest biometric privacy law in the country and a very sympathetic narrative of intense workplace surveillance.
SPEAKER_01It is a legal Mexican standoff. How does this end?
SPEAKER_00It ends with a$4.25 million settlement. But to understand why both sides agreed to lay down their arms, we have to look at the sheer attrition of the litigation. The settlement memorandum outlines a grueling timeline.
SPEAKER_01This wasn't a quick payout or a nuisance settlement.
SPEAKER_00It was three years of grinding, complex, highly adversarial litigation. Litics fought incredibly hard. They filed a motion to dismiss the case entirely early on, which the plaintiffs successfully survived.
SPEAKER_01Surviving a motion to dismiss is a huge hurdle. It means the judge looked at the complaint and said, yes, there is enough legal merit here to proceed to discovery.
SPEAKER_00And discovery in a case involving proprietary AI algorithms is an absolute nightmare. It is the grueling process of demanding internal emails, technical specifications, source code summaries, and engineering documents from each other.
SPEAKER_01They fought over what documents had to be produced.
SPEAKER_00Finally, after years of spending money on corporate lawyers, it culminated in a full-day intense mediation session with a respected independent mediator.
SPEAKER_01There is also a quick procedural note in the documents regarding Maverick transportation, the employer of the first named plaintiff, Joshua Lewis.
SPEAKER_00That's an important detail. Early on in the process, Maverick settled their portion of the claims out of court. The documents note they were dismissed from the lawsuit with prejudice.
SPEAKER_01Wait, what does with prejudice actually mean in plain English? Because it sounds like the judge was angry at them.
Twenty-Three Affirmative Defenses
SPEAKER_00It's a common legal phrase that simply means the dismissal is final and permanent. When a case is dismissed with prejudice, it means the plaintiff is legally barred from ever bringing that exact same claim against that exact same defendant back to court.
SPEAKER_01The book is permanently closed on Maverick.
SPEAKER_00That dismissal left Lytics standing entirely alone to face the music as the sole remaining defendant against the entire class of drivers.
SPEAKER_01So let's look at the final deal they struck in that mediation room. The memorandum lays it out clearly. The settlement establishes a non-reversionary cash fund of$4,250,000.
SPEAKER_00Non-reversionary is a vital term here. It means that Linux writes the check and puts the money in an escrow account. And no matter how many drivers actually fill out the paperwork to claim their share of the money, Linux doesn't get a single punny back.
SPEAKER_01If only 10 drivers claim the money, those 10 drivers get a massive payout. Any leftover funds do not revert back to the company.
SPEAKER_00The scale of the class size is staggering. The settlement covers an estimated 85,000 unique class members. That means there are 85,000 individual commercial truck drivers who operated a LedX equipped vehicle within the state of Illinois during the specified class period.
SPEAKER_01The court-appointed settlement administrator executed a massive notice program to try and track down these drivers, sending emails, physical mail, and digital ads, and successfully provided direct notice to 22.5% of the massive class.
SPEAKER_00And notably when the court asked for objections, not a single class member out of the thousands notified, and no defendant opposed the final approval of this settlement.
SPEAKER_01But the big question, the million dollar question, or I guess the$4.25 million question is why settle? If Lytics had 23 amazing defenses, if they believed they could win on the dormant commerce clause, and they swore up and down that their tech fundamentally does not stand face geometry, why write a check for$4.25 million?
SPEAKER_00And conversely, if the plaintiffs truly felt their fundamental human privacy was deeply violated by an illegal biometric surveillance tool, why let Lytics off the hook for what if you divide 4.25 million by 85,000 drivers amounts to about 50 bucks a driver before the lawyers take their cut?
SPEAKER_01It all comes down to the brutal reality of risk assessment. Both sides faced absolute existential threats if they refused to settle and took this to a jury trial.
SPEAKER_00For the plaintiffs, the risk was that a jury might actually listen to Liddyx engineers and believe them. What if Lidx successfully proved at trial, using expert computer science testimony, that their MV plus AI algorithm genuinely does not scan face geometry is strictly defined by BPA?
SPEAKER_01What if they proved it only detects shapes and contrast patterns without ever extracting unique biometric identifiers?
SPEAKER_00If a jury believed that technical distinction, the plaintiffs lose everything. They get zero dollars after years of work. Alternatively, what if the judge eventually agreed with Lytics' federal preemption defense? If the court ruled that federal trucking safety laws completely override Illinois state privacy laws, the plaintiff's case is instantly destroyed on constitutional grounds.
SPEAKER_01And for Lytics, the risk was apocalyptic.
Vendor Role, Implied Consent, No Harm
SPEAKER_00Apocalyptic is the exact right word. Let's revisit the math on the statutory damages in BIPA. Let's assume the best case scenario for the plaintiffs and the worst case for Lytics. A jury finds that Lytics intentionally or recklessly violated BIPA. That is$5,000 per violation.
SPEAKER_01Multiply that$5,000 by 85,000 drivers. The damages immediately start at$425 million, nearly half a billion dollars just as a baseline.
SPEAKER_00But it gets worse because BIPA damages have often been interpreted by courts to accrue per scan, not per person. If the AI dash cam mathematically scans a driver's face geometry multiple times a minute for an eight-hour driving shift for three years, the mass creates a financial black hole that would bankrupt almost any tech company on Earth.
SPEAKER_01A trial loss could have been a fatal company-ending event.
SPEAKER_00When you look at an exposure of potentially billions of dollars, a$4.25 million settlement is not an admission of guilt. It is a highly calculated, pragmatic business expense to permanently eliminate a catastrophic risk. If we connect this to the bigger picture, this settlement serves as a massive warning beacon for the entire tech and transportation industry.
SPEAKER_01In the age of AI data is an incredibly valuable asset. It is the raw material that builds the future.
SPEAKER_00But this case proves that unregulated, improperly consented biometric data is a multi-million dollar liability waiting to detonate. It shows that deploying AI surveillance tools, even tools designed with the noble necessary goal of saving lives on the highway, carries immense unquantifiable legal risks if those tools intersect with aggressive state privacy laws.
SPEAKER_01So, what does this all mean? We started this deep dive looking at a small plastic camera mounted to the windshield of a truck, and we end up looking at the fragile high-wire balancing act of modern society.
SPEAKER_00On one hand, we possess advanced artificial intelligence that can genuinely prevent horrific accidents. It can catch a tired driver falling asleep at the wheel. It can sound an alarm before a distracted driver rear-ends a family minivan. It definitively saves lives.
SPEAKER_01But on the other hand, we have the strict legal boundaries of human privacy and the right to exist without being mathematically quantified by our employers. Do we have to sacrifice the sovereign ownership of our own biological face geometry to be safe on the highway? This lawsuit proves that we as a society have absolutely not figured out the answer to that question.
SPEAKER_00Aaron Powell And that is the ultimate takeaway for you listening to this. The very definition of what constitutes you in a digital space is still being actively fought over in the courts. Is you just a traditional photograph? Is it a glowing wireframe map of your facial landmarks? Is it a statistical model of your eye movements?
SPEAKER_01Technology is evolving exponentially faster than the law can comprehend, and everyday judges and juries are being forced to draw legal boundaries around concepts that didn't even exist 20 years ago.
SPEAKER_00I want to leave you with a brand new concept to chew on. Something that builds on this entire debate, but looks just over the horizon. Right now, privacy laws like BPA focus strictly on physical biological identifiers, your face geometry, your fingerprints, your retina.
SPEAKER_01Well, what happens in three or five years when AI systems become so hyperadvanced that they don't even need to see your face to know exactly who you are and what you are doing?
SPEAKER_00What if a smart seat in a truck cab or your office chair could identify you and predict your level of distraction or fatigue solely by the unique microscopic way your body shifts its weight over time? What if sensors embedded in a steering wheel or a keyboard can identify your unique identity based on the distinct rhythm temperature and pressure of how your hands grip the surface?
SPEAKER_01When your very behavior and movement become your biometric signature, will our current rigid laws be able to protect that data signature? Or will the entire legal framework have to be torn down and rewritten all over again?
SPEAKER_00That is a wild, slightly terrifying thought to end on, and exactly why we love digging into these sources. Thank you for joining us on this deep dive into the legal and technological frontier.
Federal Preemption And Commerce Clause
SPEAKER_01We hope we unpacked the complexity and gave you the insight you were looking for. Keep asking the big questions, keep looking past the headlines, and above all, stay insanely curious. We'll catch you on the next one.