The compliance landscape for job descriptions has never been more complex. Recent enforcement actions paint a clear picture: Tesla and News Corp faced complaints from New York City's Commission on Human Rights for pay transparency violations. Google paid $11 million to settle age discrimination lawsuits tied to recruiting practices. Colorado has issued over 200 compliance notices and assessed $238,000 in fines for job posting violations since 2021.
These aren't isolated incidents. They represent a fundamental shift in how governments regulate hiring practices. Today's HR leaders must navigate three converging forces: traditional federal anti-discrimination laws, new state AI hiring regulations, and expanding pay transparency requirements. This convergence creates both legal risk and competitive opportunity for organizations that get it right.
Most compliance violations are unintentional, stemming from outdated language patterns and evolving legal standards. This guide provides practical, actionable compliance strategies to help HR leaders protect their organizations while attracting diverse talent.
Federal Compliance: EEOC and ADA Requirements
Understanding Protected Classes — The Equal Employment Opportunity Commission (EEOC) prohibits discrimination based on race, color, religion, sex (including gender identity, sexual orientation, and pregnancy), national origin, age (40 or older), disability, and genetic information. Job descriptions cannot show preference for or discourage applications from any protected class.
The Core Principle: Job requirements must be job-related and consistent with business necessity. If neutral requirements have adverse impact on protected groups, they must represent the least discriminatory alternative available.
Age Discrimination: The Silent Barrier
Age-biased language is one of the most common compliance violations, yet often the hardest for organizations to recognize. Terms that seem innocuous can create legal liability.
Problematic Terms:
- "Digital native" — Implies preference for younger workers
- "Recent graduate" — Discourages older candidates
- "Young and energetic" — Direct age reference
- "Tech-savvy millennial" — Targets specific generation
Compliant Alternatives: Replace "digital native" with "proficient in social media platforms." Change "recent graduate" to "Bachelor's degree in related field." Use "collaborative, fast-paced environment" instead of "young, energetic team."
Gender-Coded Language: Hidden Bias
Research reveals that 60% of businesses show significant male bias in job advertisements. Masculine-coded words significantly deter female applicants, while feminine-coded words don't have the same deterrent effect on male applicants.
Masculine-Coded Terms (May Deter Women):
- Competitive, aggressive, dominant, decisive
- Rock star, ninja, guru, superhero (job titles)
- Strong, assertive, independent, ambitious
Better Alternatives: Replace "rockstar developer" with "Senior Software Developer." Use "results-oriented" instead of "aggressive." Change "competitive environment" to "performance-driven culture."
ADA Compliance: Essential vs. Non-Essential Functions
The Americans with Disabilities Act requires that qualified individuals with disabilities must be able to perform the essential functions of a job with or without reasonable accommodation. The key is distinguishing truly essential functions from marginal tasks.
Common ADA Violations in Physical Requirements:
- Vague requirements: "Must be able to walk long distances"
- Overly broad: "Required to stand for 8-hour shifts"
- Imprecise: "Ability to lift 50 lbs required"
Compliant Language Focuses on Outcomes: Instead of "must be able to walk," use "ability to move throughout facility to access equipment" (allows for wheelchairs/mobility aids). Replace "required to stand for 8-hour shifts" with "position requires standing with ability to take brief sitting breaks." Change "ability to lift 50 lbs" to "occasionally lifts equipment up to 50 lbs; lifting aids available."
Case Study: In a 2024 lawsuit against Amazon, the court found that simply listing 'working while standing' and 'ability to lift 50 lbs' without specifying duration, frequency, or alternatives created ambiguity. The employer couldn't prove these were truly essential as written. The lesson: detailed job descriptions with specific parameters are legally stronger than vague statements.
The New Frontier: State AI Hiring Laws
If your organization uses any form of automated system to screen resumes, rank candidates, or assist with hiring decisions—even basic Applicant Tracking Systems—you may be subject to new state AI regulations. These laws represent the fastest-growing area of hiring compliance.
New York City Local Law 144: The First Mover
Effective July 5, 2023, NYC Local Law 144 established the first comprehensive regulation of AI in hiring. It applies to any employer or employment agency using "Automated Employment Decision Tools" (AEDTs) for positions in NYC or for NYC residents.
Three Core Requirements:
- Annual Bias Audit — Independent third party must test for disparate impact on race/ethnicity and sex within the last 12 months
- Public Disclosure — Audit summary must be publicly available on company website with no login barriers
- Candidate Notice — At least 10 business days before use, notify candidates about AI use, what qualifications will be assessed, data collection practices, and how to request alternatives
Penalties: $500 per violation per day for first violations, up to $1,500 for subsequent violations. This can quickly escalate to $10,000+ per week of continued non-compliance.
California: Broader Scope, Stricter Standards
California's regulations, effective October 1, 2025, take a more expansive approach. The state regulates "Automated Decision Systems" (ADS)—a broader category that includes not just AI and machine learning, but also simple algorithms, statistical analysis, and any "selection criteria" used in employment decisions.
Key Compliance Obligations:
- Cannot use ADS that discriminates based on California Fair Employment and Housing Act protected characteristics—even unintentionally
- Employers who conduct anti-bias testing have stronger legal defense
- Must maintain automated-decision data for 4 years (up from 2 years)
- Third-party vendors who provide AI tools can be held directly liable as "agents" of the employer
Critical Note: California's definition is so broad that even basic screening questionnaires and productivity scoring systems may qualify. If any part of your hiring process involves computational analysis, legal review is warranted.
Illinois: Video Interview Protections
Illinois' Artificial Intelligence Video Interview Act (effective January 1, 2020) specifically regulates AI analysis of video interviews for positions in Illinois.
Requirements:
- Must notify applicants before interview that AI will be used
- Must explain how AI works and what characteristics it evaluates
- Must obtain consent before interview
- Cannot share videos except with evaluators
- Must delete videos within 30 days of applicant request
Additional Consideration: If AI uses facial recognition or other biometric data, Illinois' Biometric Information Privacy Act (BIPA) also applies, requiring written consent and strict data handling procedures.
Breaking: Major Class Action Lawsuits Against AI Hiring Platforms
As AI hiring tools proliferate, courts are now addressing their legal implications through two landmark class action lawsuits. These cases demonstrate that compliance risks are not theoretical—they are active, nationwide legal battles involving platforms used by thousands of employers.
Workday: Discrimination and Disparate Impact
Filed in February 2023 and granted nationwide class action status in May 2025, Mobley v. Workday alleges that the platform's AI-powered screening tools systematically discriminate against applicants based on age, race, and disability. The lead plaintiff, Derek Mobley, applied to over 100 positions through companies using Workday and was rejected every time—sometimes within minutes.
The Allegations: Workday's algorithms create disparate impact by learning from historical hiring patterns. If an employer historically disfavored candidates from protected classes, the system allegedly decreases recommendation rates for similar candidates going forward. The lawsuit claims Workday's tools operate as the employer's "agent," making the company directly liable for discrimination—not just the employers using the platform.
Current Status: The case now proceeds as a collective action potentially covering millions of job applicants aged 40 and over who applied through Workday since September 2020. In December 2025, the court ordered Workday to provide a comprehensive list of customers who enabled its AI features, meaning those employers' names will go to potential class members.
Why It Matters: This case establishes precedent that AI vendors can be held directly liable for discriminatory outcomes. Courts rejected Workday's argument that it merely provides software—the judge found Workday "sufficiently involved in the hiring process" to face liability. The EEOC filed an amicus brief supporting this theory of vendor liability.
Eightfold AI: Transparency and Consent Violations
Filed in January 2026, this class action presents a different legal challenge. Rather than focusing on discriminatory outcomes, the lawsuit alleges Eightfold violated the Fair Credit Reporting Act (FCRA) by compiling secretive reports about job candidates without their knowledge or consent.
The Allegations: Eightfold allegedly collects data from over 1.5 billion sources including LinkedIn, social media, location data, cookies, and internet activity to generate "match scores" ranging from 0-5. These scores predict a candidate's "likelihood of success" and employers allegedly focus only on top-scoring candidates. The lawsuit claims this constitutes a consumer report under FCRA, requiring disclosure, consent, and the right to dispute inaccuracies—none of which Eightfold provides.
The Scale: Eightfold's clients include Microsoft, PayPal, BNY, Morgan Stanley, and numerous Fortune 500 companies. Plaintiffs claim they applied to hundreds of positions and were screened out before human review, with no knowledge that an AI system was evaluating them or ability to correct potentially inaccurate data.
Why It Matters: This lawsuit shifts focus from bias to transparency. Even if an AI system doesn't discriminate, it may violate consumer protection laws if candidates don't know they're being assessed, can't access the data used, and have no mechanism to dispute errors. As one attorney noted: "There is no AI-exemption to these laws."
Two Distinct Accountability Gaps
Together, these lawsuits expose two vulnerabilities in AI hiring systems:
- Gap One: Unfair Outcomes — Do AI tools produce discriminatory results through disparate impact, even without intent? (Workday case)
- Gap Two: Invisible Processes — Do candidates know they're being assessed by AI, what data is used, and have rights to access and correct information? (Eightfold case)
For HR leaders, these cases demand urgent action. It's no longer sufficient to ask whether your AI tools have been tested for bias. You must also ask: Are candidates aware of AI assessment? Can you explain how these systems work? If a candidate requested their assessment data, could you provide it? Do you even have access to the vendor's outputs?
Critical Takeaway: Even if you believe your organization doesn't use AI hiring tools, you may be wrong. Basic applicant tracking systems, resume parsers, and candidate ranking features often incorporate algorithmic decision-making. The legal definition of "AI hiring tools" is broader than most HR teams realize, and courts are actively testing where liability lies—with vendors, employers, or both.
Pay Transparency: The Expanding Mandate
As of 2024-2025, thirteen states require salary disclosure in job postings, with more jurisdictions adding requirements regularly. Pay transparency laws aim to close gender and racial pay gaps by making compensation information accessible during the hiring process.
Colorado: The Gold Standard
Colorado's Equal Pay for Equal Work Act represents the most comprehensive pay transparency requirements in the nation. Effective since 2021 with significant amendments in 2024, it applies to any employer with at least one employee in Colorado.
Required Disclosures in ALL Job Postings:
- Hourly rate or salary compensation (or range)
- General description of bonuses, commissions, other compensation
- General description of benefits (health, retirement, PTO)
- Application deadline date
Beyond Postings: Colorado also requires internal job opportunity notices to ALL employees on the same day, before selection decisions. Within 30 days of any hire or promotion, employers must notify Colorado employees who will work regularly with the new hire.
Enforcement Reality: Colorado has issued over 1,600 complaints and assessed $238,000 in fines through mid-2024. Penalties range from $500 to $10,000 per posting violation. The state actively enforces these requirements.
Other Key Jurisdictions
New York State and NYC: Employers with 4+ employees must include compensation or range in job postings. NYC's Commission on Human Rights filed complaints against over 30 employers in February 2024, including Tesla and News Corp, marking the first major enforcement wave.
California: Employers with 15+ employees must include pay scale in job postings. The requirement applies to positions that can or will be performed in California, including remote roles.
Illinois: Effective January 1, 2025, employers with 15+ employees must include pay scales and benefits in job postings and maintain records for 5 years.
Multi-State Compliance Challenge: For employers operating across multiple states, the complexity multiplies. Each jurisdiction has different thresholds, disclosure requirements, and penalties. Many organizations adopt a "highest common denominator" approach, applying the strictest standard across all postings.
Common Compliance Mistakes and How to Avoid Them
Most violations stem from five recurring patterns. Understanding these mistakes helps organizations proactively address compliance gaps.
1. Vague Salary Language
Problem: Using terms like "competitive pay" or "commensurate with experience" doesn't meet pay transparency requirements. Solution: Provide specific ranges: "$55,000 - $70,000 annually" or "$26 - $34 per hour."
2. Age-Biased Terminology
Problem: Terms like "digital native" or "recent graduate" open age discrimination claims. Solution: Focus on skills: "Proficiency in social media platforms" or "Bachelor's degree in related field."
3. Gender-Coded Job Titles
Problem: Titles like "rockstar developer" or "sales ninja" deter diverse applicants. Solution: Use standard, descriptive titles: "Senior Developer" or "Sales Manager."
4. Overly Broad Physical Requirements
Problem: Stating "must be able to stand for long periods" without specifics violates ADA. Solution: Specify actual requirements: "Position requires standing for 4-6 hours daily with scheduled breaks; seating available."
5. Missing Required Statements
Problem: Omitting EEO statements, reasonable accommodation language, or AI disclosures. Solution: Include standard compliance statements in all job postings (see Practical Compliance Checklist below).
Practical Compliance Checklist: Before You Post
Use this checklist before posting any job description to catch common compliance issues:
Federal Compliance Review
- Remove age-biased terms (digital native, recent grad, youthful)
- Use gender-neutral language (avoid rockstar, ninja, guru)
- Check for masculine/feminine-coded words
- Eliminate unnecessary racial/cultural references
ADA Compliance Review
- Clearly identify essential vs. non-essential functions
- Use outcome-focused physical requirements language
- Specify frequency/duration for physical demands
- Include reasonable accommodation statement
State AI Compliance (if using AI/ATS)
- NYC: Ensure annual bias audit completed and posted
- NYC: Provide 10-day notice to candidates
- California: Document anti-bias testing efforts
- Illinois: Obtain consent for AI video interviews
- Verify vendor compliance: Can you access AI-generated assessments? Do candidates know about AI use?
Pay Transparency Compliance
- Include specific salary range or rate
- List benefits (health, retirement, PTO)
- Add application deadline (if required by state)
- Verify compliance with specific state requirements
Required Statements
- EEO statement
- Reasonable accommodation language
- AI disclosure (if applicable)
Compliance as Competitive Advantage
Job description compliance isn't merely about avoiding legal risk—it's about building competitive advantage in talent acquisition. Organizations with compliant job descriptions attract broader, more diverse candidate pools. They demonstrate commitment to equity and inclusion from the first touchpoint. They reduce time-to-hire by clearly communicating expectations and eliminating barriers for qualified candidates.
The compliance landscape will continue evolving. More states will adopt AI hiring regulations. Pay transparency requirements will expand. Federal agencies may introduce new guidance. The Workday and Eightfold lawsuits signal that courts will scrutinize both discriminatory outcomes and lack of transparency in AI hiring systems. Organizations that build proactive compliance processes today position themselves for sustainable success tomorrow.
At Workforce Transition Partners, we understand that compliance isn't just a legal checkbox—it's fundamental to making authentic qualifications visible to both automated systems and human reviewers. We help organizations navigate the intersection of ATS optimization and compliance requirements, ensuring job descriptions attract the best talent while meeting all applicable legal standards. Our approach emphasizes transparency, human oversight, and ethical AI practices—the same principles courts are now demanding through active litigation.
The organizations that thrive in 2025 and beyond will be those that recognize compliance not as burden but as opportunity—the opportunity to build more inclusive, effective, and legally sound hiring practices that serve both business objectives and societal good.
Ready to Make Your Talent Visible?
Whether you're a professional navigating a career transition or an HR leader looking to optimize your talent pipeline, WTP can help.
Schedule a Consultation