What If Your Shot At The Job Ended Before It Even Began?

You had the experience. The skills. The cover letter you had three friends proofread.

But somehow… you never even made it to the interview.

What if the problem wasn’t your résumé—but how it was filtered?

There’s no such thing as a neutral sort. Every hiring platform has to decide which resumes to show first — and which to leave buried at the bottom of the stack. That’s choice architecture. And whether intentional or not, it shapes careers.

Filtering resumes isn’t just a technical decision. It’s a moral one.

That’s where Indeed’s Fairness Filter comes in.

Before a hiring manager sees a single candidate, they’re prompted to choose a definition of fairness that the algorithm will use to filter applicants.

Because fairness isn’t just one thing. In fact, you can mathematically define it in 21 different ways, each with dramatically different consequences. Some definitions prioritize showing equal percentages from each group (demographic parity), others focus on equal access for top candidates (equal opportunity), or equal hiring accuracy across groups (predictive parity).

Which one is right? That depends on your values.

But here’s the catch: most platforms don’t ask. They just pick for you.

So unless your hiring manager moonlights as an AI ethicist — and your recruiter as a philosopher — those moral decisions are likely being made by whoever built the algorithm. By designers and engineers who may never have considered the ethical stakes of their sort logic, let alone been trained in moral AI.

The Behavioral Science Behind It

Behavioral science tells us that invisible defaults are powerful. When people don’t know a choice exists, they don’t question its consequences. But even the simplest sort order embeds a worldview—and that worldview can decide whose résumé rises to the top.

By surfacing this moral choice up front, Indeed’s Fairness Filter interrupts that invisibility. It transforms an implicit value judgment into an explicit one.

A Nuanced Take

Algorithms don’t just filter résumés. They filter opportunity. And if someone’s going to decide what “fair” looks like, it should be someone who understands what’s at stake—because the cost of a hidden moral default isn’t just bad UX. It’s lost potential.

Until platforms design with fairness and visibility in mind, qualified candidates will continue to be left behind—without ever knowing why.