Skip to content
JL JobLabs

Editorial Standards & Methodology

I've spent 12 years placing candidates, and I've watched too many career sites publish thin, AI-generated filler with no accountability. This page is my answer: exactly how JobLabs is written, tested, and reviewed, in plain English.

Who writes this site

My name is Alex. I'm a real recruiter with 12 years on the desk, mostly in UK commercial and tech hiring, and I'm the person behind almost every word on JobLabs. I'm not a pseudonym, and I'm not a content team hiding behind a brand name. Every article on this site carries an author byline at the top, and if it says "By Alex", I wrote or edited it personally before it went live.

You can read my full background, including the kinds of roles I've placed and the hiring markets I know best, on my author page. If a contributor ever writes here, they'll have their own byline and their own page, with their own professional history. No ghostwriters, no fake experts.

How we test AI tools

Most "best AI tool" roundups you'll read are written from press releases and a 30-second demo. That's not how I work. When I review a tool on JobLabs, I put it through a real testing process designed to catch the things that actually matter to a job seeker.

Every reviewed tool is tested on real, anonymised candidate CVs, used with explicit consent from the candidate. I don't rely on the sample CVs the tool's marketing team provides, because those are tuned to make the product look good. I want to see how the tool behaves on a messy real-world CV with a three-year career gap, a non-linear path, or an overseas qualification.

I run every tool against at least three different candidate profiles. That means a graduate with no experience, a mid-career professional with a clear trajectory, and a senior candidate changing industries. One pass on one CV is not a review, it's a guess. Three profiles catches the failure modes.

Every output is compared line by line against the source CV. If a tool invents a metric ("increased revenue by 32%") that wasn't in the original material, I mark it down heavily and say so in the review. Hallucinated numbers are one of the biggest reasons candidates get rejected at reference-check stage, and a tool that encourages them is dangerous, not helpful.

Where a tool claims ATS compatibility, I test it against the actual systems it names: Workday, Greenhouse, Taleo, iCIMS. I don't take the claim on faith. If a tool says "ATS-friendly" and then produces a two-column layout Workday can't parse, you'll read about it here first.

Our AI usage disclosure

I use AI on this site. I'd rather be honest about that than pretend I don't, because readers can tell the difference and Google can too.

Here's the split. AI helps me with first-pass drafting, article structure, and rewriting awkward paragraphs. It's a writing tool, the same way a spell-checker is. What AI does not do is decide what the article says, pick the examples, or choose the editorial angle. That's all me.

Every draft is edited, fact-checked, and voice-corrected by me before it publishes. That editing pass is where the real work happens, and it's non-negotiable on every article. I also run every draft through an AI-isms filter that strips out the recognisable ChatGPT cadence: words like "delve", "landscape", "journey", and the other verbal tics that mark text as machine-written. If it still reads like AI after that, it doesn't go live.

The original reporting, the specific examples, and the recruiter's-eye insights come from my own placement work, anonymised. Google's own guidelines explicitly allow AI-assisted content when it meets E-E-A-T standards. Ours does, because a real recruiter with a real track record is the one signing off on every article.

How we choose what to review

There are hundreds of AI career tools on the market, and most of them aren't worth your time. My selection criteria are deliberately strict.

First, the tool has to be available to UK job seekers. I won't review a US-only platform with no UK equivalent, because the advice isn't actionable for most of my readers. If the tool is US-first but works internationally, I'll say so clearly.

Second, the tool must have a genuine free tier or a no-risk trial. That way you can verify my claims yourself instead of taking my word for it. I don't review tools that gate everything behind a paid upfront commitment.

Third, I prioritise tools with real traction, typically 10,000 or more user reviews, or inclusion in a recognised industry ranking. New tools can earn coverage if they solve a problem the established ones don't, but they need a reason to be reviewed beyond a press launch.

Finally, I actively review tools I expect to criticise. Affiliate potential is never the selection driver. Some of the worst tools I've reviewed had generous commission structures.

Affiliate relationships and independence

JobLabs earns some of its revenue through affiliate links. I'm transparent about that, and the structure is designed so it never compromises a review.

Affiliate links are disclosed inline on every review that contains them. You'll see a clear note, usually near the top of the article and again near the link itself, confirming that I may earn a commission if you sign up. No hidden tracking, no obscure disclosure buried in a footer.

Editorial ranking is never affiliate-weighted. A tool with a 5% commission has outranked a tool with a 50% commission on this site, because the 5% tool did the job better. The ranking order in any "best of" comparison is built from my testing scores, not from what pays out.

I've also published negative reviews of tools whose affiliate programs I'm a member of. If a tool hallucinates metrics or fails my ATS testing, the review says so, even when that means recommending readers try a non-affiliate competitor instead. That's the only way reviews are worth anything.

Non-affiliate tools like Google's Interview Warmup and the free tier of ChatGPT get the same full editorial treatment as paid ones. I cover what works, not just what pays.

Fact-checking and corrections policy

Every article on JobLabs shows a publish date and, where relevant, an updated date in its header. You can see when something was written and when it was last reviewed.

When I correct a factual error, two things happen. The updated date changes to reflect the edit, and a correction note is added at the bottom of the article explaining what was wrong and what changed. I don't quietly swap sentences and hope nobody notices. Visible corrections are how trust gets earned.

Reader-submitted corrections are welcome. Email hello@joblabs.ai with the article URL and the error, and I'll review it within 48 business hours. If you're right, the correction goes in with credit to the reader where appropriate.

I won't silently delete articles either. If something is outdated, I update it. If something is wrong, I correct it visibly. If an article is genuinely obsolete, I'll add a clear note explaining why rather than making it vanish.

Advertising and sponsored content

JobLabs plans to run non-intrusive display advertising through Google AdSense once the site meets AdSense's content and traffic thresholds. Ads will be clearly separated from editorial content and never styled to look like article recommendations.

I do not accept paid product placements inside editorial articles. A tool cannot pay to appear in a review, cannot pay for a higher ranking in a comparison, and cannot pay to have a negative mention removed.

If I ever publish sponsored content, it will be labelled "Sponsored" prominently above the article title, disclosed in the meta description, and kept separate from the editorial feed. Sponsors will never see review drafts, and they have no right of reply on independent reviews.

Privacy and candidate data

Anonymisation is the default for any candidate example used on this site. I never publish identifying details, including names, specific employers, exact job titles, or locations, of candidates I've worked with. Where I describe a placement story, any identifying detail has been changed or removed.

Privacy requests and data-subject access requests go to privacy@joblabs.ai, with a typical response under 48 business hours. Full details, including our legal basis for processing and data-retention periods, are in the privacy policy.

Report a problem

Found a factual error, a broken link, or an outdated recommendation? I want to know.

Email hello@joblabs.ai with the article URL and the correction. I'll respond within 48 business hours and, if you're right, update the article visibly with a correction note. That's how this site stays worth reading.