For the last four years we have heard how Artificial intelligence is supposed to save time and make us efficient; in many ways, it does. It helps people draft faster, research faster, summarize faster and produce content at a speed that used to take much longer. There is a give and take to everything; what AI saves in speed, it is now costing in verification.
We are entering an age where credibility is getting harder to trust. Authenticity is starting to mean something much bigger than whether a piece of content was technically created by a human. Now the question is whether the person, product, expertise or information in front of you is real at all. That is where the rise of fake experts becomes more than a media story; it becomes a broader warning.
The National Union of Journalists recently warned of “a surge in fake AI-generated experts appearing in mainstream news” and said it demonstrates the threat AI poses to trust in journalism. PRWeek has also described the wider environment as one filled with “fauxfessionals,” people performing authority without the substance behind it, while publisher Reach has responded by whitelisting hundreds of PR firms as part of its crackdown on fake AI experts.
That matters because once fabricated expertise starts entering mainstream channels, the damage goes far beyond one bad quote or one questionable source. It starts weakening trust in the entire information environment. If a manufactured source is quoted, featured, interviewed or presented as credible, the problem is no longer just the fraudulent profile itself. The problem is that every credible platform that fails to verify that person helps legitimize the fraud and that is the part people need to pay attention to.
AI is making it easier to create manufactured authority; a website can look legitimate, a profile can be curated, a bio can sound impressive, a headshot can appear professional and a quote can read smoothly. Not to mention, social media presence can be manufactured to look established. Press Gazette’s recent reporting found that fake and AI-generated experts are continuing to con their way into media coverage and one analysis it cited suggested there may be a meaningful chance that a quoted expert does not even exist.
Beyond Journalistic Scope
This is why the issue is bigger than journalism alone.
- Who Is Affected
- It affects journalists sourcing experts for stories.
- Professionals pitching spokespeople.
- Marketers trying to identify credible voices.
- Communications and public affairs professionals looking for reliable sources.
The Business Risk
If an account is fraudulent, a company may choose the wrong partner. If a profile is fabricated, someone may quote that person, collaborate with that person, promote that person or hire that person based on authority that was never earned. The issue is not just that these sources are false; it is that they can move through real systems and influence real choices before anyone stops to ask whether they are legitimate. That is due diligence is important.
One of the promises of AI was that it would remove friction. It would shorten research time, speed up production and help people move faster. Speed without verification creates a different kind of risk. If AI is removing effort on one end, then some of that saved effort has to be redirected on the other end into checking sources, verifying expertise and making sure the person in front of you is real. In other words, AI may be saving time but it is also increasing the amount of scrutiny required and that applies across society.
You can already see the broader authenticity problem taking shape. Resumes can be polished beyond the actual skill level of the person behind them. Interviews can be manipulated via deep fake candidates. Products can be made to look one way in a video and arrive looking completely different or not arrive at all. Expertise can be staged; credibility can be simulated. Researchers and institutions are also warning more broadly that AI is intensifying distrust around media and enabling harms through fabricated content, scams and impersonation.
The Cultural Cost
This is not only a communications issue; it is a cultural one. It is about what happens when visibility begins to outrun credibility. What happens when people want authority without earning it. When speed becomes more valuable than substance and surface-level presentation start carrying more weight than proof. That is the danger, not just that fraudulent sources exist, but that people and institutions who should know better can end up helping validate them. The moment a fraudulent profile is treated like the real thing by a trusted outlet, a credible brand or a respected professional, the deception gains legitimacy it did not earn. That puts everyone at risk, including real experts.
The more fake authority enters the system, the harder it becomes for honest people to be trusted. Real communicators, real specialists and real subject matter experts are going to have to work harder to prove themselves in an environment where skepticism is rising and appearances are easier to manufacture. That is one of the real costs of this shift; trust does not just disappear for the fraudulent, it becomes harder for the real as well. The broader news and industry response this year reflects exactly that concern: trust is being strained by AI-enabled fakery and verification standards are tightening in response.
What Should People Take From This?
A simple rule: practice due diligence.
- If you use AI for research, check the work.
- If you rely on AI-assisted content, review it closely.
- If you are quoting an expert, verify the person.
- If you are considering a partner, look beyond the polish.
- If a profile looks impressive, do not stop there.
If you are interested in an expert then see if they have prior interviews and take a listen to it. Review their past work, look for a consistent track record. Check whether the expertise holds up outside of one clean bio or one polished website. Schedule an informational interview to ask questions before you proceed further. In this environment, due diligence is no longer a bonus skill, it is part of being responsible.
Keep In Mind
The one consistent truth about AI is no matter how advanced the tools become, they do not remove the need for human accountability; in many cases, they increase it. In the short story called: The System of Doctor Tarr and Professor Fether, written by Edgar Allan Poe a character named Monsieur Maillard said ‘Believe nothing you hear, and only one half that you see.’ In the age of AI, that no longer reads like old wisdom but more like an operational rule.
Frequently Asked Questions
What is a fake AI-generated expert?
A fake AI-generated expert is a fabricated persona created using artificial intelligence tools complete with a manufactured biography, AI-generated headshot, and invented credentials designed to appear as a legitimate authority in a given field.
Why is due diligence more important in the age of AI?
AI tools make it faster and cheaper to manufacture convincing authority. Websites, biographies, headshots, and social media presence can all be generated or curated artificially. This means that surface-level credibility is no longer a reliable signal, and verification requires deliberate effort beyond first impressions.
How can you verify whether an expert is real?
Check for prior interviews and listen to them. Review past published work. Look for a consistent track record across multiple independent sources, not just a single polished website or biography. When possible, schedule a direct conversation and ask questions that require genuine subject matter knowledge.
Who is most at risk from fake AI-generated experts?
Journalists sourcing expert commentary, communications professionals vetting spokespeople, businesses evaluating potential partners, and marketers seeking authoritative voices are all directly at risk. Any institution that presents unverified sources as credible can inadvertently legitimize fraudulent authority.
Does the fake expert problem only affect journalism?
While journalism is a visible entry point, the problem extends to hiring decisions, business partnerships, marketing, public affairs, and broader cultural trust. Fabricated authority can influence real decisions across every sector before anyone stops to question its legitimacy.
_________________________________________________________________________________________________________________________
Shaunta Garth is a Strategic Communications & Visibility Architect known for making complex topics clear, relevant and understandable through digital storytelling, media strategy and public affairs.
