AI generates article with ‘serious’ YMYL content issues

AI generates article with ‘serious’ YMYL content issues

AI generates article with ‘serious’ YMYL content issues

Men’s Journal is the latest publication to be called out for using AI to generate content that contained several “serious” errors.

What happened. 18 specific errors were identified in the first AI-generated article published on Men’s Journal. It was titled “What All Men Should Know About Low Testosterone.” As Futurism reported:

Like most AI-generated content, the article was written with the confident authority of an actual expert. It sported academic-looking citations, and a disclosure at the top lent extra credibility by assuring readers that it had been “reviewed and fact-checked by our editorial team.” 

The publication ended up making substantial changes. But as the article noted, publishing inaccurate content on health could have serious implications.

E-E-A-T and YMYL. E-E-A-T stands for expertise, experience, authoritativeness and trustworthiness. It is a concept – a way for Google to evaluate the signals associated with your business, your website and its content for the purposes of ranking.

As Hyung-Jin Kim, the VP of Search at Google, told us at SMX Next in November (before Google added “experience” as a component of E-A-T):

“E-A-T is a template for how we rate an individual site. We do it to every single query and every single result. It’s pervasive throughout every single thing we do.”

YMYL is short for Your Money or Your Life. YMYL is in play whenever topics or pages might impact a person’s future happiness, health, financial stability or safety if presented inaccurately.

In this case, this applies because inaccurate information can impact someone’s “health.” Something like this could potentially impact the E-E-A-T – and eventually the rankings – of Men’s Journal in the future.

Dig deeper: How to improve E-A-T for YMYL pages

Although, in this case as Glenn Gabe pointed out on Twitter, the article was noindexed.

While AI content can rank (especially with some minor editing), just remember that Google’s helpful content system is designed to detect low-quality content created for search engines.

We know Google doesn’t oppose AI-generated content entirely. After all, it’s hard to do so when you’re planning to use it as a core feature of your search results.

Why we care. Content accuracy is incredibly important. The real and online worlds are incredibly confusing and noisy for people. Your brand’s content must be trustworthy. Brands must be a beacon of understanding in an ocean of noise. Make sure you are providing helpful answers or accurate information that people are searching for.

Others using AI. Red Ventures brands, including CNET and BankRate, were also called out previously for publishing poor AI-generated content. Half of CNET’s AI-written content contained errors, according to The Verge.

And there will be plenty more AI content to come. We know BuzzFeed is diving into AI content. And at least 10% of Fortune 500 companies plan to invest in AI-supported digital content creation, according to Forrester.

Human error and AI error. It’s also important to remember that, while AI content can be generated quickly, you need to have an editorial review process in place to make sure any information you publish is correct.

There will be lots more AI-generated stories to come. AI won’t be perfect because the dataset it was trained on (the web) is full of errors, misinformation and inaccuracies.

And let’s be honest – content written by humans can also contain serious errors. Mistakes happen all the time, from small, niche publishers all the way to The New York Times.

Also, Futurism repeatedly referred to AI content as “garbage.” But let’s not forget that plenty of human-written “garbage” has been published for as long as there have been search engines. It’s up to the spam-fighting teams at search engines to make sure this stuff doesn’t rank. And it’s nowhere near as bad as it was in the earliest days of search 20 years ago.

AI hallucination. Another thing to watch out for is AI making up answers.

“This kind of artificial intelligence we’re talking about right now can sometimes lead to something we call hallucination. This then expresses itself in such a way that a machine provides a convincing but completely made-up answer.”

– Prabhakar Raghavan, a senior vice president at Google and head of Google Search, as quoted by Welt am Sonntag (a German Sunday newspaper)

Bottom line: AI is in early days and there are a lot of ways to hurt yourself as a content publisher right now.

The post AI generates article with ‘serious’ YMYL content issues appeared first on Search Engine Land.

from Search Engine Land
via free Seo Tools

Leave a Reply

Your email address will not be published. Required fields are marked *