Low-background steel and AI
AI generated content is changing everything
A few months ago I heard for the first time the term low-background steel.
It seems steel produced before the Second World War is more valuable for certain applications. The reason is steel produced after the first nuclear tests was contaminated with traces of nuclear material from those tests in the atmosphere. Since then, radiation levels have fallen almost to natural levels, but still it is a fascinating fact.
Today that random fact of trivia came to my mind while reading the following
Oct 2022: I ask GPT-3, who won the Super Bowl the year Justin Bieber was born? GPT-3 says Packers. It’s the Cowboys.
— Riley Goodside (@goodside) November 12, 2023
Example is added to LangChain. Other LLMs answer. Answers are added to GitHub.
Now, citing GPT4All, Google says it’s the Buffalo Bills — who have never won: pic.twitter.com/atS8wlEzPt
The author asked GPT-3 in 2022, an evaluation question. GPT-3 gave an incorrect answer. So the question was added to a site where questions to evaluate LLM (AI models) are compiled. And other models started giving incorrect answers to that same question.
Those answers were posted to Gist(Github). Google crawled them.
Now, Google returns one of those incorrect answers.
This example is probably not uncommon in Google results. In my experience I’m getting more and more generated content in my Google searches.
AI generated content is on the rise. Our tools are not prepared for that.
Content pre-AI will be treasured.
Like low-background steel.