Catherine Thorbecke: Alpha slang is brain rot for artificial intelligence which can’t keep up

In the breakout Netflix crime drama Adolescence, British police scramble for clues after a 13-year-old boy is accused of killing his classmate. Scouring Instagram, investigators initially assume he and the victim were friends - until an officer’s teenage son explains that the emojis under his posts are cloaked references to something more nefarious.
Now, some authorities are hoping technology can help crack this code. Australian Federal Police are working with Microsoft Corp. to develop a prototype artificial intelligence tool that will “interpret emojis and Gen Z and Alpha slang in encrypted communications and chat groups to identify sadistic online exploitation,” according to Commissioner Krissy Barrett. She said the goal was to fight back against an online ring of “crimefluencers” and make it “quicker for our teams to save children from harm much earlier.”
Can AI keep pace with warp-speed evolution of digital slang? No cap: That’s a tall order. The technology might be able to parse balance sheets, but the idea of teaching it to speak teen is giving boomer.
So far, there are more questions than answers about the initiative. The AFP told me that Barrett’s speech is the only information they can provide at this time.
Microsoft didn’t immediately respond to an emailed request for further details. The effort comes before a law banning children under the age of 16 from social media takes effect next month. (The legislation doesn’t apply to gaming and chat platforms.)
In theory, AI can be trained on social media data or past investigations to spot when an emoji’s meaning turns from innocent to illicit. But that would still make it a lagging indicator. It will be incredibly difficult to keep up with the nonsensical internet slang, acronym or emoji that takes off in days and whose meaning then constantly changes within weeks. Take “skibidi toilet” for example.
A July study led in part by a California teen (and co-authored by an Italian professor of computer science) found that Gen Alpha - those born between 2010 and 2024 - had a 98 per cent basic understanding of their own online language, while parents grasped 68 per cent.
Large language models, meanwhile, comprehended between 58 per cent and 64 per cent. These gaps create “dangerous blind spots where concerning interactions may go undetected,” the researchers warned, particularly in the context of content moderation.
Researchers in India found that AI translation systems fail to adequately parse Gen Alpha slang, due in part to its extensive cultural blending from gaming and meme ecosystems and “rapid semantic evolution”.
Part of the difficulty around decoding this digital-native lexicon stems from how much the meaning of a phrase or emoji can shift depending on the context or platform.
Moreover, as soon as a term becomes “infiltrated” or goes mainstream, teens abandon it. It doesn’t follow a pattern that a computer program can easily comprehend.
As one emoji researcher (yes, that’s a thing) writes, open communication with young people about digital behaviour and which platforms they use will “always be more effective than trying to stay ahead of an ever-shifting symbolic paralanguage”.
In the Emmy-winning Adolescence, it wasn’t an algorithm that cracked the code - it was another teenager.
Tech giants like Microsoft would be wise to invest more research into studying youth linguistics, not just for content moderation but to understand the online worlds kids live in, especially if “crimefluencers” are actually on the rise. Still, AI will never be a substitute for inter-generational dialogue. Parents, caregivers and authorities trying to keep children safe shouldn’t rely on it as a panacea. LLM’s ability to identify patterns in a text doesn’t equate to a true understanding of their meanings.
Take Dictionary.com’s new word of the year, “67” pronounced six-seven. The announcement came with a disclaimer: “Don’t worry, because we’re all still trying to figure out exactly what it means”.
As it turns out, it’s “impossible to define” the online lexicon added, describing the term as “meaningless, ubiquitous, and nonsensical”.
It’s essentially a catchall that chronically online youth picked up to mess with the older generation. But despite taking over the internet, when I asked OpenAI’s GPT-5 for a definition, it described it simply as a whole number “coming after 66 and before 68”.
Perhaps that’s the point. Gen Alpha’s language seems designed to confuse both parents and machines - and for now, it’s winning. The real challenge might not be teaching machines to talk like kids, but getting adults to listen to them.
Catherine Thorbecke is a Bloomberg Opinion columnist covering Asia tech. Previously she was a tech reporter at CNN and ABC News
Get the latest news from thewest.com.au in your inbox.
Sign up for our emails