If Millennials had 'dude-bros' with 'entrepreneur' in their Instagram bios, Gen-Z has 'Artificial Intelligence (AI) expert'. The idea behind this title is that one can claim authority in a field that is rapidly evolving and constantly changing. For Gen-Z, this means that any AI tool must be evaluated based on its own merits, rather than the broader implications of AI use. This isn't to deny that AI experts exist, but the actual engineers, expert users, and developers of machine learning tools, who have a strong foundation in mathematics, statistics, computer science, and programming, are far fewer than LinkedIn profiles might suggest.
The current AI boom, while a digital frontier worth exploring, is also a fertile ground for con-artists, grifters, and hustlers. It seems that everyone with a business degree and a six-month coding course is now calling themselves an 'AI business expert' or 'AI moneymaker'. This isn't to state the obvious, but to emphasize the importance of approaching AI with as much nuance and honesty as possible. If someone isn't a computer scientist, coder, or expert in linear algebra, they might start selling courses on how to use Chat-GPT, or direct people to free resources on YouTube.
As an elder Gen-Z, I find the widespread use of such tools distasteful. They're fine for small businesses needing to write brochure copy or for writing basic code to make a job easier. However, these tools are essentially crutches that increase productivity and reinforce the scarcity mindset driving most industries. Large Language Models (LLMs) are emerging as tools to accommodate the growing markets of capitalism, without addressing the problems they were originally created to solve.
I have firsthand experience with this. In my journalism degree, one of our courses involved creating a hypothetical journalism startup. My group proposed an AI-based fact-checking tool, believing it could speed up the fact-checking process in news publication. However, we lost marks because we couldn't explain the intricate coding techniques required for such a tool, even though this detail wasn't part of the rubric. We explained to the panel, two of whom were true AI experts, that we were researchers and writers—budding journalists—not the tech professionals who could answer their questions, especially with only a few months of research.
None of this addressed why fact-checking takes so long—a frustratingly nuanced reason being that not all data can be publicly available at all times. This is precisely why 'AI experts' are popping up everywhere: people think AI is a panacea, when in reality, it's just a stop-gap.