I’ve been working with data for the past year and a half, as I’m building my own company, so I can’t ignore the noise around the launch of OpenAI’s ChatGPT, and the fears it has spurred.
That’s the first word that comes to mind when I hear about Chat GPT, GPT-3, other large language models. Journalists and bloggers love reporting about disaster scenarios where AI takes over the world and puts all people out of jobs.
Leaving this luddite assertion aside, there have been a lot of conversations about how writers, marketers, PR folks, coders and other creative jobs are at risk from these language models. I can’t think of something more wrong than that. A truly creative work can’t be fully replaced (yet!) with any of these models, because, well, the creativity part.
Let me explain.
A language model works off existing patterns and biases, so it can’t stray too far off from existing work. Even if the stuff it creates looks new, it actually is an extension of words/images/strings seen in the past.
If you try to have the model write about topics that require deep expertise or are controversial, then you will be frustrated with the result. It will put out mostly platitudes, if you’re lucky, or just purely wrong advice.
But if a job requires the person to gather symptoms, fixed inputs and other clear instructions, with a similarly clear outcome, then it’s much easier for a model to execute the same task with easier/better outcomes.
A few examples of jobs at risk of automation, and where a model could provide a better experience:
- Primary care doctor that gathers your symptoms and recommends a quick solution to try out or route to a specialist for further investigation
- L1 customer care specialist that responds to simple inquiries and fixes quick refunds, or routes to the right higher level specialist, as needed
- L1 IT support / debugging based on error messages
- Inbound sales specialist asking qualifying questions for a prospect, given a well-known product use case list
- Content moderators
- Bookkeeping and most of accounting tasks, as well as doing taxes for common situations
- Business analysts / data analysts that pull reports based on manager asks from existing available data
- SEO strategist choosing keywords to focus on in well-known/saturated markets
Here are a few more that I was surprised to see, but made sense after reading the reasoning behind them: sound engineer, animators, support engineers.
But we’re not quite there yet. For now, GPT-3 and other LLMs are just writing aids.