Demand for a Hot New AI Role Is Skyrocketing — but Some Say It’s a Fad

Estimated read time 6 min read
  • Prompt engineers are the hot job of the AI age.
  • Demand for the role has skyrocketed amid the ChatGPT hype.
  • But despite the money, noise, and excitement — some say the role is merely a passing fad.

Thanks for signing up!

Access your favorite topics in a personalized feed while you’re on the go.

download the app


Kelly Daniel loves her job.

A former media executive, Daniel now works as a prompt director at software company Lazarus AI.

Day to day, she coaxes AI models like OpenAI’s GPT, utilizing natural language to manipulate the model into spitting out exactly the content she wants. It’s like there’s a hidden code waiting to be cracked, she says.

“I just find it really interesting, really fun — it’s like solving a word puzzle,” Daniel, whose résumé also includes a prompt engineering role at LinkedIn, said.

Daniel is part of a wave of a new kind of AI engineers — those without formal tech skills. Known as prompt engineers, anyone from marketing professionals to commercial lawyers can be qualified for the buzzy new job.

UK-based Tanya Thomas, a lawyer turned prompt engineer, views her job as a creative one.

“It’s exactly like a Rubik’s cube,” Thomas said regarding the prompting of AI models. “You get one side all green, and then the other side’s messed up, and you have to change it in a constant, iterative process.”

She says the role has even brought out a creative side she didn’t know she had when working in the prescriptive, often black-and-white legal world.

Demand for prompt engineers has been skyrocketing amid the AI hype, and salaries are echoing the excitement. Last year, AI lab Antropic made headlines after advertising a $300,000 salary for its in-house prompt engineers.

The rise of the role has become an emblem of job positivity amid the AI boom. An example wheeled out to prove the “AI will create new jobs too” rhetoric often emphasized by tech companies in response to job-stealing fears.

Yet, despite the money, noise, and excitement — some say the role is merely a passing fad.

A catch-all role

Prompt engineering is a complicated term.

The title comes from a more traditional tech job, one that includes fine-tuning large language models. Since OpenAI’s ChatGPT sparked an increase in AI products prompted by natural language, the job title has expanded to include many other responsibilities.

Some are critical of the new role, particularly when occupied by non-technical candidates.

Critics have likened the role to hiring an expert in Google Search at the turn of the century, predicting the expertise will become so normalized it will no longer be considered a skill.

Conor Grennan, the Dean of Students at NYU Stern School of Business, is among the critics.

“There’s a sense you have to know something you just don’t need to know,” Grennan said in an interview following a passionate LinkedIn post about the rise of non-technical prompt engineering.

The job is particularly frustrating because Grennan, who helps companies incorporate AI, views tools like ChatGPT as one of the most democratizing technologies in history.

He said the skill set and experience of the employee using the AI is what really matters, not how people are simply prompting the model. For Grennan, the main way to improve AI-generated content is to learn how to talk to it like a human, which means getting over the mental block of chatting with a machine.

Grennan said bringing in external workers with no knowledge of the company simply to prompt models may even be counterproductive, slowing down workflows by creating an unnecessary block between the AI and the workers with specific skills.

“It’s a bit like hiring someone that understands how to Google better,” he said. “The problem is that the person already in the company and doing the work is the one who has the expertise necessary.”

Leave the prompting up to the AI

Some research suggests an even deeper problem with the trendy new job.

A recent research paper from Rick Battle and Teja Gollapudi at cloud computing company VMware suggests that prompt optimization may be best left up to AI models rather than the humans using them.

The pair found that automatic prompt optimization — getting AI models to refine prompts themselves — produced far better results than anything a human engineer could achieve.

“You should never handwrite a prompt again,” Battle said of the research. “Just write basic instructions, and then let the model optimize the prompt for you.”

Battle called prompt engineering roles “ridiculous” and a flash in the pan. He said prompting AI models is an important skill, not a job.

“The only people who should be working on this kind of stuff are people with formal data science backgrounds because the fundamentals here have not changed one iota,” he said.

Separate research from Vasudev Lal, an AI scientist at Intel Labs, reached a similar conclusion.

According to Lal, automated prompt engineering consistently outperforms metrics achieved by human engineers.

“That’s not very surprising because human engineers will only be able to explore a small fraction of the parameter space,” he said.

AI prompt optimization is still expensive, but it’s getting cheaper as the models get more advanced, Lal said.

Lal called the need for human prompt engineering a “temporary blip.”

The more people engage and give feedback to the models, the faster they will get to a point where they can understand human intent, he added.

A headstart for workers

Most prompt engineers aren’t immune to the controversy surrounding their roles.

Yinuo Chen, an engineer based in an advertising company in Beijing, says she already views her role as uncertain due to the pace of AI development.

“These tools are available to everyone — even my dad uses them sometimes,” she said. “I don’t know if this role will be so technical that it would require an actual official role for it in the future.”

While Chen thinks there’ll always be a need for human agency in AI prompting, she’s nervous about the technology’s effect on the job market in general.

“I was definitely scared that generative AI would change the way that content creation works in general,” she said. “AI changes so fast, and it’s very difficult to pinpoint where it’s going even in the next month.”

Daniel also knows what the critics think of her new job — but she isn’t bothered.

“I definitely have a drive to stay ahead of the curve, and part of that is getting in this early,” she said of her role.

Working in lay-off-prone industries like tech and media has taught that her no job is 100% stable. (Daniel was laid off from a role at Meta in the first round of Mark Zuckerberg’s cuts.)

She said she’s even encouraged by some of the criticism, even the comparison with Google Search.

“I think different standards will emerge between what everybody can do and what the expert prompting engineers can do,” she said.

All in all, she’s feeling confident.

“Everybody can run a Google Search now, but I can run a better one,” she said.