How important has prompt engineering become among IT skills?
So important that Erick Brethenoux, an adjunct professor at the Illinois Institute of Technology, said all of his engineering graduate students have put artificial intelligence (AI) prompt-engineering skills on their resumes and LinkedIn profiles. They know that’s what businesses now want.
“It’s critical. Prompt engineering is a very important part of what’s coming, not just for OpenAI, Google, Amazon, and Microsoft, but also for all the open-source models,” Brethenoux said. “They’re going to go into the workforce, and they need to master that technology one way or another.”
That anecdotal evidence is echoed by LinkedIn data, which shows a stark rise in the demand for prompt engineering, with more than 10,000 job offers mentioning it for various positions — and around 100 explicitly seeking a "prompt engineer."
Since the start of the year, LinkedIn has seen on average a 75% increase each month in members worldwide adding terms like “GAI,” “ChatGPT,” “Prompt Engineering,” and “Prompt Crafting,” to their profile, a spokesperson said. The number of LinkedIn members who have, or have been holding, "head of AI" positions has nearly tripled in the last five years.
AI and Machine Learning Specialists top the list of fast-growing jobs, followed by Sustainability Specialists, Business Intelligence Analysts and Information Security Analysts, and Renewable Energy Engineers, according to a survey by to the World Economic Forum.
Prompt engineers can earn a healthy six-figure salary — up to $335,000 a year. That’s because prompt engineering is needed to improve human-machine interaction with genAI tools; learning about the foundational technology — large language models (LLMs) — provides the best possible responses to queries.
The need for skilled prompt engineers is growing rapidly — as fast “as GenAI applications are being considered and piloted,” according to Avivah Litan, a vice president and distinguished analyst at Gartner Research.
“We think it’s important for enterprises to train their existing developers and software engineers on new prompt-engineering techniques," she said. "It isn’t easy to get good results out of genAI models unless you are skilled at data preparation and process flows for prompting models effectively.
“But you don’t need to run out and hire new staff for it — you likely need just a handful of specialized prompt engineers to jumpstart the development process and to help train other motivated and competent engineers and developers already in your organization.”
While every university around the world will eventually be training engineers, developers and others on prompt engineering, Brethenoux, who is also a distinguished vice president analyst at Gartner, agrees that organizations today should focus on upskilling existing employees. “People you already have have domain experience," he said. "You already have technology experts there. There are people working together who already know your business problems."
At the moment, online learning platforms such as Udemy, Coursera, and Code Academy are the best places to turn for upskilling or reskilling employees, Brethenoux said.
Over the past five months, Coursera has launched four courses on the topic, and it has enrolled more than 170,000 students in them, according to a spokesperson. Coursera’s programs include Prompt Engineering for ChatGPT from Vanderbilt University; ChatGPT Prompt Engineering for Developers from DeepLearning.AI; Prompt Engineering for Web Developers from Scrimba; and AI Foundations: Prompt Engineering with ChatGPT from Arizona State University.
Interest in the subject is so intense, in fact, that ChatGPT creator OpenAI, and well known AI scientist Andrew Ng — cofounder and head of Google Brain — also launched a course called ChatGPT Prompt Engineering for Developers.
And in the business world, companies such as EY, NTT DATA, Datasumi and eDreams have already created prompt engineer job titles, according to Forbes.
Kim Curley, vice president of People & Organization at NTT DATA, said her company has always created new jobs in response to disruptive technologies, and the position of prompt engineer will allow them to take advantage of emerging tech, particularly in the AI spaces.
"As all of us have seen, there’s a wide variety in the quality of the outputs from AI, and that quality has a lot to do with how we ask the questions we’re looking for answers to," Curley said. "Prompt engineers need to understand not only the underlying data that they’re querying but also the context of the answer that they need. This is a hot job and one that will increase in importance as more organizations are able to use AI."
Upskilling an existing employee, Curley said, is also the way to go when creating the position of prompt engineer.
"The need to understand context for the question and answer gives people who are already working with your data and in your business a big head start to be successful, particularly as AI has a way to go in terms of consistent quality," Curley said. "That said, we see these skills starting to be taught in traditional educational channels now, so expect this role to expand to newer hires who are earlier on in their careers."
So, what are LLMs and why do they need training?
LLMs are the deep-learning algorithms — neural networks — most often characterized by their massive storehouses of information. LLMs can have millions, billions and even trillions of parameters or variables. Essentially, LLMs are next-word generators, and training them to choose the most appropriate response for a given query is the job of a prompt engineer.
“Think of it as the process of interacting with a machine to get it to produce the results you’d like,” said Sameer Maskey, a Columbia University AI professor and CEO of Fusemachines, an AI consultancy.
While most LLMs such as OpenAI's GPT-4, Google’s LaMDA or Hugging Face’s Bart are pre-filled with massive amounts of information, prompt engineering allows genAI tools to be tailored for specific industry or even organizational use.
Over time, massive, amorphous LLMs such as GPT-4 are expected to give way to smaller models that are less compute intensive and more domain specific, allowing more compact LLMs to gain traction in any number of vertical industries. When that happens, prompt engineering will become even more critical.
“One of the biggest focus areas from a research perspective is how to get similar accuracy out of an LLM without having to use millions, billions, or even trillions of parameters,” Maskey said. “GPT is not going away, but on the back end, it may get smaller.”
In the most basic sense, prompt engineering, or designing questions or tasks for AI, requires five “high-level” considerations, according to Maskey.
- Context — Provide the AI engine a specific sphere of vertical tasks, such as financial services, healthcare or manufacturing.
- Give it the task itself — For example, ask it to return a list of all geographies where diabetes is found in 20% or more of the population.
- Specificity — Narrow the scope of answers. Ask the AI tool to create a list versus an image, for example.
- Fine-tuning process — Ask AI to elaborate on its answers to determine accuracy.
- Retuning — If answers are not right, ask it for more information or to elaborate even more.
Prompts can be multimodal. For example, you can write a prompt to elicit text about an image into a two-stage framework. In that scenario, prompt engineering could be used to gain insights about radiological images; a prompt engineer inputs radiological images and the AI engine parses them to show acute conditions that need greater analysis by radiologists and physicians.
“Based on what you’re entering, it could be a text-based window, a coding platform where you're using snippets of code as prompts," Maskey said. "You could be uploading images or videos, and so forth."
One advantage with genAI is its accessibility to non-tech employees — that is, employees don’t necessarily need coding chops, Maskey said.
“For example, if you want build a linear regression model on your system or some kind of statistical analysis on a given data set — before, business people couldn’t, because they didn’t know Python, for example. There are a lot of instances where non-engineers would get a lot of advantages out of being able to do statistical analysis on datasets. Now, they can.”
Prompt engineering, Maskey said, will some day be akin to learning how to use an Excel spreadsheet, which today is useful across many business units or departments.
Gartner’s Litan believes prompt engineering will eventually be folded into application engineering and software developer career streams. “It will be a required skill for the future, but it will not be a separate career stream,” she said.
Cloud providers are also expected to launch prompt-engineering services, according to a new report from Forrester Research.
“In 2024, all of the hyperscalers will announce prompt engineering," Forrester said. "However, enterprise adoption will be limited. Due to incomplete contextual data and limited experience in natural language and prompt engineering among data scientists, the cloud provider’s first-gen prompt engineering services will not suffice to address the tailored finetuning needs."
Even so, at this point, developing prompt engineering skills is mostly “learning by doing at this point because there’s no book on prompt engineering," said Brethenoux. “What kind of expertise do we have in just nine months' experience [with genAI tools]?” he said. “So, the main thing is about building an AI literacy program within an organization, and then start upskilling people.”