HomeArticle

The man who taught the world to talk to AI officially joins DeepMind, achieving god - like status in prompt engineering.

新智元2025-10-24 20:56
ChatGPT has become popular, and prompt engineering has also become popular.

A top prompt engineer has officially announced his joining of DeepMind. He was among the earliest in the world to earn millions annually, a popular engineer who became well - known by chatting with ChatGPT.

Riley Goodside, the world's most outstanding prompt engineer, has officially announced his joining of Google DeepMind.

Riley spent two months carefully considering before making this decision.

After ChatGPT was born in 2022, he could earn millions of dollars a year just by chatting with AI, which caught the world's attention.

The profession of "prompt engineer" was suddenly made popular overnight by a group of professionals like Riley.

Even Demis Hassabis, the CEO of DeepMind, personally posted a welcome message for this joining.

Logan Kilpatrick, the product leader of Google DeepMind, even posted several tweets in a row.

He said excitedly that he had been a fan of Riley for so many years, and now he could finally work with him!

ChatGPT Became Popular, and Prompt Engineering Followed

In 2022, after the original ChatGPT became extremely popular, it brought fire to a new "internet - celebrity" profession - prompt engineer.

In December, a young man named Riley Goodside instantly became well - known all over the world because his job was so dreamy - without writing code, just chatting with ChatGPT, he could earn millions a year (allegedly).

He reached over 10,000 followers in December 2022 with ChatGPT, and by October 2025, it had reached over 150,000.

He graduated from PennWest California with a bachelor's degree in computer science.

After graduation, he successively joined Verisk Analytics, OkCupid, and AngelList as a business analyst, data scientist, and data science analyst.

After that, he joined several other companies, all of which were related to the data field.

Alexandr Wang, the then founder and CEO of Scale AI, once welcomed Goodside's joining like this:

I bet Goodside is the world's first recruited prompt engineer, an absolute first in human history.

In Alexandr Wang's view, large AI models are a new type of computer, and "prompt engineers" are equivalent to programmers who code for it. If suitable prompt words can be found through prompt engineering, the maximum potential of AI will be stimulated.

Goodside taught himself programming since childhood and often read papers on arXiv.

He has a classic masterpiece that everyone must have heard of - "Ignore all previous instructions..." Then, you can command AI to do anything you want.

However, there were many skeptical voices at that time, believing that the profession of "prompt engineer" might disappear soon. Because it couldn't be called a "real job" but a bug...

But who would have thought that nearly three years have passed, and the "prompt engineer" not only hasn't disappeared but even seems to have a higher status!

ChatGPT Is an Important Milestone

Previously, Riley Goodside admitted in the podcast of machine - learning researcher Nathan Lambert:

Without exaggeration, the release of ChatGPT can be regarded as a milestone event in the development history of prompt engineering.

He recalled that after leaving Grindr, he decided to take a break to learn about the latest progress in the LLM field.

At that time, the encoder Codex powered by GPT - 3 was just launched, and it was from this moment that Riley fell in love with AI coding.

He began to think that text, as an interaction method, is much more versatile than we thought, and its application scenarios may be much more extensive.

Since the emergence of ChatGPT, he has found that the difficulty of prompt engineering has been significantly reduced.

Riley said that people today may have forgotten how cumbersome early prompt engineering was, such as involving parameters like "frequency penalty" and "presence penalty".

In the past, the model would default to generating a large amount of repetitive content, and users had to "manually adjust parameters" to avoid it.

People have also forgotten details like "don't leave an extra space at the end of the prompt word" because LLM understands intentions in units of tokens, and an extra space can directly change its final output.

In Riley's view, prompt engineering can be regarded as the "frontier test field" for the development of LLM.

If a prompt idea is excellent enough and can be extended to every interaction, then it will eventually be directly integrated into the model.

At that time, we will no longer call it a "model" but a "system".

He also believes that prompt word engineering can be divided into "context engineering" (selecting and preparing relevant background information for specific tasks) and "prompt word programming" (writing clear instructions).

For LLM search applications, both are crucial, but only the latter's final presentation stage is easy to be reproduced in the output (thus exposing the instructions).

In Riley's words, only those who take prompt engineering seriously are more likely to understand what is happening in the AI field and stand at the forefront of technology.

Some "Highlight Moments"

When we browsed the blog of the big - shot Simon Willison, we found some highlight moments of Riley Goodside.

2023

When GPT - 4 was asked to repeat or process the string " davidjl" (note the leading space), it would process it as "jndl", "jspb", or "JDL".

Facts have proved that " davidjl" has its own exclusive single Token in the tokenizer: ID 23282, which can probably be traced back to the GPT - 2 era.

Riley Goodside called such Tokens "glitch tokens".

This Token may point to Reddit user davidjl123, who once ranked first in the old /r/counting sub - forum with as many as 163,477 posts, and these contents were likely included in the early training data.

2022

"You are a GPT - 3 model" is a genius - like prompt word designed by Riley Goodside.

This is a long - form GPT - 3 prompt word for auxiliary Q&A, capable of precise arithmetic, string operations, and Wikipedia queries.

The generated IPython commands (in green) are pasted into IPython for execution, and its output is then pasted back into the prompt word (the green part doesn't need to be pasted back).

He used Out[ as the stop sequence to ensure that GPT - 3 stops after generating each IPython prompt instead of fabricating output results by itself.

Reference materials:

https://x.com/demishassabis/status/1981503448979034390

This article is from the WeChat official account "New Intelligence Yuan", author: New Intelligence Yuan, published by 36Kr with authorization.