New course — ChatGPT Prompt Engineering for Developers Learn how to use ChatGPT's API to build applications for text processing, robotic process automation, coaching, and more

Reading time
2 min read
New course — ChatGPT Prompt Engineering for Developers: Learn how to use ChatGPT's API to build applications for text processing, robotic process automation, coaching, and more

Dear friends,

Last week, we released a new course, ChatGPT Prompt Engineering for Developers, created in collaboration with OpenAI. This short, 1.5-hour course is taught by OpenAI’s Isa Fulford and me. This has been the fastest-growing course I’ve ever taught, with over 300,000 sign-ups in under a week. Please sign up to take it for free!

Many people have shared tips on how to use ChatGPT’s web interface, often for one-off tasks. In contrast, there has been little material on best practices for developers who want to build AI applications using API access to these hugely powerful large language models (LLMs).

LLMs have emerged as a new AI application development platform that makes it easier to build applications in robotic process automation, text processing, assistance for writing or other creative work, coaching, custom chatbots, and many other areas. This short course will help you learn what you can do with these tools and how to do it.

Say, you want to build a classifier to extract names of people from text. In the traditional machine learning approach, you would have to collect and label data, train a model, and figure out how to deploy it to get inferences. This can take weeks. But using an LLM API like OpenAI’s, you can write a prompt to extract names in minutes.

In this short course, Isa and I share best practices for prompting. We cover common use cases such as:

  • Summarizing, such as taking a long text and distilling it
  • Inferring, such as classifying texts or extracting keywords
  • Transforming, such as translation or grammar/spelling correction
  • Expanding, such as using a short prompt to generate a custom email

We also cover how to build a custom chatbot and show how to construct API calls to build a fun pizza order-taking bot.

In this course, we describe best practices for developing prompts. Then you can try them out yourself via the built-in Jupyter notebook (the middle portion of the image above). If you want to run the provided code, you can hit Shift-Enter all the way through the notebook to see its output. Or you can edit the code to gain hands-on practice with variations on the prompts.

Many applications that were very hard to build can now be built quickly and easily by prompting an LLM. So I hope you’ll check out the course and gain the important skill of using prompts in development. Hopefully you’ll also come away with new ideas for fun things that you want to build yourself!

Keep learning!



Subscribe to The Batch

Stay updated with weekly AI News and Insights delivered to your inbox