Skip to content

Commit

Permalink
Merge pull request microsoft#313 from nitya/feat/add-sketchnotes
Browse files Browse the repository at this point in the history
Feat/add sketchnotes (Lessons 4 and 18)
  • Loading branch information
koreyspace committed Mar 5, 2024
2 parents f198fac + 229bd8a commit 0089beb
Show file tree
Hide file tree
Showing 13 changed files with 32 additions and 19 deletions.
9 changes: 8 additions & 1 deletion 04-prompt-engineering-fundamentals/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Prompt Engineering Fundamentals

[![Prompt Engineering Fundamentals](./images/04-lesson-banner.png?WT.mc_id=academic-105485-koreyst)](https://learn.microsoft.com/_themes/docs.theme/master/en-us/_themes/global/video-embed.html?id=d54c0c69-b183-4a6c-80ed-8b1a8f299cff?WT.mc_id=academic-105485-koreyst)
[![Prompt Engineering Fundamentals](./images/04-lesson-banner.png?WT.mc_id=academic-105485-koreyst)](https://learn.microsoft.com/_themes/docs.theme/master/_themes/global/video-embed.html?id=d54c0c69-b183-4a6c-80ed-8b1a8f299cff?WT.mc_id=academic-105485-koreyst)

The way your write your prompt to an LLM also matters. A carefully-crafted prompt can achieve a better quality of response. But what exactly do terms like _prompt_ and _prompt engineering_ mean? And how do I improve the prompt _input_ that I send to the LLM? These are the questions we'll try to answer with in this chapter and the next.

Expand Down Expand Up @@ -33,6 +33,13 @@ The Jupyter Notebook accompanying this lesson provides a _sandbox_ environment w

The notebook comes with _starter_ exercises - but you are encouraged to add your own _Markdown_ (description) and _Code_ (prompt requests) sections to try out more examples or ideas - and build your intuition for prompt design.

## Illustrated Guide

Want to get the big picture of what this lesson covers before you dive in? Check out this illustrated guide which gives you a sense of the main topics covered, and the key takeaways for you to think about in each one. The lesson roadmap takes you from understanding the core concepts and challenges, to addressing them with relevant prompt engineering techniques and best practices. Note that the "Advanced Techniques" section in this guide refers to content covered in the _next_ chapter of this curriculum.

![Illustrated Guide to Prompt Engineering](./images/04-prompt-engineering-sketchnote.png?WT.mc_id=academic-105485-koreyst)


## Our Startup

Now, let's talk about how _this topic_ relates to our startup mission to [bring AI innovation to education](https://educationblog.microsoft.com/2023/06/collaborating-to-bring-ai-innovation-to-education?WT.mc_id=academic-105485-koreyst). We want to build AI-powered applications of _personalized learning_ - so let's think about how different users of our application might "design" prompts:
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# 第四章:提示工程基础

[![Prompt Engineering Fundamentals](../../images/04-lesson-banner.png?WT.mc_id=academic-105485-koreyst)](https://learn.microsoft.com/_themes/docs.theme/master/en-us/_themes/global/video-embed.html?id=d54c0c69-b183-4a6c-80ed-8b1a8f299cff?WT.mc_id=academic-105485-koreyst)
[![Prompt Engineering Fundamentals](../../images/04-lesson-banner.png?WT.mc_id=academic-105485-koreyst)](https://learn.microsoft.com/_themes/docs.theme/master/_themes/global/video-embed.html?id=d54c0c69-b183-4a6c-80ed-8b1a8f299cff?WT.mc_id=academic-105485-koreyst)

如何撰写 LLM 的提示很重要,精心设计的提示可以比不精心设计的提示取得更好的结果。 但这些概念到底是什么,提示、提示工程以及我如何改进我发送给 LLMs 的内容? 诸如此类的问题正是本章和下一章想要解答的。

Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# プロンプト・エンジニアリングの基礎

[![Prompt Engineering Fundamentals](../../images/04-lesson-banner.png?WT.mc_id=academic-105485-yoterada)](https://learn.microsoft.com/_themes/docs.theme/master/en-us/_themes/global/video-embed.html?id=d54c0c69-b183-4a6c-80ed-8b1a8f299cff?WT.mc_id=academic-105485-yoterada)
[![Prompt Engineering Fundamentals](../../images/04-lesson-banner.png?WT.mc_id=academic-105485-yoterada)](https://learn.microsoft.com/_themes/docs.theme/master/_themes/global/video-embed.html?id=d54c0c69-b183-4a6c-80ed-8b1a8f299cff?WT.mc_id=academic-105485-yoterada)

大規模言語モデル (LLM) では、プロンプトの書き方がとても重要で、慎重に作成したプロンプトは、そうでないものに比べ良い結果をもたらします。しかし、プロンプトやプロンプト・エンジニアリングとは一体どういう物なのでしょうか?また、LLM に送信する内容をどのようにして改善すればいいのでしょうか?この章と次の章では、そうした疑問に答えたいと思います。

Expand Down
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Fundamentos de Engenharia de Prompt

[![Prompt Engineering Fundamentals](../../images/04-lesson-banner.png?WT.mc_id=academic-105485-koreyst)](https://learn.microsoft.com/_themes/docs.theme/master/en-us/_themes/global/video-embed.html?id=d54c0c69-b183-4a6c-80ed-8b1a8f299cff?WT.mc_id=academic-105485-koreyst)
[![Prompt Engineering Fundamentals](../../images/04-lesson-banner.png?WT.mc_id=academic-105485-koreyst)](https://learn.microsoft.com/_themes/docs.theme/master/_themes/global/video-embed.html?id=d54c0c69-b183-4a6c-80ed-8b1a8f299cff?WT.mc_id=academic-105485-koreyst)

A forma como você escreve seu prompt para o LLM importa. Um prompt cuidadosamente elaborado pode alcançar um resultado melhor do que um que não é. Mas o que são esses conceitos, prompt, Engenharia de Prompt e como posso melhorar o que envio para o LLM? Perguntas como essas são o que este capítulo e o próximo estão procurando responder.

Expand Down
2 changes: 1 addition & 1 deletion 05-advanced-prompts/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -615,7 +615,7 @@ Please attempt to solve the assignment by adding suitable prompts to the code.
> [!TIP]
> Phrase a prompt to ask it to improve, it's a good idea to limit how many improvements. You can also ask to improve it in a certain way, for example architecture, performance, security, etc.
[Solution](./python/aoai-solution.py)
[Solution](./python/aoai-solution.py?WT.mc_id=academic-105485-koreyst)

## Knowledge check

Expand Down
8 changes: 4 additions & 4 deletions 16-open-source-models/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,12 +30,12 @@ For this lesson, we will refer to the models as "open models" going forward as t

**Cost** - The cost per token for using and deploying these models is lower than that of proprietary models. When building Generative AI applications, looking at performance vs price when working with these models on your use case should be done.

![Model Cost](./images/model-price.png)
![Model Cost](./images/model-price.png?WT.mc_id=academic-105485-koreyst)
Source: Artifical Anayslsis

**Flexibility** - Working with open models enables you do be flexible on in terms of using different models or combining them. An example of this is the [HuggingChat Assistants ](https://huggingface.co/chat?WT.mc_id=academic-105485-koreyst) where a users can select the model being used directly in the user interface:

![Choose Model](./images/choose-model.png)
![Choose Model](./images/choose-model.png?WT.mc_id=academic-105485-koreyst)

## Exploring Different Open Models

Expand Down Expand Up @@ -63,8 +63,8 @@ There is no one answer for choosing an open model. A good place to start is by u

When looking to compare LLMs across the different types, [Artificial Analysis](https://artificialanalysis.ai/?WT.mc_id=academic-105485-koreyst) is another great resource:

![Model Quality](./images/model-quality.png)
Source: Artifical Anayslsis
![Model Quality](./images/model-quality.png?WT.mc_id=academic-105485-koreyst)
Source: Artifical Analysis

If working on a specific use case, searching for fine-tuned versions that are focused on the same area can be effective. Experimenting with multiple open models to see how they perform according to your and your users' expectations is another good practice

Expand Down
12 changes: 9 additions & 3 deletions 18-fine-tuning/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,12 @@ By the end of this lesson, you should be able to answer the following questions:

Ready? Let's get started.

## Illustrated Guide

Want to get the big picture of what we'll cover before we dive in? Check out this illustrated guide that describes the learning journey for this lesson - from learning the core conccepts and motivation for fine-tuning, to understanding the process and best practices for executing the fine-tuning task. This is a fascinating topic for exploration, so don't forget to check out the [Resources](./RESOURCES.md?WT.mc_id=academic-105485-koreyst) page for additional links to support your self-guided learning journey!

![Illustrated Guide to Fine Tuning Language Models](./img/18-fine-tuning-sketchnote.png?WT.mc_id=academic-105485-koreyst)

## What is fine-tuning for language models?

By definition, large language models are _pre-trained_ on large quantities of text sourced from diverse sources including the internet. As we've learned in previous lessons, we need techniques like _prompt engineering_ and _retrieval-augmented generation_ to improve the quality of the model's responses to the user's questions ("prompts").
Expand Down Expand Up @@ -70,7 +76,7 @@ The following resources provide step-by-step tutorials to walk you through a rea
| Provider | Tutorial | Description |
| ------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ | ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| OpenAI | [How to fine-tune chat models](https://github.com/openai/openai-cookbook/blob/main/examples/How_to_finetune_chat_models.ipynb?WT.mc_id=academic-105485-koreyst) | Learn to fine-tune a `gpt-35-turbo` for a specific domain ("recipe assistant") by preparing training data, running the fine-tuning job, and using the fine-tuned model for inference. |
| Azure Open AI | [GPT 3.5 Turbo fine-tuning tutorial](https://learn.microsoft.com/en-us/azure/ai-services/openai/tutorials/fine-tune?tabs=python-new%2Ccommand-line?WT.mc_id=academic-105485-koreyst) | Learn to fine-tune a `gpt-35-turbo-0613` model **on Azure** by taking steps to create & upload training data, run the fine-tuning job. Deploy & use the new model. |
| Azure Open AI | [GPT 3.5 Turbo fine-tuning tutorial](https://learn.microsoft.com/azure/ai-services/openai/tutorials/fine-tune?tabs=python-new%2Ccommand-line?WT.mc_id=academic-105485-koreyst) | Learn to fine-tune a `gpt-35-turbo-0613` model **on Azure** by taking steps to create & upload training data, run the fine-tuning job. Deploy & use the new model. |
| Hugging Face | [Fine-tuning LLMs with Hugging Face](https://www.philschmid.de/fine-tune-llms-in-2024-with-trl?WT.mc_id=academic-105485-koreyst) | This blog post walks you fine-tuning an _open LLM_ (ex: `CodeLlama 7B`) using the [transformers](https://huggingface.co/docs/transformers/index?WT.mc_id=academic-105485-koreyst) library & [Transformer Reinforcement Learning (TRL)](https://huggingface.co/docs/trl/index?WT.mc_id=academic-105485-koreyst]) with open [datasets](https://huggingface.co/docs/datasets/index?WT.mc_id=academic-105485-koreyst) on Hugging Face. |
| | | |

Expand All @@ -82,6 +88,6 @@ Select one of the tutorials above and walk through them. _We may replicate a ver

After completing this lesson, check out our [Generative AI Learning collection](https://aka.ms/genai-collection?WT.mc_id=academic-105485-koreyst) to continue leveling up your Generative AI knowledge!

Congratulations!! You have completed the final lesson from the v2 series for this course! Don't stop learning and building. \*\*Check out the [RESOURCES](RESOURCES.md) page for a list of additional suggestions for just this topic.
Congratulations!! You have completed the final lesson from the v2 series for this course! Don't stop learning and building. \*\*Check out the [RESOURCES](RESOURCES.md?WT.mc_id=academic-105485-koreyst) page for a list of additional suggestions for just this topic.

Our v1 series of lessons have also been updated with more assignments and concepts. So take a minute to refresh your knowledge - and please [share your questions and feedback](https://github.com/microsoft/generative-ai-for-beginners/issues) to help us improve these lessons for the community.
Our v1 series of lessons have also been updated with more assignments and concepts. So take a minute to refresh your knowledge - and please [share your questions and feedback](https://github.com/microsoft/generative-ai-for-beginners/issues?WT.mc_id=academic-105485-koreyst) to help us improve these lessons for the community.
Loading

0 comments on commit 0089beb

Please sign in to comment.