forked from QwenLM/Qwen2.5
-
Notifications
You must be signed in to change notification settings - Fork 0
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Merge pull request QwenLM#128 from bug-orz/main
Add LlamaIndex.rst
- Loading branch information
Showing
8 changed files
with
701 additions
and
19 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,65 @@ | ||
# SOME DESCRIPTIVE TITLE. | ||
# Copyright (C) 2024, Qwen Team | ||
# This file is distributed under the same license as the Qwen package. | ||
# FIRST AUTHOR <EMAIL@ADDRESS>, 2024. | ||
# | ||
#, fuzzy | ||
msgid "" | ||
msgstr "" | ||
"Project-Id-Version: Qwen \n" | ||
"Report-Msgid-Bugs-To: \n" | ||
"POT-Creation-Date: 2024-03-18 18:18+0800\n" | ||
"PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n" | ||
"Last-Translator: FULL NAME <EMAIL@ADDRESS>\n" | ||
"Language: zh_CN\n" | ||
"Language-Team: zh_CN <[email protected]>\n" | ||
"Plural-Forms: nplurals=1; plural=0;\n" | ||
"MIME-Version: 1.0\n" | ||
"Content-Type: text/plain; charset=utf-8\n" | ||
"Content-Transfer-Encoding: 8bit\n" | ||
"Generated-By: Babel 2.14.0\n" | ||
|
||
#: ../../source/framework/Langchain.rst:2 a17522f19b824ce78e0cabb8f5fba043 | ||
msgid "Using Langchain to Retrieval" | ||
msgstr "" | ||
|
||
#: ../../source/framework/Langchain.rst:4 fdea42f976c34db3b503c9d134e26427 | ||
msgid "" | ||
"This guide helps you to build a question-answering application based on a" | ||
" local knowledge base using ``Qwen1.5-7B-Chat`` with ``langchain``. The " | ||
"goal is to establish a knowledge base Q&A solution that is friendly to " | ||
"many scenarios and open-source models, and that can run offline." | ||
msgstr "本教程旨在帮助您利用``Qwen1.5-7B-Chat``与``langchain``,基于本地知识库构建问答应用。目标是建立一个适用于多种场景的、友好的、基于开源模型的知识库问答解决方案,且能够离线运行。" | ||
|
||
#: ../../source/framework/Langchain.rst:10 be3ee3fc413b441e806681bf4c321cc2 | ||
msgid "Basic Usage" | ||
msgstr "基础用法" | ||
|
||
#: ../../source/framework/Langchain.rst:12 46a7949d695548f3a5a4820e1804415a | ||
msgid "" | ||
"You can just use your document with ``langchain`` to build a question-" | ||
"answering application. The implementation process of this project " | ||
"includes loading files -> reading text -> segmenting text -> vectorizing " | ||
"text -> vectorizing questions -> matching the top k most similar text " | ||
"vectors with the question vectors -> incorporating the matched text as " | ||
"context along with the question into the prompt -> submitting to the " | ||
"Qwen1.5-7B-Chat to generate an answer. Below is an example:" | ||
msgstr "您可以仅使用您的文档配合``langchain``来构建一个问答应用。该项目的实现流程包括加载文件 -> 阅读文本 -> 文本分段 -> 文本向量化 -> 问题向量化 -> 将最相似的前k个文本向量与问题向量匹配 -> 将匹配的文本作为上下文连同问题一起纳入提示 -> 提交给Qwen1.5-7B-Chat生成答案。以下是一个示例:" | ||
|
||
#: ../../source/framework/Langchain.rst:92 12a2e495b2d94489ab7ac846c51a8011 | ||
msgid "" | ||
"After load the Qwen1.5-7B-Chat model, you should specify txt file that " | ||
"needs retrieval for knowledge-based Q&A." | ||
msgstr "加载Qwen1.5-7B-Chat模型后,您可以指定需要用于知识库问答的txt文件。" | ||
|
||
#: ../../source/framework/Langchain.rst:248 5ffc73b03a8246059d17e8581ca7c93c | ||
msgid "Next Step" | ||
msgstr "下一步" | ||
|
||
#: ../../source/framework/Langchain.rst:250 96f3640a21fa44a4992ba90024be61b6 | ||
msgid "" | ||
"Now you can chat with Qwen1.5 use your own document. Continue to read the" | ||
" documentation and try to figure out more advanced usages of model " | ||
"retrieval!" | ||
msgstr "现在,您可以在您自己的文档上与Qwen1.5进行交流。继续阅读文档,尝试探索模型检索的更多高级用法!" | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,129 @@ | ||
# SOME DESCRIPTIVE TITLE. | ||
# Copyright (C) 2024, Qwen Team | ||
# This file is distributed under the same license as the Qwen package. | ||
# FIRST AUTHOR <EMAIL@ADDRESS>, 2024. | ||
# | ||
#, fuzzy | ||
msgid "" | ||
msgstr "" | ||
"Project-Id-Version: Qwen \n" | ||
"Report-Msgid-Bugs-To: \n" | ||
"POT-Creation-Date: 2024-03-18 18:47+0800\n" | ||
"PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n" | ||
"Last-Translator: FULL NAME <EMAIL@ADDRESS>\n" | ||
"Language: zh_CN\n" | ||
"Language-Team: zh_CN <[email protected]>\n" | ||
"Plural-Forms: nplurals=1; plural=0;\n" | ||
"MIME-Version: 1.0\n" | ||
"Content-Type: text/plain; charset=utf-8\n" | ||
"Content-Transfer-Encoding: 8bit\n" | ||
"Generated-By: Babel 2.14.0\n" | ||
|
||
#: ../../source/framework/LlamaIndex.rst:2 fd8cb627291b4337a5343514cb875ce3 | ||
msgid "LlamaIndex" | ||
msgstr "LlamaIndex" | ||
|
||
#: ../../source/framework/LlamaIndex.rst:4 3814141ca63942118893f4f20a549b28 | ||
msgid "" | ||
"To connect Qwen1.5. with external data, such as documents, web pages, " | ||
"etc., we offer a tutorial on `LlamaIndex <https://www.llamaindex.ai/>`__." | ||
" This guide helps you quickly implement retrieval-augmented generation " | ||
"(RAG) using LlamaIndex with Qwen1.5." | ||
msgstr "为了实现 Qwen1.5 与外部数据(例如文档、网页等)的连接,我们提供了 `LlamaIndex <https://www.llamaindex.ai/>`__ 的详细教程。本指南旨在帮助用户利用 LlamaIndex 与 Qwen1.5 快速部署检索增强生成(RAG)技术。" | ||
|
||
#: ../../source/framework/LlamaIndex.rst:8 9d50f10bb8b2432785823a536b89b902 | ||
msgid "Preparation" | ||
msgstr "环境准备" | ||
|
||
#: ../../source/framework/LlamaIndex.rst:10 fbb180e615474ca49345c4e6ace1bb74 | ||
msgid "" | ||
"To implement RAG, we advise you to install the LlamaIndex-related " | ||
"packages first." | ||
msgstr "为实现检索增强生成(RAG),我们建议您首先安装与 LlamaIndex 相关的软件包。" | ||
|
||
#: ../../source/framework/LlamaIndex.rst:13 6b09fffd6967464fb87848e130421243 | ||
msgid "The following is a simple code snippet showing how to do this:" | ||
msgstr "以下是一个简单的代码示例:" | ||
|
||
#: ../../source/framework/LlamaIndex.rst:22 34c29b67ddcb4d5997c429bf7a6ecee8 | ||
msgid "Set Parameters" | ||
msgstr "设置参数" | ||
|
||
#: ../../source/framework/LlamaIndex.rst:24 0e6d3c9deb384da3b328fae2199c0239 | ||
msgid "" | ||
"Now we can set up LLM, embedding model, and the related configurations. " | ||
"Qwen1.5-Chat supports conversations in multiple languages, including " | ||
"English and Chinese. You can use the ``bge-base-en-v1.5`` model to " | ||
"retrieve from English documents, and you can download the ``bge-base-" | ||
"zh-v1.5`` model to retrieve from Chinese documents. You can also choose " | ||
"``bge-large`` or ``bge-small`` as the embedding model or modify the " | ||
"context window size or text chunk size depending on your computing " | ||
"resources. Qwen 1.5 model families support a maximum of 32K context " | ||
"window size." | ||
msgstr "" | ||
|
||
#: ../../source/framework/LlamaIndex.rst:82 1243ca1a38c34d1797bae5518410374b | ||
msgid "Build Index" | ||
msgstr "现在,我们可以设置语言模型和向量模型。Qwen1.5-Chat支持包括英语和中文在内的多种语言对话。您可以使用``bge-base-en-v1.5``模型来检索英文文档,下载``bge-base-zh-v1.5``模型以检索中文文档。根据您的计算资源,您还可以选择``bge-large``或``bge-small``作为向量模型,或调整上下文窗口大小或文本块大小。Qwen 1.5模型系列支持最大32K上下文窗口大小。" | ||
|
||
|
||
#: ../../source/framework/LlamaIndex.rst:84 5fba4b593336404199cb633cb37290ea | ||
msgid "Now we can build index from documents or websites." | ||
msgstr "现在我们可以从文档或网站构建索引。" | ||
|
||
#: ../../source/framework/LlamaIndex.rst:86 6824493c7a0b4bc593374050a816b1bf | ||
msgid "" | ||
"The following code snippet demonstrates how to build an index for files " | ||
"(regardless of whether they are in PDF or TXT format) in a local folder " | ||
"named 'document'." | ||
msgstr "以下代码片段展示了如何为本地名为'document'的文件夹中的文件(无论是PDF格式还是TXT格式)构建索引。" | ||
|
||
#: ../../source/framework/LlamaIndex.rst:99 ba2dd451ed214fce8b5dcd7d2605edad | ||
msgid "" | ||
"The following code snippet demonstrates how to build an index for the " | ||
"content in a list of websites." | ||
msgstr "以下代码片段展示了如何为一系列网站的内容构建索引。" | ||
|
||
#: ../../source/framework/LlamaIndex.rst:115 0caf80e9b56344ba8551739c2c6e36d1 | ||
msgid "To save and load the index, you can use the following code snippet." | ||
msgstr "要保存和加载已构建的索引,您可以使用以下代码示例。" | ||
|
||
#: ../../source/framework/LlamaIndex.rst:129 9d6e83c2ffbd407cb88c383d88a396da | ||
msgid "RAG" | ||
msgstr "检索增强(RAG)" | ||
|
||
#: ../../source/framework/LlamaIndex.rst:131 49c900e2323d4e3ca121fe25bdc30b38 | ||
msgid "" | ||
"Now you can perform queries, and Qwen1.5 will answer based on the content" | ||
" of the indexed documents." | ||
msgstr "现在您可以输入查询,Qwen1.5 将基于索引文档的内容提供答案。" | ||
|
||
#~ msgid "" | ||
#~ "To connect Qwen1.5. with external data," | ||
#~ " such as documents, web pages, etc.," | ||
#~ " we recommend using `LlamaIndex " | ||
#~ "<https://www.llamaindex.ai/>`__. This guide helps" | ||
#~ " you quickly implement retrieval-augmented" | ||
#~ " generation (RAG) using LlamaIndex with " | ||
#~ "Qwen1.5." | ||
#~ msgstr "" | ||
|
||
#~ msgid "" | ||
#~ "Now we can set up LLM, embedding" | ||
#~ " model, and the related configurations. " | ||
#~ "Qwen1.5-Chat supports conversations in " | ||
#~ "multiple languages, including English and " | ||
#~ "Chinese. We recommend using the " | ||
#~ "``bge-base-en-v1.5`` model to retrieve " | ||
#~ "from English documents, and you can " | ||
#~ "download the ``bge-base-zh-v1.5`` model" | ||
#~ " to retrieve from Chinese documents. " | ||
#~ "You can also choose ``bge-large`` " | ||
#~ "or ``bge-small`` as the embedding " | ||
#~ "model or modify the context window " | ||
#~ "size or text chunk size depending " | ||
#~ "on your computing resources. Qwen 1.5" | ||
#~ " model families support a maximum of" | ||
#~ " 32K context window size." | ||
#~ msgstr "" | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,35 @@ | ||
# SOME DESCRIPTIVE TITLE. | ||
# Copyright (C) 2024, Qwen Team | ||
# This file is distributed under the same license as the Qwen package. | ||
# FIRST AUTHOR <EMAIL@ADDRESS>, 2024. | ||
# | ||
#, fuzzy | ||
msgid "" | ||
msgstr "" | ||
"Project-Id-Version: Qwen \n" | ||
"Report-Msgid-Bugs-To: \n" | ||
"POT-Creation-Date: 2024-03-18 18:18+0800\n" | ||
"PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n" | ||
"Last-Translator: FULL NAME <EMAIL@ADDRESS>\n" | ||
"Language: zh_CN\n" | ||
"Language-Team: zh_CN <[email protected]>\n" | ||
"Plural-Forms: nplurals=1; plural=0;\n" | ||
"MIME-Version: 1.0\n" | ||
"Content-Type: text/plain; charset=utf-8\n" | ||
"Content-Transfer-Encoding: 8bit\n" | ||
"Generated-By: Babel 2.14.0\n" | ||
|
||
#: ../../source/framework/function_call.rst:2 1d7dd558cf874ea5abe483ea0b2a69f4 | ||
msgid "Function Calling" | ||
msgstr "函数调用" | ||
|
||
#: ../../source/framework/function_call.rst:4 7611540418aa4051beedae463ed1fdba | ||
msgid "" | ||
"We offer a wrapper for function calling over the dashscope API and the " | ||
"OpenAI API in `Qwen-Agent <https://github.com/QwenLM/Qwen-Agent>`__." | ||
msgstr "在 `Qwen-Agent <https://github.com/QwenLM/Qwen-Agent>`__ 中,我们提供了一个专用封装器,旨在实现通过 dashscope API 与 OpenAI API 进行的函数调用。" | ||
|
||
#: ../../source/framework/function_call.rst:8 ab450be5b90d45809e1450be5e648062 | ||
msgid "Use Case" | ||
msgstr "使用示例" | ||
|
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,57 @@ | ||
# SOME DESCRIPTIVE TITLE. | ||
# Copyright (C) 2024, Qwen Team | ||
# This file is distributed under the same license as the Qwen package. | ||
# FIRST AUTHOR <EMAIL@ADDRESS>, 2024. | ||
# | ||
#, fuzzy | ||
msgid "" | ||
msgstr "" | ||
"Project-Id-Version: Qwen \n" | ||
"Report-Msgid-Bugs-To: \n" | ||
"POT-Creation-Date: 2024-03-18 18:18+0800\n" | ||
"PO-Revision-Date: YEAR-MO-DA HO:MI+ZONE\n" | ||
"Last-Translator: FULL NAME <EMAIL@ADDRESS>\n" | ||
"Language: zh_CN\n" | ||
"Language-Team: zh_CN <[email protected]>\n" | ||
"Plural-Forms: nplurals=1; plural=0;\n" | ||
"MIME-Version: 1.0\n" | ||
"Content-Type: text/plain; charset=utf-8\n" | ||
"Content-Transfer-Encoding: 8bit\n" | ||
"Generated-By: Babel 2.14.0\n" | ||
|
||
#: ../../source/framework/qwen_agent.rst:2 53aa75a8daf2481cbeef43478a28e8a0 | ||
msgid "Qwen-Agent" | ||
msgstr "Qwen-Agent" | ||
|
||
#: ../../source/framework/qwen_agent.rst:4 8d7f40e131654f2789d382223137d371 | ||
msgid "" | ||
"`Qwen-Agent <https://github.com/QwenLM/Qwen-Agent>`__ is a framework for " | ||
"developing LLM applications based on the instruction following, tool " | ||
"usage, planning, and memory capabilities of Qwen. It also comes with " | ||
"example applications such as Browser Assistant, Code Interpreter, and " | ||
"Custom Assistant." | ||
msgstr "`Qwen-Agent <https://github.com/QwenLM/Qwen-Agent>`__ 是一个基于 Qwen 的指令跟随、工具使用、计划和记忆能力来开发 LLM 应用程序的框架。它还附带了一些示例应用程序,例如浏览器助手、代码解释器和自定义助手。" | ||
|
||
#: ../../source/framework/qwen_agent.rst:11 b070f438dec14dbf817b1d4fb84799b4 | ||
msgid "Installation" | ||
msgstr "安装" | ||
|
||
#: ../../source/framework/qwen_agent.rst:20 16ce1e0a61514df786d75fd335d96013 | ||
msgid "Developing Your Own Agent" | ||
msgstr "开发您自己的智能体" | ||
|
||
#: ../../source/framework/qwen_agent.rst:22 e84fb00cbe94490f8d3ae7d78debf7bd | ||
msgid "" | ||
"Qwen-Agent provides atomic components such as LLMs and prompts, as well " | ||
"as high-level components such as Agents. The example below uses the " | ||
"Assistant component as an illustration, demonstrating how to add custom " | ||
"tools and quickly develop an agent that uses tools." | ||
msgstr "Qwen-Agent 提供包括语言模型和提示词等原子级组件,及智能体等高级组件在内的多种组件。以下示例选取助理组件进行展示,阐述了如何整合自定义工具以及如何迅速开发出一个能够应用这些工具的代理程序。" | ||
|
||
#: ../../source/framework/qwen_agent.rst:91 a6dcb5bae37748f78b738bcf10ed049d | ||
msgid "" | ||
"The framework also provides more atomic components for developers to " | ||
"combine. For additional showcases, please refer to `examples " | ||
"<https://github.com/QwenLM/Qwen-Agent/tree/main/examples>`__." | ||
msgstr "该框架还为开发者提供了更多的原子组件以供组合使用。欲了解更多示例,请参见 `examples <https://github.com/QwenLM/Qwen-Agent/tree/main/examples>`__。" | ||
|
Oops, something went wrong.