Prompt Templates

以下是参考提示模板。

我们首先显示默认提示的链接。

然后,我们显示基本提示类, 派生自`Langchain <https://langchain.readthedocs.io/en/latest/modules/prompt.html>`。

默认提示

默认提示列表可以在`这里找到 <https://github.com/jerryjliu/llama_index/blob/main/llama_index/prompts/default_prompts.py>`。

注意:我们还为ChatGPT用例策划了一组精炼提示。 ChatGPT精炼提示列表可以在`这里找到 <https://github.com/jerryjliu/llama_index/blob/main/llama_index/prompts/chat_prompts.py>`。

Prompts

Subclasses from base prompt.

llama_index.prompts.prompts.KeywordExtractPrompt

Query keyword extract prompt.

Prompt to extract keywords from a query query_str with a maximum of max_keywords keywords.

Required template variables: query_str, max_keywords

llama_index.prompts.prompts.KnowledgeGraphPrompt

Simple Input prompt.

Required template variables: query_str.

llama_index.prompts.prompts.QueryKeywordExtractPrompt

Schema extract prompt.

Prompt to extract schema from unstructured text text.

Required template variables: text, schema

llama_index.prompts.prompts.QuestionAnswerPrompt

Keyword extract prompt.

Prompt to extract keywords from a text text with a maximum of max_keywords keywords.

Required template variables: text, max_keywords

llama_index.prompts.prompts.RefinePrompt

Question Answer prompt.

Prompt to answer a question query_str given a context context_str.

Required template variables: context_str, query_str

llama_index.prompts.prompts.RefineTableContextPrompt

Define the knowledge graph triplet extraction prompt.

llama_index.prompts.prompts.SchemaExtractPrompt

Text to SQL prompt.

Prompt to translate a natural language query into SQL in the dialect dialect given a schema schema.

Required template variables: query_str, schema, dialect

llama_index.prompts.prompts.SimpleInputPrompt

Pandas prompt. Convert query to python code.

Required template variables: query_str, df_str, instruction_str.

llama_index.prompts.prompts.SummaryPrompt

Tree Insert prompt.

Prompt to insert a new chunk of text new_chunk_text into the tree index. More specifically, this prompt has the LLM select the relevant candidate child node to continue tree traversal.

Required template variables: num_chunks, context_list, new_chunk_text

llama_index.prompts.prompts.TableContextPrompt

Refine Table context prompt.

Prompt to refine a table context given a table schema schema, as well as unstructured text context context_msg, and a task query_str. This includes both a high-level description of the table as well as a description of each column in the table.

llama_index.prompts.prompts.TextToSQLPrompt

Table context prompt.

Prompt to generate a table context given a table schema schema, as well as unstructured text context context_str, and a task query_str. This includes both a high-level description of the table as well as a description of each column in the table.

llama_index.prompts.prompts.TreeInsertPrompt

Tree select prompt.

Prompt to select a candidate child node out of all child nodes provided in context_list, given a query query_str. num_chunks is the number of child nodes in context_list.

Required template variables: num_chunks, context_list, query_str

llama_index.prompts.prompts.TreeSelectMultiplePrompt

Refine prompt.

Prompt to refine an existing answer existing_answer given a context context_msg, and a query query_str.

Required template variables: query_str, existing_answer, context_msg

llama_index.prompts.prompts.TreeSelectPrompt

Tree select multiple prompt.

Prompt to select multiple candidate child nodes out of all child nodes provided in context_list, given a query query_str. branching_factor refers to the number of child nodes to select, and num_chunks is the number of child nodes in context_list.

Required template variables: num_chunks, context_list, query_str,

branching_factor

基本提示类

Prompt class.

class llama_index.prompts.Prompt(template: Optional[str] = None, langchain_prompt: Optional[BasePromptTemplate] = None, langchain_prompt_selector: Optional[ConditionalPromptSelector] = None, stop_token: Optional[str] = None, output_parser: Optional[BaseOutputParser] = None, prompt_type: str = PromptType.CUSTOM, metadata: Optional[Dict[str, Any]] = None, **prompt_kwargs: Any)

Prompt class for LlamaIndex.

Wrapper around langchain's prompt class. Adds ability to:
  • enforce certain prompt types

  • partially fill values

  • define stop token

format(llm: Optional[BaseLanguageModel] = None, **kwargs: Any) str

Format the prompt.

classmethod from_langchain_prompt(prompt: BasePromptTemplate, **kwargs: Any) Prompt

Load prompt from LangChain prompt.

classmethod from_langchain_prompt_selector(prompt_selector: ConditionalPromptSelector, **kwargs: Any) Prompt

Load prompt from LangChain prompt.

classmethod from_prompt(prompt: Prompt, llm: Optional[BaseLanguageModel] = None, prompt_type: Optional[PromptType] = None) Prompt

Create a prompt from an existing prompt.

Use case: If the existing prompt is already partially filled, and the remaining fields satisfy the requirements of the prompt class, then we can create a new prompt from the existing partially filled prompt.

get_langchain_prompt(llm: Optional[BaseLanguageModel] = None) BasePromptTemplate

Get langchain prompt.

property original_template: str

Return the originally specified template, if supplied.

partial_format(**kwargs: Any) Prompt

Format the prompt partially.

Return an instance of itself.