Chat Engines

聊天引擎是一种高级界面,可以与您的数据进行对话(多次往返而不是单次问答)。

Chat Engine Implementations

下面我们展示具体的聊天引擎实现。

Chat Engine Types

class llama_index.chat_engine.types.BaseChatEngine

Base Chat Engine.

abstract async achat(message: str) Union[Response, StreamingResponse]

Async version of main chat interface.

abstract chat(message: str) Union[Response, StreamingResponse]

Main chat interface.

chat_repl() None

Enter interactive chat REPL.

abstract reset() None

Reset conversation state.

class llama_index.chat_engine.types.ChatMode(value)

Chat Engine Modes.

CONDENSE_QUESTION = 'condense_question'

Corresponds to CondenseQuestionChatEngine.

First generate a standalone question from conversation context and last message, then query the query engine for a response.

REACT = 'react'

Corresponds to ReActChatEngine.

Use a ReAct agent loop with query engine tools. Implemented via LangChain agent.

SIMPLE = 'simple'

Corresponds to SimpleChatEngine.

Chat with LLM, without making use of a knowledge base.