简单聊天引擎

class llama_index.chat_engine.simple.SimpleChatEngine(service_context: ServiceContext, prompt: Prompt, chat_history: List[Tuple[str, str]])

Simple Chat Engine.

Have a conversation with the LLM. This does not make use of a knowledge base.

async achat(message: str) Union[Response, StreamingResponse]

Async version of main chat interface.

chat(message: str) Union[Response, StreamingResponse]

Main chat interface.

chat_repl() None

Enter interactive chat REPL.

classmethod from_defaults(service_context: Optional[ServiceContext] = None, prompt: Optional[Prompt] = None, chat_history: Optional[List[Tuple[str, str]]] = None, **kwargs: Any) SimpleChatEngine

Initialize a SimpleChatEngine from default parameters.

reset() None

Reset conversation state.