参考响应:

响应

Response schema.

class llama_index.response.schema.Response(response: ~typing.Optional[str], source_nodes: ~typing.List[~llama_index.data_structs.node.NodeWithScore] = <factory>, extra_info: ~typing.Optional[~typing.Dict[str, ~typing.Any]] = None)

Response object.

Returned if streaming=False.

response

The response text.

Type

Optional[str]

get_formatted_sources(length: int = 100) str

Get formatted sources text.

class llama_index.response.schema.StreamingResponse(response_gen: ~typing.Optional[~typing.Generator], source_nodes: ~typing.List[~llama_index.data_structs.node.NodeWithScore] = <factory>, extra_info: ~typing.Optional[~typing.Dict[str, ~typing.Any]] = None, response_txt: ~typing.Optional[str] = None)

StreamingResponse object.

Returned if streaming=True.

response_gen

The response generator.

Type

Optional[Generator]

get_formatted_sources(length: int = 100) str

Get formatted sources text.

get_response() Response

Get a standard response object.

print_response_stream() None

Print the response stream.