LlamaChatContentFormatter#

class langchain_community.chat_models.azureml_endpoint.LlamaChatContentFormatter[source]#

Deprecated: Kept for backwards compatibility

Chat Content formatter for Llama.

Attributes

SUPPORTED_ROLES

accepts

The MIME type of the response data returned from the endpoint

content_type

The MIME type of the input data passed to the endpoint

format_error_msg

supported_api_types

Supported APIs for the given formatter.

Methods

__init__()

escape_special_characters(prompt)

Escapes any special characters in prompt

format_messages_request_payload(messages,Β ...)

Formats the request according to the chosen api

format_request_payload(prompt,Β model_kwargs)

Formats the request body according to the input schema of the model.

format_response_payload(output[,Β api_type])

Formats response

__init__() β†’ None[source]#
Return type:

None

static escape_special_characters(prompt: str) β†’ str#

Escapes any special characters in prompt

Parameters:

prompt (str) –

Return type:

str

format_messages_request_payload(messages: List[BaseMessage], model_kwargs: Dict, api_type: AzureMLEndpointApiType) β†’ bytes#

Formats the request according to the chosen api

Parameters:
Return type:

bytes

format_request_payload(prompt: str, model_kwargs: Dict, api_type: AzureMLEndpointApiType = AzureMLEndpointApiType.dedicated) β†’ Any#

Formats the request body according to the input schema of the model. Returns bytes or seekable file like object in the format specified in the content_type request header.

Parameters:
Return type:

Any

format_response_payload(output: bytes, api_type: AzureMLEndpointApiType = AzureMLEndpointApiType.dedicated) β†’ ChatGeneration#

Formats response

Parameters:
Return type:

ChatGeneration