llms#

LLM classes provide access to the large language model (LLM) APIs and services.

Class hierarchy:

BaseLanguageModel --> BaseLLM --> LLM --> <name>  # Examples: AI21, HuggingFaceHub, OpenAI

Main helpers:

LLMResult, PromptValue,
CallbackManagerForLLMRun, AsyncCallbackManagerForLLMRun,
CallbackManager, AsyncCallbackManager,
AIMessage, BaseMessage

Classes

llms.ai21.AI21

AI21 large language models.

llms.ai21.AI21PenaltyData

Parameters for AI21 penalty data.

llms.aleph_alpha.AlephAlpha

Aleph Alpha large language models.

llms.amazon_api_gateway.AmazonAPIGateway

Amazon API Gateway to access LLM models hosted on AWS.

llms.amazon_api_gateway.ContentHandlerAmazonAPIGateway()

Adapter to prepare the inputs from Langchain to a format that LLM model expects.

llms.anyscale.Anyscale

Anyscale large language models.

llms.aphrodite.Aphrodite

Aphrodite language model.

llms.arcee.Arcee

Arcee's Domain Adapted Language Models (DALMs).

llms.aviary.Aviary

Aviary hosted models.

llms.aviary.AviaryBackend(backend_url, bearer)

Aviary backend.

llms.azureml_endpoint.AzureMLBaseEndpoint

Azure ML Online Endpoint models.

llms.azureml_endpoint.AzureMLEndpointApiType(value)

Azure ML endpoints API types.

llms.azureml_endpoint.AzureMLEndpointClient(...)

AzureML Managed Endpoint client.

llms.azureml_endpoint.AzureMLOnlineEndpoint

Azure ML Online Endpoint models.

llms.azureml_endpoint.ContentFormatterBase()

Transform request and response of AzureML endpoint to match with required schema.

llms.azureml_endpoint.CustomOpenAIContentFormatter()

Content formatter for models that use the OpenAI like API scheme.

llms.azureml_endpoint.DollyContentFormatter()

Content handler for the Dolly-v2-12b model

llms.azureml_endpoint.GPT2ContentFormatter()

Content handler for GPT2

llms.azureml_endpoint.HFContentFormatter()

Content handler for LLMs from the HuggingFace catalog.

llms.azureml_endpoint.LlamaContentFormatter()

Deprecated: Kept for backwards compatibility

llms.azureml_endpoint.OSSContentFormatter()

Deprecated: Kept for backwards compatibility

llms.baichuan.BaichuanLLM

Baichuan large language models.

llms.baidu_qianfan_endpoint.QianfanLLMEndpoint

Baidu Qianfan completion model integration.

llms.bananadev.Banana

Banana large language models.

llms.baseten.Baseten

Baseten model

llms.beam.Beam

Beam API for gpt2 large language model.

llms.bedrock.BedrockBase

Base class for Bedrock models.

llms.bedrock.LLMInputOutputAdapter()

Adapter class to prepare the inputs from Langchain to a format that LLM model expects.

llms.bigdl_llm.BigdlLLM

Wrapper around the BigdlLLM model

llms.bittensor.NIBittensorLLM

NIBittensor LLMs

llms.cerebriumai.CerebriumAI

CerebriumAI large language models.

llms.chatglm.ChatGLM

ChatGLM LLM service.

llms.chatglm3.ChatGLM3

ChatGLM3 LLM service.

llms.clarifai.Clarifai

Clarifai large language models.

llms.cloudflare_workersai.CloudflareWorkersAI

Cloudflare Workers AI service.

llms.ctransformers.CTransformers

C Transformers LLM models.

llms.ctranslate2.CTranslate2

CTranslate2 language model.

llms.databricks.Databricks

Databricks serving endpoint or a cluster driver proxy app for LLM.

llms.deepinfra.DeepInfra

DeepInfra models.

llms.deepsparse.DeepSparse

Neural Magic DeepSparse LLM interface.

llms.edenai.EdenAI

EdenAI models.

llms.exllamav2.ExLlamaV2

ExllamaV2 API.

llms.fake.FakeListLLM

Fake LLM for testing purposes.

llms.fake.FakeStreamingListLLM

Fake streaming list LLM for testing purposes.

llms.forefrontai.ForefrontAI

ForefrontAI large language models.

llms.friendli.BaseFriendli

Base class of Friendli.

llms.friendli.Friendli

Friendli LLM.

llms.gigachat.GigaChat

GigaChat large language models API.

llms.gooseai.GooseAI

GooseAI large language models.

llms.gpt4all.GPT4All

GPT4All language models.

llms.gradient_ai.GradientLLM

Gradient.ai LLM Endpoints.

llms.gradient_ai.TrainResult

Train result.

llms.human.HumanInputLLM

User input as the response.

llms.ipex_llm.IpexLLM

IpexLLM model.

llms.javelin_ai_gateway.JavelinAIGateway

Javelin AI Gateway LLMs.

llms.javelin_ai_gateway.Params

Parameters for the Javelin AI Gateway LLM.

llms.koboldai.KoboldApiLLM

Kobold API language model.

llms.konko.Konko

Konko AI models.

llms.layerup_security.LayerupSecurity

Layerup Security LLM service.

llms.llamacpp.LlamaCpp

llama.cpp model.

llms.llamafile.Llamafile

Llamafile lets you distribute and run large language models with a single file.

llms.manifest.ManifestWrapper

HazyResearch's Manifest library.

llms.minimax.Minimax

Minimax large language models.

llms.minimax.MinimaxCommon

Common parameters for Minimax large language models.

llms.mlflow.Mlflow

MLflow LLM service.

llms.mlflow_ai_gateway.MlflowAIGateway

MLflow AI Gateway LLMs.

llms.mlflow_ai_gateway.Params

Parameters for the MLflow AI Gateway LLM.

llms.mlx_pipeline.MLXPipeline

MLX Pipeline API.

llms.modal.Modal

Modal large language models.

llms.moonshot.Moonshot

Moonshot large language models.

llms.moonshot.MoonshotCommon

Common parameters for Moonshot LLMs.

llms.mosaicml.MosaicML

MosaicML LLM service.

llms.nlpcloud.NLPCloud

NLPCloud large language models.

llms.oci_data_science_model_deployment_endpoint.OCIModelDeploymentLLM

Base class for LLM deployed on OCI Data Science Model Deployment.

llms.oci_data_science_model_deployment_endpoint.OCIModelDeploymentTGI

OCI Data Science Model Deployment TGI Endpoint.

llms.oci_data_science_model_deployment_endpoint.OCIModelDeploymentVLLM

VLLM deployed on OCI Data Science Model Deployment

llms.oci_generative_ai.CohereProvider()

llms.oci_generative_ai.MetaProvider()

llms.oci_generative_ai.OCIAuthType(value[, ...])

OCI authentication types as enumerator.

llms.oci_generative_ai.OCIGenAI

OCI large language models.

llms.oci_generative_ai.OCIGenAIBase

Base class for OCI GenAI models

llms.oci_generative_ai.Provider()

llms.octoai_endpoint.OctoAIEndpoint

OctoAI LLM Endpoints - OpenAI compatible.

llms.ollama.Ollama

Ollama locally runs large language models.

llms.ollama.OllamaEndpointNotFoundError

Raised when the Ollama endpoint is not found.

llms.opaqueprompts.OpaquePrompts

LLM that uses OpaquePrompts to sanitize prompts.

llms.openai.BaseOpenAI

Base OpenAI large language model class.

llms.openllm.IdentifyingParams

Parameters for identifying a model as a typed dict.

llms.openllm.OpenLLM

OpenLLM, supporting both in-process model instance and remote OpenLLM servers.

llms.openlm.OpenLM

OpenLM models.

llms.pai_eas_endpoint.PaiEasEndpoint

Langchain LLM class to help to access eass llm service.

llms.petals.Petals

Petals Bloom models.

llms.pipelineai.PipelineAI

PipelineAI large language models.

llms.predibase.Predibase

Use your Predibase models with Langchain.

llms.predictionguard.PredictionGuard

Prediction Guard large language models.

llms.promptlayer_openai.PromptLayerOpenAI

PromptLayer OpenAI large language models.

llms.promptlayer_openai.PromptLayerOpenAIChat

PromptLayer OpenAI large language models.

llms.replicate.Replicate

Replicate models.

llms.rwkv.RWKV

RWKV language models.

llms.sagemaker_endpoint.ContentHandlerBase()

Handler class to transform input from LLM to a format that SageMaker endpoint expects.

llms.sagemaker_endpoint.LLMContentHandler()

Content handler for LLM class.

llms.sagemaker_endpoint.LineIterator(stream)

Parse the byte stream input.

llms.sagemaker_endpoint.SagemakerEndpoint

Sagemaker Inference Endpoint models.

llms.sambanova.SSEndpointHandler(host_url, ...)

SambaNova Systems Interface for SambaStudio model endpoints.

llms.sambanova.SVEndpointHandler(host_url)

SambaNova Systems Interface for Sambaverse endpoint.

llms.sambanova.SambaStudio

SambaStudio large language models.

llms.sambanova.Sambaverse

Sambaverse large language models.

llms.self_hosted.SelfHostedPipeline

Model inference on self-hosted remote hardware.

llms.self_hosted_hugging_face.SelfHostedHuggingFaceLLM

HuggingFace Pipeline API to run on self-hosted remote hardware.

llms.solar.Solar

Solar large language models.

llms.solar.SolarCommon

Common configuration for Solar LLMs.

llms.sparkllm.SparkLLM

iFlyTek Spark completion model integration.

llms.stochasticai.StochasticAI

StochasticAI large language models.

llms.symblai_nebula.Nebula

Nebula Service models.

llms.textgen.TextGen

Text generation models from WebUI.

llms.titan_takeoff.Device(value[, names, ...])

The device to use for inference, cuda or cpu

llms.titan_takeoff.ReaderConfig

Configuration for the reader to be deployed in Titan Takeoff API.

llms.titan_takeoff.TitanTakeoff

Titan Takeoff API LLMs.

llms.tongyi.Tongyi

Tongyi completion model integration.

llms.vllm.VLLM

VLLM language model.

llms.vllm.VLLMOpenAI

vLLM OpenAI-compatible API client

llms.volcengine_maas.VolcEngineMaasBase

Base class for VolcEngineMaas models.

llms.volcengine_maas.VolcEngineMaasLLM

volc engine maas hosts a plethora of models.

llms.weight_only_quantization.WeightOnlyQuantPipeline

Weight only quantized model.

llms.writer.Writer

Writer large language models.

llms.xinference.Xinference

Xinference large-scale model inference service.

llms.yandex.YandexGPT

Yandex large language models.

llms.yi.YiLLM

Yi large language models.

llms.you.You

Wrapper around You.com's conversational Smart and Research APIs.

llms.yuan2.Yuan2

Yuan2.0 language models.

Functions

llms.anyscale.create_llm_result(choices, ...)

Create the LLMResult from the choices and prompts.

llms.anyscale.update_token_usage(keys, ...)

Update token usage.

llms.aviary.get_completions(model, prompt[, ...])

Get completions from Aviary models.

llms.aviary.get_models()

List available models

llms.cohere.acompletion_with_retry(llm, **kwargs)

Use tenacity to retry the completion call.

llms.cohere.completion_with_retry(llm, **kwargs)

Use tenacity to retry the completion call.

llms.databricks.get_default_api_token()

Get the default Databricks personal access token.

llms.databricks.get_default_host()

Get the default Databricks workspace hostname.

llms.databricks.get_repl_context()

Get the notebook REPL context if running inside a Databricks notebook.

llms.fireworks.acompletion_with_retry(llm, ...)

Use tenacity to retry the completion call.

llms.fireworks.acompletion_with_retry_batching(...)

Use tenacity to retry the completion call.

llms.fireworks.acompletion_with_retry_streaming(...)

Use tenacity to retry the completion call for streaming.

llms.fireworks.completion_with_retry(llm, ...)

Use tenacity to retry the completion call.

llms.fireworks.completion_with_retry_batching(...)

Use tenacity to retry the completion call.

llms.fireworks.conditional_decorator(...)

Conditionally apply a decorator.

llms.google_palm.completion_with_retry(llm, ...)

Use tenacity to retry the completion call.

llms.koboldai.clean_url(url)

Remove trailing slash and /api from url if present.

llms.layerup_security.default_guardrail_violation_handler(...)

Default guardrail violation handler.

llms.loading.load_llm(file, **kwargs)

Load LLM from a file.

llms.loading.load_llm_from_config(config, ...)

Load LLM from Config Dict.

llms.openai.acompletion_with_retry(llm[, ...])

Use tenacity to retry the async completion call.

llms.openai.completion_with_retry(llm[, ...])

Use tenacity to retry the completion call.

llms.openai.update_token_usage(keys, ...)

Update token usage.

llms.symblai_nebula.completion_with_retry(...)

Use tenacity to retry the completion call.

llms.symblai_nebula.make_request(self, prompt)

Generate text from the model.

llms.tongyi.agenerate_with_last_element_mark(...)

Generate elements from an async iterable, and a boolean indicating if it is the last element.

llms.tongyi.astream_generate_with_retry(llm, ...)

Async version of stream_generate_with_retry.

llms.tongyi.check_response(resp)

Check the response from the completion call.

llms.tongyi.generate_with_last_element_mark(...)

Generate elements from an iterable, and a boolean indicating if it is the last element.

llms.tongyi.generate_with_retry(llm, **kwargs)

Use tenacity to retry the completion call.

llms.tongyi.stream_generate_with_retry(llm, ...)

Use tenacity to retry the completion call.

llms.utils.enforce_stop_tokens(text, stop)

Cut off the text as soon as any stop words occur.

llms.vertexai.acompletion_with_retry(llm, prompt)

Use tenacity to retry the completion call.

llms.vertexai.completion_with_retry(llm, prompt)

Use tenacity to retry the completion call.

llms.vertexai.is_codey_model(model_name)

Return True if the model name is a Codey model.

llms.vertexai.is_gemini_model(model_name)

Return True if the model name is a Gemini model.

llms.yandex.acompletion_with_retry(llm, **kwargs)

Use tenacity to retry the async completion call.

llms.yandex.completion_with_retry(llm, **kwargs)

Use tenacity to retry the completion call.

Deprecated classes

llms.anthropic.Anthropic

Deprecated since version 0.0.28: Use langchain_anthropic.AnthropicLLM instead.

llms.bedrock.Bedrock

Deprecated since version 0.0.34: Use langchain_aws.BedrockLLM instead.

llms.cohere.BaseCohere

Deprecated since version 0.0.30: Use langchain_cohere.BaseCohere instead.

llms.cohere.Cohere

Deprecated since version 0.1.14: Use langchain_cohere.Cohere instead.

llms.fireworks.Fireworks

Deprecated since version 0.0.26: Use langchain_fireworks.Fireworks instead.

llms.google_palm.GooglePalm

Deprecated since version 0.0.12: Use langchain_google_genai.GoogleGenerativeAI instead.

llms.huggingface_endpoint.HuggingFaceEndpoint

Deprecated since version 0.0.37: Use langchain_huggingface.HuggingFaceEndpoint instead.

llms.huggingface_hub.HuggingFaceHub

Deprecated since version 0.0.21: Use langchain_huggingface.HuggingFaceEndpoint instead.

llms.huggingface_pipeline.HuggingFacePipeline

Deprecated since version 0.0.37: Use langchain_huggingface.HuggingFacePipeline instead.

llms.huggingface_text_gen_inference.HuggingFaceTextGenInference

Deprecated since version 0.0.21: Use langchain_huggingface.HuggingFaceEndpoint instead.

llms.openai.AzureOpenAI

Deprecated since version 0.0.10: Use langchain_openai.AzureOpenAI instead.

llms.openai.OpenAI

Deprecated since version 0.0.10: Use langchain_openai.OpenAI instead.

llms.openai.OpenAIChat

Deprecated since version 0.0.1: Use langchain_openai.ChatOpenAI instead.

llms.together.Together

Deprecated since version 0.0.12: Use langchain_together.Together instead.

llms.vertexai.VertexAI

Deprecated since version 0.0.12: Use langchain_google_vertexai.VertexAI instead.

llms.vertexai.VertexAIModelGarden

Deprecated since version 0.0.12: Use langchain_google_vertexai.VertexAIModelGarden instead.

llms.watsonxllm.WatsonxLLM

Deprecated since version 0.0.18: Use langchain_ibm.WatsonxLLM instead.