LLM()
function that lets users incorporate the LLM-generated output directly into the data queries.
Prerequisites
TheLLM()
function requires a large language model, which can be defined in the following ways:
-
By setting the
default_llm
parameter in the MindsDB configuration file. - By saving the default model in the MindsDB Editor under Settings.
-
By defining the environment variables as below, choosing one of the available model providers.
OpenAI
Here are the environment variables for the OpenAI provider:Note that the values stored in the environment variables are specific for each provider.Anthropic
Here are the environment variables for the Anthropic provider:Note that the values stored in the environment variables are specific for each provider.LiteLLM
Here are the environment variables for the LiteLLM provider:Note that the values stored in the environment variables are specific for each provider.Ollama
Here are the environment variables for the Ollama provider:Note that the values stored in the environment variables are specific for each provider.Nvidia NIMs
Here are the environment variables for the Nvidia NIMs provider:Note that the values stored in the environment variables are specific for each provider.
OpenAI-compatible model providers can be used like OpenAI models.There is a number of OpenAI-compatible model providers including OpenRouter or vLLM. To use models via these providers, users need to define the base URL and the API key of the provider.Here is an example of using OpenRouter.
Usage
You can use theLLM()
function to simply ask a question and get an answer.
LLM()
function with your data to swiftly complete tasks such as text generation or summarization.