MindsDB provides the LLM() function that lets users incorporate the LLM-generated output directly into the data queries.

Prerequisites

The LLM() function requires a large language model, which can be defined in the following ways:
  • By setting the default_llm parameter in the MindsDB configuration file.
  • By saving the default model in the MindsDB Editor under Settings.
  • By defining the environment variables as below, choosing one of the available model providers.
OpenAI-compatible model providers can be used like OpenAI models.There is a number of OpenAI-compatible model providers including OpenRouter or vLLM. To use models via these providers, users need to define the base URL and the API key of the provider.Here is an example of using OpenRouter.
LLM_FUNCTION_MODEL_NAME = "mistralai/devstral-small-2505"
LLM_FUNCTION_BASE_URL = "https://openrouter.ai/api/v1"
OPENAI_API_KEY = "openrouter-api-key"

Usage

You can use the LLM() function to simply ask a question and get an answer.
SELECT LLM('How many planets are there in the solar system?');
Here is the output:
+------------------------------------------+
| llm                                      |
+------------------------------------------+
| There are 8 planets in the solar system. |
+------------------------------------------+
Moreover, you can use the LLM() function with your data to swiftly complete tasks such as text generation or summarization.
SELECT
    comment,
    LLM('Describe the comment''s category in one word: ' || comment) AS category
FROM example_db.user_comments;
Here is the output:
+--------------------------+----------+
| comment                  | category |
+--------------------------+----------+
| I hate tacos             | Dislike  |
| I want to dance          | Desire   |
| Baking is not a big deal | Opinion  |
+--------------------------+----------+