The LLM() function requires a large language model, which can be defined in the MindsDB configuration under default_llm or via environment variables as below.
To use the LLM() function with MindsDB, choosing one of the available model providers and define the following environment variables.
OpenAI
Here are the environment variables for the OpenAI provider:
Note that the values stored in the environment variables are specific for each provider.
MindsDB
Here are the environment variables for the MindsDB provider:
Copy
Ask AI
LLM_FUNCTION_MODEL_NAMELLM_FUNCTION_PROJECT_NAME
To use MindsDB as a provider, create a model in a project within MindsDB and use its name in the LLM_FUNCTION_MODEL_NAME environment variable and the project name in the LLM_FUNCTION_PROJECT_NAME environment variable.
You can use the LLM() function to simply ask a question and get an answer.
Copy
Ask AI
SELECT LLM('How many planets are there in the solar system?');
Here is the output:
Copy
Ask AI
+------------------------------------------+| llm |+------------------------------------------+| There are 8 planets in the solar system. |+------------------------------------------+
Moreover, you can the LLM() function with your data to swiftly complete tasks such as text generation or summarization.
Copy
Ask AI
SELECT comment, LLM('Describe the comment''s category in one word: ' || comment) AS categoryFROM example_db.user_comments;
Here is the output:
Copy
Ask AI
+--------------------------+----------+| comment | category |+--------------------------+----------+| I hate tacos | Dislike || I want to dance | Desire || Baking is not a big deal | Opinion |+--------------------------+----------+
Assistant
Responses are generated using AI and may contain mistakes.