LLM()
function that lets users incorporate the LLM-generated output directly into the data queries.
LLM()
function requires a large language model, which can be defined in the following ways:
default_llm
parameter in the MindsDB configuration file.
OpenAI
Anthropic
LiteLLM
Ollama
Nvidia NIMs
LLM()
function to simply ask a question and get an answer.
LLM()
function with your data to swiftly complete tasks such as text generation or summarization.