Prerequisites
Before proceeding, ensure the following prerequisites are met:- Install MindsDB locally via Docker or Docker Desktop.
 - To use LangChain within MindsDB, install the required dependencies following this instruction.
 - Obtain the API key for a selected model provider that you want to use through LangChain.
 
Available models include the following:
- OpenAI (how to get the API key)
 - Anthropic (how to get the API key)
 - Google (how to get the API key)
 - Ollama (how to download Ollama)
 - LiteLLM (use the API key of the model used via LiteLLM)
 - MindsDB (use any model created within MindsDB)
 
Setup
Create an AI engine from the LangChain handler.langchain_engine as an engine and a selected model provider.
This handler supports tracing features for LangChain via LangFuse. To use it, provide the following parameters in the 
USING clause:langfuse_host,langfuse_public_key,langfuse_secret_key.
There are three different tools utilized by this agent:
- MindsDB is the internal MindsDB executor.
 - Metadata fetches the metadata information for the available tables.
 - Write is able to write agent responses into a MindsDB data source.
 
langchain_engine as an engine and a selected model provider.
OpenAI
OpenAI
Anthropic
Anthropic
Ollama
Ollama
LiteLLM
LiteLLM
Google
MindsDB
MindsDB
Usage
The following usage examples utilizelangchain_engine to create a model with the CREATE MODEL statement.
Create a model that will be used to ask questions.
Next StepsGo to the Use Cases section to see more examples.