LangChain Embedding
This documentation describes the integration of MindsDB with LangChain, a framework for developing applications powered by language models. The integration allows for the deployment of LangChain models within MindsDB, providing the models with access to data from various data sources.
Prerequisites
Before proceeding, ensure the following prerequisites are met:
- Install MindsDB locally via Docker or Docker Desktop.
- To use LangChain within MindsDB, install the required dependencies following this instruction.
Setup
Create an AI/ML engine from the LangChain Embedding handler.
Create a model using embedding
as an engine and providing your OpenAI API key.
The following are the required and optional parameters:
engine
is a required parameter. It defines the AI engine, as created with theCREATE ML_ENGINE
statement, to be used.class
is a required parameter. It defines the model provider, such as"OpenAI"
or"HuggingFace"
.model
is a required parameter. It defined the embedding model, such astext-embedding-3-small
.openai_api_key
is a required parameter when using OpenAI as a provider.input_columns
is an optional parameter. It defines the column(s) to be processed by the embedding model.
Usage
The following usage examples utilize embedding
to create a model with the CREATE MODEL
statement.
- Using the OpenAI models:
- Using the HuggingFace models:
Ensure that the model has been created successfully before using it. To do that, use the DESCRIBE
command and look at the status
column.
Next Steps
Go to the Use Cases section to see more examples.
Was this page helpful?