Question Answering with MindsDB and OpenAI using MQL
Introduction
In this blog post, we present how to create OpenAI models within MindsDB. In this example, we ask a question to a model and get an answer. The input data is taken from our sample MongoDB database.
Prerequisites
To follow along, install MindsDB locally via Docker or Docker Desktop.
How to Connect MindsDB to a Database
We use a collection from our MongoDB public demo database, so let’s start by connecting MindsDB to it.
You can use Mongo Compass or Mongo Shell to connect our sample database like this:
Tutorial
In this tutorial, we create a predictive model to answer questions in a specified domain.
Now that we’ve connected our database to MindsDB, let’s query the data to be used in the example:
Here is the output:
Let’s create a model collection to answer all questions from the input dataset:
Note that you need to create an OpenAI engine first before deploying the OpenAI model within MindsDB.
Here is how to create this engine:
In practice, the insertOne
method triggers MindsDB to generate an AI collection called question_answering
that uses the OpenAI integration to predict a field named answer
. The model is created inside the default mindsdb
project. In MindsDB, projects are a natural way to keep artifacts, such as models or views, separate according to what predictive task they solve. You can learn more about MindsDB projects here.
The training_options
key specifies the parameters that this handler requires.
- The
engine
parameter defines that we use theopenai
engine. - The
prompt_template
parameter conveys the structure of a message that is to be completed with additional text generated by the model.
Follow this instruction to set up the OpenAI integration in MindsDB.
Once the insertOne
method has started execution, we can check the status of the creation process with the following query:
It may take a while to register as complete depending on the internet connection. Once the creation is complete, the behavior is the same as with any other AI collection – you can query it either by specifying synthetic data in the actual query:
Here is the output data:
Or by joining with a collection for batch predictions:
Here is the output data:
The questions
collection is used to make batch predictions. Upon joining the question_answering
model with the questions
collection, the model uses all values from the article_title
and question
fields.
Was this page helpful?