Skip to content

Open In Colab

How to Set Up a Kili Project with a LLM Model and Create a Conversation

In this tutorial, you'll learn how to set up a project in Kili Technology that integrates a Large Language Model (LLM), associate the LLM with your project, and create a conversation using the Kili Python SDK. By the end of this guide, you'll have a functional project ready to collect and label LLM outputs for comparison and evaluation.

Here are the steps we will follow:

  1. Creating a Kili project with a custom interface
  2. Creating an LLM model
  3. Associating the model with the project
  4. Creating a conversation

Creating a Kili Project with a Custom Interface

We will create a Kili project with a custom interface that includes a comparison job and a classification job. This interface will be used for labeling and comparing LLM outputs.

Here's the JSON interface we will use:

interface = {
    "jobs": {
        "COMPARISON_JOB": {
            "content": {
                "options": {
                    "IS_MUCH_BETTER": {"children": [], "name": "Is much better", "id": "option1"},
                    "IS_BETTER": {"children": [], "name": "Is better", "id": "option2"},
                    "IS_SLIGHTLY_BETTER": {
                        "children": [],
                        "name": "Is slightly better",
                        "id": "option3",
                    },
                    "TIE": {"children": [], "name": "Tie", "id": "option4", "mutual": True},
                },
                "input": "radio",
            },
            "instruction": "Pick the best answer",
            "mlTask": "COMPARISON",
            "required": 1,
            "isChild": False,
            "isNew": False,
        },
        "CLASSIFICATION_JOB": {
            "content": {
                "categories": {
                    "BOTH_ARE_GOOD": {"children": [], "name": "Both are good", "id": "category1"},
                    "BOTH_ARE_BAD": {"children": [], "name": "Both are bad", "id": "category2"},
                },
                "input": "radio",
            },
            "instruction": "Overall quality",
            "mlTask": "CLASSIFICATION",
            "required": 0,
            "isChild": False,
            "isNew": False,
        },
    }
}

Now, we create the project using the create_project method, with type LLM_INSTR_FOLLOWING:

from kili.client import Kili

kili = Kili(
    # api_endpoint="https://cloud.kili-technology.com/api/label/v2/graphql",
)
project = kili.create_project(
    title="[Kili SDK Notebook]: LLM Project",
    description="Project Description",
    input_type="LLM_INSTR_FOLLOWING",
    json_interface=interface,
)
project_id = project["id"]

Creating an LLM Model

We will now create an LLM model in Kili, by specifying the model's credentials and connector type. In this example, we will use the OpenAI SDK as the connector type.

Note: Replace api_key and endpoint with your model's actual credentials.

model_response = kili.llm.create_model(
    organization_id="<YOUR_ORGANIZATION_ID>",
    model={
        "credentials": {
            "api_key": "<YOUR_OPEN_AI_API_KEY>",
            "endpoint": "<your_desired_open_ai_endpoint>",
        },
        "name": "My Model",
        "type": "OPEN_AI_SDK",
    },
)

model_id = model_response["id"]

You can now see the model integration by clicking Manage organization :

Model Integration

Associating the Model with the Project

Next, we will associate the created model with our project by creating project models with different configurations. Each time you create a prompt, two models will be chosen from the project models in the project

In this example, we compare GPT 4o and GPT 4o Mini, with different temperature settings :

# First project model with a fixed temperature
first_project_model = kili.llm.create_project_model(
    project_id=project_id,
    model_id=model_id,
    configuration={
        "model": "gpt-4o",
        "temperature": 0.5,
    },
)

# Second project model with a temperature range
second_project_model = kili.llm.create_project_model(
    project_id=project_id,
    model_id=model_id,
    configuration={
        "model": "gpt-4o-mini",
        "temperature": {"min": 0.2, "max": 0.8},
    },
)

You can now see the project models in the project settings :

Project Models

Creating a Conversation

Now, we'll generate a conversation by providing a prompt.

conversation = kili.llm.create_conversation(
    project_id=project_id, prompt="Give me Schrödinger equation."
)

It will add an asset to your project, and you'll be ready to start labeling the conversation :

Conversation

Summary

In this tutorial, we've:

  • Created a Kili project with a custom interface for LLM output comparison.
  • Registered an LLM model in Kili with the necessary credentials.
  • Associated the model with the project by creating project models with different configurations.
  • Generated a conversation using a prompt, adding it to the project for labeling.