Working with Prompt Templates

Creating Your First Prompt Template

Prompt templates lie at the heart of LLM testing and experimentation.


What is a Prompt Template

If you are used to interacting with LLM models via a "Chat" with a user interface, rather than through an API, the concept of a Prompt Template may be new to you.

A Prompt Template acts a formatted blueprint for a series of prompts and their interaction with the chosen LLM model. This provides a shared structure and context when each individual prompt is initiated.

For our purposes, a Prompt Template consists of three primary parts.

Name

This is pretty straightforward. In Libretto, each prompt template needs a unique name. This name will then be used to create the template key, which can be referenced for API calls within our SDK.

Chat Template

Chat templates can vary slightly across LLM providers. In general though, a Chat Template consists of a series of System, User, and Assistant messages, each corresponding to a different Role.

  1. System: This is the context and set of instructions you'd like the LLM model to have prior to responding to the User. By default, a single System chat is set to "I'm a helpful assistant". These can range from generic and open to hyper-specific, specifying tone, format, intended audience, and general mindset.

  2. User: This is the input from you or the user, and probably the portion of the AI interaction most people are familiar with. This includes the core piece of content to which the AI model will respond

  • For our purposes, this will most likely include the prompt template variable(s), which we will discuss in the next section.
  1. Assistant: This is the response from the LLM model to the user's last prompt. Note that you can add assistant messages to a chat that were never actual responses from an LLM.

You can also add Chat History inputs, which is itself a variable consisting of a series of Role-specified inputs, to provide greater context to the conversation.

Variables

Probably the most unfamiliar concept to some is the prompt template variable. If you are coming from UI-based AI interaction like ChatGPT, most of your prompts are probably pretty unique. If you've already been integrating LLMs via an API, you might simply be passing along the contents of users' prompts.

This is the portion of the prompt that changes across different calls. For instance, for a prompt template that aims to do sentiment classification on hotel reviews, the variable may be the review itself, or {review}. For a prompt template that creates emails, you may want {recipientName}, {howWeMet}, and {senderName} variables.


How to Create a Prompt Template

Once you've created a Project, you can create a new prompt template in two ways: by integrating the SDK with a new prompt template, or by clicking on the "Create Prompt Template" button on your project page. For the former method, read our SDK guide. Read about the latter method below.

Choose a Prompt Type

Currently, you can choose between Chat and Assistants.

Chat encapsulates most common use cases, where a series of System, Assistant, and User messages are specified. Use this type of prompt template even if you intend to use a completion-style model, like GPT-3.

Assistant lets you include files to reference within your prompt template, and leverages OpenAI's Assistants API. Note that currently, Assistant is only supported for OpenAI models, so if you use this type of prompt template, you won't be able to compare models from different providers, like Anthropic or Google.

Add your Prompt Template

  1. Give your prompt Template a unique name.

  2. Build out your prompt template

    • By default, the System message will include "You are a helpful assistant". Modify this field to give the model guidance on how to respond to a User's message.
    • Add as many User and Assistant messages as you would like.
  3. Specify at least one variable, denoted by curly brackets (e.g. {variableName}) within one of the Chats.

  4. To provide extra context to Libretto, you can optionally provide a description for each variable as well.

Next Steps

Click Save and you'll have successfully created your first Prompt Template! From here you can upload a .CSV file of test cases, which is covered in our next section, or continue exploring by clicking on your newly created prompt template's name on the Project portal.

Previous
Real-time Monitoring