This post is sponsored by Maxim AI. Maxim AI is an end-to-end observability and evaluation for your LLMs and AI Agent. It offers a powerful playground for your experiments, agent simulations to test against complex scenatios and observability to see what’s going on with your Agents.
Try 14 days for free: https://getmax.im/web
Introduction
The essential part of your LLM-powered application is your prompt - it makes or breaks your app.
But, some people are still sleeping on it.
I mean, the prompts are not tracked and versioned properly which leads to just putting the prompt into a text file and version it with Git.
Unfortunately, this is not applicable for real-world applications:
1. What, if you want to change your prompt quickly? You always need to redeploy your app.
2. What, if your users are not happy with the new results coming from the LLM? You need to manually find the last working prompt (good luck finding it).
For this, you need a proper Prompt Registry.
Similar to a Model Registry, a Prompt Registry allows you to easily manage prompts.
Charming about a Prompt Registry is, that you can version your prompts and let your app retrieve the prompts dynamically at runtime.
One tool to use for Prompt Registry is MLflow.
But isn't MLflow just for Machine Learning?
Not anymore! MLflows latest release included a Prompt Registry, which means you can load and delete prompts, treating prompts like a real artifact instead of a easy-to-overlook text file in your repository.
In this post, you will learn how to use MLflows new Prompt Registry with a few methods to upgrade your LLM-powered app.
Setup MLflow
You need to install MLflow's latest release to make use of its newest features.
$ pip install mlflow==2.21.0
And run a MLflow server:
$ mlflow server --host 127.0.0.1 --port 8080
Working with Prompts
MLflow offers 5 main functions to work with prompts:
register_prompt()
load_prompt()
delete_prompt()
set_prompt_alias()
delete_prompt_alias()
Let's look at them:
To create a prompt you use register_prompt():
import mlflow
template = """\
Write a kind message to {{ name }}.
"""
prompt = mlflow.register_prompt(
name="message-prompt",
template=template,
commit_message="Initial commit",
version_metadata={
"author": "baniasbaabe@example.com",
},
tags={
"task": "text_generation",
"language": "en",
},
)
1. The template you created can contain future input fields, noted with double-curly braces.
2. You give the prompt a name
3. You can also include a commit message, similar to a Git commit message
4. You can add additional version metadata like the author of the prompt
5. You can add additional tags.
This was enough to create our first prompt.
Now, when we want to update the prompt, we can just change the prompt and call register_prompt() again.
import mlflow
template = """\
Write a kind message to {{ name }}. It should also include a funny joke at the beginning.
"""
updated_prompt = mlflow.register_prompt(
name="message-prompt",
template=template,
commit_message="Improve prompt by including a joke to make the message more kind.",
version_metadata={
"author": "baniasbaabe@example.com",
},
)
This will create a new version for our prompt.
You can check it out in the UI under http://localhost:8080.
MLflow shows you the difference between your first and second version with all the details.
If you want to use the prompt in your application, you need the load_prompt() function.
import mlflow
prompt = mlflow.load_prompt("prompts:/message-prompt/2")
print(prompt.format(name="John Doe"))
By calling .format() on the prompt, you can fill out the inputs we specified with double-curly braces.
Deleting a prompt is also straightforward, by calling delete_prompt() and passing the name and version of the prompt.
import mlflow
mlflow.delete_prompt("message-prompt", version=2)
One cool feature are Aliases. An Alias is a named reference to the prompt. E.g. you can create an alias named `production` or `staging` to refer to the version used in your respective environment.
You can create a prompt alias like so:
import mlflow
mlflow.set_prompt_alias("message-prompt", alias="production", version=2)
And using it:
import mlflow
prompt = mlflow.load_prompt("prompts:/message-prompt@production")
If you want to get rid of the alias, you can delete it:
import mlflow
mlflow.delete_prompt_alias(name="message-prompt", alias="production")
Conclusion
Prompts are the heart of your LLM application, so treat it appropriately. You learned how to manage prompts easily with MLflow's newest Prompt Registry in Python.