Skip to main content

Using prompts

Overview

To facilitate easier iteration and usage of prompts within steps, Sandgarden provides a prompt library. This document covers storing, linking, and using prompts within Sandgarden steps.

Storing prompts

Once you have a prompt you're satisfied with for calling an LLM within a step, it makes sense to store it in Sandgarden's prompt library. To do this, you can paste it directly into the web UI or save it using the CLI.

$ sand prompts create --name myPrompt --content prompt.txt

✅ Prompt createed successfully!
ID: prm_01jn4e3zv7fgyoicc0gt11lp3q
Name: myPrompt
Version: 1

Variable substitution

Sandgarden-managed prompts can contain special fields using mustache notation for variable substitution inside a step. This is useful for adding RAG to your prompts by retrieving data from elsewhere and including it in the prompt. This is how it works:

  • Within the prompt, write placeholders like this: {{variable.content}}
  • In your step, define a dictionary containing the variables to substitute
  • Render the prompt using sandgarden.render_prompt:
ticket_content = sandgarden.render_prompt('prompt-with-substitutions', \
variable_dictionary)

Using prompts in steps

Linking the prompt to the step

Prompts are linked to steps during creation/update. When linking, a specific version of the prompt must be chosen. Multiple prompts can be linked to a step, but multiple versions of the same prompt cannot be linked to the same step.

At the CLI:

sand steps push docker --name myStep --file step.py --prompt prompt:1 --prompt anotherprompt:1

Referencing prompts in a step

When a prompt is linked, it is referenced via sandgarden.get_prompt, or sandgarden.render_prompt if it contains variables. Use only the prompt name within the code:

a_prompt = sandgarden.get_prompt('prompt')
another_prompt = sandgarden.render_prompt('anotherprompt', dictionary)

Iterating on prompts

A major benefit of storing prompts within Sandgarden is that it decouples prompt content from step logic and makes it possible to iterate on both independently. If a new version of a prompt is pushed (or created within the web UI), a new version is created and all old versions remain available. Once it is established that the new prompt is an improvement, it can be linked to a step without making any code changes. In the web UI, this looks like creating a new version of a step, changing the linked prompt, and saving. At the CLI, a prompt is updated by pushing a new version of the step:

sand steps push docker --name myStep --file step.py --prompt prompt:2 --prompt anotherprompt:3