More on Prompts
Fully Specifications of Prompt Sections
Prompts are characterized by unique names that start with prompt.
, such as [prompt.generate_joke]
or [prompt.write_recipe]
.
Each prompt section can have the following attributes:
Core Attributes
model_name: (Optional) Dictates the AI model used for the prompt. In its absence, the value from the
[default]
section or the built-in default model (identity) is chosen.description: (Optional) A brief overview of new user input fields used in the prompt. For every input field, provide a description in the first prompt where it appears. The format is:
Specifying the data type is optional, with the default being
str
. Use the|
delimiter only when pinpointing the data type. Typicallyask
section should be used for user input unless using a attribute supported only by prompt sections (currentlycontinue_if
andbreak
).display: (Optional, defaults to
True
) If set toTrue
, the model's response for that specific prompt is shown to the user. Otherwise, it remains hidden.display_option: (Optional) Denotes the mode of response display (e.g., "text", "image", "video", "markdown", "audio" etc.). In its absence, "text" is chosen.
output_type: (Optional) Stipulates the typecasting method for the response (e.g., "json", "csv", "ini", etc). Leveraging structured output epitomizes chatGPT's capabilities! Output type should not be specified unless you are type casting (treating raw LLM output as an "json", "ini" or other structured outputs.
continue_if: (Optional, Boolean) Dictates the condition to continue with the subsequent prompt. If set to
True
, the workflow advances to the next step based on the defined condition.break: (Optional, Boolean) Represents a condition where the workflow should halt. If set to
True
, the workflow ceases based on the set condition.change_page (Optional, Boolean) If set to
True
, a new page is started on the app. Typically,next_page
section should be used for changing the page unless using a attribute supported only by prompt sections (currentlycontinue_if
andbreak
).
Model-Specific Attributes
Models can bring in their unique attributes. For instance, chat models recognize attributes like system_role
, message
, and priming
.
Example 1
An example prompt section that uses a user input called topic
and sends a message to gpt-3.5-turbo
to generate a hilarious joke on that topic.
Example 2
An example of a prompt section that uses two user inputs dish_name
and number_of_people
and sends a message to gpt-4
to generate a recipe based on inputs.
Example 3
An example of a prompt section that takes a user input genre
and sends a message to StabilityAI's stable-diffusion-xl-beta-v2-2-2
to create an image of Taj Mahal in that genre.
In this section, taking of user input genre_2
is conditional on whether user want the second image. In such a case, inline user input inside a prompt section is needed as ask
section currently doesn't support conditional (this will change soon).
Last updated