Prompt - PromptTemplate
Prompts to llm Models
Prompts can be given with or without templating.
Simple Prompt Without Template
!pip install langchain-openai
from langchain_openai import OpenAI
llm = OpenAI(
openai_api_key = openai_api_key
)
response = llm.invoke("Translate from english to french the text <hello>")
print(response)
llm vs chat models
llm do not require a chat kind of scenario, but rather expects just a response for the prompt. chat models expect a chat like scenario in which the conversation history needs to be remembered.
PromptTemplate
PromptTemplate class is used with llm and not chat models.
PromptTemplate is a string template. It accepts a set of variables as input from the user and they are used to generate prompts for language models.
Steps
1. Use from_template() to create PromptTemplate object
2. Create Prompt as string using format()
Step 1 : Using from_template()
cls - shows it is a class method and so we call the function with the class PromptTemplate.
template - is the f-string with variables in {}
template_format - The template can be formatted using either f-strings (default) or jinja2 syntax and the default is preferred.
Example
template = PromptTemplate.from_template(
template= "Define {topic} for a {person} person",
template_format='f-string', #default
)
Step 2 : Create Prompt as String
Template with no arguments
The previous example showed template with 2 arguments. Templates may have 0 or more argument. The following example shows template with no arguments.
from langchain_core.prompts import PromptTemplate
template = PromptTemplate.from_template("What is GAI")
print(type(template),template)
prompt = template.format()
print(type(prompt),prompt)
Passing Multiple Values in prompt variables
The need for templating is to pass the different values for the variables. The following program executes the prompt for different values of variables.
from langchain_core.prompts import PromptTemplate
# Define a simple prompt template
prompt_template = PromptTemplate.from_template(template="What is {topic}?")
# Generate formatted prompts with different topics
for topic in ["GAI", "NLP", "ML"]:
formatted_prompt = prompt_template.format(topic=topic)
print(formatted_prompt)
Pipeline
Consider the scenario where you have multiple prompts and parts of the prompt may be reused. To compose and reuse multiple prompts we use PipelinePromptTemplate
from langchain.prompts.pipeline import PipelinePromptTemplate
Steps
Create PipelinePromptTemplate object with 2 prompts as input.
pipeline_prompt = PipelinePromptTemplate(
final_prompt=full_prompt, pipeline_prompts=input_prompts
)
full_prompt should have placeholders for the prompts it should be composed of and input_prompts is a list of individual prompts that compose the full_prompt.
full_template = """{type}
{person}
{how}"""
full_prompt = PromptTemplate.from_template(full_template)
input_prompts = [
("type", type_prompt),
("person", audience_prompt),
("how", tone_prompt),
]
The individual prompts must also be formatted.
type_template = """Write a {content_type}"""
type_prompt = PromptTemplate.from_template(type_template)
audience_template = """The intended audience are {audience}"""
audience_prompt = PromptTemplate.from_template(audience_template)
tone_template = """The tone should be {tone}"""
tone_prompt = PromptTemplate.from_template(tone_template)
The final prompt string is created as usual with the input_variables in the individual input prompts.
final_prompt = pipeline_prompt.format(
content_type="email",
audience="client",
tone="formal",
)
Using invoke()
PromptTemplate and ChatPromptTemplate implement the Runnable interface. This means we can call invoke() and other LCEL methods.
The input parameter for PromptTemplate is a dictionary of the input variables and the output is StringPromptValue.
from langchain_core.prompts import PromptTemplate
template = PromptTemplate.from_template(template = "Write about {topic} for {person}",
template_format = "f-string",
)
prompt = template.format(topic = "GAI", person = "beginner")
print(prompt)
prompt_val = template.invoke({"person": "experienced", "topic": "GAI"})
print(prompt_val)
from langchain_openai import OpenAI
llm = OpenAI(
openai_api_key=openai_api_key,
max_tokens = 20
)
response = llm.invoke(prompt_val)
print(response)
Invoking using prompt string directly
from langchain_core.prompts import PromptTemplate
template = PromptTemplate.from_template(template = "Write about {topic} for {person}",
template_format = "f-string",
)
prompt = template.format(topic = "GAI", person = "beginner")
print(prompt)
from langchain_openai import OpenAI
llm = OpenAI(
openai_api_key=openai_api_key,
max_tokens = 20
)
response = llm.invoke(prompt)
print(response)
langchain/Prompts.ipynb at main · lovelynrose/langchain (github.com)


Comments
Post a Comment