Customizing prompts in RAG templates
In the RAG templates, you can customize the LLM Q&A prompt directly in the YAML configuration file. You just need to set the prompt_template
argument for the BaseRAGQuestionAnswerer
.
Using demo-question-answering
as an example, let's see how to customize the prompt.
By default, the BaseRAGQuestionAnswerer
in the app.yaml
is initialized with:
question_answerer: !pw.xpacks.llm.question_answering.BaseRAGQuestionAnswerer
llm: $llm
indexer: $document_store
Set the prompt_template
to be the prompt you want to use. Your prompt needs to contain two placeholders: "{query}"
and "{context}"
– "{query}"
will be replaced with the question being asked, whereas "{context}"
will be replaced with the list of context documents.
question_answerer: !pw.xpacks.llm.question_answering.BaseRAGQuestionAnswerer
llm: $llm
indexer: $document_store
prompt_template: "Given these documents: {context}, please answer the question: {query}"
If you plan to use a longer prompt, it may be more convenient to use a multiline string and store it in a variable (the syntax for variables is described in https://pathway.com/developers/ai-pipelines/configure-yaml).
$prompt_template: |
Answer the question based on the given documents.
If you can't find the answer in the documents, say that you don't know.
Context documents: {context}
Question: {query}
Your answer:
question_answerer: !pw.xpacks.llm.question_answering.BaseRAGQuestionAnswerer
llm: $llm
indexer: $document_store
prompt_template: $prompt_template