You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
There is no current way for a ChatGenerator to directly connect to AnswerBuilder() despite similar functionality to a regular Generator.
Describe the solution you'd like
We can extract the text from a ChatMessage and use it as a str as expected in the code for AnswerBuilder.
Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.
Additional context
ChatGenerators that rely on non-open ai models require a specific chat format to be applied to properly differentiate between system/user/assistant messages. This is done manually (and prone to errors) in regular generators. This is an example I pulled from a Haystack notebook:
prompt_template = """
<|begin_of_text|><|start_header_id|>user<|end_header_id|>
Using the information contained in the context, give a comprehensive answer to the question.
If the answer cannot be deduced from the context, do not give an answer.
Context:
{% for doc in documents %}
{{ doc.content }} URL:{{ doc.meta['url'] }}
{% endfor %};
Question: {{query}}<|eot_id|>
<|start_header_id|>assistant<|end_header_id|>
"""
prompt_builder = PromptBuilder(template=prompt_template)
This can be accomplished by using a ChatGenerator as follows:
system_message = ChatMessage.from_system(
"""
Read the context provided and answer the question if possible.If you can not form an answer from the context, reply with "Nah".
Context:
{% for doc in documents %}
{{ doc.content }}
{% endfor %};
"""
)
user_message = ChatMessage.from_user("query: {{query}}")
assistent_message = ChatMessage.from_assistant("Answer: ")
We can then add the output to an AnswerBuilder to complete a pipeline ensuring that the proper template is applied.
The text was updated successfully, but these errors were encountered:
would also like to add that i have implemented this locally for my hyrbrid pipeline that utilizes LlamaCppChatGenerator and the new ChatPromptBuilder. Everything seems to work. It's just lacking the .meta support, but that can be added at a different time if needed.
@anakin87 There is no need to write an additional component. A ChatMessage already contains the necessary information and would only require 3-4 new lines in AnswerBuilder. My quick and dirty implementation was allowing for ChatMessages or str (the default) as input. If a str is passed in, we keep the same logic that exists in AnswerBuilder. If a ChatMessage is passed in, we extract the message string and reuse the existing str logic.
Is your feature request related to a problem? Please describe.
There is no current way for a
ChatGenerator
to directly connect toAnswerBuilder()
despite similar functionality to a regular Generator.Describe the solution you'd like
We can extract the text from a
ChatMessage
and use it as astr
as expected in the code for AnswerBuilder.Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.
Additional context
ChatGenerators that rely on non-open ai models require a specific chat format to be applied to properly differentiate between system/user/assistant messages. This is done manually (and prone to errors) in regular generators. This is an example I pulled from a Haystack notebook:
This can be accomplished by using a
ChatGenerator
as follows:We can then add the output to an
AnswerBuilder
to complete a pipeline ensuring that the proper template is applied.The text was updated successfully, but these errors were encountered: