Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Notebook optional user input #2961

Open
wants to merge 21 commits into
base: 0.2
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 15 commits
Commits
Show all changes
21 commits
Select commit Hold shift + click to select a range
5da2247
Added 'role' as a summary_args and to the reflection_with_llm flow to…
MarianoMolina Apr 26, 2024
fa927f1
Added 'role' as a summary_args and to the reflection_with_llm flow to…
MarianoMolina Apr 26, 2024
04b0597
Added test for summary prompt role assignment
MarianoMolina Apr 27, 2024
eff19ac
Merge branch 'main' into add-role-to-reflection-with-llm
sonichi Apr 29, 2024
953464a
Fixed docstrings and mocked llm-config in the test
MarianoMolina Apr 30, 2024
e973ac3
Fixed docstrings and mocked llm-config in the test
MarianoMolina Apr 30, 2024
d309e15
Update autogen/agentchat/conversable_agent.py
MarianoMolina May 2, 2024
9b3555e
Merge branch 'main' into add-role-to-reflection-with-llm
MarianoMolina May 2, 2024
2dd6b14
ran pre-commit
MarianoMolina May 3, 2024
bf32bf0
ran pre-commit2
MarianoMolina May 3, 2024
72a0e38
Merge branch 'main' into add-role-to-reflection-with-llm
sonichi May 4, 2024
0f3f5d5
fixed old arg name
MarianoMolina May 5, 2024
ed1cdf2
Delete dasdaasd
MarianoMolina May 11, 2024
b23b4ba
Merge branch 'main' into add-role-to-reflection-with-llm
ekzhu May 13, 2024
ba63386
Added notebook with stateflow example of optional user input
MarianoMolina Jun 18, 2024
3b8598b
Merge branch '0.2' into notebook-optional-user-input
ekzhu Oct 2, 2024
7320811
Update notebook/agentchat_groupchat_user_input_stateflow.ipynb
ekzhu Oct 2, 2024
8acbb19
Update notebook/agentchat_groupchat_user_input_stateflow.ipynb
ekzhu Oct 2, 2024
6faadcf
Merge branch '0.2' into notebook-optional-user-input
rysweet Oct 12, 2024
d102543
Merge branch '0.2' into notebook-optional-user-input
rysweet Oct 18, 2024
75b33cb
Merge branch '0.2' into notebook-optional-user-input
rysweet Nov 21, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
23 changes: 19 additions & 4 deletions autogen/agentchat/conversable_agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -937,6 +937,7 @@ def my_summary_method(
One example key is "summary_prompt", and value is a string of text used to prompt a LLM-based agent (the sender or receiver agent) to reflect
on the conversation and extract a summary when summary_method is "reflection_with_llm".
The default summary_prompt is DEFAULT_SUMMARY_PROMPT, i.e., "Summarize takeaway from the conversation. Do not add any introductory phrases. If the intended request is NOT properly addressed, please point it out."
Another available key is "summary_role", which is the role of the message sent to the agent in charge of summarizing. Default is "system".
message (str, dict or Callable): the initial message to be sent to the recipient. Needs to be provided. Otherwise, input() will be called to get the initial message.
- If a string or a dict is provided, it will be used as the initial message. `generate_init_message` is called to generate the initial message for the agent based on this string and the context.
If dict, it may contain the following reserved fields (either content or tool_calls need to be provided).
Expand Down Expand Up @@ -1168,8 +1169,13 @@ def _reflection_with_llm_as_summary(sender, recipient, summary_args):
raise ValueError("The summary_prompt must be a string.")
msg_list = recipient.chat_messages_for_summary(sender)
agent = sender if recipient is None else recipient
role = summary_args.get("summary_role", None)
if role and not isinstance(role, str):
raise ValueError("The summary_role in summary_arg must be a string.")
try:
summary = sender._reflection_with_llm(prompt, msg_list, llm_agent=agent, cache=summary_args.get("cache"))
summary = sender._reflection_with_llm(
prompt, msg_list, llm_agent=agent, cache=summary_args.get("cache"), role=role
)
except BadRequestError as e:
warnings.warn(
f"Cannot extract summary using reflection_with_llm: {e}. Using an empty str as summary.", UserWarning
Expand All @@ -1178,7 +1184,12 @@ def _reflection_with_llm_as_summary(sender, recipient, summary_args):
return summary

def _reflection_with_llm(
self, prompt, messages, llm_agent: Optional[Agent] = None, cache: Optional[AbstractCache] = None
self,
prompt,
messages,
llm_agent: Optional[Agent] = None,
cache: Optional[AbstractCache] = None,
role: Union[str, None] = None,
) -> str:
"""Get a chat summary using reflection with an llm client based on the conversation history.

Expand All @@ -1187,10 +1198,14 @@ def _reflection_with_llm(
messages (list): The messages generated as part of a chat conversation.
llm_agent: the agent with an llm client.
cache (AbstractCache or None): the cache client to be used for this conversation.
role (str): the role of the message, usually "system" or "user". Default is "system".
"""
if not role:
role = "system"

system_msg = [
{
"role": "system",
"role": role,
"content": prompt,
}
]
Expand All @@ -1203,7 +1218,7 @@ def _reflection_with_llm(
else:
raise ValueError("No OpenAIWrapper client is found.")
response = self._generate_oai_reply_from_client(llm_client=llm_client, messages=messages, cache=cache)
return response
return self.generate_oai_reply(messages=messages, config=llm_client)
rysweet marked this conversation as resolved.
Show resolved Hide resolved

def _check_chat_queue_for_sender(self, chat_queue: List[Dict[str, Any]]) -> List[Dict[str, Any]]:
"""
Expand Down
Loading