Skip to content

Commit

Permalink
Use correct reference to prompt_generator in autogpt/llm/chat.py (Sig…
Browse files Browse the repository at this point in the history
  • Loading branch information
TKasperczyk authored May 8, 2023
1 parent 33a3e6f commit 0166eac
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion autogpt/llm/chat.py
Original file line number Diff line number Diff line change
Expand Up @@ -200,7 +200,7 @@ def chat_with_ai(
if not plugin.can_handle_on_planning():
continue
plugin_response = plugin.on_planning(
agent.prompt_generator, current_context
agent.config.prompt_generator, current_context
)
if not plugin_response or plugin_response == "":
continue
Expand Down

0 comments on commit 0166eac

Please sign in to comment.