Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix llama 3 code halucination #1250

Merged

Conversation

Notnaton
Copy link
Collaborator

@Notnaton Notnaton commented May 1, 2024

Builds on code by CyanideByte
Just moved it to a different location
Will still display the backtick within the message, but will remove the backtick before code is run

Describe the changes you have made:

Reference any relevant issues (e.g. "Fixes #000"):

Pre-Submission Checklist (optional but appreciated):

  • I have included relevant documentation updates (stored in /docs)
  • I have read docs/CONTRIBUTING.md
  • I have read docs/ROADMAP.md

OS Tests (optional but appreciated):

  • Tested on Windows
  • Tested on MacOS
  • Tested on Linux

Builds on code by CyanideByte
Will still display the backtick within the message, but will remove the backtick before code is run

Co-Authored-By: CyanideByte <[email protected]>
@KillianLucas KillianLucas merged commit d316a99 into OpenInterpreter:main May 4, 2024
0 of 2 checks passed
@KillianLucas
Copy link
Collaborator

Nice, agreed that that spot is better (similar fixes for LLM quirks around it). Merged. Great work @Notnaton and @CyanideByte for the original fix!

code = code[2:].strip()
if interpreter.verbose:
print("Removing `\n")

if language == "text":

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

`interpreter/core/respond.py```

@hossain666
Copy link

#1250 (comment)

@Notnaton Notnaton deleted the fix-llama3-code-halucination branch May 15, 2024 06:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants