-
Notifications
You must be signed in to change notification settings - Fork 24
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
support Bedrock #81
base: master
Are you sure you want to change the base?
support Bedrock #81
Conversation
hi @JackHopkins How can I can contact you ...etc (discord invite is invalid) thanks |
Hey Kevin -
You can join here: https://discord.gg/ymmV6FKg
LMK if you can't and I can add you via email!
Daniel Kwak, CEO
Paperplane ( https://www.paperplane.ai/ )
404-704-2999
…On Tue, Nov 21, 2023 at 6:01 PM, Kevin Truong < ***@***.*** > wrote:
hi @ JackHopkins ( https://github.com/JackHopkins )
WIP for bedrock support
I'm testing at my side, working. But need align with you to make a good
progress for the MR.
How can I can contact you ...etc (discord invite is invalid)
thanks
—
Reply to this email directly, view it on GitHub (
#81 (comment)
) , or unsubscribe (
https://github.com/notifications/unsubscribe-auth/AHVPE5SGIJAPRSL24LSID7LYFUXDDAVCNFSM6AAAAAA7VKKPTOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTQMRRHAZTEOBQGQ
).
You are receiving this because you are subscribed to this thread. Message
ID: <monkeypatch/monkeypatch. py/pull/81/c1821832804 @ github. com>
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you have a look at these comments and let me know what you think?
Additionally, it'd be really helpful to have some automated tests for Bedrock (possibly mocking out the Bedrock client) so we can ensure that we don't break anything as we continue development.
if ("error" in response and | ||
"code" in response["error"] and | ||
response["error"]["code"] == 'invalid_api_key'): | ||
raise Exception(f"The supplied OpenAI API key {self.api_key} is invalid") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please change to "Anthropic API key"
continue | ||
|
||
if not choice: | ||
raise Exception("OpenAI API failed to generate a response") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same again here.
modelId=model, | ||
contentType="application/json", | ||
accept="application/json") | ||
return json.loads(response.get('body').read().decode())['completion'] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We shouldn't be returning this here - we should loop the counter below in the case where the bedrock_runtime
throws an exception.
return 'anthropic.claude-instant-v1' | ||
|
||
elif api_model_name == 'openai': | ||
return "" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let me double check to ensure whether this works.
"Content-Type": "application/json", | ||
} | ||
response = requests.post( | ||
OPENAI_URL, headers=openai_headers, json=params, timeout=50 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should replace this with the invoke_model
you have above.
align_dataset_size = self.dataset_sizes["alignments"][func_hash] if func_hash in self.dataset_sizes[ | ||
"alignments"] else 0 | ||
patch_dataset_size = self.dataset_sizes["patches"][func_hash] if func_hash in self.dataset_sizes[ | ||
"patches"] else 0 | ||
total_dataset_size = align_dataset_size + patch_dataset_size | ||
training_file_id = response["id"] | ||
# submit the finetuning job | ||
try: | ||
finetuning_response = openai.FineTuningJob.create(training_file=training_file_id, model="gpt-3.5-turbo", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We need to figure out a way to call the logic to trigger a finetune in the relevant LLM provider class (i.e OpenAI, Bedrock, etc) - this is not the right place for this logic to live now we are adding support for more providers.
from monkey_patch.language_models.llm_api_abc import LLM_Api | ||
import os | ||
|
||
OPENAI_URL = "https://api.openai.com/v1/chat/completions" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This can be removed.
OPENAI_URL = "https://api.openai.com/v1/chat/completions" | ||
import requests | ||
|
||
AWS_REGION = os.environ.get("AWS_DEFAULT_REGION", "us-east-1") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should instantiate all envvars / bedrock specific logic inside the init
of the Bedrock class - catching and reraising any thrown exceptions.
from monkey_patch.language_models.openai_api import Openai_API | ||
from monkey_patch.models.language_model_output import LanguageModelOutput | ||
from monkey_patch.utils import approximate_token_count | ||
|
||
|
||
class ApiModelFactory: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we please have this factory inside its own file.
src/monkey_patch/monkey.py
Outdated
@@ -68,10 +68,9 @@ class Monkey: | |||
logging.addLevelName(PATCH_LEVEL_NUM, "PATCH") | |||
logging.basicConfig(level=ALIGN_LEVEL_NUM) | |||
logger = logger_factory(__name__) | |||
language_modeler = LanguageModel() | |||
language_modeler = LanguageModel(api_model=os.getenv('API_MODEL')) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We should add the API_MODEL
envvar to .example.env
to make it obvious that it is required.
4422bae
to
b411af8
Compare
No description provided.