Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

poetry project reports 2.0 as learning rate #520

Open
radiodee1 opened this issue Jul 31, 2019 · 0 comments
Open

poetry project reports 2.0 as learning rate #520

radiodee1 opened this issue Jul 31, 2019 · 0 comments

Comments

@radiodee1
Copy link

I am following the code found here:

https://cloud.google.com/blog/products/gcp/cloud-poetry-training-and-hyperparameter-tuning-custom-text-models-on-cloud-ml-engine

The code can be found on github in a notebook here:

https://github.com/GoogleCloudPlatform/training-data-analyst/blob/master/courses/machine_learning/deepdive/09_sequence/poetry.ipynb

I have copied the code fairly closely, but have changed the names of some things. When I run my code there is a message that the learning rate has been set to 2.00 . Is this right? Should it not be something like 0.05?

below is some code from my 'problem.py' code.

@registry.register_hparams
def transformer_chat():
    hparams = transformer.transformer_base()
    hparams.num_hidden_layers = 2
    hparams.hidden_size = 128
    hparams.filter_size = 512
    hparams.num_heads = 4
    hparams.attention_dropout = 0.6
    hparams.layer_prepostprocess_dropout = 0.6
    hparams.learning_rate = 0.05
    return hparams

# hyperparameter tuning ranges
@registry.register_ranged_hparams
def transformer_chat_range(rhp):
    rhp.set_float("learning_rate", 0.05, 0.25, scale=rhp.LOG_SCALE)
    rhp.set_int("num_hidden_layers", 2, 4)
    rhp.set_discrete("hidden_size", [128, 256, 512])
    rhp.set_float("attention_dropout", 0.4, 0.7)

This is a snippet of the output that shows the learning rate being reported as 2.0:

I0731 11:12:03.723451 140080027969344 learning_rate.py:29] Base learning rate: 2.000000
I0731 11:12:03.892279 140080027969344 optimize.py:327] Trainable Variables Total size: 1972992
I0731 11:12:03.892893 140080027969344 optimize.py:327] Non-trainable variables Total size: 5
I0731 11:12:03.893257 140080027969344 optimize.py:182] Using optimizer adam
I0731 11:12:07.278682 140080027969344 estimator.py:1147] Done calling model_fn.
I0731 11:12:07.279783 140080027969344 basic_session_run_hooks.py:541] Create CheckpointSaverHook.

I suspect I'm doing something wrong. Can you point it out? Thanks for your time.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant