You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Machine Learning with TensorFlow on Google Cloud Platform
course on coursera. While working through labs in the course, I have noticed that the strategies for configuring the gcloud sdk are not very robust. Perhaps it is because they are intended to be run on GCP in datalab, but I like doing them on my computer or VMs: datalab itself has been showing non-responsive UI, which may be caused poor network latency, or my persistent use of firefox.
Anyhow, moving onward, there doesn't seem to be a place in the documentation with an advised way of automating the setup of a GCP config, and I have broke quite a few gcloud configurations by running scripts like the one here. It changes the project id, bucket and region in my currently open config. These configurations are proving quite tedious to keep an eye on.
I know terraform and other devops tools offer partial solutions, but this really feels like something that should be native. Does anyone have suggestions on how we could improve the scripts used for setting up GCP environmental variables to stop them from being set on-top of existing configs, but to use a temporary one that belongs exclusively to the script?
Perhaps it is possible to set all of these variables with the python api, and avoid changing any of the configs that are used for bash calls.
The text was updated successfully, but these errors were encountered:
I have not found any solutions that I am very happy with. However, I would like to suggest finding a way to automatically generate a client_secret for an OAuth 2.0 approach as outlined here.
However, I think this requires some dev-ops work from the qwiklabs group. They would have to generate these credentials per instance of a lab, or provide the user instructions on how to create such credentials. In the case of asking the user to create the credentials, a service account might be easier.
I've been doing the
course on coursera. While working through labs in the course, I have noticed that the strategies for configuring the gcloud sdk are not very robust. Perhaps it is because they are intended to be run on GCP in datalab, but I like doing them on my computer or VMs: datalab itself has been showing non-responsive UI, which may be caused poor network latency, or my persistent use of firefox.
Anyhow, moving onward, there doesn't seem to be a place in the documentation with an advised way of automating the setup of a GCP config, and I have broke quite a few gcloud configurations by running scripts like the one here. It changes the project id, bucket and region in my currently open config. These configurations are proving quite tedious to keep an eye on.
I know terraform and other devops tools offer partial solutions, but this really feels like something that should be native. Does anyone have suggestions on how we could improve the scripts used for setting up GCP environmental variables to stop them from being set on-top of existing configs, but to use a temporary one that belongs exclusively to the script?
Perhaps it is possible to set all of these variables with the python api, and avoid changing any of the configs that are used for bash calls.
The text was updated successfully, but these errors were encountered: