-
Notifications
You must be signed in to change notification settings - Fork 127
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
logging #32
Labels
Comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi!
I would like to ask if it is possible to turn off logging or change the logging level from python script that uses nlu library?
Even simple 'import nlu' generates lines of logs, loading models there are tons of them...
Before importing nlu, I am trying to create pyspark context and set desired log level as it is pointed in logs from import nlu: Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel)
But it doesn't seem to help, actually the opposite, I can't load models and do the predictions then...
The other approach was setting logging levels for all possible loggers: nlu, py4j, py4j.java_gateway to CRITICAL in my case
But it also didn't help.
There are still messages from e.g. WARN SparkSession$Builder, WARN ApacheUtils, I tensorflow/core/platform/cpu_feature_guard.cc:142], etc...
The text was updated successfully, but these errors were encountered: