Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Notebook 03a, SparkContext() will not run without modifying /etc/hosts on OSX. #2

Open
vestuto opened this issue Jul 12, 2016 · 5 comments

Comments

@vestuto
Copy link

vestuto commented Jul 12, 2016

Students in your tutorial may encounter this. The following code cell generates an error without updating /etc/hosts

sc = SparkContext('local[4]')

The error is as following:

Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries!

Adding the following to the end of /etc/hosts enabled me to run the cell successfully:

127.0.0.1 <hostname>

Where <hostname> is the output from calling hostname from the shell.

@vestuto
Copy link
Author

vestuto commented Jul 12, 2016

UPDATE: regardless of content of /etc/hosts cannot get notebook 06 to run the following cell without the same error as shown above:

sc = SparkContext('spark://schedulers:7077')

@mrocklin
Copy link
Collaborator

According to @quasiben and @ahmadia this may be closer to fixed by setting your JAVA_HOME and JRE_HOME environment variables.

@vestuto
Copy link
Author

vestuto commented Jul 12, 2016

Setting the following in my ~/.bashrc allows me to run pyspark from the conda environment I've set up...

export JAVA_HOME=$(/usr/libexec/java_home)
export JRE_HOME=$JAVA_HOME/jre

... but I still get the same error from notebook 06

@ahmadia
Copy link
Collaborator

ahmadia commented Jul 12, 2016

This was a VPN issue for me. Are you running one?
On Mon, Jul 11, 2016 at 10:07 PM Jason Vestuto [email protected]
wrote:

Setting the following in my ~/.bashrc allows me to run pyspark from the
conda environment I've set up...

export JAVA_HOME=$(/usr/libexec/java_home)
export JRE_HOME=$JAVA_HOME/jre

... but I still get the same error from notebook 06


You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
#2 (comment),
or mute the thread
https://github.com/notifications/unsubscribe/AAfRJaSOdIOPoHUkU_ogDvaaw5jLNaTyks5qUwTRgaJpZM4JJ-Mq
.

@vestuto
Copy link
Author

vestuto commented Jul 12, 2016

nope. just trying to run locally on my laptop.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants