-
Notifications
You must be signed in to change notification settings - Fork 45
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
JavaPackage'object is not callable #581
Comments
Hey @radi2015, I checked the same steps in a fresh conda environment:
And the output that I see: $ python3
Python 3.9.10 | packaged by conda-forge | (main, Feb 1 2022, 21:28:27)
[Clang 11.1.0 ] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import pyrasterframes
>>> from pyrasterframes.utils import create_rf_spark_session
>>> spark = create_rf_spark_session()
2022-04-02 00:07:56,589 WARN util.Utils: Your hostname, macbook.local resolves to a loopback address: 127.0.0.1; using 10.0.0.224 instead (on interface en0)
2022-04-02 00:07:56,590 WARN util.Utils: Set SPARK_LOCAL_IP if you need to bind to another address
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/Users/.../miniconda/envs/geoparquet/lib/python3.9/site-packages/pyspark/jars/spark-unsafe_2.12-3.1.2.jar) to constructor java.nio.DirectByteBuffer(long,int)
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
2022-04-02 00:07:57,056 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
>>> I will make an assumption that or you don't have a properly installed Java or you have some other specifics of the infra you're using (i.e. Windows) and not sharing in the issue description, please try to follow the Getting Started guide one more time; to write a more comprehensive response I need to know more, since as I said, the setup in a fresh conda env just works for me; in the worst case you may try it in a prepared Docker env. We're open to PRs, and if you'd like to improve docs, you're more than welcome. I'd also like to remind you that is an opensource project, and people on this project often volunteering their time to maintain it. If you have any questions / issues feel free to reach us here or in the gitter channel for the interactive chat; but all questions / comments related to other projects please redirect to Stackoverflow / other projects repositories instead. |
Sorry, there is something wrong with my words. This is a great project. I really want to learn to use it. Maybe there is something wrong with my operation. In the Python interpreter above you, I saw how to start, and I succeeded
I shouldn't start pyspark first, I should start Python first thank you very much |
@radi2015 great that it works now! how have you tried it before and what is the difference? In this case Spark session creation makes your Python interpreter process a driver; if you want to try the same in the |
At the beginning, I started the pyspark command first, and then the sparksession was created. Then I executed the create_rf_spark_session() function and reported an error. When I saw the error, I thought it was the jar package of rasterframe that was not put into the Lib directory of spark. If I put the jar package into the Lib directory, it also report an error. Now, it seems that the reason is that if pyspark cmd create a sparksession first,then executing the create_rf_spark_session() function create spark session again will be error。 |
@radi2015 oh it makes sense, I think you'd need to try smth like |
I have the same issue when trying to install on Databricks. When installing
Also, it is unclear to me based on the documentation, whether it is necessary to install these java packages in order to use pyrasterframes:
The installation of the last two packages fail with the error:
|
hey @mathiaskvist that's due to stac4s in deps and due to the json schema validation library we use, add the following |
in Getting Started
python3 -m pip install pyrasterframes
import pyrasterframes
from pyrasterframes.utils import create_rf_spark_session
spark = create_rf_spark_session()
Error setting up SparkSession; cannot find the pyrasterframes assembly jar
'JavaPackage' object is not callable
I haven't installed the pirasterframes for a day
The text was updated successfully, but these errors were encountered: