Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

gradlew run issue #933

Open
MohamedRagabAnas opened this issue Sep 16, 2019 · 3 comments
Open

gradlew run issue #933

MohamedRagabAnas opened this issue Sep 16, 2019 · 3 comments

Comments

@MohamedRagabAnas
Copy link

I'm trying to run the one of the examples inside morpheus on a spark cluster I have edited the creation of the morpheus session using the following lines of code:

 val conf = new SparkConf(true)
  conf.set("spark.sql.codegen.wholeStage", "true")
  conf.set("spark.sql.shuffle.partitions", "12")
  conf.set("spark.default.parallelism", "8")
  val spark = SparkSession
    .builder()
    .config(conf)
    .master("spark://172.17.67.122:7077")
    .appName(s"morpheus-local-${UUID.randomUUID()}")
    .enableHiveSupport()
    .getOrCreate()
  spark.sparkContext.setLogLevel("error")
  implicit val morpheus: MorpheusSession = MorpheusSession.create(spark)

 import spark.sqlContext.implicits._


  val nodesDF = spark.createDataset(Seq(
    (0L, "Alice", 42L),
    (1L, "Bob", 23L),
    (2L, "Eve", 84L)
  )).toDF("id", "name", "age")

  val relsDF = spark.createDataset(Seq(
    (0L, 0L, 1L, "23/01/1987"),
    (1L, 1L, 2L, "12/12/2009")
  )).toDF("id", "source", "target", "since")

  val personTable = MorpheusNodeTable(Set("Person"), nodesDF)
  val friendsTable = MorpheusRelationshipTable("KNOWS", relsDF)

  val graph = morpheus.readFrom(personTable, friendsTable)
  val result = graph.cypher("MATCH (n:Person) RETURN n.name")
  result.show

just by editing the sparksession with the master URL 'spark://172.17.67.122:7077' rather than 'local'
I have a problem while running the gradlew run example
./gradlew morpheus-examples:runApp -PmainClass=org.opencypher.morpheus.examples.DataFrameInputExample

while debugging, The problem stated is with the result.show line:

Caused by: java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
        at org.opencypher.okapi.trees.AbstractTreeNode.<init>(AbstractTreeNode.scala:69)
        at org.opencypher.okapi.ir.api.expr.Expr.<init>(Expr.scala:50)
        ... 88 more

But while I change the example to be with 'local' it runs correctly, and no problems arise

When I searched, It seems the problem is with the Scala version:
although I have the following on my cluster of 3 machines:

OS: Centos 7
Spark: version 2.4.2
Scala: Version 2.12.8

and I have put the required morpheus jars in the Spark class path (spark/jars) directory on all the cluster machines Master and Workers, such as:

  • morpheus-spark-cypher-0.4.3-SNAPSHOT.jar
    
  • okapi-api-0.4.3-SNAPSHOT.jar
    
  • okapi-relational-0.4.3-SNAPSHOT.jar
    

Please help me figuring out the problem, because I take a lot of time trying to solve this issue!
Thanks in advance for your help and support!

@rbramley
Copy link

Hi Mohamed,

Which version of Scala was your distribution of Spark compiled against?
Look in the jars directory at the spark jars e.g. spark-sql_2.11-2.4.3.jar

I ended up compiling Spark 3.0.0-SNAPSHOT against Scala 2.12 to get a working Morpheus install. This should also solve #932 for you.

@MohamedRagabAnas
Copy link
Author

Actually I have the version of Spark2.4.2 which is compiled against Scala 2.12.8 with this spark-sql_2.12-2.4.2.jar but the same problem occurs !! which version of spark do you have ?
Could you help please !!

@soerenreichardt
Copy link
Contributor

Hi Mohamed,
the latest release of Morpheus uses Spark 2.4.3 with Scala 2.12.8. Could you try it with that version? You can download the correct Spark version here: https://archive.apache.org/dist/spark/spark-2.4.3/spark-2.4.3-bin-without-hadoop-scala-2.12.tgz
Unfortunately this build does not include hadoop so you have to add the missing packages yourself if you need them or build Spark directly from source.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants