I'm trying to run a custom HDFS reader class in PySpark. This class is written in Java and I need to access it from PySpark, either from the shell or with spark-submit.
In PySpark, I retrieve the JavaGateway from the SparkContext (sc._gateway
).
Say I have a class:
package org.foo.module
public class Foo {
public int fooMethod() {
return 1;
}
}
I've tried to package it into a jar and pass it with the --jar
option to pyspark and then running:
from py4j.java_gateway import java_import
jvm = sc._gateway.jvm
java_import(jvm, "org.foo.module.*")
foo = jvm.org.foo.module.Foo()
But I get the error:
Py4JError: Trying to call a package.
Can someone help with this? Thanks.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…