I am facing this issue that I am not able to resolve. I am writing a javaspark code with MySQL integration. I can see there is a small fix to this.
package JavaSpark.Javs.SQL;
import java.util.Properties;
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.apache.spark.sql.SparkSession;
public class sparkSqlMysql {
private static final org.apache.log4j.Logger LOGGER = org.apache.log4j.Logger.getLogger(sparkSqlMysql.class);
private static final String MYSQL_CONNECTION_URL = "jdbc:mysql://localhost:3306/BCG";
private static final String MYSQL_USERNAME = "root";
private static final String MYSQL_PWD = "mypassword";
private static final SparkSession sparkSession =
SparkSession.builder().master("local[*]").appName("Spark2JdbcDs").getOrCreate();
public static void main(String[] args) {
//JDBC connection properties
final Properties connectionProperties = new Properties();
connectionProperties.put("user", MYSQL_USERNAME);
connectionProperties.put("password", MYSQL_PWD);
connectionProperties.put("driver", "com.mysql.jdbc.Driver");
final String dbTable =
"select * from testLoad";
// Load MySQL query result as Dataset
Dataset<Row> jdbcDF =
sparkSession.read()
.jdbc(MYSQL_CONNECTION_URL, dbTable, connectionProperties);
}
}
When I execute this gives me an error
You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '' at line 1
at com.mysql.cj.jdbc.exceptions.SQLError.createSQLException(SQLError.java:120)
What is wrong and how can I fix it?
question from:
https://stackoverflow.com/questions/65647253/connect-java-spark-sql-to-mysql 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…