Skip to content

How to use local IntelliJ IDEA to write spark(scala)? #11

@dukechain2333

Description

@dukechain2333

Hello! I'm new to docker and Big Data Dev.

I want to use local IntelliJ IDEA to connect to the master and write spark code in scala. I've already created a scala project with IDEA, however, when I was trying to run the code below, some error occured.

import org.apache.spark.sql.SparkSession

object test1 {
  def main(args: Array[String]): Unit = {

    val spark = SparkSession
      .builder()
      .appName("Spark Hive Example")
      .master("localhost:7077")
      .config("spark.sql.warehouse.dir", "/usr/hive/warehouse")
      .enableHiveSupport()
      .getOrCreate()
    spark.sparkContext.setLogLevel("WARN")

    val data = spark.sql("show databases")
    data.show(10)
  }
}

Here is the error.

Exception in thread "main" java.lang.IllegalArgumentException: Unable to instantiate SparkSession with Hive support because Hive classes are not found.
	at org.apache.spark.sql.SparkSession$Builder.enableHiveSupport(SparkSession.scala:869)
	at test1$.main(test1.scala:12)
	at test1.main(test1.scala)

I think I may need some help:(

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions