• java.lang.NoSuchFieldError: JAVA_9

问题:这个错误主要是因为Spark-3.x会依赖commons-lang3这个包

解决方法:普通的解决方法就不说了,解决冲突,缺少commons-lang3的加上去等等

如果还不行,就检查下是否有hive-exec这个依赖的存在,打开它你会发现它里面竟然也有commons-lang3,但是它的JavaVersion这个类竟然不一样!!!

如果是上面的问题,排除掉hive-exec就好了

<!-- https://mvnrepository.com/artifact/org.apache.hadoop/hadoop-common -->
<dependency>
    <groupId>org.apache.hadoop</groupId>
    <artifactId>hadoop-common</artifactId>
    <version>3.1.1</version>
    <exclusions>
        <exclusion>
            <artifactId>commons-lang3</artifactId>
            <groupId>org.apache.commons</groupId>
        </exclusion>
        <exclusion>
            <groupId>com.fasterxml.jackson.module</groupId>
            <artifactId>*</artifactId>
        </exclusion>
        <exclusion>
            <groupId>com.fasterxml.jackson.core</groupId>
            <artifactId>*</artifactId>
        </exclusion>
    </exclusions>
</dependency>
  • com.fasterxml.jackson.databind.JsonMappingException: Scala module
    2.12.0 requires Jackson Databind version >= 2.12.0 and < 2.13.0

问题:这是jackson多版本题
解决方法: 需要屏蔽所有hadoop 组件中的Jackson,同升级pom中的jackson

<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>3.1.1</version>
<exclusions>
<exclusion>
<artifactId>commons-lang3</artifactId>
<groupId>org.apache.commons</groupId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.module</groupId>
<artifactId>*</artifactId>
</exclusion>
<exclusion>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>*</artifactId>
</exclusion>
</exclusions>
</dependency>
 
<dependency>
<groupId>com.fasterxml.jackson.core</groupId>
<artifactId>jackson-annotations</artifactId>
<version>2.12.3</version>
</dependency>
  • 读取hudi表报:java.lang.NoSuchMethodError:
    org.json4s.JsonDSL . s e q 2 j v a l u e ( L s c a l a / c o l l e c t i o n / I t e r a b l e ; L s c a l a / F u n c t i o n 1 ; ) L o r g / j s o n 4 s / J s o n A S T .seq2jvalue(Lscala/collection/Iterable;Lscala/Function1;)Lorg/json4s/JsonAST .seq2jvalue(Lscala/collection/Iterable;Lscala/Function1;)Lorg/json4s/JsonASTJArray;

问题:pom引入的 org.json4s 包与spark3.0自己的包冲突
解决方法:注释 pom 中的所有 org.json4s

javax.net.ssl.SSLException: closing inbound before receiving peer’s close_notify的解决办法 jdbc:mysql://localhost:3306/数据库名称?useUnicode=true&characterEncoding=utf-8&serverTimezone=GMT%2B8&useSSL=false

参考此文章

  • Spark3.0 读取含有空字符串字段的es数据报:java.lang.RuntimeException: scala.None$ is
    not a valid external type for schema of string

问题:elasticsearch-spark-30_2.12

org.elasticsearch elasticsearch-spark-30_2.12 7.14.0 org.slf4j slf4j-log4j12 log4j log4j

elasticsearch-spark-30_2.12
github中也有公开的缺陷:https://github.com/elastic/elasticsearch-hadoop/issues/1635

解决方法:
.config(“es.field.read.empty.as.null”,“no”)
es.field.read.empty.as.null(默认是)elasticsearch-hadoop是否将空字段视为null。
注意,这样设置后 字段为空值时,会认为 空字符串 ‘’
https://github.com/cjuexuan/mynote/issues/72
为解决该问题需要在自己项目下新建如下包,并改写 ScalaValueReader.scala
def nullValue() = { None }

def nullValue() = { null }

  • java.lang.ClassNotFoundException:
    org.apache.logging.log4j.core.pattern.ArrayPatternConverter

问题:log4j jar包冲突
解决方法:注释掉pom中的 log4j 包,使用spark自带的包

1.You may get a different result due to the upgrading of Spark 3.0: Fail to parse ‘2012-8-7 13:41:28’ in the new parser. You can set spark.sql.legacy.timeParserPolicy to LEGACY to restore the behavior before Spark 3.0, or set to CORRECTED and treat it as an invalid datetime string.
问题:log4j jar包冲突
解决方法:config(“spark.sql.legacy.timeParserPolicy”, “LEGACY”)

Logo

大数据从业者之家,一起探索大数据的无限可能!

更多推荐