spark 2.3编译出现 net.alchim31.maven:scala-maven-plugin NullPointerException 异常

it2024-01-13  68

 

编译spark工程,使用maven clean package -DskipTests编译,出现标题中的错误

使用maven clean package -DskipTests -e,查看出现详情,发现下面的错误

Scala project using sbt throws NullPointerException

java.lang.NullPointerException at java.base/java.util.regex.Matcher.getTextLength(Matcher.java:1769) at java.base/java.util.regex.Matcher.reset(Matcher.java:416) at java.base/java.util.regex.Matcher.<init>(Matcher.java:253) at java.base/java.util.regex.Pattern.matcher(Pattern.java:1130) at java.base/java.util.regex.Pattern.split(Pattern.java:1249) at java.base/java.util.regex.Pattern.split(Pattern.java:1322) at sbt.IO$.pathSplit(IO.scala:797) at sbt.IO$.parseClasspath(IO.scala:912) at sbt.compiler.CompilerArguments.extClasspath(CompilerArguments.scala:66) ... ...

 最后发现是jdk版本使用的是11,替换为8解决。这里需要特别说明一下,如果一个系统中有两个jdk版本,需要在idea project中明确指定jdk版本(file -> project structure打开下面的设置页面),如下所示

 

参考:

https://stackoverflow.com/questions/50559843/scala-project-using-sbt-throws-nullpointerexception

https://stackoverflow.com/questions/28004552/problems-while-compiling-spark-with-maven

最新回复(0)