NoClassDefFoundError for embedded-kafka in scala project

198 views Asked by At

I'm trying to integrate an embedded kafka in scala 2.11 for my tests. However I'm getting the following error which I could not understand.

java.lang.NoClassDefFoundError: org/apache/zookeeper/AsyncCallback$MultiCallback

All the versions of kafka libraries are same. Can you please tell what is wrong in my code?

Here is my dependencies

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-sql-kafka-0-10_2.11</artifactId>
    <version>2.4.7</version>
</dependency>
<dependency>
    <groupId>org.apache.kafka</groupId>
    <artifactId>kafka-clients</artifactId>
    <version>2.4.0</version>
</dependency>
<dependency>
    <groupId>io.github.embeddedkafka</groupId>
    <artifactId>embedded-kafka_2.11</artifactId>
    <version>2.4.0</version>
    <scope>test</scope>
</dependency>

Here is how I start/stop embedded kafka server.

implicit val embedKafkaConfig = EmbeddedKafkaConfig(kafkaPort = 7000, zooKeeperPort = 7001)
implicit val stringDeserializer = new StringDeserializer()
implicit val stringSerializer = new StringSerializer

Before("") { _: Scenario =>
  EmbeddedKafka.start()(embedKafkaConfig)
}
After("") { _: Scenario =>
  EmbeddedKafka.stop()
}

And the error:

kafka.server.KafkaServer - Fatal error during KafkaServer startup. Prepare to shutdown
java.lang.NoClassDefFoundError: org/apache/zookeeper/AsyncCallback$MultiCallback
    at kafka.zk.KafkaZkClient$.apply(KafkaZkClient.scala:1857)
    at kafka.server.KafkaServer.kafka$server$KafkaServer$$createZkClient$1(KafkaServer.scala:374)
    at kafka.server.KafkaServer.initZkClient(KafkaServer.scala:399)
    at kafka.server.KafkaServer.startup(KafkaServer.scala:207)
    at net.manub.embeddedkafka.ops.KafkaOps$class.startKafka(kafkaOps.scala:52)
    at net.manub.embeddedkafka.EmbeddedKafka$.startKafka(EmbeddedKafka.scala:50)
    at net.manub.embeddedkafka.ops.KafkaOps$class.startKafka(kafkaOps.scala:60)
    at net.manub.embeddedkafka.EmbeddedKafka$.startKafka(EmbeddedKafka.scala:50)
    at net.manub.embeddedkafka.ops.RunningKafkaOps$class.startKafka(kafkaOps.scala:88)
    at net.manub.embeddedkafka.EmbeddedKafka$.startKafka(EmbeddedKafka.scala:50)
    at net.manub.embeddedkafka.EmbeddedKafka$.start(EmbeddedKafka.scala:68)

enter image description here

2

There are 2 answers

2
Dmytro Mitin On BEST ANSWER

The interface org.apache.zookeeper.AsyncCallback.MultiCallback appeared in Zookeeper 3.4.7+

https://github.com/apache/zookeeper/blob/release-3.4.7/src/java/main/org/apache/zookeeper/AsyncCallback.java#L271

It was absent in Zookeeper 3.4.6-

https://github.com/apache/zookeeper/blob/release-3.4.6/src/java/main/org/apache/zookeeper/AsyncCallback.java

Based on the part of pom.xml you provided, you're using Zookeeper 3.5.6 and this should be ok.

You should check whether some dependency is still using Zookeeper 3.4.6- in your actual use case.


I suspect that you provided not enough information for reproduction. I can't reproduce. https://scastie.scala-lang.org/DmytroMitin/q4BRo4taQAey47JufBc87w/1 (see also tab "Build settings"). How to create a Minimal, Reproducible Example. You can check yourself in the following way. Create an empty project with your build file and your code. If you can't reproduce the behavior there then we will not be able either.

General reasons of NoClassDefFoundError:

Why am I getting a NoClassDefFoundError in Java?

What causes and what are the differences between NoClassDefFoundError and ClassNotFoundException?

NoClassDefFoundError can signal about incompatibilities in dependencies

Unable to write DF in delta format on hdfs

ClassNotFoundException: breeze.storage.Zero$DoubleZero$

NoClassDefFoundError: scala/collection/TraversableOnce (Using Phantom Library To Fetch Data From ScyllaDB Cluster)

How to fix dependency injection when upgrading play framework from 2.5.x to 2.6.x Scala


  • You can do sbt dependencyTree. In Maven this should be similar to mvn dependency:tree with maven-dependency-plugin (but in sbt this doesn't show provided dependencies)

How to get a dependency tree for an artifact?

  • Or print System.getProperty("java.class.path") (but this shows classpath only upon JVM start).

  • Or add scalacOptions += "-Ylog-classpath" to build.sbt. In maven this should be similar to <arg>-Ylog-classpath</arg>

Specifying Scalac Compile-Time Option with maven-scala-plugin

  • Or run the following script inside your actual environment that you're using
var cl = getClass.getClassLoader
while (cl != null) {
  println(s"classloader: ${cl.getClass.getName}")
  cl match {
    case cl: URLClassLoader =>
      println("classloader urls:")
      cl.getURLs.foreach(println)
    case _ =>
      println("not URLClassLoader")
  }
  cl = cl.getParent
}

Based on the part of dependency tree you posted, for reproduction it was enough to add

    <dependencies>
        ...
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.4.7</version>
            <scope>provided</scope>
        </dependency>
    </dependencies>

How can I override zookeper 3.4.6 with 3.5.6 in test scope?

Try to add to pom.xml

    <dependencies>
        ...
        <dependency>
            <groupId>org.apache.zookeeper</groupId>
            <artifactId>zookeeper</artifactId>
            <version>3.5.6</version>
            <scope>test</scope>
        </dependency>
    </dependencies>

How to force a specific version of dependency? (sbt)

Maven: how to override the dependency added by a library (maven)

3
Madhawa Gunasekara On

I think a similar issue was earlier reported here as well, due to some conflicting versions between Kafk and zookeeper dependency versions which was reported in old Apache issues.

https://issues.apache.org/jira/browse/BIGTOP-3208

https://github.com/apache/bigtop/pull/536

I am suggesting you to generate a dependency tree and check the versions are compatible to solve the version problems.