When I try to add Repository support to my Spring boot application with Spark integrated, the application is not starting up and I get the below exception trace.
org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:416) at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:406)
Caused by: java.io.NotSerializableException: org.springframework.data.jpa.repository.support.CrudMethodMetadataPostProcessor$CrudMethodMetadataPopulatingMethodInterceptor at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
This is thrown on executing the below code:
JavaDStream changePayStream = records.map(new Function<ConsumerRecord<String, String>, String>() {
/**
*
*/
private static final long serialVersionUID = 1L;
@Override
public String call(ConsumerRecord<String, String> kafkaRecord) throws Exception {
return kafkaRecord.offset() + "~>" + kafkaRecord.partition() + "~>" + kafkaRecord.key() + "~>"
+ kafkaRecord.value();
}
});
If I remove the spring boot OOTB Repository code, then this exception is not thrown.
Repository Interface: @Repository public interface XYZRepository extends JpaRepository<XYZRepositoryEntity, Integer> {
@Query(value = QueryConstants.QUERY_ABC, nativeQuery = true)
Collection<AsgnCtgySubCtgyRepositoryEntity> findAllXyzNative();
}
Entity Class: import java.io.Serializable;
@Entity public class XYZRepositoryEntity implements Serializable {
/**
*
*/
private static final long serialVersionUID = 1L;
@Id
private Integer id;
......
Could you please let me know how to solve this?
Thanks, Venkatesh
I tried to add implements Serializable (both java.io and scala packages) in all the entity classes and other classes applicable. But this exception is mainly on the OOTB class - org.springframework.data.jpa.repository.support.CrudMethodMetadataPostProcessor$CrudMethodMetadataPopulatingMethodInterceptor.