I've made an application with the following three bundles
myservice-api- contains model class to transport data
de.myservice.api.model.MyObject - contains service interface
de.myservice.api.services.MyServicewith functionvoid save(MyObject o)
- contains model class to transport data
myservice-impl- contains an implementation for
MyServiceand provides it via DOSGi
- contains an implementation for
frontendis just some code that references to theMyServiceimplementation via blueprint and calls thesavefunction
I created a cluster using the Apache Cellar feature with two Karaf servers on different machines. The bundles are deployed like this and show the status Active:
- machine001.company.int
myservice-apifrontend
- machine002.company.int
myservice-apimyservice-impl
When using cluster:list-nodes it shows that the cluster is set up, using cluster:service-list I can see that my service is provided by machine002:
Service Class | Provider Node
-----------------------------------------------------------------
de.myservice.api.services.MyService | machine002.company.int:5701
The OSGi linking and stuff works, but when the frontend tries to use the save function, I get the following exception on the service providing instance machine002:
2017-03-14 12:55:02,250 | WARN | pool-65-thread-1 | QueueConsumer | 328 - org.apache.karaf.cellar.hazelcast - 4.0.3 |
CELLAR HAZELCAST: consumer task failed to poll the queue com.hazelcast.nio.serialization.HazelcastSerializationException: java.lang.ClassNotFoundException: de.myservice.api.model.MyObject
at com.hazelcast.internal.serialization.impl.JavaDefaultSerializers$JavaSerializer.read(JavaDefaultSerializers.java:224)[322:com.hazelcast:3.7.2]
at com.hazelcast.internal.serialization.impl.StreamSerializerAdapter.read(StreamSerializerAdapter.java:46)[322:com.hazelcast:3.7.2]
at com.hazelcast.internal.serialization.impl.AbstractSerializationService.toObject(AbstractSerializationService.java:172)[322:com.hazelcast:3.7.2]
at com.hazelcast.spi.impl.operationservice.impl.InvocationFuture.resolve(InvocationFuture.java:103)[322:com.hazelcast:3.7.2]
at com.hazelcast.spi.impl.operationservice.impl.InvocationFuture.resolveAndThrow(InvocationFuture.java:74)[322:com.hazelcast:3.7.2]
at com.hazelcast.spi.impl.AbstractInvocationFuture.get(AbstractInvocationFuture.java:158)[322:com.hazelcast:3.7.2]
at com.hazelcast.collection.impl.queue.QueueProxySupport.invokeAndGet(QueueProxySupport.java:177)[322:com.hazelcast:3.7.2]
at com.hazelcast.collection.impl.queue.QueueProxySupport.pollInternal(QueueProxySupport.java:124)[322:com.hazelcast:3.7.2]
at com.hazelcast.collection.impl.queue.QueueProxyImpl.poll(QueueProxyImpl.java:88)[322:com.hazelcast:3.7.2]
at org.apache.karaf.cellar.hazelcast.QueueConsumer.run(QueueConsumer.java:93)[328:org.apache.karaf.cellar.hazelcast:4.0.3]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)[:1.8.0_45]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)[:1.8.0_45]
at java.lang.Thread.run(Thread.java:745)[:1.8.0_45]
Caused by: java.lang.ClassNotFoundException: de.myservice.api.model.MyObject
at org.apache.karaf.cellar.core.utils.CombinedClassLoader.findClass(CombinedClassLoader.java:79)[326:org.apache.karaf.cellar.core:4.0.3]
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)[:1.8.0_45]
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)[:1.8.0_45]
at com.hazelcast.nio.ClassLoaderUtil.tryLoadClass(ClassLoaderUtil.java:151)
at com.hazelcast.nio.ClassLoaderUtil.loadClass(ClassLoaderUtil.java:120)
at com.hazelcast.nio.IOUtil$ClassLoaderAwareObjectInputStream.resolveClass(IOUtil.java:358)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1613)[:1.8.0_45]
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1518)[:1.8.0_45]
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1774)[:1.8.0_45]
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)[:1.8.0_45]
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)[:1.8.0_45]
at java.util.LinkedList.readObject(LinkedList.java:1149)[:1.8.0_45]
at sun.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)[:1.8.0_45]
at java.lang.reflect.Method.invoke(Method.java:497)[:1.8.0_45]
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)[:1.8.0_45]
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1896)[:1.8.0_45]
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)[:1.8.0_45]
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)[:1.8.0_45]
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)[:1.8.0_45]
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)[:1.8.0_45]
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)[:1.8.0_45]
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)[:1.8.0_45]
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:371)[:1.8.0_45]
at com.hazelcast.internal.serialization.impl.JavaDefaultSerializers$JavaSerializer.read(JavaDefaultSerializers.java:219)[322:com.hazelcast:3.7.2]
... 12 more
The myservice-api bundle is deployed on both machines so the class should be available - is there something else I must configure so that hazelcast knows where to take the classes from for de/serialization?
Looks like your api bundle actually doesn't export the required packages. Therefore make sure it really does. And since it's working on the "server" side, I'm sure your implementation bundle might have those API packages included. So make sure your exports/imports etc. for your bundles is correct.