Serve multiple spark uis from the same pod, host and port

42 views Asked by At

I have a problem with Apache Spark (version 3.3.1) on k8s.

In short

I want to make multiple spark web uis accessible from outside the K8S cluster, when they run on the same pod. In addition to it, when I would run this:

print(sc.uiWebUrl)

It would be "clickable" and I could access it from outside the cluster.

Long story

I work on a workspace for people who work with Apache Spark. The workspace would be a pod in K8S I want to let them run multiple Apache Spark programs, and I want those programs' UI would be displayed and accessible for the clients (for UX reasons).

To make it displayed, I need the value of the Apache Spark context's UI web URL value.

The UI's web URL would look like this:

http://{driver's host or public DNS if defined}:{driver's UI port}

I have tried to define the public DNS to an ingress's host value I have defined. But I can serve ports 80 (HTTP) and/or 443 (HTTPS) only.

So, when I run multiple Apache Spark programs, It would not work, as I can't serve multiple ports under the same host.

Other points

  • I don't have nginx ingress in my K8S cluster. I have OpenShift cluster.
  • I would prefer to use K8S objects as much as possible.
  • I would prefer that the clients would configure Apache Spark as least as possible.
  • I would prefer that the UI web URL of the Apache Spark's context would be set when creating the context, meaning before the pyspark-shell displays the welcome screen.

Thanks

Thanks ahead for reading my problem, and thanks ahead for helping me. Any help would be appreciated.

0

There are 0 answers