Ports Used by Spark

To run a Spark job from a client node, ephemeral ports should be opened in the cluster for the client from which you are running the Spark job.

If you do not want to open all the ephemeral ports, you can use the configuration parameter to specify the range of ports.

To set ports to special values, use the spark.driver.port, spark.blockManager.port, and spark.port.maxRetries properties. The spark.port.maxRetries property is 16 by default.

For example, if you need to open port 200 for spark.blockManager.port from 40000, set spark.blockManager.port = 40000 and spark.port.maxRetries = 200.

Refer to the open source documentation for more information.