@@ -8,7 +8,7 @@ By default, SparkSession is created with ``spark.master: local``, including all
88
99It is possible to alter default [ Spark Session configuration] ( https://spark.apache.org/docs/latest/configuration.html ) _ worker settings:
1010
11- ``` yaml
11+ ``` yaml title="config.yaml"
1212worker :
1313 spark_session_default_config :
1414 spark.master : local
@@ -20,7 +20,7 @@ worker:
2020
2121For example, to use SyncMaster on Spark-on-K8s, you can use worker image for Spark executor containers:
2222
23- ` ` ` yaml
23+ ` ` ` yaml title="config.yaml"
2424worker :
2525 spark_session_default_config :
2626 spark.master : k8s://https://kubernetes.default.svc
@@ -41,7 +41,7 @@ worker:
4141
4242It is also possible to use custom function which returns ` ` SparkSession` ` object:
4343
44- ` ` ` yaml
44+ ` ` ` yaml title="config.yaml"
4545worker :
4646 create_spark_session_function : my_worker.spark.create_custom_spark_session
4747` ` `
@@ -60,8 +60,8 @@ def create_custom_spark_session(
6060 target : ConnectionDTO,
6161 settings : WorkerSettings,
6262) -> SparkSession :
63- # any custom code returning SparkSession object
64- return SparkSession.builde.config(...).getOrCreate()
63+ # any custom code returning SparkSession object
64+ return SparkSession.builde.config(...).getOrCreate()
6565```
6666
6767Module with custom function should be placed into the same Docker image or Python virtual environment used by SyncMaster worker.
0 commit comments