The following sections explain how the performance of Spark can be monitored via JMX.
Setting up JMX to monitor Spark
To enable Spark monitoring via JMX, follow the steps below:
If your DAS deployment is a cluster, the following configurations need to be added in all the nodes of the cluster.
- Stop the DAS server(s). For detailed instructions, see Running the Product.
Create a file named
metrics.properties, include the following configuration in it and save in in the
<DAS_HOME>/repository/conf/analytics/spark/spark-defaults.conffile, add a reference to the
metrics.propertiesfile you created in the previous step as shown below.
Add the following configuration under the
- Start the DAS server(s).
Viewing information in the J Console
Once you have done the required configurations to enable Spark monitoring via JMX as instructed in Setting up JMX to monitor Spark, you can view information relating to Spark performance as described in the WSO2 Product Administration Guide - JMX-Based Monitoring.
If your DAS deployment is a cluster, you can view information for the following additional process:
- org.apache.spark.executor.CoarseGrainedExecutorBackend: This displays information related to the Spark worker JVM.