spark-monitoring
spark-monitoring copied to clipboard
Log Analytics query memory
SparkMetric_CL
| where name_s contains "driver.jvm.total."
| where executorId_s == "driver"
| extend memUsed_GB = value_d / 1000000000
| project TimeGenerated, name_s, memUsed_GB
| summarize max(memUsed_GB) by tostring(name_s), bin(TimeGenerated, 1m)
Hi, I know Driver memory consumption could use the above one to query. I am now using Azure log analytics to monitor Databricks, and it has one driver and one worker node. So, how about worker node? How could I query memory usage for worker node(or the total memory usage)?
Many Thanks!