hadoop_exporter
hadoop_exporter copied to clipboard
A hadoop exporter for prometheus, scrape hadoop metrics (including HDFS, YARN, MAPREDUCE, HBASE. etc.) from hadoop components jmx url.
In my docker file I can only trace one hdfs service per hadoop exporter container, which means if i need to get all the metrics from a 12 container hdfs...
代码是不是还没开发完? 看样子应该还要起个服务,来提供/alert/getservicesbyhost 访问吧
File "/home/gw/module/hadoop_exporter-master/cmd/hive_server.py", line 52, in collect self._get_metrics(beans) File "/home/gw/module/hadoop_exporter-master/cmd/hive_server.py", line 160, in _get_metrics self._get_node_metrics(beans[i], service, host) UnboundLocalError: local variable 'host' referenced before assignment
Hi! Could you please specify the license? I am wondering if I can use it our project. Thanks
您好,问一下可视化是怎么做的,grafana的效果怎么样,谢谢
执行命令报错 [root@rhel6-140 hadoop_exporter-master]# python hadoop_exporter.py -c bigdata -hdfs http://XXX:50070/jmx usage: hadoop_exporter.py [-h] [-c cluster_name] -s services_api [-hdfs namenode_jmx_url] [-rm resourcemanager_jmx_url] [-dn datanode_jmx_url] [-jn journalnode_jmx_url] [-mr mapreduce2_jmx_url] [-nm nodemanager_jmx_url] [-hbase hbase_jmx_url]...