We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
1、通过prometheus-exporter-collector收集的数据如下 [{ "metric": "redis_commands_processed_total", "endpoint": "127.0.0.1", "timestamp": 1631013680, "step": 0, "value": 1970, "counterType": "COUNTER", "tags": "aaa=hellw,dept=cloud", "tagsMap": {}, "extra": "" }, { "metric": "redis_memory_used_peak_bytes", "endpoint": "127.0.0.1", "timestamp": 1631013680, "step": 0, "value": 813360, "counterType": "GAUGE", "tags": "aaa=hellw,dept=cloud", "tagsMap": {}, "extra": "" } ] 2、server端的日志却提示tags是map格式,这个是已知问题吗? json: cannot unmarshal string into Go struct field MetricValue.tags of type map[string]string
The text was updated successfully, but these errors were encountered:
补充一点server版本: [root@node0002 server]# ./n9e-server -v version: 5.0.0-rc6 [root@node0002 server]# prometheus-exporter-collector 是直接下载的二进制文件
Sorry, something went wrong.
5.0.0版本 nightingale的数据结构变了,不支持了。
5.0的版本已经不需要这个prometheus-exporter-collector了,5.0直接使用Prometheus作为存储,那就可以直接在Prometheus.yml中配置抓取规则
No branches or pull requests
1、通过prometheus-exporter-collector收集的数据如下
[{
"metric": "redis_commands_processed_total",
"endpoint": "127.0.0.1",
"timestamp": 1631013680,
"step": 0,
"value": 1970,
"counterType": "COUNTER",
"tags": "aaa=hellw,dept=cloud",
"tagsMap": {},
"extra": ""
},
{
"metric": "redis_memory_used_peak_bytes",
"endpoint": "127.0.0.1",
"timestamp": 1631013680,
"step": 0,
"value": 813360,
"counterType": "GAUGE",
"tags": "aaa=hellw,dept=cloud",
"tagsMap": {},
"extra": ""
}
]
2、server端的日志却提示tags是map格式,这个是已知问题吗?
json: cannot unmarshal string into Go struct field MetricValue.tags of type map[string]string
The text was updated successfully, but these errors were encountered: