Is it possible to access elasticsearch's internal stats via Kibana - kibana

I can see from querying our elasticsearch nodes that they contains internal statistics that for example show disk, memory and CPU usage (for example via GET _nodes/stats API).
Is there anyway to access these in Kibana-4?

Not directly, as ElasticSearch doesn't natively push it's internal statistics to an index. However you could easily set something like this up on a *nix box:
Poll your ElasticSearch box via REST periodically (say, once a minute). The /_status or /_cluster/health end points probably contain what you're after.
Pipe these to a log file in a simple CSV format along with a time stamp.
Point logstash to these log files and forward the output to your ElasticSearch box.
Graph your data.

Related

how to log nginx.workers.fds_count

I'm trying to log the number of file descriptors nginx is using. The docs suggest i can access this data with nginx.workers.fds_count (docs) but that doesn't result in any useful data. How do i access it?

After a certain limit it should point to different vm

I have a file system in which the files are stored with the numeric number (SQL index) but my VM size is full and I can't shift my files to different Cloud or anything.
My file system URL will be
https://example.com/5374/randomstring.jpg
5374 is file number which is saved in SQL DB and a random string is generated.
What I'm planning to do is using nginx redirecting right now I have 56770 in a vm if a user tries to upload it will go and save in different vm and if user wants to access 56771 means using nginx it should point to that VM.
You will make your life easier by choosing the cutoff point yourself, it's not essential but it will make matching a regular expression a lot more concise.
If you said 56000 and above was on VM2 then your regex is as simple as /([5-9][6-9][0-9][0-9][0-9])/

Is there anyway to "ETL" data out of Graphite DB (Grafana)

We are trying to move some of our monitoring away from Grafana / Graphite DB into another system. Is there anyway to pull the full db data into a sql db?
You may use tools delivered from Whisper DB to extract data from files, i.e. and upload it into your DB.
[boss#DPU101 gn_bytes_total_card0]$ whisper-fetch gauge.wsp | head
1499842740 51993482526.000000
1499842800 51014501995.000000
1499842860 51011637567.000000
1499842920 51301789613.000000
1499842980 50994189020.000000
1499843040 50986821344.000000
This tool also allows you to extract data in JSON:
$ whisper-fetch --help
[...]
--json Output results in JSON form
You can use whisper utilities provided. You need to download it separately using following command.
on Ubuntu 14.04
apt-get install python-whisper
whisper-fetch.py program will allow you to download data into json format (or pretty format - separated by tab).
The data points will be for every 60 seconds.
Whisper Link

How could Bosun fit for my usecase?

I need an alerting system where I could have my own metric and threshold to report for anomalies (basically alerting on the basis of logs and data in DB). I explored Bosun but not sure how to make it work. I have following issues:-
There are pre-defined items which are all system level, but I couldn't find a way to add new items, i.e. custom items
How will bosun ingest data other than scollector. As I understand could I use logstash as data source and totally miss OpenTDSP( Really don't like HBase dependency)?
By Items I think you mean metrics. Bosun learns about metrics, and their tag relationships when you do one of the following:
Relay opentsdb data through Bosun (http://bosun.org/api#sending-data)
Get copies of metrics sent to the api/index route http://bosun.org/api#apiindex
There are also metadata routes, which tell bosun about the metric, such as counter/gauge, unit, and description.
The logstash datasource will be deprecated in favor of an elastic datasource in the coming 0.5.0 release. But it is replaced by an elastic one is better (but requires ES 2+). To use those expressions see the raw documentation (bosun.org docs will updated next release): https://raw.githubusercontent.com/bosun-monitor/bosun/master/docs/expressions.md. To add it you would have something like the following in the config:
elasticHosts=http://ny-lselastic01.ds.stackexchange.com:9200,http://ny-lselastic02.ds.stackexchange.com:9200,http://ny-lselastic03.ds.stackexchange.com:9200
The functions to query various backends are only loaded into the expression library when the backend is configured.

how do you connect and retrieve data from graphite (whisper)

Is there an R package to connect to graphite (whisper)?
Seems I am looking the same thing. For now I see only that ways:
Using jsnonlite within R to access graphite render URL API and get json or csv formatted data.
Get whisper data from via whisper-fetch (example usage described in russian IT blog (its automatically translated to English by google)

Resources