I’ve read the Kibana website so I know theoretically what Kibana can be used for, but I’m interested in real-world stories. Is Kibana solely used for log analysis? Can it be used as some kind of BI tool across your actual data set? Interested to hear what kind of applications its useful in.
Kibana is very useful for visualizing mixed types of data, not just numbers - metrics, but also text and GEO data. You can use to Kibana to visualize:
real-time data about visitors of your webpage
number of sales per region
locations from sensor data
emails sent, server load, most frequent errors
... and many other there's a plethora of use cases, you only need to feed your data into Elasticsearch (and find appropriate visualization)
Kibana is basically an analytics and visualization platform, which lets you easily visualize data from Elasticsearch and analyze it to make sense of it. You can assume Kibana as an Elasticsearch dashboard where you can create visualizations such as pie charts, line charts, and many others.
There are like the infinite number of use cases for Kibana. For example, You can plot your website’s visitors onto a map and show traffic in real time. Kibana is also where you configure change detection and forecasting. You can aggregate website traffic by the browser and find out which browsers are important to support based on your particular audience. Kibana also provides an interface to manage authentication and authorization regarding Elasticsearch. You can literally think of Kibana as a Web interface to the data stored on Elasticsearch.
Related
I have an object delivery service on Akamai and i would like to fetch the usage data for our endpoints and send it to graphite and create some graphs on grafana.
so appreciate any suggestions or ideas to achieve this ?
We have built a Prometheus exporter for Akamai data, the cloudmonitor_exporter. If you couple this with the graphite_exporter, you should be able to get the data into Graphite (albeit a bit roundabout). If you want to go directly to Graphite, you can probably pick up some inspiration from how we built it for Prometheus.
(Note: This requires that you have the "Cloud Monitor" product in your Akamai account)
As I know kibana and opentsdb both can be used in conjunction with Elastic logstash kibana (ELK) stack.
OpenTSDB offers a built-in, simple user interface for selecting one or more metrics and tags to generate a graph as an image. Alternatively an HTTP API is available to tie OpenTSDB into external systems such as monitoring frameworks, dashboards, statistics packages or automation tools. Each (Time series daemon)TSD uses the open source database HBase to store and retrieve time-series data.
Kibana can also be used for plotting metrics with access logs and custom logs, how does opentsdb helps in the system?
OpenTSDB is a time-series database. You could use OpenTSDB alongside Elasticsearch, for the metrics component. You would use Elasticsearch for text (logs perhaps) and OpenTSDB for metrics.
If you are using Logstash, there is an output that can export metrics from logs into OpenTSDB.
Kibana is a visualization tool for Elasticsearch queries. It assists you with searching and viewing the data that is stored in Elasticsearch. It is only used with Elasticsearch.
If you would like a unified front-end for Elasticsearch and OpenTSDB, you could consider Grafana, which has support for both Elasticsearch and OpenTSDB, but less functionality than Kibana in regard to Elasticsearch.
I am trying to track some basic data from multiple domains (Monthly hits, Session length, etc). Each domain is owned by a client, and it is a business requirement for me to have access to their statistics.
I have the access needed to add in a tracking code to each domain, similar to GA's tracking code. Each of these should transmit data to my master server, which consolidates and stores the info.
What would be the best way to go about doing this? Is this sort of tracker best built from scratch, or is there an open source analytics provider which can meet this specific use case? (Many tracking end-points, single database)
I will be aggregating the visits, duration, etc for each user separately, so as to compare and rank them against each other.
Hello Sainath Krishnan,
In general terms, it all depends on how you want to see these data: whether you want to aggregate all the sites together, whether you want to see the users as only one across these multiple sites, and so on.
For instance, since you've installed the proper tracking code, which you administrate, on these site, you will be able to see all sort of traffic data that these tracking codes will send to your Google Analytics account and properties.
Could you be more specific regarding the way you want to see the data, or even if you'll deal with goals os specifics KPI's ?
As we know Elasticsearch stores, search and analyses data and then shows it on Kibana. But I have my data already stored in PostgreSQL and we have to deal with huge data, so storing it in Elasticsearch for seeing a graph on Kibana is not good. There will be duplication like we have same data in Postgres as well as in Elasticsearch and I have huge data (full traffic from a telecom company) and we want to build a reporting tool.
Kibana has all the features that we want but we don't want this duplication of data. I mean we want to use only Kibana. Is it possible? And what should I do to avoid this problem? What are the possibilities?
My opinion. If you have all this data, and it is not in a non-sql, document database, your are going about it the wrong way. Either it's elasticsearch or mongo, you should use that kind of databases.
As far as I know, there is no way of using Kibana to display information from something other than Elasticsearch.
You could check out Grafana http://grafana.org/, it has that and more.
Good luck.
For connecting to SQL databases, Tableau is one of the best options. As I worked with both Tableau and Kibana, I can tell that Tableau supports almost all operations that are supported by Kibana and also Tableau can generate graphs for complex visualizations like
sum(field1)/ sum(field2) over values of field3.
which can not be generated by using Kibana.
This is way late, but the way I would tackle this is to write an app that pulls data out of your database and publishes to elasticsearch. Sure there is duplication of data, but you can focus on only that data you care about. You also wouldn't be querying against a production database when displaying charts in kibana, which can add its own complications.
My GA Account has a number(50) of profiles associated with it and I am trying to build an api which shows me the basic information like visits, bounce rates etc. for each profile.
This query gets me what I want from GA, but for each profile:
URL ="https://www.google.com/analytics/feeds/data?ids=ga:11111&start-date=2011-07-01&end-date=2011-07-02&metrics=ga:visitors&prettyprint=true&alt=json"
The id is table id and the metrics gives me the information I want.
Now the problem is, I want to show all the information together. So, everytime I will have to send 50 requests to the API, which just doesn't work out. Is there a way I can get the information for all the profiles associated with me in a single request?
You unfortunately will be required to perform 50 requests if you want metrics for 50 different profiles. You can easily automate this, however, by using a combination of the Management API and the Data Export API.
The Management API allows you to pull information about the account. For example, you can very easily pull all profile IDs and names associated with an Analytics account through this API for use in an automated query.
The Data Export API, which I am sure you already are familiar with, is the only way to pull collected data/statistics for individual profiles.
If you are concerned about speed, you might want to build an automated process that uses both the Management API and the Data Export API. Pull all of the profiles associated with your account with the Management API, then loop through each and pull the basic data you'd like through the Data Export API. Have this run at regular intervals based on your needs and cache it between runs. This way it won't execute every time the page is hit (though you honestly might be fine, depending on your traffic - I've found it to be extremely quick).