Very new to this. Importing with logstash a csv file with 2 columns, like this
lun 16 feb 15; 3,00
mar 17 feb 15; 4,00
...
1st colum is date, 2nd is Humidity
I then want to produce a very simple grapf with kibana showing Data on X, Humidity value on Y - superbasic stuff
Looks like I am not able to have the 2 fileds imported properly or recognized by Kibana as field1: date field2: number
here is what I get
{
"_index": "prova-2015.02.12",
"_type": "logs",
"_id": "AUt9lYFzON9412qlRdDl",
"_score": 1,
"_source": {
"message": [
"lun 16 feb 15;3,00"
],
"#version": "1",
"#timestamp": "2015-02-12T11:38:43.283Z",
"host": "ELK-Dev-and-Demo",
"path": "/home/elkadmin/Documenti/Analytics/data-value.csv",
"Data": "lun 16 feb 15",
"HUM": "3,00"
},
"fields": {
"#timestamp": [
1423741123283
],
"Data": [
"15"
]
}
}
Still in Kibana 4 looks like the numeris value is interpreted like a string. What am I doing wrong in importing it?
Logstash conf file
input {
file {
path => "/home/elkadmin/Documenti/Analytics/data-value.csv"
start_position => "beginning"
}
}
filter {
csv {
columns => ["Data", "HUM"]
separator => ";"
}
}
output {
elasticsearch {
action => "index"
protocol => "http"
host => "localhost"
index => "prova-%{+YYYY.MM.dd}"
workers => 1
}
stdout {
codec => rubydebug
}
}
Data file has been saved to csv from an Excel file ( with proper cells type set for date and number);
mer 11 feb 15;1,00
gio 12 feb 15;4,00
ven 13 feb 15;5,60
sab 14 feb 15;8,00
dom 15 feb 15;12,50
lun 16 feb 15;3,00
mar 17 feb 15;4,60
mer 18 feb 15;7,00
gio 19 feb 15;2,20
ven 20 feb 15;5,00
sab 21 feb 15;4,50
dom 22 feb 15;2,35
lun 23 feb 15;3,00
mar 24 feb 15;6,00
mer 25 feb 15;9,10
gio 26 feb 15;2,20
Final question is also how to define the proper Visualization to show Dates and Values, why in Y do I always get options for Aggregate and not the specific value for a date?
Date histogram on X does not work , Fatal Erro when I select it and apply ( more details in a further Q when I have understood you to have Kibana recognize date and numbers )
Thanks in advance
F
I don't see that you're doing anything to make these fields be interpreted as anything other than a string.
Two options come to mind:
Set a mapping for the index that specifies 'date' as a date field and 'hum' as an integer.
Use logstash's mutate->convert feature (for the int) and date{} (for the date) to get the fields in the correct format before inserting into elasticsearch.
If you use #1, note that Elasticsearch will drop any record that can't be coerced into the right type.
Related
i am having some trouble searching for an entity in my CosmicMind/Graph database.
Here is the code, which is pretty explanatory
//User 1
//data: 14 Febbraio 2017
//ora : 15:40
//name: Paolo
//User 2
//data: 14 Febbraio 2017
//ora : 12:40
//name: Ernesto
//User 3
//data: 13 Febbraio 2017
//ora : 16:40
//name: Paolo
/*Search Parameters*/
//dataSearch = 13 Febbraio 2017
//oraSearch = 16:40
//nameSearch = Paolo
var search = Search<Entity>(graph: graph).for(types: "Users").where(properties: (key: "data", value: dataSearch)).where(properties: (key:"ora", value: oraSearch)).where(properties: (key:"name", value: nameSearch))
//returns [User1, User3]
I am expecting from the search [User3], since the search parameters coincide with that entity, but instead, search is returning [User1, User3], like if dataSearch and oraSearch parameters are being ignored, and only the last search parameter nameSearch is being used for searching.
What am i doing wrong?
You have too many where statements. They are replacing themselves with each successive statement.
this:
.where(properties: (key: "data", value: dataSearch)).where(properties: (key:"ora", value: oraSearch)).where(properties: (key:"name", value: nameSearch)
should be:
.where(properties: (key: "data", value: dataSearch), (key:"ora", value: oraSearch), (key:"name", value: nameSearch))
or shorthand:
.where(properties: ("data", dataSearch), ("ora", oraSearch), ("name", nameSearch))
That's it :)
EDIT: I "solved" using another parameter for retrieving the records, a String type. It worked perfectly as searchingParameter, instead of Date type.Then, i did some operation on the data to isolate the correct result.
Maybe my xcode console can help you, but first i show you my code:
let search = Search<Entity>(graph: DataManager.shared.graph).for(types: DataManager.shared.entityType).where(properties: (key: "data", value: (DataManager.shared.datasource[0][0]["data"] as! Date)))
print("The date i am searching for->",DataManager.shared.datasource[0][0]["data"]!)
for (index,res) in search.sync().enumerated(){
print("Result #\(index)->\(res["data"]!)")
}
print("Total results found->",search.sync().count)
print("But only 10 records meet the requirement, not 22")
The date i am searching for-> 2017-02-16 11:19:14 +0000
Result #0->2017-02-15 22:31:28 +0000
Result #1->2017-02-15 22:21:51 +0000
Result #2->2017-02-15 22:31:43 +0000
Result #3->2017-02-15 22:44:31 +0000
Result #4->2017-02-16 10:56:37 +0000
Result #5->2017-02-16 10:56:48 +0000
Result #6->2017-02-16 10:59:23 +0000
Result #7->2017-02-15 22:32:01 +0000
Result #8->2017-02-16 10:56:21 +0000
Result #9->2017-02-15 22:23:06 +0000
Result #10->2017-02-16 11:16:00 +0000
Result #11->2017-02-16 11:19:14 +0000
Result #12->2017-02-15 22:32:12 +0000
Result #13->2017-02-15 22:42:12 +0000
Result #14->2017-02-16 11:18:07 +0000
Result #15->2017-02-16 10:59:59 +0000
Result #16->2017-02-15 22:31:36 +0000
Result #17->2017-02-16 10:58:24 +0000
Result #18->2017-02-16 10:59:07 +0000
Result #19->2017-02-15 22:23:22 +0000
Result #20->2017-02-15 22:31:49 +0000
Result #21->2017-02-15 22:32:18 +0000
Total results found-> 22
But only 10 records meet the requirement, not 22
I'm working on a Parser which Parses log files from a game so I can do analysis on auctions made within the game, however the date format that's being written by the logger seems to be causing problems as the format seems to be custom written for the logger, an example datetime stamp looks like: [Wed Nov 23 23:26:10 2016] I try to Parse it with:
func (r *AuctionReader) extractSaleInformation(line string) {
fmt.Println("Extracting information from: ", line)
// Format mask for output
layout := "DD-MM-YYYY hh:mm:ss"
// Replace the square brackets so we're just left with the date-time string
date := strings.TrimSpace(strings.Replace((strings.Split(line, "]")[0]), "[", "", -1))
fmt.Println(time.Parse(date, layout))
}
When I attempt to Parse the above date-time string I get the following error:
0001-01-01 00:00:00 +0000 UTC parsing time "DD-MM-YYYY hh:mm:ss" as "Wed Nov 23 23:26:10 2016": cannot parse "DD-MM-YYYY hh:mm:ss" as "Wed Nov "
How am I able to get the parser to recognise this seemingly custom format, I will be saving this data to Mongo so I don't want to store the auction time as a string as I want to query the timestamps individually.
Golang handle all date formatting in a unique way - it uses the reference time Mon Jan 2 15:04:05 MST 2006 (01/02 03:04:05PM '06 -0700) to show the pattern with which to format/parse a given time/string.
So, to read the format "Wed Nov 23 23:26:10 2016" you would put the reference date into that format: "Mon Jan 2 15:04:05 2006", and then do:
t, _ := time.Parse("Mon Jan 2 15:04:05 2006", "Wed Nov 23 23:26:10 2016")
Then, to output it in the given format, if you wanted the format DD-MM-YYYY hh:mm:ss, you would put the reference time into that format: 02-01-2006 15:04:05, and then do:
t.Format("02-01-2006 15:04:05")
https://play.golang.org/p/VO5413Z7-z
So basically, the main change is
// Format mask for output
layout := "DD-MM-YYYY hh:mm:ss"
should be
// Format mask for output
layout := "02-01-2006 15:04:05"
and
time.Parse(date, layout)
should be
time.Parse(layout, date)
I am working on twitter data and have a field: user_created_at that looks like Thu Jun 11 16:41:35 +0000 2015.
I am not sure what the type of the field is since I got the fields using elephant bird. To covert it into datetime type, I did:
ToDate(user_created_at, 'yyyy.MM.dd') as user_created_at
but it failed with an error:
ERROR 0: Exception while executing [POUserFunc (Name: POUserFunc(org.apache.pig.builtin.ToDate2ARGS)[datetime] - scope-148 Operator Key: scope-148) children: null at []]: java.lang.IllegalArgumentException: Invalid format: "Thu Jun 11 16:41:35 +0000 2015".
What is wrong? I am using Pig version 0.15. Appreciate any help. Thanks!
Match datetime format with input datetime string. Something like this.
ToDate(user_created_at, 'EEE MMM dd HH:mm:ss Z yyyy')
so I wanted to start playing with it and test it, so i put this config:
storage-schemas.conf:
[short2]
pattern = ^short2\.
retentions = 10s:1m
storage-aggregation.conf
[sum]
pattern = \.count$
xFilesFactor = 0
aggregationMethod = sum
what I think that my config say:
get data every 10 seconds and save it to 1 minutes so total of 10 points will be saved
now if i go to
http://localhost/render/?target=short2.sum&format=json&from=-1h
I see many data with null values a lot more than 10,
ok so I give up on that, than I said let's try to feed it data once every 10 seconds, if i do
echo "short2.sum 22 `date +%s`" | nc -q0 127.0.0.1 2003
wait 11 seconds
echo "short2.sum 23 `date +%s`" | nc -q0 127.0.0.1 2003
now looking at the api I can see only the last point get registerd like:
[
null,
1464781920
],
[
null,
1464781980
],
[
null,
1464782040
],
[
null,
1464782100
],
[
23,
1464782160
],
now if I send it another point (a lot after 10 seconds)
echo "short2.sum 24 `date +%s`" | nc -q0 127.0.0.1 2003
this is what I get:
[
null,
1464781920
],
[
null,
1464781980
],
[
null,
1464782040
],
[
null,
1464782100
],
[
24,
1464782160
],
only once in a couple of tries I will see them count as new but they just overwriting each other instead of acting like new data
Actually:
[short2]
pattern = ^short2\.
retentions = 10s:1m
means: all metrics starts with short2. keep for 1 minute with 10 second resolution (each datapoint represents 10s). It also means if there are not defined other storage schemas for short2., it will have value only for 1 (last) minute.
http://graphite.readthedocs.io/en/latest/config-carbon.html#storage-schemas-conf
xAxis: {
type: 'datetime',
maxZoom: 14 * 24 * 3600000,
dateTimeLabelFormats:{
day: '%e-%b-%Y',
week: '%e-%b-%Y',
month: '%b \'%Y',
year: '%Y'
},
title: {
text: 'Days'
},
labels: {
y: 40,
rotation: 60
},
tickmarkPlacement:'on',
startOnTick: true,
endOnTick: true
}
I have added a column chart using high chart gallery. I used xaxis as datetime type. This chart shows previous 30 days report. I am giving a starting date to it but when chart is rendered its showing 1 or 2 extra dates in starting and same in ending. Ex- if I am giving 1 may as a starting date it should show from 1 may to 30 may but its showing from 30 apr to 1 june or 30 apr to 31 may?