Datetime xaxis in high charts showing extra labels - datetime

xAxis: {
type: 'datetime',
maxZoom: 14 * 24 * 3600000,
dateTimeLabelFormats:{
day: '%e-%b-%Y',
week: '%e-%b-%Y',
month: '%b \'%Y',
year: '%Y'
},
title: {
text: 'Days'
},
labels: {
y: 40,
rotation: 60
},
tickmarkPlacement:'on',
startOnTick: true,
endOnTick: true
}
I have added a column chart using high chart gallery. I used xaxis as datetime type. This chart shows previous 30 days report. I am giving a starting date to it but when chart is rendered its showing 1 or 2 extra dates in starting and same in ending. Ex- if I am giving 1 may as a starting date it should show from 1 may to 30 may but its showing from 30 apr to 1 june or 30 apr to 31 may?

Related

How to get friday to friday weeks from two date range using moment

I am trying to get week start as friday & end date as friday and I tried to use startOf/endOf week(week/isoweek) but failed. Is there any way that I can get friday as start of week and Friday as end of week using moment.
Moment(date).startOf('week'); // or isoweek
Output should be,
Date of friday
Request data:
First date= 05-09-2019
End date= 05-15-2019(current date)
Expected output:
[
{
Weekstart: 05-03-2019,
Weekend: 05-10-2019
},
{
Weekstart: 05-10-2019,
Weekend: 05-17-2019
}
]
There is no option for setting start day of week . But you can fetch last Friday date using
moment().weekday(-2).format("YYYY-DD-MM")
You can update the week start for a locale using something like:
moment.updateLocale(moment.locale(), { week: { dow: 5 } })
moment().startOf('week').toString(); // Fri May 10 2019 00:00:00 GMT+0100
You can do a short custom function to always give you start and end of the week based on a passed isoWeekDay:
let getCustomWeek = (dayOfWeek=7, date=new Date()) => {
let firstDay, lastDay, passedDay = moment(date).isoWeekday(dayOfWeek)
firstDay = moment(passedDay).subtract(1, 'week')
lastDay = moment(firstDay).add(1, 'week')
return { start: firstDay.format(), end: lastDay.format() }
}
let dateRange = ['05-09-2019', '05-15-2019']
console.log('Friday: ', dateRange.map(d => getCustomWeek(5, new Date(d))))
console.log('Wednesday: ', dateRange.map(d => getCustomWeek(3, new Date(d))))
<script src="https://cdnjs.cloudflare.com/ajax/libs/moment.js/2.24.0/moment.js"></script>

JavaScript array value to chart.js graph as stacked graph

I have the following data
new Chart(ctx2, {
type: 'bar',
data: {
labels: $scope.teamGraphAssociateName,
datasets: [{
data: $scope.teamGraphAgileRewards,
backgroundColor: $scope.backgroundColors,
borderWidth: 1.5
}
]
}
});
This data gets success data of all the associates name on the (X axis) from labels.
For the $scope.teamGraphAgileRewards in console i'm getting data like this:
(3) [Array(1), Array(2), Array(1)]
0: [7]
1: (2) [2, 3]
2: [10]
length: 3
I'm getting the grap like this.(only the last array [10] is getting visible on graph)
Y
|
|
|
| 10
|_______________ X
ASS1 ASS2 ASS3
(labels)
But I want this data to be visible on stacked bar graph like this.
Y
|
| 3
|
| 7 2 10
|_______________ X
ASS1 ASS2 ASS3
(labels)
Well, you will need to rearange your data. Data structure of chart is different than this you are sending to chart. See example:
var ctx = document.getElementById("myChart");
var myChart = new Chart(ctx, {
type: 'bar',
data: {
labels: ["ASS1", "ASS2", "ASS3"],
datasets: [{
stack: 'Stack 0',
data: [7, 3, 10],
backgroundColor: 'blue'
},
{
stack: 'Stack 0',
data: [0, 2, 0],
backgroundColor: 'green'
}]
},
options: {
legend: {
display: false
},
responsive: false,
scales: {
xAxes: [{
stacked: true,
}],
yAxes: [{
stacked: true
}]
}
}
});
That's how your chart should look like. Each data what you add to dataset is sent to labels, so when you send arrays in array it can't be recognized, that you want to add multiple values for one label. Group your values in way, that you have 3 values for one dataset (['ASS1value', 'ASS2value', 'ASS3value']), like I've added in sample.

Kibana 4 - numeric fields

Very new to this. Importing with logstash a csv file with 2 columns, like this
lun 16 feb 15; 3,00
mar 17 feb 15; 4,00
...
1st colum is date, 2nd is Humidity
I then want to produce a very simple grapf with kibana showing Data on X, Humidity value on Y - superbasic stuff
Looks like I am not able to have the 2 fileds imported properly or recognized by Kibana as field1: date field2: number
here is what I get
{
"_index": "prova-2015.02.12",
"_type": "logs",
"_id": "AUt9lYFzON9412qlRdDl",
"_score": 1,
"_source": {
"message": [
"lun 16 feb 15;3,00"
],
"#version": "1",
"#timestamp": "2015-02-12T11:38:43.283Z",
"host": "ELK-Dev-and-Demo",
"path": "/home/elkadmin/Documenti/Analytics/data-value.csv",
"Data": "lun 16 feb 15",
"HUM": "3,00"
},
"fields": {
"#timestamp": [
1423741123283
],
"Data": [
"15"
]
}
}
Still in Kibana 4 looks like the numeris value is interpreted like a string. What am I doing wrong in importing it?
Logstash conf file
input {
file {
path => "/home/elkadmin/Documenti/Analytics/data-value.csv"
start_position => "beginning"
}
}
filter {
csv {
columns => ["Data", "HUM"]
separator => ";"
}
}
output {
elasticsearch {
action => "index"
protocol => "http"
host => "localhost"
index => "prova-%{+YYYY.MM.dd}"
workers => 1
}
stdout {
codec => rubydebug
}
}
Data file has been saved to csv from an Excel file ( with proper cells type set for date and number);
mer 11 feb 15;1,00
gio 12 feb 15;4,00
ven 13 feb 15;5,60
sab 14 feb 15;8,00
dom 15 feb 15;12,50
lun 16 feb 15;3,00
mar 17 feb 15;4,60
mer 18 feb 15;7,00
gio 19 feb 15;2,20
ven 20 feb 15;5,00
sab 21 feb 15;4,50
dom 22 feb 15;2,35
lun 23 feb 15;3,00
mar 24 feb 15;6,00
mer 25 feb 15;9,10
gio 26 feb 15;2,20
Final question is also how to define the proper Visualization to show Dates and Values, why in Y do I always get options for Aggregate and not the specific value for a date?
Date histogram on X does not work , Fatal Erro when I select it and apply ( more details in a further Q when I have understood you to have Kibana recognize date and numbers )
Thanks in advance
F
I don't see that you're doing anything to make these fields be interpreted as anything other than a string.
Two options come to mind:
Set a mapping for the index that specifies 'date' as a date field and 'hum' as an integer.
Use logstash's mutate->convert feature (for the int) and date{} (for the date) to get the fields in the correct format before inserting into elasticsearch.
If you use #1, note that Elasticsearch will drop any record that can't be coerced into the right type.

Google Charts Numeric Date

Error: column 0 is not numeric
Column 0 is my date column
data.addColumn('date', 'Date');
...
// The variable date here is an epoch
var fullDate = new Date((parseInt(date)*1000));
r.push(new Date(fullDate.getYear(), fullDate.getMonth(), fullDate.getDay()));
// Logging r here yields: Mon Mar 06 113 00:00:00 GMT+1100 (EST)
...
var slider = new google.visualization.ControlWrapper({
'controlType': 'NumberRangeFilter',
'containerId': 'control1',
'options': {
'filterColumnLabel': 'Date',
'ui': {'labelStacking': 'vertical'}
}
});
I guess I need to find a way to make date numeric?
Another solution would be to instead of making date numeric, change the control type of the slider.
'controlType': 'ChartRangeFilter',

how to solve netCDF library : Attempt to convert between text & numbers

I like to hear your suggestions for solving the netCDF library problem. When I use a program netcdf merge to combine several small netcdf files, I get the following messages:
The netCDF library has reported the following problem:
NetCDF: Attempt to convert between text & numbers
I stuck~~ Please let me know any suggestions. Thanks a lot.
The following is my log file:
imerge: Merging 61 npptot*.nc files, using 1 buffers
Number of dimensions: 4
Dimension id: 0 Name: longitude Length: 4
Dimension id: 1 Name: latitude Length: 1
Dimension id: 2 Name: time Length: 60
Dimension id: 3 Name: lengthd Length: 10
Number of variables: 6
Number of global attributes: 5
Global attribute: 0 Attribute: title Type: NC_CHAR, string Length: 25 Value: monthly total NPP carbon
Global attribute: 1 Attribute: source Type: NC_CHAR, string Length: 14 Value: ibis wmonthly
Global attribute: 2 Attribute: history Type: NC_CHAR, string Length: 12 Value: 14-May-2012
Global attribute: 3 Attribute: calendar Type: NC_CHAR, string Length: 10 Value: gregorian
Global attribute: 4 Attribute: conventions Type: NC_CHAR, string Length: 9 Value: NCAR-CSM
Variable: longitude Attribute: long_name Type: NC_CHAR, string Length: 10 Value: longitude
Variable: longitude Attribute: units Type: NC_CHAR, string Length: 13 Value: degrees_east
Attention: NetCDF attribute 'missing_value' does not exist for the selected variable.
Variable: latitude Attribute: long_name Type: NC_CHAR, string Length: 9 Value: latitude
Variable: latitude Attribute: units Type: NC_CHAR, string Length: 14 Value: degrees_north
Attention: NetCDF attribute 'missing_value' does not exist for the selected variable.
Variable: time Attribute: long_name Type: NC_CHAR, string Length: 5 Value: time
Variable: time Attribute: units Type: NC_CHAR, string Length: 22 Value: days since 1500-12-31
Attention: NetCDF attribute 'missing_value' does not exist for the selected variable.
Variable: time_weights Attribute: long_name Type: NC_CHAR, string Length: 29 Value: number of days per time step
Variable: time_weights Attribute: units Type: NC_CHAR, string Length: 5 Value: days
Attention: NetCDF attribute 'missing_value' does not exist for the selected variable.
Variable: date Attribute: long_name Type: NC_CHAR, string Length: 25 Value: label for each time step
Variable: date Attribute: units Type: NC_CHAR, string Length: 1 Value:
Attention: NetCDF attribute 'missing_value' does not exist for the selected variable.
Variable: npptot Attribute: long_name Type: NC_CHAR, string Length: 16 Value: total NPP carbon
Variable: npptot Attribute: units Type: NC_CHAR, string Length: 14 Value: kg-C/m^2/month
Variable: npptot Attribute: missing_value Type: NC_FLOAT, 4 bytes Length: 1 Value: 9e+20
Value of netCDF attribute 'missing_value' is: 9e+20
In file npptot0.nc there is 1 nlat line for a running total of 1 latitude lines.
:
:
:
variable 0: longitude, rank 1, size 4
variable 1: latitude, rank 1, size 61
variable 2: time, rank 1, size 60
variable 3: time_weights, rank 1, size 60
variable 4: date, rank 2, size 600

Resources