Gnuplot: How is possible to correct the range of axis for 3D graph? - plot

I want to use Gnuplot to have 3D graph for my attached data. Hereby, I used the following command:
set ticslevel 0
set dgrid3d 30,30
set style lines 100 lt 5 lw 0.5
set pm3d hidden3d 100
set grid
plot "./1.txt" using 2:3:4 with pm3d
But, the problem here is the vertical axis which not show the maximum number available at 4th column of my data (for example we have 4 in column 4th).Could you please put any suggestion regarding the range of axis?
Here is my data:
1 1.4915 -0.1 1 0.0542767
2 1.4915 1.9 0 0
3 1.4915 3.9 1 0.0542767
4 1.4915 5.9 1 0.0542767
5 1.4915 7.9 0 0
6 1.4915 9.9 1 0.0542767
7 1.4915 11.9 1 0.0542767
8 1.4915 13.9 2 0.108553
9 1.4915 15.9 0 0
10 1.4915 17.9 2 0.108553
11 1.4915 19.9 0 0
12 1.4915 21.9 0 0
13 1.4915 23.9 1 0.0542767
14 1.4915 25.9 1 0.0542767
15 1.4915 27.9 1 0.0542767
16 1.4915 29.9 1 0.0542767
17 1.4915 31.9 1 0.0542767
18 1.4915 33.9 0 0
19 1.4915 35.9 1 0.0542767
20 1.4915 37.9 0 0
21 1.4915 39.9 0 0
22 1.4915 41.9 0 0
23 2.4745 -0.1 0 0
24 2.4745 1.9 0 0
25 2.4745 3.9 1 0.0327152
26 2.4745 5.9 1 0.0327152
27 2.4745 7.9 0 0
28 2.4745 9.9 1 0.0327152
29 2.4745 11.9 0 0
30 2.4745 13.9 0 0
31 2.4745 15.9 2 0.0654303
32 2.4745 17.9 0 0
33 2.4745 19.9 2 0.0654303
34 2.4745 21.9 1 0.0327152
35 2.4745 23.9 3 0.0981455
36 2.4745 25.9 1 0.0327152
37 2.4745 27.9 0 0
38 2.4745 29.9 1 0.0327152
39 2.4745 31.9 1 0.0327152
40 2.4745 33.9 0 0
41 2.4745 35.9 3 0.0981455
42 2.4745 37.9 0 0
43 2.4745 39.9 0 0
44 2.4745 41.9 0 0
45 3.4575 -0.1 0 0
46 3.4575 1.9 2 0.0468279
47 3.4575 3.9 1 0.0234139
48 3.4575 5.9 0 0
49 3.4575 7.9 2 0.0468279
50 3.4575 9.9 1 0.0234139
51 3.4575 11.9 1 0.0234139
52 3.4575 13.9 0 0
53 3.4575 15.9 0 0
54 3.4575 17.9 0 0
55 3.4575 19.9 0 0
56 3.4575 21.9 0 0
57 3.4575 23.9 0 0
58 3.4575 25.9 1 0.0234139
59 3.4575 27.9 0 0
60 3.4575 29.9 1 0.0234139
61 3.4575 31.9 1 0.0234139
62 3.4575 33.9 0 0
63 3.4575 35.9 1 0.0234139
64 3.4575 37.9 0 0
65 3.4575 39.9 0 0
66 3.4575 41.9 2 0.0468279
67 4.4405 -0.1 0 0
68 4.4405 1.9 0 0
69 4.4405 3.9 1 0.0182308
70 4.4405 5.9 1 0.0182308
71 4.4405 7.9 2 0.0364615
72 4.4405 9.9 0 0
73 4.4405 11.9 1 0.0182308
74 4.4405 13.9 1 0.0182308
75 4.4405 15.9 2 0.0364615
76 4.4405 17.9 4 0.072923
77 4.4405 19.9 2 0.0364615
78 4.4405 21.9 0 0
79 4.4405 23.9 1 0.0182308
80 4.4405 25.9 1 0.0182308
81 4.4405 27.9 2 0.0364615
82 4.4405 29.9 2 0.0364615
83 4.4405 31.9 2 0.0364615
84 4.4405 33.9 1 0.0182308
85 4.4405 35.9 0 0
86 4.4405 37.9 1 0.0182308
87 4.4405 39.9 0 0
88 4.4405 41.9 1 0.0182308
89 5.4235 -0.1 0 0
90 5.4235 1.9 1 0.0149265
91 5.4235 3.9 1 0.0149265
92 5.4235 5.9 1 0.0149265
93 5.4235 7.9 0 0
94 5.4235 9.9 3 0.0447794
95 5.4235 11.9 0 0
96 5.4235 13.9 4 0.0597059
97 5.4235 15.9 0 0
98 5.4235 17.9 1 0.0149265
99 5.4235 19.9 1 0.0149265
100 5.4235 21.9 0 0
101 5.4235 23.9 2 0.0298529
102 5.4235 25.9 2 0.0298529
103 5.4235 27.9 0 0
104 5.4235 29.9 1 0.0149265
105 5.4235 31.9 1 0.0149265
106 5.4235 33.9 3 0.0447794
107 5.4235 35.9 1 0.0149265
108 5.4235 37.9 1 0.0149265
109 5.4235 39.9 0 0
110 5.4235 41.9 0 0
111 6.4065 -0.1 0 0
112 6.4065 1.9 2 0.0252724
113 6.4065 3.9 0 0
114 6.4065 5.9 0 0
115 6.4065 7.9 1 0.0126362
116 6.4065 9.9 0 0
117 6.4065 11.9 3 0.0379085
118 6.4065 13.9 1 0.0126362
119 6.4065 15.9 1 0.0126362
120 6.4065 17.9 1 0.0126362
121 6.4065 19.9 0 0
122 6.4065 21.9 1 0.0126362
123 6.4065 23.9 0 0
124 6.4065 25.9 2 0.0252724
125 6.4065 27.9 2 0.0252724
126 6.4065 29.9 1 0.0126362
127 6.4065 31.9 0 0
128 6.4065 33.9 2 0.0252724
129 6.4065 35.9 0 0
130 6.4065 37.9 1 0.0126362
131 6.4065 39.9 1 0.0126362
132 6.4065 41.9 0 0
133 7.3895 -0.1 0 0
134 7.3895 1.9 3 0.0328657
135 7.3895 3.9 1 0.0109552
136 7.3895 5.9 3 0.0328657
137 7.3895 7.9 2 0.0219105
138 7.3895 9.9 4 0.0438209
139 7.3895 11.9 0 0
140 7.3895 13.9 2 0.0219105
141 7.3895 15.9 3 0.0328657
142 7.3895 17.9 0 0
143 7.3895 19.9 6 0.0657314
144 7.3895 21.9 2 0.0219105
145 7.3895 23.9 3 0.0328657
146 7.3895 25.9 4 0.0438209
147 7.3895 27.9 2 0.0219105
148 7.3895 29.9 3 0.0328657
149 7.3895 31.9 1 0.0109552
150 7.3895 33.9 2 0.0219105
151 7.3895 35.9 4 0.0438209
152 7.3895 37.9 3 0.0328657

First of all, your data is actually a regular grid (22 x 7 data points, actually two lines seem to be missing at the end).
So, maybe depending what exactly you are expecting there might not be a need for using set dgrid3d which is meant for non-grid data.
Anyway, it looks to me that you might want to get some smoothing or averaging.
If you just set dgrid3d the default is 10 x 10 qnorm 1. Check help dgrid3d:
The qnorm algorithm calculates a weighted average of the input data at each grid point. Each data point is weighted by the inverse of its distance from the grid point raised to some power. The power is specified as an optional integer parameter that defaults to 1. This algorithm is the default.
So, my understanding is that if your highest data point is by chance not on a regular dgrid3d point, your maximum might be a different value (that is what you observe).
Two suggestions:
since your points are in a regular grid, add empty lines where the column 2 value changes.
Then you can plot your data without dgrid3d
or alternatively:
use dgrid3d splines, where the surface will go through your data points. No necessary to insert empty lines into your data.
Note: if you have a MxN grid you will have (M-1)*(N-1) surface elements defined by 4 points. You can define which color each element should have. Check help corners2color and here, the very last graph. In your case, max would be suitable.
Data: SO74912379.dat
1 1.4915 -0.1 1 0.0542767
2 1.4915 1.9 0 0
3 1.4915 3.9 1 0.0542767
4 1.4915 5.9 1 0.0542767
5 1.4915 7.9 0 0
6 1.4915 9.9 1 0.0542767
7 1.4915 11.9 1 0.0542767
8 1.4915 13.9 2 0.108553
9 1.4915 15.9 0 0
10 1.4915 17.9 2 0.108553
11 1.4915 19.9 0 0
12 1.4915 21.9 0 0
13 1.4915 23.9 1 0.0542767
14 1.4915 25.9 1 0.0542767
15 1.4915 27.9 1 0.0542767
16 1.4915 29.9 1 0.0542767
17 1.4915 31.9 1 0.0542767
18 1.4915 33.9 0 0
19 1.4915 35.9 1 0.0542767
20 1.4915 37.9 0 0
21 1.4915 39.9 0 0
22 1.4915 41.9 0 0
23 2.4745 -0.1 0 0
24 2.4745 1.9 0 0
25 2.4745 3.9 1 0.0327152
26 2.4745 5.9 1 0.0327152
27 2.4745 7.9 0 0
28 2.4745 9.9 1 0.0327152
29 2.4745 11.9 0 0
30 2.4745 13.9 0 0
31 2.4745 15.9 2 0.0654303
32 2.4745 17.9 0 0
33 2.4745 19.9 2 0.0654303
34 2.4745 21.9 1 0.0327152
35 2.4745 23.9 3 0.0981455
36 2.4745 25.9 1 0.0327152
37 2.4745 27.9 0 0
38 2.4745 29.9 1 0.0327152
39 2.4745 31.9 1 0.0327152
40 2.4745 33.9 0 0
41 2.4745 35.9 3 0.0981455
42 2.4745 37.9 0 0
43 2.4745 39.9 0 0
44 2.4745 41.9 0 0
45 3.4575 -0.1 0 0
46 3.4575 1.9 2 0.0468279
47 3.4575 3.9 1 0.0234139
48 3.4575 5.9 0 0
49 3.4575 7.9 2 0.0468279
50 3.4575 9.9 1 0.0234139
51 3.4575 11.9 1 0.0234139
52 3.4575 13.9 0 0
53 3.4575 15.9 0 0
54 3.4575 17.9 0 0
55 3.4575 19.9 0 0
56 3.4575 21.9 0 0
57 3.4575 23.9 0 0
58 3.4575 25.9 1 0.0234139
59 3.4575 27.9 0 0
60 3.4575 29.9 1 0.0234139
61 3.4575 31.9 1 0.0234139
62 3.4575 33.9 0 0
63 3.4575 35.9 1 0.0234139
64 3.4575 37.9 0 0
65 3.4575 39.9 0 0
66 3.4575 41.9 2 0.0468279
67 4.4405 -0.1 0 0
68 4.4405 1.9 0 0
69 4.4405 3.9 1 0.0182308
70 4.4405 5.9 1 0.0182308
71 4.4405 7.9 2 0.0364615
72 4.4405 9.9 0 0
73 4.4405 11.9 1 0.0182308
74 4.4405 13.9 1 0.0182308
75 4.4405 15.9 2 0.0364615
76 4.4405 17.9 4 0.072923
77 4.4405 19.9 2 0.0364615
78 4.4405 21.9 0 0
79 4.4405 23.9 1 0.0182308
80 4.4405 25.9 1 0.0182308
81 4.4405 27.9 2 0.0364615
82 4.4405 29.9 2 0.0364615
83 4.4405 31.9 2 0.0364615
84 4.4405 33.9 1 0.0182308
85 4.4405 35.9 0 0
86 4.4405 37.9 1 0.0182308
87 4.4405 39.9 0 0
88 4.4405 41.9 1 0.0182308
89 5.4235 -0.1 0 0
90 5.4235 1.9 1 0.0149265
91 5.4235 3.9 1 0.0149265
92 5.4235 5.9 1 0.0149265
93 5.4235 7.9 0 0
94 5.4235 9.9 3 0.0447794
95 5.4235 11.9 0 0
96 5.4235 13.9 4 0.0597059
97 5.4235 15.9 0 0
98 5.4235 17.9 1 0.0149265
99 5.4235 19.9 1 0.0149265
100 5.4235 21.9 0 0
101 5.4235 23.9 2 0.0298529
102 5.4235 25.9 2 0.0298529
103 5.4235 27.9 0 0
104 5.4235 29.9 1 0.0149265
105 5.4235 31.9 1 0.0149265
106 5.4235 33.9 3 0.0447794
107 5.4235 35.9 1 0.0149265
108 5.4235 37.9 1 0.0149265
109 5.4235 39.9 0 0
110 5.4235 41.9 0 0
111 6.4065 -0.1 0 0
112 6.4065 1.9 2 0.0252724
113 6.4065 3.9 0 0
114 6.4065 5.9 0 0
115 6.4065 7.9 1 0.0126362
116 6.4065 9.9 0 0
117 6.4065 11.9 3 0.0379085
118 6.4065 13.9 1 0.0126362
119 6.4065 15.9 1 0.0126362
120 6.4065 17.9 1 0.0126362
121 6.4065 19.9 0 0
122 6.4065 21.9 1 0.0126362
123 6.4065 23.9 0 0
124 6.4065 25.9 2 0.0252724
125 6.4065 27.9 2 0.0252724
126 6.4065 29.9 1 0.0126362
127 6.4065 31.9 0 0
128 6.4065 33.9 2 0.0252724
129 6.4065 35.9 0 0
130 6.4065 37.9 1 0.0126362
131 6.4065 39.9 1 0.0126362
132 6.4065 41.9 0 0
133 7.3895 -0.1 0 0
134 7.3895 1.9 3 0.0328657
135 7.3895 3.9 1 0.0109552
136 7.3895 5.9 3 0.0328657
137 7.3895 7.9 2 0.0219105
138 7.3895 9.9 4 0.0438209
139 7.3895 11.9 0 0
140 7.3895 13.9 2 0.0219105
141 7.3895 15.9 3 0.0328657
142 7.3895 17.9 0 0
143 7.3895 19.9 6 0.0657314
144 7.3895 21.9 2 0.0219105
145 7.3895 23.9 3 0.0328657
146 7.3895 25.9 4 0.0438209
147 7.3895 27.9 2 0.0219105
148 7.3895 29.9 3 0.0328657
149 7.3895 31.9 1 0.0109552
150 7.3895 33.9 2 0.0219105
151 7.3895 35.9 4 0.0438209
152 7.3895 37.9 3 0.0328657
# 153 7.3895 39.9 ? ?
# 154 7.3895 41.9 ? ?
Script:
### plot with pm3d
reset session
FILE = "SO74912379.dat"
set xrange[1.4915:7.3895]
set yrange[-0.1:41.9]
set cbrange[0:6]
set ticslevel 0
set style lines 100 lt 5 lw 0.5
set pm3d hidden3d 100 border
set pm3d corners2color max
set grid
set multiplot layout 2,1
splot FILE using 2:3:4 with pm3d ti "insert empty lines and no dgrid3d"
set dgrid3d 30,30 splines
splot FILE using 2:3:4 with pm3d ti "dgrid3d 30,30 splines"`
unset multiplot
### end of script
Result:

Related

how add spatial interpolation in R

I have question to you guys because I don't know how to interpolate my data on map.
I'm try to reproduce map from http://aasa.ut.ee/Rspatial/05_session.html and I want to achive something like that:
map from tutorial. But there is happening something in background so I don't understand everything.
What I already have:
data frame with longitude, latitude and preassure columns.
The range of longitude and latitude is xlim=c(-10, 30), ylim=c(40,65)
> df
longitude latitude preassure
1 65.0 -10.0 999.6984
2 65.0 -7.5 999.7445
3 65.0 -5.0 999.5182
4 65.0 -2.5 999.0021
5 65.0 0.0 998.4595
6 65.0 2.5 998.0452
7 65.0 5.0 997.8119
8 65.0 7.5 997.4956
9 65.0 10.0 997.1532
10 65.0 12.5 997.1851
11 65.0 15.0 997.3216
12 65.0 17.5 997.0767
13 65.0 20.0 996.5215
14 65.0 22.5 996.0055
15 65.0 25.0 995.7271
16 65.0 27.5 995.6919
17 65.0 30.0 995.8885
18 65.0 32.5 996.1016
19 65.0 35.0 996.3514
20 62.5 -10.0 1001.9770
21 62.5 -7.5 1002.4431
22 62.5 -5.0 1002.6335
23 62.5 -2.5 1002.6905
24 62.5 0.0 1002.7607
25 62.5 2.5 1002.7090
26 62.5 5.0 1002.2872
27 62.5 7.5 1001.8760
28 62.5 10.0 1001.8066
29 62.5 12.5 1001.3829
30 62.5 15.0 1000.1226
31 62.5 17.5 998.7299
32 62.5 20.0 997.8351
33 62.5 22.5 997.3430
34 62.5 25.0 997.2307
35 62.5 27.5 997.3709
36 62.5 30.0 997.5797
37 62.5 32.5 997.7341
38 62.5 35.0 997.9541
39 60.0 -10.0 1006.7589
40 60.0 -7.5 1007.3146
41 60.0 -5.0 1007.5654
42 60.0 -2.5 1007.7268
43 60.0 0.0 1007.8035
44 60.0 2.5 1007.5192
45 60.0 5.0 1006.9113
46 60.0 7.5 1006.3931
47 60.0 10.0 1005.7369
48 60.0 12.5 1004.3599
49 60.0 15.0 1002.6407
50 60.0 17.5 1001.2952
51 60.0 20.0 1000.2639
52 60.0 22.5 999.5920
53 60.0 25.0 999.3969
54 60.0 27.5 999.3746
55 60.0 30.0 999.4167
56 60.0 32.5 999.6822
57 60.0 35.0 1000.2075
58 57.5 -10.0 1011.8321
59 57.5 -7.5 1012.5412
60 57.5 -5.0 1013.0607
61 57.5 -2.5 1013.2542
62 57.5 0.0 1013.0942
63 57.5 2.5 1012.6132
64 57.5 5.0 1011.7856
65 57.5 7.5 1010.6275
66 57.5 10.0 1009.2998
67 57.5 12.5 1007.9899
68 57.5 15.0 1006.7288
69 57.5 17.5 1005.4613
70 57.5 20.0 1004.2938
71 57.5 22.5 1003.5243
72 57.5 25.0 1003.2300
73 57.5 27.5 1003.0484
74 57.5 30.0 1002.9350
75 57.5 32.5 1003.1247
76 57.5 35.0 1003.6449
77 55.0 -10.0 1017.0270
78 55.0 -7.5 1018.0844
79 55.0 -5.0 1018.8001
80 55.0 -2.5 1018.8523
81 55.0 0.0 1018.3936
82 55.0 2.5 1017.8593
83 55.0 5.0 1017.3493
84 55.0 7.5 1016.5559
85 55.0 10.0 1015.3215
86 55.0 12.5 1013.9338
87 55.0 15.0 1012.6094
88 55.0 17.5 1011.2712
89 55.0 20.0 1009.9580
90 55.0 22.5 1008.9790
91 55.0 25.0 1008.4529
92 55.0 27.5 1008.0782
93 55.0 30.0 1007.7059
94 55.0 32.5 1007.5469
95 55.0 35.0 1007.5681
96 52.5 -10.0 1021.5118
97 52.5 -7.5 1022.4463
98 52.5 -5.0 1022.9901
99 52.5 -2.5 1023.1570
100 52.5 0.0 1023.0451
101 52.5 2.5 1022.8766
102 52.5 5.0 1022.7153
103 52.5 7.5 1022.2165
104 52.5 10.0 1021.0707
105 52.5 12.5 1019.6818
106 52.5 15.0 1018.5007
107 52.5 17.5 1017.3067
108 52.5 20.0 1015.8635
109 52.5 22.5 1014.5103
110 52.5 25.0 1013.4495
111 52.5 27.5 1012.4800
112 52.5 30.0 1011.7438
113 52.5 32.5 1011.5841
114 52.5 35.0 1011.6581
115 50.0 -10.0 1024.6059
116 50.0 -7.5 1025.4720
117 50.0 -5.0 1026.0310
118 50.0 -2.5 1026.4533
119 50.0 0.0 1026.6738
120 50.0 2.5 1026.6087
121 50.0 5.0 1026.4604
122 50.0 7.5 1026.2937
123 50.0 10.0 1025.5823
124 50.0 12.5 1024.3837
125 50.0 15.0 1023.3542
126 50.0 17.5 1022.3630
127 50.0 20.0 1021.0361
128 50.0 22.5 1019.7546
129 50.0 25.0 1018.5380
130 50.0 27.5 1016.9760
131 50.0 30.0 1015.5891
132 50.0 32.5 1015.2455
133 50.0 35.0 1015.3957
134 47.5 -10.0 1026.5579
135 47.5 -7.5 1027.6550
136 47.5 -5.0 1028.1215
137 47.5 -2.5 1028.2034
138 47.5 0.0 1028.3665
139 47.5 2.5 1028.4254
140 47.5 5.0 1028.6371
141 47.5 7.5 1029.1085
142 47.5 10.0 1028.8970
143 47.5 12.5 1027.6374
144 47.5 15.0 1026.1353
145 47.5 17.5 1024.8384
146 47.5 20.0 1023.6115
147 47.5 22.5 1022.7498
148 47.5 25.0 1022.1981
149 47.5 27.5 1020.5934
150 47.5 30.0 1018.6138
151 47.5 32.5 1017.7399
152 47.5 35.0 1017.6530
153 45.0 -10.0 1027.3521
154 45.0 -7.5 1028.1177
155 45.0 -5.0 1028.4873
156 45.0 -2.5 1028.4996
157 45.0 0.0 1028.7896
158 45.0 2.5 1028.8341
159 45.0 5.0 1027.7460
160 45.0 7.5 1026.4421
161 45.0 10.0 1025.6955
162 45.0 12.5 1025.1825
163 45.0 15.0 1025.0429
164 45.0 17.5 1025.2959
165 45.0 20.0 1025.0541
166 45.0 22.5 1024.3106
167 45.0 25.0 1022.9915
168 45.0 27.5 1021.3729
169 45.0 30.0 1020.4743
170 45.0 32.5 1019.7272
171 45.0 35.0 1019.3080
172 42.5 -10.0 1027.2090
173 42.5 -7.5 1027.8139
174 42.5 -5.0 1028.9445
175 42.5 -2.5 1029.2019
176 42.5 0.0 1028.4393
177 42.5 2.5 1026.8987
178 42.5 5.0 1025.0419
179 42.5 7.5 1023.9250
180 42.5 10.0 1024.0046
181 42.5 12.5 1024.0845
182 42.5 15.0 1023.5381
183 42.5 17.5 1023.5317
184 42.5 20.0 1024.7725
185 42.5 22.5 1025.0638
186 42.5 25.0 1022.9272
187 42.5 27.5 1022.0318
188 42.5 30.0 1021.7181
189 42.5 32.5 1020.8714
190 42.5 35.0 1020.5228
191 40.0 -10.0 1026.8144
192 40.0 -7.5 1027.3171
193 40.0 -5.0 1029.2556
194 40.0 -2.5 1028.9006
195 40.0 0.0 1026.9173
196 40.0 2.5 1026.1678
197 40.0 5.0 1025.5287
198 40.0 7.5 1024.6530
199 40.0 10.0 1024.0673
200 40.0 12.5 1023.5342
201 40.0 15.0 1022.7612
202 40.0 17.5 1022.1834
203 40.0 20.0 1022.2757
204 40.0 22.5 1022.6448
205 40.0 25.0 1021.9519
206 40.0 27.5 1021.6179
207 40.0 30.0 1022.3322
208 40.0 32.5 1023.9349
209 40.0 35.0 1023.3993
here is my code:
library("rnaturalearth")
library(sf)
library(ggplot2)
library(viridis)
world <- ne_countries(scale = "medium", returnclass = "sf")
sf::sf_use_s2(FALSE)
ggplot(data = world) +
geom_raster(data = df, aes(x=latitude, y=longitude, fill=preassure), interpolate = TRUE) +
scale_fill_viridis(direction = 1) +
geom_sf(fill = "NA", colour = "white") +
coord_sf(xlim=c(-10, 30),ylim=c(40,65), expand = F) +
xlab("Długość geograficzna") + ylab("Szerokość geograficzna") +
labs(title = "SLP zima",fill = "Cisnienie") +
theme(plot.title = element_text(hjust = 0.5)
)
and result: what I was able to do
and finaly I want something like this:
map with the preasure interpolation - demo from paint
Thanks for all your help!!!

Convert single column variable into multiple columns

How do I convert a variable such as this into a dataframe with multiple columns?
1 18.0 8 307.0 130.0 3504. 12.0 70 1 chevrolet chevelle malibu
2 15.0 8 350.0 165.0 3693. 11.5 70 1 buick skylark 320
3 18.0 8 318.0 150.0 3436. 11.0 70 1 plymouth satellite
4 16.0 8 304.0 150.0 3433. 12.0 70 1 amc rebel sst
5 17.0 8 302.0 140.0 3449. 10.5 70 1 ford torino
6 15.0 8 429.0 198.0 4341. 10.0 70 1 ford galaxie 500
The imported data makes it a single column instead of multiple columns. I tried separate but it separated the numbers at the decimal points.
auto.mpg %>% separate(V1,c("mpg","cylinders","displacement","power","weight","acceleration","year","origin"))
mpg cylinders displacement power weight acceleration year origin V2
1 18 0 8 307 0 130 0 3504 chevrolet chevelle malibu
2 15 0 8 350 0 165 0 3693 buick skylark 320
3 18 0 8 318 0 150 0 3436 plymouth satellite
4 16 0 8 304 0 150 0 3433 amc rebel sst
5 17 0 8 302 0 140 0 3449 ford torino
6 15 0 8 429 0 198 0 4341 ford galaxie 500
Current data set:
You could try this:
auto.mpg <- as.data.frame(do.call(rbind, (strsplit(as.character(auto.mpg$V1), " {2,10}"))))
auto.mpg
# V1 V2 V3 V4 V5 V6 V7 V8 V9 V10
# 1 1 18.0 8 307.0 130.0 3504. 12.0 70 1 chevrolet chevelle malibu
# 2 2 15.0 8 350.0 165.0 3693. 11.5 70 1 buick skylark 320
# 3 3 18.0 8 318.0 150.0 3436. 11.0 70 1 plymouth satellite
# 4 4 16.0 8 304.0 150.0 3433. 12.0 70 1 amc rebel sst
# 5 5 17.0 8 302.0 140.0 3449. 10.5 70 1 ford torino
# 6 6 15.0 8 429.0 198.0 4341. 10.0 70 1 ford galaxie 500
and then
auto.mpg <- auto.mpg[,-1]
names(auto.mpg) <- c("mpg", "cylinders", "displacement", "power",
"weight", "acceleration", "year", "origin", "model")
auto.mpg
# mpg cylinders displacement power weight acceleration year origin model
# 1 18.0 8 307.0 130.0 3504. 12.0 70 1 chevrolet chevelle malibu
# 2 15.0 8 350.0 165.0 3693. 11.5 70 1 buick skylark 320
# 3 18.0 8 318.0 150.0 3436. 11.0 70 1 plymouth satellite
# 4 16.0 8 304.0 150.0 3433. 12.0 70 1 amc rebel sst
# 5 17.0 8 302.0 140.0 3449. 10.5 70 1 ford torino
# 6 15.0 8 429.0 198.0 4341. 10.0 70 1 ford galaxie 500

How to solve this error for pollution rose?

I have an issue with R where I am trying to do a pollution rose graph and I am almost sure that the data and the code is correct but I still keep getting an error message and I couldn't figure out what does it mean. The error message is:
Error in `[[<-.data.frame`(`*tmp*`, vars[i], value = numeric(0)) :
replacement has 0 rows, data has 37.
My code is:
pollutionRose(pollution_rose, pollutant = "PM10",header= TRUE, cols = c("darkblue","green4","yellow2","red","red4"), key.position = "bottom",max.freq = 50)
and here is my data:
HH:MM:SS WD1 WS1 PM10 PM2.5 PM1
10:10:00 AM 0 0 0 0 0
10:20:00 AM 0 0 0 0 0
10:29:00 AM 254 0.4 0 0 0
10:30:00 AM 109 0.5 0 0 0
10:40:00 AM 21 1.9 0 0 0
10:50:00 AM 148 1.2 0 0 0
10:54:00 AM 222 1.1 0 0 0
10:55:00 AM 61 1 0 0 0
11:00:00 AM 109 0.6 19 4.3 1.8
11:10:00 AM 354 0.7 20.4 4.1 1.7
11:20:00 AM 5 2.6 8.3 3.8 1.6
11:29:00 AM 60 2.6 7.9 3.8 1.5
11:30:00 AM 97 1.5 18.6 3.8 1.5
11:40:00 AM 42 0.8 15.6 3.8 1.5
11:50:00 AM 52 0 10.5 4.3 1.6
12:00:00 PM 60 0.9 11.7 3.9 1.5
12:10:00 PM 74 1 9.6 4.1 1.4
12:20:00 PM 338 1.7 0 0 0
12:30:00 PM 285 4.4 0 0 0
12:40:00 PM 296 3.6 0 0 0
12:50:00 PM 241 3.3 0 0 0
1:00:00 PM 274 1.2 0 0 0
1:10:00 PM 287 1.3 15.8 4.4 1.6
1:20:00 PM 317 3 13.1 4.6 1.7
1:30:00 PM 309 2.6 10.5 3.5 1.4
1:31:00 PM 244 3.5 14.8 4.2 1.5
1:40:00 PM 251 0.9 12.8 4.1 1.5
1:50:00 PM 282 1.1 12.9 4.8 1.8
2:00:00 PM 254 2.5 9.6 4.9 1.7
2:10:00 PM 245 2.3 10.9 4.6 1.6
2:20:00 PM 207 2.1 0 0 0
2:30:00 PM 30 0 0 0 0
2:37:00 PM 62 0.7 12.9 4.3 1.6
2:40:00 PM 80 1.8 10.1 3.6 1.5
2:40:00 PM 0 0 10.1 3.6 1.5
2:50:00 PM 0 0 10 4.3 1.5
3:00:00 PM 0 0 0 0 0

Apply rolling quartiles with skipping in R?

I have the following weather data for 10 years where each date has 10 values (10y * 1d). What I have been working on is to calculate the 90th percentile (for Max and Min Temp) using centered moving percentiles for every 15-days calendar window. E.g., for May-1 the 90th percentile will be calculated from April 24 to May-8. Then May-2 will use data from April-25 to May-9. Thus for my data, I have to use window of 150 (10y*15d) as I have 10 values per day
head(df,50)
YEAR MONTH DAY MEAN MAX MIN Data2 MD
114 1985 4 24 20.0 25.9 13.8 1985-04-24 04-24
479 1986 4 24 22.0 28.6 16.0 1986-04-24 04-24
844 1987 4 24 23.9 30.0 18.5 1987-04-24 04-24
1210 1988 4 24 23.1 28.0 17.2 1988-04-24 04-24
1575 1989 4 24 22.8 28.0 16.7 1989-04-24 04-24
1940 1990 4 24 17.9 23.4 10.5 1990-04-24 04-24
2305 1991 4 24 24.4 32.0 17.8 1991-04-24 04-24
2671 1992 4 24 23.8 29.8 18.8 1992-04-24 04-24
3036 1993 4 24 23.5 28.4 19.2 1993-04-24 04-24
3401 1994 4 24 23.5 30.0 15.0 1994-04-24 04-24
115 1985 4 25 21.3 28.1 15.0 1985-04-25 04-25
480 1986 4 25 22.0 26.7 18.2 1986-04-25 04-25
845 1987 4 25 24.3 30.4 19.0 1987-04-25 04-25
1211 1988 4 25 21.7 26.4 17.4 1988-04-25 04-25
1576 1989 4 25 23.0 28.0 17.0 1989-04-25 04-25
1941 1990 4 25 15.3 22.7 11.2 1990-04-25 04-25
2306 1991 4 25 25.2 33.0 20.0 1991-04-25 04-25
2672 1992 4 25 23.2 28.4 19.8 1992-04-25 04-25
3037 1993 4 25 22.7 27.4 19.7 1993-04-25 04-25
3402 1994 4 25 22.8 30.0 15.0 1994-04-25 04-25
116 1985 4 26 21.4 28.0 16.5 1985-04-26 04-26
481 1986 4 26 21.3 26.4 18.1 1986-04-26 04-26
846 1987 4 26 24.7 31.0 20.6 1987-04-26 04-26
1212 1988 4 26 21.1 26.5 14.0 1988-04-26 04-26
1577 1989 4 26 21.5 28.0 16.5 1989-04-26 04-26
1942 1990 4 26 15.7 21.5 9.8 1990-04-26 04-26
2307 1991 4 26 25.7 32.8 20.0 1991-04-26 04-26
2673 1992 4 26 22.6 27.0 17.6 1992-04-26 04-26
3038 1993 4 26 22.0 26.0 19.0 1993-04-26 04-26
3403 1994 4 26 23.3 29.6 18.5 1994-04-26 04-26
117 1985 4 27 21.5 28.0 16.0 1985-04-27 04-27
482 1986 4 27 20.0 26.5 15.0 1986-04-27 04-27
847 1987 4 27 24.5 30.4 20.0 1987-04-27 04-27
1213 1988 4 27 23.4 28.3 14.0 1988-04-27 04-27
1578 1989 4 27 19.8 27.7 15.0 1989-04-27 04-27
1943 1990 4 27 12.3 16.4 10.0 1990-04-27 04-27
2308 1991 4 27 25.8 32.8 19.0 1991-04-27 04-27
2674 1992 4 27 22.0 28.4 15.6 1992-04-27 04-27
3039 1993 4 27 19.6 24.2 16.2 1993-04-27 04-27
3404 1994 4 27 25.0 31.6 19.0 1994-04-27 04-27
118 1985 4 28 21.4 31.1 16.0 1985-04-28 04-28
483 1986 4 28 19.7 25.9 15.0 1986-04-28 04-28
848 1987 4 28 24.6 32.8 18.0 1987-04-28 04-28
1214 1988 4 28 24.8 29.4 15.2 1988-04-28 04-28
1579 1989 4 28 23.1 28.4 14.0 1989-04-28 04-28
1944 1990 4 28 17.9 23.2 8.2 1990-04-28 04-28
2309 1991 4 28 26.2 32.0 19.0 1991-04-28 04-28
2675 1992 4 28 23.3 28.6 16.0 1992-04-28 04-28
3040 1993 4 28 19.1 24.2 14.0 1993-04-28 04-28
3405 1994 4 28 23.9 32.2 17.2 1994-04-28 04-28
I have tried this but it did not include all the 10 values of the centered day
require(caTools)
k=150
df$MPerMAX<- runquantile(df$MAX, k, probs = .9, endrule="NA" )
df[71:83,]
YEAR MONTH DAY MEAN MAX MIN Data2 MD MPerMAX
121 1985 5 1 21.7 28.3 15.5 1985-05-01 05-01 NA
486 1986 5 1 24.0 32.0 16.8 1986-05-01 05-01 NA
851 1987 5 1 25.0 32.6 18.4 1987-05-01 05-01 NA
1217 1988 5 1 26.6 32.2 18.2 1988-05-01 05-01 NA
1582 1989 5 1 26.5 31.0 18.0 1989-05-01 05-01 32.8
1947 1990 5 1 24.3 29.4 19.2 1990-05-01 05-01 32.8
2312 1991 5 1 26.3 33.8 21.2 1991-05-01 05-01 32.8
2678 1992 5 1 23.1 26.2 20.4 1992-05-01 05-01 32.8
3043 1993 5 1 23.3 29.8 17.6 1993-05-01 05-01 32.8
3408 1994 5 1 25.0 30.8 20.0 1994-05-01 05-01 32.8
122 1985 5 2 22.3 29.1 18.0 1985-05-02 05-02 32.8
487 1986 5 2 24.3 30.7 18.8 1986-05-02 05-02 32.8
852 1987 5 2 25.1 32.0 19.4 1987-05-02 05-02 32.8
Also tried this but still did not work
df$MPerMAX2=rollapply(df$MAX,width=150,FUN="quantile",p=.9, fill = NA)
To go around that I used 160-days (150+10d-of the center day). But for May-2 and the following dates, the function will only move by 1 data point (day) from both sides. I.e., days from April-25 (using one value out the 10) to May-9 (using one value out the 10) will be used for calculating the percentile which is not correct.
So I am looking for a way that the function will move by 10 data points instead of one.
Any help please.

Relate data of 2 split functions

I have a problem with split command in R...
I have this:
>x1
$`1`
[1] 130 165 150 150 140 198 220 215 225 190 170 160 150 225 95 97 85 90 215 200
[21] 210 193 90 90 100 105 100 88 100 165 175 153 150 180 170 175 110 72 100 88
[41] 86 70 80 90 86 165 175 150 153 150 208 155 160 190 150 130 140 150 86 80
[61] 175 150 145 137 150 198 150 158 150 215 225 175 105 100 100 88 95 150 167 170
[81] 180 100 72 85 107 145 230 150 180 95 95 100 100 80 75 100 110 105 140 150
[101] 150 140 150 75 95 105 72 72 170 145 150 148 110 105 110 95 110 110 129 83
[121] 100 78 97 90 92 79 140 150 120 152 100 105 81 90 52 60 100 78 110 95
[141] 72 150 180 145 130 150 80 96 145 110 145 130 110 105 100 98 180 170 190 149
[161] 88 89 63 83 66 110 140 139 105 95 85 88 100 90 105 85 110 120 145 165
[181] 139 140 68 75 105 85 115 85 88 90 110 130 129 138 135 155 142 125 150 80
[201] 80 125 90 70 70 90 115 115 90 70 90 88 90 90 105 80 84 84 92 110
[221] 84 64 63 65 65 110 105 88 85 88 88 88 85 84 90 92 95 63 70 110
[241] 85 92 112 84 90 86 84 79 82
$`2`
[1] 46 87 90 95 113 90 70 76 60 54 112 76 87 69 46 90 49 75 91 112
[21] 110 83 67 78 75 75 67 71 70 95 88 98 115 86 81 83 70 71 102 88
[41] 120 58 78 78 110 48 103 125 115 133 71 71 77 71 69 76 78 48 48 67
[61] 67 110 62 88 74 75 80 76 74 52
$`3`
[1] 95 88 88 95 65 69 95 97 92 97 88 88 94 90 122 67 65 52 61 97
[21] 93 75 96 97 53 53 70 75 108 68 70 75 67 97 110 52 70 60 95 97
[41] 95 97 68 65 65 60 65 90 75 92 75 65 65 67 67 132 100 72 58 60
[61] 67 65 62 68 75 75 100 74 116 120 68 68 88 75 70 67 67 67 96
Then just leave the first part which is what interests me:
>y1=x1[1]
$`1`
[1] 130 165 150 150 140 198 220 215 225 190 170 160 150 225 95 97 85 90 215 200
[21] 210 193 90 90 100 105 100 88 100 165 175 153 150 180 170 175 110 72 100 88
[41] 86 70 80 90 86 165 175 150 153 150 208 155 160 190 150 130 140 150 86 80
[61] 175 150 145 137 150 198 150 158 150 215 225 175 105 100 100 88 95 150 167 170
[81] 180 100 72 85 107 145 230 150 180 95 95 100 100 80 75 100 110 105 140 150
[101] 150 140 150 75 95 105 72 72 170 145 150 148 110 105 110 95 110 110 129 83
[121] 100 78 97 90 92 79 140 150 120 152 100 105 81 90 52 60 100 78 110 95
[141] 72 150 180 145 130 150 80 96 145 110 145 130 110 105 100 98 180 170 190 149
[161] 88 89 63 83 66 110 140 139 105 95 85 88 100 90 105 85 110 120 145 165
[181] 139 140 68 75 105 85 115 85 88 90 110 130 129 138 135 155 142 125 150 80
[201] 80 125 90 70 70 90 115 115 90 70 90 88 90 90 105 80 84 84 92 110
[221] 84 64 63 65 65 110 105 88 85 88 88 88 85 84 90 92 95 63 70 110
[241] 85 92 112 84 90 86 84 79 82
Realize then the same thing for the other variable:
> y2=x2[1]
$`1`
[1] 12.0 11.5 11.0 12.0 10.5 10.0 9.0 8.5 10.0 8.5 10.0 8.0 9.5 10.0 15.5 15.5
[17] 16.0 15.0 14.0 15.0 13.5 18.5 15.5 19.0 13.0 15.5 15.5 15.5 15.5 12.0 11.5 13.5
[33] 13.0 11.5 12.0 12.0 13.5 19.0 15.0 14.5 14.0 20.5 17.0 19.5 16.5 12.0 12.0 13.5
[49] 13.0 11.5 11.0 13.5 13.5 12.5 12.5 14.0 16.0 14.0 16.0 15.0 13.0 11.5 13.0 14.5
[65] 12.5 11.5 12.0 13.0 14.5 11.0 11.0 11.0 16.5 18.0 16.0 16.5 16.0 14.0 12.5 13.0
[81] 12.5 15.0 19.5 18.5 14.0 13.0 9.5 11.0 11.0 16.5 17.0 16.0 17.0 16.5 17.0 17.0
[97] 18.0 16.5 14.0 14.5 13.5 16.0 15.5 14.5 16.0 16.0 21.0 19.5 11.5 14.0 14.5 13.5
[113] 21.0 18.5 19.0 19.0 15.0 13.5 12.0 17.0 16.0 18.5 14.5 17.0 14.9 17.7 13.0 13.0
[129] 13.9 12.8 15.4 14.5 17.6 17.6 22.2 22.1 17.7 21.0 16.2 17.8 13.6 13.2 12.1 12.0
[145] 15.0 14.0 14.8 15.5 12.5 19.0 13.7 14.9 16.4 16.9 17.7 19.0 11.1 11.4 12.2 14.5
[161] 16.0 15.8 17.0 15.9 14.4 15.5 13.2 12.8 19.2 18.2 15.8 15.4 17.2 17.2 15.8 16.7
[177] 18.7 15.1 13.2 13.4 11.2 13.7 16.5 14.5 16.7 17.6 15.4 18.2 17.3 18.2 16.6 15.4
[193] 13.4 13.2 15.2 14.9 14.3 15.0 13.0 14.4 15.0 17.4 22.2 13.2 14.9 16.0 11.3 12.9
[209] 13.2 15.5 16.5 18.1 20.1 18.7 14.4 14.3 15.7 16.4 14.4 12.6 12.9 16.4 14.9 16.2
[225] 20.7 15.8 19.0 17.1 16.6 19.6 18.6 18.0 16.2 16.0 18.0 16.4 20.5 14.7 17.3 16.4
[241] 17.0 14.5 14.7 13.0 17.3 15.6 11.6 18.6 19.4
But now I want to relate two variables, that is, perform again split with the 2 generated variables: y1 and y2
Then I use the split command (split(y2,y1)) again but throws me the following error:
Warning message:
In split.default(y1, y2) : data length is not a multiple of split variable
Can you help me?
By the way, the data is ordered well, for example, the first data y1 is related to the first data y2 and in the same way for the next.

Resources