Turn a long data structure to a wide matrix structure - r

I do have the following data structure...
ID value
1 1 1
2 1 63
3 1 2
4 1 58
5 2 3
6 2 4
7 3 34
8 3 25
Now I want to turn it into a kind of dyadic data structure. Every ID with the same value should have a relationship.
I tried several option and:
df_wide <- dcast(df, ID ~ value)
... have brought me a long way down the road...
ID 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 39 40
1 1001 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
2 1006 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
3 1007 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 2 0 0
4 1011 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0
5 1018 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
6 1020 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0
7 1030 0 0 1 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0
8 1036 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Now is my main problem to turn it into a proper matrix to get a igraph object out of it.
df_wide_matrix <- data.matrix(df_wide)
df_aus_wide_g <- graph.edgelist(df_wide_matrix ,directed = TRUE)
don't get me there...
I also tried to transform it into a adjacency matrix...
df_wide_matrix <- get.adjacency(graph.edgelist(as.matrix(df_wide), directed=FALSE))
... but it didn't work either

If you want to create an edge between all IDs with the same value, try something like this instead. First merge the data frame onto itself by the value. Then, remove the value column, and remove all (undirected) edges that are duplicate or just points. Finally, convert to a two-column matrix and create the edges.
res <- merge(df, df, by='value', all=FALSE)[,c('ID.x','ID.y')]
res <- res[res$ID.x<res$ID.y,]
resg <- graph.edgelist(as.matrix(res))

Related

R: Simulating ERGM model in R then generate adjacency matrix of that model

I use library(ergm) and library(igraph) and generate a ERGM network. But I want the adjacency matrix of that network. I am unable to find any function which can produce that.
library(ergm)
library(igraph)
g.use <- network(16,density=0.1,directed=FALSE)
#
# Starting from this network let's draw 3 realizations
# of a edges and 2-star network
#
g.sim <- simulate(~edges+kstar(2), nsim=3, coef=c(-1.8,0.03),
basis=g.use, control=control.simulate(
MCMC.burnin=1000,
MCMC.interval=100))
#g.sim[[3]]
summary(g.sim)
Is it possible to find the adjacency matrix from g.sim? and how?
EGRM package uses the network package and not the igraph package. You should maintain everythig in network and not load igraph as the two have some conflicting functions with same names.
In your case, you simulate 3 graphs thus you should have 3 adjacency matrices. The code is as below:
library(ergm)
g.use <- network(16,density=0.1,directed=FALSE)
g.sim <- simulate(~edges+kstar(2), nsim=3, coef=c(-1.8,0.03),
basis=g.use, control=control.simulate(
MCMC.burnin=1000,
MCMC.interval=100))
The code you want:
lapply(g.sim, as.matrix)
[[1]]
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0
2 0 0 0 0 0 1 0 1 0 0 0 0 0 1 0 0
3 0 0 0 1 1 0 1 0 0 0 0 0 1 0 0 1
4 0 0 1 0 0 0 0 0 1 0 0 0 0 0 0 0
5 1 0 1 0 0 0 0 0 0 0 0 0 1 1 0 0
6 0 1 0 0 0 0 0 0 0 1 0 0 0 0 0 0
7 0 0 1 0 0 0 0 0 0 0 1 0 0 0 0 1
8 0 1 0 0 0 0 0 0 0 1 1 1 1 0 1 0
9 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1
10 0 0 0 0 0 1 0 1 0 0 0 0 1 0 0 0
11 0 0 0 0 0 0 1 1 0 0 0 0 1 0 0 0
12 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0
13 0 0 1 0 1 0 0 1 0 1 1 0 0 0 0 1
14 0 1 0 0 1 0 0 0 0 0 0 0 0 0 0 0
15 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0
16 0 0 1 0 0 0 1 0 1 0 0 0 1 0 0 0
[[2]]
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
1 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0
2 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0
3 0 0 0 1 0 0 0 0 0 0 1 0 0 0 1 0
4 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0
5 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1
6 0 0 0 0 0 0 0 1 1 0 1 1 1 0 0 1
7 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 0
8 0 0 0 0 0 1 0 0 0 0 0 0 0 0 1 0
9 0 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0
10 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0
11 0 1 1 0 0 1 1 0 0 0 0 0 0 0 0 0
12 1 0 0 0 0 1 0 0 1 0 0 0 0 0 0 1
13 1 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0
14 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0
15 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0
16 0 0 0 0 1 1 0 0 0 0 0 1 0 0 0 0
[[3]]
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
1 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1
2 0 0 0 0 0 1 0 0 0 0 0 1 1 0 0 0
3 0 0 0 0 1 0 0 0 0 1 1 0 0 0 1 0
4 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
5 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0
6 0 1 0 0 0 0 1 0 1 0 0 0 1 0 1 0
7 0 0 0 0 0 1 0 0 0 0 0 0 0 1 1 0
8 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
9 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0
10 0 0 1 0 0 0 0 0 0 0 0 0 0 0 1 1
11 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 1
12 0 1 0 0 0 0 0 0 0 0 0 0 1 1 1 0
13 1 1 0 1 0 1 0 0 0 0 0 1 0 0 0 0
14 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0
15 0 0 1 0 0 1 1 0 0 1 0 1 0 0 0 1
16 1 0 0 0 0 0 0 0 0 1 1 0 0 0 1 0

Estimation transition matrix with low observation count

I am building a markov model with an relativ low count of observations for a given number of states.
Are there other methods to estimate the real transition probabilities than the cohort method? Especially to ensure that the probabilities are decreasing with increasing distance from the current state. The pair (11,14) does not behave in that manner and the underlying model wouldn't support this.
2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
2 4 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
3 1 2 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
4 0 1 2 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0
5 0 0 2 10 8 0 0 0 0 0 0 0 0 0 0 0 0 0
6 0 0 0 9 53 13 2 0 0 0 0 0 0 0 0 0 0 0
7 0 0 0 0 17 42 17 0 0 0 0 0 0 0 0 0 0 0
8 0 0 0 0 0 21 71 21 0 0 0 0 0 0 0 0 0 0
9 0 0 0 0 0 0 23 102 21 3 0 0 0 0 0 0 0 0
10 0 0 0 0 0 0 0 23 57 33 0 0 0 0 0 0 0 0
11 0 0 0 0 0 0 0 1 34 142 28 1 3 0 0 0 0 0
12 0 0 0 0 0 0 0 0 1 28 127 27 0 0 0 0 0 0
13 0 0 0 0 0 0 0 0 0 0 28 134 27 0 0 0 0 0
14 0 0 0 0 0 0 0 0 0 0 0 27 93 20 2 0 0 0
15 0 0 0 0 0 0 0 0 0 0 0 0 23 133 19 0 0 0
16 0 0 0 0 0 0 0 0 0 0 0 0 0 22 114 20 0 0
17 0 0 0 0 0 0 0 0 0 0 0 0 0 0 21 192 19 0
18 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 21 263 21
19 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 24 827
Thanks

Garch(1,1) with Dummy Variable

I am trying in R to use Garch(1,1) to estimate the influence of day of the week, and also later other parameters, on my log return (ln(Pt/Pt-1)) of Product sells
I have all setup in a CSV file and for each Day a dummy variable (D1,D2) with 1 or 0 as value
I am building the following model in R
#Bind Data
ext.reg.D1 <- mydata$D1
ext.reg.D2 <- mydata$D2
ext.reg.D3 <- mydata$D3
ext.reg.D4 <- mydata$D4
ext.reg.D5 <- mydata$D5
ext.reg.D6 <- mydata$D6
ext.reg.D7 <- mydata$D7
ext.reg <- cbind(ext.reg.D1, ext.reg.D2, ext.reg.D3,ext.reg.D4,ext.reg.D5,ext.reg.D6)
y <- mydata$log_return
fit.spec <- ugarchspec(variance.model = list(model = "sGARCH", garchOrder = c(1, 1), submodel = NULL, external.regressors = NULL, variance.targeting = FALSE), mean.model = list(armaOrder = c(0, 0), external.regressors = ext.reg), distribution.model = "norm", start.pars = list(), fixed.pars = list())
fit <- ugarchfit(data = y, spec = fit.spec)
Error
In .sgarchfit(spec = spec, data = data, out.sample = out.sample, : ugarchfit-->warning: solver failer to converge.
Any ideas how to solve this?
Thanks
Sampled Data 14 Rows
log_return D5 D6 D7 D1 D2 D3 D4
1 -0.02979189 1 0 0 0 0 0 0
2 17.43188265 0 1 0 0 0 0 0
3 -9.12727223 0 0 1 0 0 0 0
4 2.77744081 0 0 0 1 0 0 0
5 9.62597392 0 0 0 0 1 0 0
6 -0.11614358 0 0 0 0 0 1 0
7 10.81279075 0 0 0 0 0 0 1
8 -1.03825650 1 0 0 0 0 0 0
9 -5.49109661 0 1 0 0 0 0 0
10 -16.81177602 0 0 1 0 0 0 0
11 9.74292804 0 0 0 1 0 0 0
12 15.22583595 0 0 0 0 1 0 0
13 -1.79578436 0 0 0 0 0 1 0
14 0.40559431 0 0 0 0 0 0 1
15 -2.38281092 1 0 0 0 0 0 0
16 -4.88853323 0 1 0 0 0 0 0
17 -16.98493635 0 0 1 0 0 0 0
18 7.57998016 0 0 0 1 0 0 0
19 17.56008274 0 0 0 0 1 0 0
20 -0.46754932 0 0 0 0 0 1 0
21 -1.27007966 0 0 0 0 0 0 1
22 -1.79234966 1 0 0 0 0 0 0
23 -5.79461986 0 1 0 0 0 0 0
24 -17.82636881 0 0 1 0 0 0 0
25 9.48124679 0 0 0 1 0 0 0
26 17.64277207 0 0 0 0 1 0 0
27 -0.71191725 0 0 0 0 0 1 0
28 -1.14937870 0 0 0 0 0 0 1
29 -1.62331777 1 0 0 0 0 0 0
30 -5.52787401 0 1 0 0 0 0 0
31 -18.50034717 0 0 1 0 0 0 0
32 10.31502542 0 0 0 1 0 0 0
33 16.21997258 0 0 0 0 1 0 0
34 -1.09910695 0 0 0 0 0 1 0
35 -0.57416519 0 0 0 0 0 0 1
36 -1.83623328 1 0 0 0 0 0 0
37 -5.48021232 0 1 0 0 0 0 0
38 -20.02869823 0 0 1 0 0 0 0
39 11.48799875 0 0 0 1 0 0 0
40 17.55356524 0 0 0 0 1 0 0
41 -1.45430558 0 0 0 0 0 1 0
42 -2.15287757 0 0 0 0 0 0 1
43 -4.91058837 1 0 0 0 0 0 0
44 -4.35107354 0 1 0 0 0 0 0
45 -19.40533612 0 0 1 0 0 0 0
46 6.47785167 0 0 0 1 0 0 0
47 16.54500844 0 0 0 0 1 0 0
48 1.43266482 0 0 0 0 0 1 0
49 1.91234500 0 0 0 0 0 0 1
50 -1.44926252 1 0 0 0 0 0 0
51 -5.69296574 0 1 0 0 0 0 0
52 -14.21241905 0 0 1 0 0 0 0
53 9.85180551 0 0 0 1 0 0 0
54 16.72072000 0 0 0 0 1 0 0
55 -1.04381003 0 0 0 0 0 1 0
56 -1.49048390 0 0 0 0 0 0 1
57 -2.57835848 1 0 0 0 0 0 0
58 -2.93456505 0 1 0 0 0 0 0
59 -21.27981318 0 0 1 0 0 0 0
60 14.27747712 0 0 0 1 0 0 0
61 15.20376637 0 0 0 0 1 0 0
62 -2.36474181 0 0 0 0 0 1 0
63 -0.12825700 0 0 0 0 0 0 1
64 -2.17755007 1 0 0 0 0 0 0
65 -6.50236487 0 1 0 0 0 0 0
66 -20.40159745 0 0 1 0 0 0 0
67 10.12381534 0 0 0 1 0 0 0
68 19.34672964 0 0 0 0 1 0 0
69 -0.18663788 0 0 0 0 0 1 0
70 -1.26430704 0 0 0 0 0 0 1
71 -2.17712050 1 0 0 0 0 0 0
72 -5.20850527 0 1 0 0 0 0 0
73 -19.00303225 0 0 1 0 0 0 0
74 10.78960865 0 0 0 1 0 0 0
75 16.50911599 0 0 0 0 1 0 0
76 -1.20629718 0 0 0 0 0 1 0
77 -0.92077350 0 0 0 0 0 0 1
78 -2.13818901 1 0 0 0 0 0 0
79 -6.39795596 0 1 0 0 0 0 0
80 -16.89947946 0 0 1 0 0 0 0
81 11.84070286 0 0 0 1 0 0 0
82 16.76126417 0 0 0 0 1 0 0
83 -2.32992683 0 0 0 0 0 1 0
84 -0.04347497 0 0 0 0 0 0 1
85 -1.58421553 1 0 0 0 0 0 0
86 -5.11294741 0 1 0 0 0 0 0
87 -22.94382512 0 0 1 0 0 0 0
88 12.08906834 0 0 0 1 0 0 0
89 18.59588505 0 0 0 0 1 0 0
90 -0.66190281 0 0 0 0 0 1 0
91 -3.35891858 0 0 0 0 0 0 1
92 -5.56096067 1 0 0 0 0 0 0
93 -19.12946131 0 1 0 0 0 0 0
94 -2.45717082 0 0 1 0 0 0 0
95 -6.00314421 0 0 0 1 0 0 0
96 16.87403882 0 0 0 0 1 0 0
97 16.72700765 0 0 0 0 0 1 0
98 -1.80683941 0 0 0 0 0 0 1
99 -2.08228231 1 0 0 0 0 0 0
100 -5.98864409 0 1 0 0 0 0 0
101 -14.91991224 0 0 1 0 0 0 0
I think the problem is that the explanatory variables are all dummy variables. You should include another non dummy variable as x with D1...D7. Your model does not make sense without this variable.
You can not estimate y (which is a continuous variable) with only dummy ones. try for example to add y-1 to
ext.reg <- cbind(ext.reg.D1, ext.reg.D2, ext.reg.D3,ext.reg.D4,ext.reg.D5,ext.reg.D6)
good luck
change your ext.reg for this
ext.reg <- cbind(ext.reg.D1, ext.reg.D2, ext.reg.D3, ext.reg.D4,
ext.reg.D5, ext.reg.D6, ext.reg.D7)
men see, solved exercise.

Loosing observation when I use reshape in R

I have data set
> head(pain_subset2, n= 50)
PatientID RSE SE SECODE
1 1001-01 0 0 0
2 1001-01 0 0 0
3 1001-02 0 0 0
4 1001-02 0 0 0
5 1002-01 0 0 0
6 1002-01 1 2a 1
7 1002-02 0 0 0
8 1002-02 0 0 0
9 1002-02 0 0 0
10 1002-03 0 0 0
11 1002-03 0 0 0
12 1002-03 1 1 1
> dim(pain_subset2)
[1] 817 4
> table(pain_subset2$RSE)
0 1
788 29
> table(pain_subset2$SE)
0 1 2a 2b 3 4 5
788 7 5 1 6 4 6
> table(pain_subset2$SECODE)
0 1
788 29
I want to create matrix with n * 6 (n :# of PatientID, column :6 levels of SE)
I use reshape, I lost many observations
> dim(p)
[1] 246 9
My code:
p <- reshape(pain_subset2, timevar = "SE", idvar = c("PatientID","RSE"),v.names = "SECODE", direction = "wide")
p[is.na(p)] <- 0
> table(p$RSE)
0 1
226 20
Compare with table of RSE, I lost 9 patients having 1.
This is out put I have
PatientID RSE SECODE.0 SECODE.2a SECODE.1 SECODE.5 SECODE.3 SECODE.2b SECODE.4
1 1001-01 0 0 0 0 0 0 0 0
3 1001-02 0 0 0 0 0 0 0 0
5 1002-01 0 0 0 0 0 0 0 0
6 1002-01 1 0 1 0 0 0 0 0
7 1002-02 0 0 0 0 0 0 0 0
10 1002-03 0 0 0 0 0 0 0 0
12 1002-03 1 0 0 1 0 0 0 0
13 1002-04 0 0 0 0 0 0 0 0
15 1003-01 0 0 0 0 0 0 0 0
18 1003-02 0 0 0 0 0 0 0 0
21 1003-03 0 0 0 0 0 0 0 0
24 1003-04 0 0 0 0 0 0 0 0
27 1003-05 0 0 0 0 0 0 0 0
30 1003-06 0 0 0 0 0 0 0 0
32 1003-07 0 0 0 0 0 0 0 0
35 1004-01 0 0 0 0 0 0 0 0
36 1004-01 1 0 0 0 1 0 0 0
40 1004-02a 0 0 0 0 0 0 0 0
Anyone knows what happens, I really appreciate.
Thanks for your help, best.
Try:
library(dplyr)
library(tidyr)
pain_subset2 %>%
spread(SE, SECODE)

3D Array value assignment ruins the structure of array

Here is how to reproduce my problem. I want to create a 3D array
> g=array(0,dim=c(3,31,31))
> dim(g)
[1] 3 31 31
> dim(g[1,,])
[1] 31 31
This is x with dimension 31 by 31
> dim(x)
[1] 31 31
> x
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31
1 NA 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0
2 0 NA 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
3 2 1 NA 0 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0
4 0 0 0 NA 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0
5 0 0 0 0 NA 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
6 0 0 0 0 0 NA 1 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0
7 0 0 0 0 0 1 NA 0 0 1 0 1 0 0 0 0 1 0 0 0 1 0 0 0 0 0 0 0 0 0 0
8 0 0 0 0 0 0 0 NA 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
9 0 0 0 0 0 0 0 0 NA 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
10 0 1 1 0 0 0 1 0 0 NA 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0 0 0 0 1 0
11 0 0 0 0 0 0 0 0 2 0 NA 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0
12 0 0 0 0 0 0 1 1 0 0 0 NA 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0
13 0 0 0 0 0 0 0 0 0 0 0 0 NA 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0
14 0 0 0 0 0 0 0 0 0 0 0 0 0 NA 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
15 0 0 1 1 1 0 0 0 0 0 0 0 0 1 NA 1 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1
16 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 NA 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
17 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 NA 0 0 0 0 0 0 1 0 0 0 0 0 0 0
18 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 NA 0 1 1 0 0 0 0 0 0 0 0 0 0
19 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 NA 0 0 0 0 0 0 0 0 0 0 0 0
20 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 NA 0 0 0 0 0 0 0 0 0 0 0
21 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 0 0 NA 0 0 0 0 0 0 0 0 0 0
22 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 NA 1 0 0 0 0 0 0 0 0
23 1 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 NA 0 0 0 0 0 0 0 0
24 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 NA 0 0 1 0 0 0 0
25 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 NA 0 0 0 0 0 0
26 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 NA 0 0 0 0 0
27 0 0 0 0 0 0 0 0 0 0 0 1 0 0 1 0 0 0 0 0 0 0 0 1 0 0 NA 0 1 0 0
28 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 NA 0 0 0
29 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 NA 0 0
30 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 NA 0
31 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 NA
when I try to assign x to the first 'section' of g using
> g[1,,] = x
The array structure of g is totally changed, as now it becomes:
> dim(g)
NULL
> head(g)
[[1]]
[1] NA 0 2 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 0 0 0 0 0 0 0 0
[[2]]
[1] 0
[[3]]
[1] 0
[[4]]
[1] 0 NA 1 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0
[[5]]
[1] 0
[[6]]
[1] 0
This is totally different from what I expected, I am just trying to put a matrix to g[1,,] and dim(g) should still be 3 by 31 by 31, am I wrong? where did I do wrong?
Thanks to Pascal's comments below I've amended my answer, though I've left the dimensionality changed to 31x31x3 for perhaps easier understanding. The problem is the way that the data are converted from your data.frame object before storing in your array. I think by converting first to a matrix you should get what you're looking for:
g <- array(0,dim=c(31,31,3))
m <- matrix(1, 31, 31)
x <- as.data.frame(m)
## Storing x as it is will result in g becoming a list...
#g[,,1] <- x
## Converting the data.frame to a matrix will result in
## g remaining an array:
g[,,1] <- as.matrix(x)

Resources