How to include 3 interactions with a SINGLE predictor in lmer - r

I know that I could use lm(a ~ (b + c + d)^2) in order to get all possible two-way interactions in a model, but I need only the interactions with a single predictor. Let's say I want the possible interaction of B + C + D with predictor E.
I've tried:
lmer(MyVar ~ (1|ID) + (B + C + D)^E, data = data, REML = F)
Error in terms.formula(formula, data = data) :
invalid power in formula
I know that I could hard code each interaction with either * or :, but I suppose there's a simple way to do that all at once, isn't there? Thanks in advance.

Peter already provided an answer in the comments, but just so there is a worked example here, I have used the carrots dataset from the lmerTest package to fit this kind of model.
#### Load Library ####
library(lmerTest)
#### Fit 3 Interactions with Predictor ####
fit <- lmer(Preference
~ (Work + Homesize + Age) * sens2
+ (1 + sens2 | Consumer),
data=carrots)
summary(fit)
This specific model has more than 12 parameters, so it gives a warning at the that it can't show the entire correlation matrix at the end:
Linear mixed model fit by REML. t-tests use Satterthwaite's method [
lmerModLmerTest]
Formula:
Preference ~ (Work + Homesize + Age) * sens2 + (1 + sens2 | Consumer)
Data: carrots
REML criterion at convergence: 3793
Scaled residuals:
Min 1Q Median 3Q Max
-3.5393 -0.5531 0.0221 0.6129 3.0304
Random effects:
Groups Name Variance Std.Dev. Corr
Consumer (Intercept) 0.194588 0.44112
sens2 0.002667 0.05164 0.30
Residual 1.070431 1.03462
Number of obs: 1233, groups: Consumer, 103
Fixed effects:
Estimate Std. Error df t value Pr(>|t|)
(Intercept) 4.599943 0.269675 92.130730 17.057 <2e-16 ***
Work2 0.252784 0.215224 92.355377 1.175 0.2432
Work3 0.049107 0.202453 92.620270 0.243 0.8089
Work4 0.350115 0.241920 92.357943 1.447 0.1512
Work5 -0.172296 0.251901 92.336511 -0.684 0.4957
Work6 0.142940 0.306935 92.245988 0.466 0.6425
Work7 0.284870 0.222300 92.466369 1.281 0.2032
Homesize3 -0.210541 0.117745 92.054098 -1.788 0.0770 .
Age2 0.147557 0.258083 91.931134 0.572 0.5689
Age3 0.175345 0.244237 91.940161 0.718 0.4746
Age4 0.143185 0.286984 91.891878 0.499 0.6190
sens2 -0.005156 0.048716 92.036870 -0.106 0.9159
Work2:sens2 -0.026848 0.038861 92.096571 -0.691 0.4914
Work3:sens2 0.025743 0.036536 92.167106 0.705 0.4828
Work4:sens2 0.020395 0.043681 92.097263 0.467 0.6417
Work5:sens2 0.041402 0.045486 92.091579 0.910 0.3651
Work6:sens2 0.041545 0.055435 92.076468 0.749 0.4555
Work7:sens2 -0.026257 0.040130 92.126134 -0.654 0.5145
Homesize3:sens2 0.034216 0.021273 92.017206 1.608 0.1112
Age2:sens2 0.050271 0.046641 91.984618 1.078 0.2839
Age3:sens2 0.049982 0.044137 91.986480 1.132 0.2604
Age4:sens2 0.098257 0.051868 91.973468 1.894 0.0613 .
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
Correlation matrix not shown by default, as p = 22 > 12.
Use print(x, correlation=TRUE) or
vcov(x) if you need it
Since this may be common for models that have a lot of interaction terms, you can simply follow the advice given with the warning and just run vcov(fit) to see the rest:
22 x 22 Matrix of class "dpoMatrix"
(Intercept) Work2 Work3 Work4
(Intercept) 0.0727247177 -2.460271e-02 -2.159223e-02 -2.485573e-02
Work2 -0.0246027147 4.632157e-02 2.731653e-02 2.605219e-02
Work3 -0.0215922283 2.731653e-02 4.098720e-02 2.751607e-02
Work4 -0.0248557275 2.605219e-02 2.751607e-02 5.852536e-02
Work5 -0.0188462642 2.535069e-02 2.746376e-02 2.638325e-02
Work6 -0.0590599021 2.577976e-02 2.074795e-02 2.352827e-02
Work7 -0.0196165301 2.525230e-02 2.745152e-02 2.594625e-02
Homesize3 -0.0109607544 -3.400679e-04 3.870252e-03 2.706524e-03
Age2 -0.0407749626 -6.515489e-03 -1.335698e-02 -4.201862e-03
Age3 -0.0494575617 -7.385083e-04 -7.919796e-03 -3.694373e-03
Age4 -0.0511840024 -5.047503e-04 -6.213891e-03 -8.488546e-04
sens2 0.0017103436 -5.694673e-04 -4.960309e-04 -5.737888e-04
Work2:sens2 -0.0005694673 1.082713e-03 6.301142e-04 6.007327e-04
Work3:sens2 -0.0004960308 6.301141e-04 9.510769e-04 6.326784e-04
Work4:sens2 -0.0005737887 6.007327e-04 6.326785e-04 1.367865e-03
Work5:sens2 -0.0004311883 5.838187e-04 6.310805e-04 6.061557e-04
Work6:sens2 -0.0013903112 5.995643e-04 4.776545e-04 5.446369e-04
Work7:sens2 -0.0004495020 5.815462e-04 6.308496e-04 5.958678e-04
Homesize3:sens2 -0.0002555305 -1.271474e-05 8.614862e-05 5.884019e-05
Age2:sens2 -0.0009699655 -1.494199e-04 -3.104255e-04 -9.394093e-05
Age3:sens2 -0.0011767009 -1.158708e-05 -1.805710e-04 -8.082443e-05
Age4:sens2 -0.0012155754 -8.270833e-06 -1.427594e-04 -1.592515e-05
Work5 Work6 Work7 Homesize3
(Intercept) -1.884626e-02 -5.905990e-02 -1.961653e-02 -1.096075e-02
Work2 2.535069e-02 2.577976e-02 2.525230e-02 -3.400679e-04
Work3 2.746376e-02 2.074795e-02 2.745152e-02 3.870252e-03
Work4 2.638325e-02 2.352827e-02 2.594625e-02 2.706524e-03
Work5 6.345422e-02 1.792640e-02 3.446007e-02 2.839939e-03
Work6 1.792640e-02 9.420917e-02 1.785281e-02 1.291156e-03
Work7 3.446007e-02 1.785281e-02 4.941720e-02 4.034687e-03
Homesize3 2.839939e-03 1.291156e-03 4.034687e-03 1.386395e-02
Age2 -7.707013e-03 3.230279e-02 -7.764027e-03 -1.515922e-03
Age3 -1.021223e-02 4.033104e-02 -9.334598e-03 2.906882e-03
Age4 -1.878498e-02 4.116822e-02 -2.475574e-02 4.240265e-03
sens2 -4.311884e-04 -1.390312e-03 -4.495020e-04 -2.555306e-04
Work2:sens2 5.838188e-04 5.995648e-04 5.815462e-04 -1.271468e-05
Work3:sens2 6.310805e-04 4.776551e-04 6.308496e-04 8.614867e-05
Work4:sens2 6.061557e-04 5.446375e-04 5.958678e-04 5.884025e-05
Work5:sens2 1.483935e-03 4.114916e-04 7.971930e-04 6.188087e-05
Work6:sens2 4.114910e-04 2.207829e-03 4.096409e-04 2.535023e-05
Work7:sens2 7.971929e-04 4.096415e-04 1.151554e-03 9.018370e-05
Homesize3:sens2 6.188081e-05 2.535036e-05 9.018363e-05 3.267278e-04
Age2:sens2 -1.768088e-04 7.730782e-04 -1.781380e-04 -3.387819e-05
Age3:sens2 -2.350601e-04 9.623697e-04 -2.142697e-04 7.102571e-05
Age4:sens2 -4.406804e-04 9.803703e-04 -5.821051e-04 1.018839e-04
Age2 Age3 Age4 sens2
(Intercept) -4.077496e-02 -4.945756e-02 -5.118400e-02 0.0017103436
Work2 -6.515489e-03 -7.385083e-04 -5.047503e-04 -0.0005694673
Work3 -1.335698e-02 -7.919796e-03 -6.213891e-03 -0.0004960309
Work4 -4.201862e-03 -3.694373e-03 -8.488546e-04 -0.0005737888
Work5 -7.707013e-03 -1.021223e-02 -1.878498e-02 -0.0004311884
Work6 3.230279e-02 4.033104e-02 4.116822e-02 -0.0013903117
Work7 -7.764027e-03 -9.334598e-03 -2.475574e-02 -0.0004495020
Homesize3 -1.515922e-03 2.906882e-03 4.240265e-03 -0.0002555306
Age2 6.660707e-02 4.917829e-02 4.874852e-02 -0.0009699654
Age3 4.917829e-02 5.965166e-02 5.751230e-02 -0.0011767008
Age4 4.874852e-02 5.751230e-02 8.235963e-02 -0.0012155754
sens2 -9.699654e-04 -1.176701e-03 -1.215575e-03 0.0023732002
Work2:sens2 -1.494201e-04 -1.158717e-05 -8.270915e-06 -0.0008009534
Work3:sens2 -3.104256e-04 -1.805711e-04 -1.427594e-04 -0.0007021649
Work4:sens2 -9.394109e-05 -8.082452e-05 -1.592525e-05 -0.0008088714
Work5:sens2 -1.768090e-04 -2.350602e-04 -4.406805e-04 -0.0006125030
Work6:sens2 7.730783e-04 9.623699e-04 9.803704e-04 -0.0019275339
Work7:sens2 -1.781381e-04 -2.142698e-04 -5.821052e-04 -0.0006376798
Homesize3:sens2 -3.387822e-05 7.102570e-05 1.018839e-04 -0.0003572095
Age2:sens2 1.574981e-03 1.162241e-03 1.152849e-03 -0.0013328873
Age3:sens2 1.162241e-03 1.410198e-03 1.360446e-03 -0.0016167476
Age4:sens2 1.152849e-03 1.360446e-03 1.949606e-03 -0.0016727270
Work2:sens2 Work3:sens2 Work4:sens2 Work5:sens2
(Intercept) -5.694673e-04 -4.960308e-04 -5.737887e-04 -4.311883e-04
Work2 1.082713e-03 6.301141e-04 6.007327e-04 5.838187e-04
Work3 6.301142e-04 9.510769e-04 6.326785e-04 6.310805e-04
Work4 6.007327e-04 6.326784e-04 1.367865e-03 6.061557e-04
Work5 5.838188e-04 6.310805e-04 6.061557e-04 1.483935e-03
Work6 5.995648e-04 4.776551e-04 5.446375e-04 4.114916e-04
Work7 5.815462e-04 6.308496e-04 5.958678e-04 7.971930e-04
Homesize3 -1.271468e-05 8.614867e-05 5.884025e-05 6.188087e-05
Age2 -1.494201e-04 -3.104256e-04 -9.394109e-05 -1.768090e-04
Age3 -1.158717e-05 -1.805711e-04 -8.082452e-05 -2.350602e-04
Age4 -8.270915e-06 -1.427594e-04 -1.592525e-05 -4.406805e-04
sens2 -8.009534e-04 -7.021649e-04 -8.088714e-04 -6.125030e-04
Work2:sens2 1.510208e-03 8.888542e-04 8.476685e-04 8.246902e-04
Work3:sens2 8.888542e-04 1.334852e-03 8.949241e-04 8.931412e-04
Work4:sens2 8.476685e-04 8.949241e-04 1.908065e-03 8.579816e-04
Work5:sens2 8.246902e-04 8.931412e-04 8.579816e-04 2.068938e-03
Work6:sens2 8.398725e-04 6.749491e-04 7.659950e-04 5.829051e-04
Work7:sens2 8.214882e-04 8.927538e-04 8.437191e-04 1.121775e-03
Homesize3:sens2 -1.207554e-05 1.252882e-04 8.732358e-05 9.165766e-05
Age2:sens2 -2.118283e-04 -4.351115e-04 -1.361084e-04 -2.505820e-04
Age3:sens2 -2.290075e-05 -2.572678e-04 -1.193013e-04 -3.321938e-04
Age4:sens2 -1.572527e-05 -2.020783e-04 -2.686406e-05 -6.127763e-04
Work6:sens2 Work7:sens2 Homesize3:sens2
(Intercept) -1.390311e-03 -0.0004495020 -2.555305e-04
Work2 5.995643e-04 0.0005815462 -1.271474e-05
Work3 4.776545e-04 0.0006308496 8.614862e-05
Work4 5.446369e-04 0.0005958678 5.884019e-05
Work5 4.114910e-04 0.0007971929 6.188081e-05
Work6 2.207829e-03 0.0004096415 2.535036e-05
Work7 4.096409e-04 0.0011515535 9.018363e-05
Homesize3 2.535023e-05 0.0000901837 3.267278e-04
Age2 7.730783e-04 -0.0001781381 -3.387822e-05
Age3 9.623699e-04 -0.0002142698 7.102570e-05
Age4 9.803704e-04 -0.0005821052 1.018839e-04
sens2 -1.927534e-03 -0.0006376798 -3.572095e-04
Work2:sens2 8.398725e-04 0.0008214882 -1.207554e-05
Work3:sens2 6.749491e-04 0.0008927538 1.252882e-04
Work4:sens2 7.659950e-04 0.0008437191 8.732358e-05
Work5:sens2 5.829051e-04 0.0011217754 9.165766e-05
Work6:sens2 3.072991e-03 0.0005804819 4.112908e-05
Work7:sens2 5.804819e-04 0.0016104051 1.306882e-04
Homesize3:sens2 4.112908e-05 0.0001306882 4.525614e-04
Age2:sens2 1.056816e-03 -0.0002524409 -4.911036e-05
Age3:sens2 1.318931e-03 -0.0003035236 9.540544e-05
Age4:sens2 1.345904e-03 -0.0008078267 1.388138e-04
Age2:sens2 Age3:sens2 Age4:sens2
(Intercept) -9.699655e-04 -1.176701e-03 -1.215575e-03
Work2 -1.494199e-04 -1.158708e-05 -8.270833e-06
Work3 -3.104255e-04 -1.805710e-04 -1.427594e-04
Work4 -9.394093e-05 -8.082443e-05 -1.592515e-05
Work5 -1.768088e-04 -2.350601e-04 -4.406804e-04
Work6 7.730782e-04 9.623697e-04 9.803703e-04
Work7 -1.781380e-04 -2.142697e-04 -5.821051e-04
Homesize3 -3.387819e-05 7.102571e-05 1.018839e-04
Age2 1.574981e-03 1.162241e-03 1.152849e-03
Age3 1.162241e-03 1.410198e-03 1.360446e-03
Age4 1.152849e-03 1.360446e-03 1.949606e-03
sens2 -1.332887e-03 -1.616748e-03 -1.672727e-03
Work2:sens2 -2.118283e-04 -2.290075e-05 -1.572527e-05
Work3:sens2 -4.351115e-04 -2.572678e-04 -2.020783e-04
Work4:sens2 -1.361084e-04 -1.193013e-04 -2.686406e-05
Work5:sens2 -2.505820e-04 -3.321938e-04 -6.127763e-04
Work6:sens2 1.056816e-03 1.318931e-03 1.345904e-03
Work7:sens2 -2.524409e-04 -3.035236e-04 -8.078267e-04
Homesize3:sens2 -4.911036e-05 9.540544e-05 1.388138e-04
Age2:sens2 2.175354e-03 1.606007e-03 1.592130e-03
Age3:sens2 1.606007e-03 1.948116e-03 1.878419e-03
Age4:sens2 1.592130e-03 1.878419e-03 2.690248e-03

Related

How to plot r squared for regression model for multiple group in r

o<-read.csv("old.csv')
Callosal_FA CST_FA SLF_FA Area_SMA_A
0.556566554 0.539971971 0.482016736 -0.007984
0.586793895 0.554954237 0.487595985 0.05567
0.613107046 0.597039029 0.467378312 0.136
0.59241945 0.58101919 0.460784717 0.03253
0.586344082 0.555524562 0.479480255 -0.01629
0.607088378 0.56048251 0.478998182 0.07981
0.595145661 0.571902322 0.461452732 0.07882
0.591501695 0.581156582 0.51408736 0.1143
0.587255765 0.566562088 0.462376015 0.1717
0.583943048 0.571209263 0.46400787 -0.01861
0.603512157 0.587332337 0.477376739 0.05672
0.582126533 0.565946603 0.459743433 0.002831
0.570966197 0.556258709 0.470341615 -0.003823
0.570307147 0.542675924 0.504833121 0.01764
0.579498276 0.569837284 0.475364742 -0.000387
0.570543729 0.542095809 0.468923119 0.117
0.613672747 0.572339549 0.486481493 0.1264
0.570649037 0.554125163 0.522845609 0.04696
0.580601176 0.558799894 0.504017998 0.1056
0.576166024 0.542110191 0.476548484 0.05783
0.579598762 0.546776236 0.491528835 0.08022
0.604775228 0.576144869 0.506060596 0.1515
0.555354582 0.556518053 0.492985322 -0.01114
0.580857907 0.556575944 0.484096309 0.03578
I wanted to plot the regression model result (r squared) for multiple group.
I have tried this one and managed to do for one group.
y <- lm(o$Area_SMA_A ~ o$CST_FA + o$SLF_FA + o$Callosal_FA)
then by using the following code:
library(ggplot2)
ggplot(y$model, aes_string(x = names(y$model)[2], y = names(y$model)[1])) +
geom_point() +
stat_smooth(method = "lm", col = "lightblue") +
labs(title = paste("Adj R2 = ",signif(summary(y)$adj.r.squared, 5),
"Intercept =",signif(y$coef[[1]],5 ),
" Slope =",signif(y$coef[[2]], 5),
" P =",signif(summary(y)$coef[2,4], 5)))
I produced this image, which what I wanted.
For I have three groups, I want to add another regression model for the two groups into this plot. Is there a way to do it please? Thank you.
m <- read.csv("middle.csv')
Callosal_FA CST_FA SLF_FA Area_SMA_A
0.599350895 0.59082334 0.518316923 0.04286
0.591540991 0.585592011 0.517415822 0.1291
0.62120411 0.613751115 0.456966929 0.05915
0.59344635 0.571179365 0.500941682 0.01122
0.621645795 0.599144316 0.487736421 0.0471
0.611521291 0.596407776 0.508636999 -0.08177
0.561589532 0.549150165 0.509993364 -0.002053
0.608089072 0.581477369 0.496346462 0.1157
0.583942196 0.576979247 0.505747697 0.01913
0.614675486 0.584447311 0.513085904 0.006673
0.599312499 0.585156336 0.475447955 0.05582
0.591977354 0.578031977 0.505042846 0.08293
0.602347244 0.582916321 0.504538196 -0.07645
0.628674145 0.595462642 0.469785878 0.04787
0.595963981 0.547983665 0.497874226 0.1132
0.604934306 0.586583356 0.502788492 0.08803
0.599656344 0.580235613 0.471793292 0.0118
0.587288357 0.559298093 0.535857414 0.06225
0.586031623 0.582565008 0.475876222 0.282
0.58277546 0.555852007 0.497386116 0.05266
y <- read.csv("young.csv')
Callosal_FA CST_FA SLF_FA Area_SMA_A
0.641939581 0.610050256 0.497039292 -0.05461
0.600969207 0.581011925 0.486918544 0.03801
0.597728695 0.569094851 0.522076721 0.08515
0.605851215 0.575788238 0.522207993 0.001711
0.615141198 0.586422768 0.49536629 0.08908
0.664600517 0.636086957 0.50723616 0.04712
0.617076761 0.577625164 0.50950881 0.02169
0.612482041 0.569112478 0.512551218 0.04043
0.627284885 0.597122461 0.541768958 0.003275
0.627408656 0.607896037 0.505038914 0.06681
0.609205487 0.577178474 0.508818934 -0.04759
0.606824376 0.593485569 0.530833127 0.05503
0.608929339 0.583816742 0.506553103 0.08804
0.623125338 0.599054187 0.518118823 0.04499
0.606161965 0.578010045 0.491883074 0.1487
0.605391626 0.585302201 0.488368677 0.1316
0.640007128 0.599344654 0.503622583 0.1909
0.598483618 0.588507596 0.508622188 0.2013
0.625079582 0.597286968 0.510829857 0.09116
0.620938861 0.577980188 0.52410613 0.02284
0.615765316 0.577922653 0.542867003 0.08179
0.606476852 0.571277288 0.486362068 0.2072
0.607761045 0.585516175 0.509739355 0.075
0.633673687 0.615854958 0.470963903 0.02209
0.641553411 0.621000635 0.492999164 0.101
0.588310547 0.57312727 0.490874808 0.07214
0.588535558 0.571499503 -0.08068 -0.03153

Unexpected error using Jump with Julia

I am trying to solve an optimization problem, I am getting error as
"ERROR: Expected m to be a JuMP model, but it has type Int64
in validmodel(::Int64, ::Symbol) at C:\Users\Ting.julia\v0.5\JuMP\src\macros.jl:247
in macro expansion; at C:\Users\Ting.julia\v0.5\JuMP\src\macros.jl:252 [inlined]
in macro expansion; at .\REPL[608]:3 [inlined]
in anonymous at .\:?"
Please see the following code(error in constraint 2). Please don't mind the way I have defined arrays, any help is appreciated. Thank you
using JuMP
using Gurobi
m = Model(solver = GurobiSolver()) #if GurobiSolver is to be used .
## insert all matrixs here
#this is the cost for plant to warehouse
plant=4 #last index for {1,2,3}
product=5 #ast index for {2,3,4}
customer=50
warehouse=4
#variable(m, x[i=1:product ,k=1:plant,l=1:warehouse]>=0) #plant to warehouse
#variable(m, y[i=1:product ,k=1:warehouse,l=1:customer]>=0) #warehouse to customer
#variable(m, z[i=1:product ,k=1:plant,l=1:customer ]>=0) #plant to customer
#variable(m, p[i=1:product ,k=1:plant]>=0) #any product i produced at plant k
#THIS GIVES COST OF PRODUCING AT ANY PRODUCT I AT PLANT K
PC=[500 500 500 500;
400 400 400 400;
300 300 300 300;
200 200 200 200;
100 100 100 100]
#DEMAND OF I AT ANY COSTOMER M, SHOULD BE A MATRIX OF (5*50)
D=[4650.28 10882.70 7920.68 2099.06 4920.32 5077.80 2259.10 9289.30 9782.28 4671.85 6625.68 6956.80 5288.12 4144.78 11121.56 9152.47 10206.88 4601.63 2718.91 1439.39 2984.38 3631.17 3934.48 12314.28 4188.04 8437.43 6302.34 1248.62 6286.56 7333.46 11027.86 6233.33 7240.82 5652.13 10276.03 1197.22 11160.13 4510.31 8850.49 8291.09 1081.47 7652.23 3936.85 2640.47 7726.72 1422.96 1644.78 1060.39 6858.66 6554.45;
528.11 4183.80 352.45 366.34 1961.78 3419.11 337.44 708.15 3556.56 1649.95 583.25 1525.97 1569.92 349.93 1904.59 2221.80 2139.63 1822.87 546.11 784.93 948.33 1424.26 1910.64 2275.11 1527.57 2477.49 1592.14 90.86 2635.48 131.02 2402.35 2669.67 105.34 1350.60 4233.60 411.54 687.88 89.09 213.23 2817.29 8.08 1586.51 577.07 1529.34 2919.06 393.97 85.45 214.93 3193.94 1565.64;
480.26 622.67 131.04 14.45 1299.71 599.27 83.08 197.37 1986.77 409.08 371.12 1249.92 216.21 62.43 34.96 1752.75 227.06 184.26 219.92 577.37 138.71 36.23 1659.02 1323.50 236.64 2557.64 76.74 74.08 363.64 52.96 456.67 1589.86 81.89 617.11 509.86 145.52 14.13 83.22 215.03 2749.34 7.12 490.00 120.42 456.03 430.22 165.02 66.16 150.70 2806.58 1403.70;
307.36 474.39 7.56 11.76 882.03 222.62 27.29 158.13 55.94 332.98 171.36 492.81 44.12 24.08 15.57 739.97 11.09 199.51 136.46 194.40 63.72 2.42 355.99 1005.42 66.33 1647.51 47.22 21.32 218.06 11.54 305.81 387.71 8.50 248.38 9.20 76.05 13.12 39.83 146.52 379.44 2.75 239.53 94.06 136.96 290.16 237.75 9.04 110.64 842.58 395.08;
76.52 280.62 5.06 6.75 281.41 215.58 5.78 54.69 20.79 22.08 78.50 322.13 34.13 6.37 11.66 178.33 3.40 142.11 60.70 46.17 6.96 1.15 227.70 669.39 3.21 526.85 45.91 17.00 131.43 11.19 189.00 43.93 3.36 110.66 1.75 41.34 0 38.63 50.78 241.19 0 176.32 94.25 99.59 153.50 123.02 3.76 122.52 853.48 99.62]
a = Array{Float64}(5,4,4)
a[1,1,1]=a[2,1,1]=a[3,1,1]=a[4,1,1]=a[5,1,1]=0.2*528.42
a[1,2,1]=a[2,2,1]=a[3,2,1]=a[4,2,1]=a[5,2,1]=0.2*1366.16
a[1,3,1]=a[2,3,1]=a[3,3,1]=a[4,3,1]=a[5,3,1]=0.2*1525.41
a[1,4,1]=a[2,4,1]=a[3,4,1]=a[4,4,1]=a[5,4,1]=0.2*878.11
a[1,1,2]=a[2,1,2]=a[3,1,2]=a[4,1,2]=a[5,1,2]=0.2*1692.25
a[1,2,2]=a[2,2,2]=a[3,2,2]=a[4,2,2]=a[5,2,2]=0.2*1553.06
a[1,3,2]=a[2,3,2]=a[3,3,2]=a[4,3,2]=a[5,3,2]=0.2*817.18
a[1,4,2]=a[2,4,2]=a[3,4,2]=a[4,4,2]=a[5,4,2]=0.2*2164.69
a[1,1,3]=a[2,1,3]=a[3,1,3]=a[4,1,3]=a[5,1,3]=0.2*2006.5
a[1,2,3]=a[2,2,3]=a[3,2,3]=a[4,2,3]=a[5,2,3]=0.2*1385.04
a[1,3,3]=a[2,3,3]=a[3,3,3]=a[4,3,3]=a[5,3,3]=0.2*998.58
a[1,4,3]=a[2,4,3]=a[3,4,3]=a[4,4,3]=a[5,4,3]=0.2*2148.45
a[1,1,4]=a[2,1,4]=a[3,1,4]=a[4,1,4]=a[5,1,4]=0.2*1073.07
a[1,2,4]=a[2,2,4]=a[3,2,4]=a[4,2,4]=a[5,2,4]=0.2*368.35
a[1,3,4]=a[2,3,4]=a[3,3,4]=a[4,3,4]=a[5,3,4]=0.2*450.12
a[1,4,4]=a[2,4,4]=a[3,4,4]=a[4,4,4]=a[5,4,4]=0.2*1129.27
#objective(m, Min ,sum(a[i,k,l]* x[i,k,l] for i=1:product for k=1:plant for l=1:warehouse) + sum(c_dash[i,l,m]* y[i,l,m] for i=1:product for l=1:warehouse for m=1:plant) +sum(c_dash_dash[i,k,m]* z[i,k,m] for i=1:product for k=1:plant for m=1:customer)+sum(PC[i,k]* p[i,k] for i=1:product for k=1:plant)) #to be changes
#constraint(m,p[1,2]==0)
#constraint(m,p[1,3]==0)
#constraint(m,p[1,4]==0)
#constraint(m,p[2,1]==0)
#constraint(m,p[2,3]==0)
#constraint(m,p[2,4]==0)
#constraint(m,p[3,1]==0)
#constraint(m,p[3,2]==0)
#constraint(m,p[3,4]==0)
#constraint(m,p[4,1]==0)
#constraint(m,p[4,2]==0)
#constraint(m,p[4,3]==0)
#constraint(m,p[5,1]==0)
#constraint(m,p[5,2]==0)
#constraint(m,p[5,3]==0)
#constraint(m,p[1,1]<=450000)
#constraint(m,p[2,2]<=108000)
#constraint(m,p[3,3]<=45000)
#constraint(m,p[4,4]<=18000)
#constraint(m,p[5,4]<=9000)
#constraint 1
#constraint(m,415728.69-0.8* sum(y[i,l,m] for i=1:product for l=1:warehouse for m=1:customer) <=0)
#constrainst 2
for m=1:customer
for i=1:product
#constraint(m, D[i,m]-sum(z[i,k,m] for k=1:plant)-sum(y[i,l,m] for l=1:warehouse) <=0 ) #cant get
end
end
#constrainst 2
for m=1:customer
for i=1:product
#constraint(m, D[i,m]-sum(z[i,k,m] for k=1:plant)-sum(y[i,l,m] for l=1:warehouse) <=0 ) #cant get
end
end
The error explains the problem very well. Your outer-loop variable here is m, which makes usage of m inside the loop refers to the loop variable and not to your model. m is also used to hold the model in the outer-scope. Change your loop variable or model variable to something else and the problem is fixed.

Incontinuous Quaternion Signals

I'm using a BNO055 IMU and I sometimes see "jumps" in the quaternion signals during the movement. Is this normal? Here's a sample plot.
The first plot is the scalar value and the rest are the three vector components.
I assumed this happens to Euler Angles but not in Quaternions. Is there something I'm missing?
0.4978 0.37885 0.65814 -0.41888
0.49774 0.37854 0.65778 -0.41986
0.49762 0.37842 0.65759 -0.42035
0.49878 0.37616 0.6582 -0.42017
0.49878 0.37561 0.65784 -0.42114
0.49872 0.37537 0.65765 -0.42175
0.49933 0.3736 0.65802 -0.42206
0.49902 0.37347 0.65753 -0.42328
0.49896 0.37335 0.65735 -0.42371
0.49921 0.37189 0.6579 -0.42383
0.49872 0.37164 0.65771 -0.42499
0.49841 0.37158 0.65784 -0.42517
0.49854 0.37042 0.65881 -0.42444
0.49792 0.37042 0.65936 -0.42444
0.4975 0.37048 0.65961 -0.42456
0.49768 0.36932 0.66034 -0.42413
0.49701 0.36957 0.66034 -0.42468
0.49664 0.36975 0.66022 -0.42511
0.49719 0.36823 0.66083 -0.42487
0.49658 0.36877 0.66028 -0.42596
0.49622 0.36908 0.65991 -0.42664
0.49683 0.3678 0.66034 -0.42645
0.49609 0.36841 0.65973 -0.42767
0.49573 0.36871 0.65948 -0.42822
0.49658 0.36713 0.66003 -0.42773
0.49591 0.36768 0.65948 -0.42889
0.49554 0.36798 0.65918 -0.4295
0.49658 0.36572 0.66016 -0.42877
0.49554 0.36639 0.65942 -0.43048
0.49493 0.36676 0.659 -0.43158
0.49426 0.36682 0.65857 -0.43292
0.49316 0.36774 0.65765 -0.43475
0.49274 0.36816 0.65723 -0.43555
0.49261 0.36768 0.65717 -0.43616
0.49188 0.3681 0.65662 -0.43744
0.49152 0.36829 0.65643 -0.43805
0.4917 0.36719 0.65662 -0.43842
0.49091 0.36749 0.65619 -0.43976
0.49048 0.36768 0.65594 -0.44043
0.4903 0.36707 0.65607 -0.44098
0.48956 0.36713 0.65582 -0.44208
0.48926 0.36707 0.65576 -0.44257
0.48944 0.36591 0.65619 -0.44263
0.48926 0.36542 0.65631 -0.44312
0.48914 0.36511 0.65649 -0.44324
0.48962 0.36328 0.65741 -0.44281
0.48956 0.3623 0.6579 -0.44293
0.48956 0.36176 0.6582 -0.44299
0.4903 0.35937 0.65936 -0.44238
0.48993 0.35846 0.65967 -0.44305
0.48969 0.35815 0.65979 -0.44348
0.4903 0.35614 0.66077 -0.44287
0.48975 0.35596 0.66095 -0.44336
0.48956 0.35583 0.66101 -0.44354
0.48999 0.35449 0.66168 -0.44312
0.48969 0.35437 0.66174 -0.4436
0.48944 0.35431 0.66174 -0.44385
0.48932 0.35382 0.66193 -0.44409
0.48907 0.35364 0.66199 -0.44446
0.48895 0.35358 0.66199 -0.44464
0.48914 0.35272 0.66211 -0.44482
0.48883 0.35284 0.66193 -0.44543
0.48865 0.35297 0.6618 -0.4458
0.48938 0.35162 0.66229 -0.44519
0.48914 0.35193 0.66205 -0.44562
0.48901 0.35205 0.66199 -0.44574
0.48993 0.35059 0.66278 -0.44476
0.48999 0.35028 0.66309 -0.44446
0.49011 0.3501 0.66321 -0.44427
0.49097 0.34845 0.66425 -0.44305
0.49103 0.34814 0.66467 -0.44263
0.49103 0.34802 0.66486 -0.44238
0.49133 0.34747 0.66553 -0.44153
0.49152 0.34711 0.66608 -0.4408
0.49164 0.34686 0.66638 -0.44037
0.49274 0.34515 0.66766 -0.43854
0.49286 0.34473 0.66827 -0.43781
0.49298 0.34448 0.66858 -0.43744
0.49408 0.34259 0.66998 -0.43549
0.49445 0.34204 0.67065 -0.43451
0.49457 0.34174 0.67102 -0.43402
0.49518 0.34064 0.67194 -0.43274
0.49536 0.34015 0.67249 -0.43213
0.49536 0.3399 0.67273 -0.43188
0.49652 0.33789 0.67395 -0.43024
0.49652 0.33771 0.67426 -0.42987
0.49652 0.33765 0.67438 -0.42969
0.49738 0.33612 0.67535 -0.42834
0.49738 0.336 0.67566 -0.42804
0.49744 0.33588 0.67572 -0.42792
0.49835 0.33429 0.6767 -0.42664
0.49829 0.33423 0.67688 -0.42645
0.49829 0.33423 0.67694 -0.42633
0.49902 0.33276 0.67767 -0.42542
0.4989 0.3327 0.67767 -0.4256
0.49878 0.3327 0.67767 -0.42572
0.49908 0.33197 0.6781 -0.42535
0.49884 0.33209 0.67798 -0.42572
0.49872 0.33221 0.67786 -0.4259
0.49847 0.33228 0.67786 -0.42615
0.49835 0.3324 0.67786 -0.42627
0.49817 0.33252 0.67773 -0.42664
0.4986 0.33179 0.67798 -0.42627
0.49847 0.33185 0.67786 -0.42651
0.49823 0.33191 0.67773 -0.42694
0.4986 0.33105 0.67804 -0.4267
0.49854 0.33112 0.67798 -0.42682
0.49835 0.3313 0.67773 -0.42725
0.49829 0.33112 0.67767 -0.42761
0.49811 0.33118 0.67755 -0.42792
0.49786 0.33118 0.67743 -0.42841
0.49872 0.32953 0.6781 -0.42761
0.49866 0.32947 0.67804 -0.4278
0.49847 0.32947 0.67804 -0.42804
0.49847 0.32922 0.6781 -0.4281
0.49847 0.32922 0.67816 -0.42816
0.49841 0.32898 0.67822 -0.4281
0.49957 0.32697 0.67914 -0.42694
0.49957 0.32684 0.6792 -0.42694
0.49957 0.3266 0.67938 -0.42682
0.50031 0.32544 0.67993 -0.4259
0.50037 0.32532 0.68005 -0.42578
0.50049 0.32495 0.6803 -0.4256
0.50159 0.32306 0.68121 -0.42426
0.50165 0.32288 0.68127 -0.42413
0.50183 0.32251 0.68146 -0.42401
0.50256 0.32129 0.68207 -0.4231
0.50262 0.32117 0.68213 -0.42303
0.50269 0.32092 0.68219 -0.42291
0.50281 0.3208 0.68213 -0.42297
0.50287 0.32074 0.68213 -0.42303
0.50293 0.32068 0.68201 -0.42322
0.50323 0.32019 0.68207 -0.4231
0.50323 0.32019 0.68195 -0.42328
0.50323 0.32025 0.68182 -0.42346
0.50385 0.31934 0.68213 -0.42291
0.50385 0.31934 0.68207 -0.42303
0.50391 0.31934 0.68188 -0.42322
0.50513 0.31738 0.68268 -0.42194
0.50519 0.31738 0.68262 -0.42206
0.50513 0.31744 0.6825 -0.42218
0.50537 0.31708 0.68262 -0.42206
0.50537 0.31714 0.68256 -0.42212
0.50543 0.31708 0.6825 -0.42212
0.50629 0.31567 0.68311 -0.4212
0.50629 0.31567 0.68311 -0.4212
0.50629 0.31567 0.68304 -0.4212
0.50684 0.31482 0.68347 -0.42059
0.50696 0.31458 0.68359 -0.42035
0.50702 0.31451 0.68365 -0.42029
0.50757 0.3136 0.68402 -0.41974
0.50757 0.3136 0.68396 -0.41974
0.50763 0.3136 0.68396 -0.41974
0.50806 0.31287 0.68427 -0.41925
0.50806 0.31281 0.68427 -0.41925
0.50812 0.31281 0.68427 -0.41931
0.50867 0.31183 0.68469 -0.41858
0.50873 0.31183 0.68469 -0.41858
0.50873 0.31183 0.68469 -0.41858
0.50916 0.31104 0.685 -0.41803
0.50922 0.31104 0.685 -0.41803
0.50922 0.31104 0.685 -0.41803
0.51013 0.30957 0.68561 -0.41699
0.51013 0.30957 0.68561 -0.41699
0.51013 0.30957 0.68561 -0.41699
0.51044 0.30902 0.68579 -0.41663
0.51044 0.30902 0.68579 -0.41669
0.51044 0.30902 0.68579 -0.41669
0.51111 0.30792 0.68628 -0.41589
0.51117 0.30792 0.68622 -0.41589
0.51135 0.30768 0.68628 -0.41577
0.51135 0.30768 0.68628 -0.41577
0.51135 0.30762 0.68622 -0.41583
0.51208 0.3064 0.68677 -0.41498
0.51208 0.3064 0.68677 -0.41498
0.51215 0.30627 0.68683 -0.41492
0.51312 0.30463 0.68756 -0.4137
0.51324 0.3045 0.68756 -0.41364
0.51324 0.3045 0.68756 -0.41364
0.51385 0.30341 0.68799 -0.4129
0.51392 0.30341 0.68799 -0.41296
0.51392 0.30334 0.68799 -0.41296
0.51428 0.3028 0.68811 -0.41266
0.51434 0.30273 0.68805 -0.41272
0.51483 0.302 0.68835 -0.41223
0.51483 0.302 0.68835 -0.41223
0.51483 0.30194 0.68835 -0.41229
0.51495 0.30176 0.68842 -0.41217
0.51501 0.3017 0.68835 -0.41217
0.51501 0.3017 0.68835 -0.41223
0.51526 0.30139 0.68842 -0.41211
0.51526 0.30133 0.68842 -0.41211
0.51526 0.30133 0.68835 -0.41211
0.51593 0.30023 0.68872 -0.41144
0.51593 0.30023 0.68872 -0.41144
0.51611 0.29999 0.68878 -0.41132
0.51617 0.29987 0.68872 -0.41144
0.51617 0.29993 0.6886 -0.41162
0.51691 0.29865 0.68903 -0.41095
0.51691 0.29858 0.68896 -0.41101
0.51691 0.29846 0.68896 -0.41113
0.51703 0.29834 0.68884 -0.41125
0.51715 0.29822 0.68872 -0.41138
0.51721 0.2981 0.68872 -0.41144
0.51746 0.29767 0.68878 -0.41138
0.51752 0.29742 0.68878 -0.41144
0.51746 0.29761 0.6886 -0.41174
0.51752 0.29761 0.68854 -0.4118
0.51752 0.29761 0.68854 -0.4118
0.51776 0.2973 0.68854 -0.41162
0.51776 0.2973 0.68854 -0.41162
0.51776 0.2973 0.68854 -0.41162
0.5177 0.29736 0.68848 -0.41174
0.5177 0.2973 0.68848 -0.4118
0.51776 0.29724 0.68842 -0.41187
0.51831 0.29633 0.68872 -0.41138
0.51837 0.29639 0.68866 -0.41144
0.51831 0.29639 0.68854 -0.41156
0.51874 0.29559 0.68884 -0.41119
0.51874 0.29547 0.68884 -0.41125
0.51892 0.29517 0.68878 -0.41125
0.51898 0.29504 0.68878 -0.41132
0.51923 0.2948 0.68854 -0.41156
0.51959 0.29443 0.68829 -0.4118
0.52008 0.2937 0.68835 -0.41156
0.52032 0.29352 0.68799 -0.41205
0.52032 0.2937 0.68726 -0.41315
0.52045 0.2937 0.68695 -0.41351
0.52051 0.29388 0.68616 -0.41455
0.52063 0.2937 0.68585 -0.41504
0.52112 0.29291 0.68518 -0.41608
0.52179 0.29218 0.68427 -0.41736
0.52252 0.29144 0.68317 -0.4187
0.52313 0.29077 0.68274 -0.41913
0.5238 0.29041 0.68152 -0.42053
0.52417 0.29016 0.68103 -0.42102
0.52472 0.28906 0.68085 -0.42133
0.52545 0.28772 0.68066 -0.42163
0.52637 0.28619 0.68048 -0.42187
0.52692 0.2854 0.68036 -0.42194
0.52808 0.28381 0.67999 -0.42212
0.5293 0.28247 0.67957 -0.42224
0.52997 0.2818 0.67926 -0.42224
0.53107 0.28058 0.67908 -0.422
0.53223 0.27924 0.67908 -0.42151
0.53296 0.27838 0.67902 -0.42126
0.53479 0.27637 0.67871 -0.42065
0.53583 0.27521 0.67853 -0.42035
0.53796 0.27264 0.67828 -0.41974
0.53998 0.27008 0.67816 -0.41907
0.54102 0.26874 0.67816 -0.41852
0.54321 0.26575 0.67834 -0.4173
0.54559 0.26257 0.67853 -0.41589
0.54687 0.26099 0.67853 -0.41522
0.54944 0.25824 0.6781 -0.41425
0.55151 0.25665 0.67694 -0.41437
0.55225 0.25641 0.67615 -0.41479
0.55341 0.25647 0.67456 -0.41583
0.55475 0.25623 0.67291 -0.41687
0.55554 0.25574 0.67224 -0.41718
0.55768 0.25403 0.67126 -0.41699
0.56042 0.25128 0.67096 -0.41547
0.56183 0.24976 0.67078 -0.41473
0.56439 0.24719 0.67017 -0.41382
0.56616 0.24554 0.66949 -0.41351
0.56683 0.24518 0.66901 -0.41357
0.56769 0.24554 0.6676 -0.41443
0.56818 0.24713 0.66547 -0.41626
0.56836 0.24817 0.66418 -0.41742
0.56854 0.25073 0.66156 -0.41974
0.56873 0.25354 0.65955 -0.42108
0.56891 0.25476 0.65875 -0.42126
0.5697 0.25684 0.65747 -0.42096
0.57098 0.25903 0.6557 -0.42065
0.57178 0.26038 0.65442 -0.42078
0.57239 0.26373 0.65234 -0.42102
0.57269 0.26721 0.65015 -0.42181
0.57275 0.26843 0.64923 -0.42236
0.57288 0.26941 0.64801 -0.42346
0.57306 0.26898 0.64758 -0.42413
0.57312 0.26855 0.64746 -0.42444
0.57184 0.27045 0.6463 -0.42676
0.57355 0.26825 0.64636 -0.42584
0.57391 0.26746 0.64661 -0.42542
0.57349 0.26819 0.64587 -0.42664
0.57361 0.2691 0.64459 -0.4278
0.57397 0.26953 0.64362 -0.42859
0.57294 0.27289 0.64032 -0.43274
0.573 0.27423 0.63849 -0.43457
0.573 0.27484 0.63757 -0.43542
0.5708 0.27911 0.6347 -0.43976
0.57092 0.27991 0.63336 -0.44098
0.57111 0.28027 0.63251 -0.44177
0.56897 0.28424 0.62958 -0.44623
0.56915 0.284 0.62891 -0.44708
0.56927 0.28369 0.62878 -0.44727
0.56726 0.28619 0.62744 -0.4502
0.56763 0.28564 0.62714 -0.45044
0.56805 0.28503 0.62701 -0.45044
0.5675 0.2854 0.6261 -0.45227
0.56848 0.28333 0.62604 -0.45239
0.56866 0.28247 0.62616 -0.45251
0.56769 0.28217 0.62659 -0.45337
0.56726 0.28119 0.6283 -0.45215
0.56708 0.2807 0.62885 -0.45184
0.56567 0.28143 0.62878 -0.45325
0.56604 0.27985 0.62964 -0.45251
0.56616 0.27917 0.63025 -0.45203
0.56543 0.27887 0.63123 -0.45172
0.56567 0.27686 0.63306 -0.45013
0.56598 0.27563 0.63379 -0.44946
0.5658 0.27448 0.6344 -0.44946
0.56628 0.2724 0.63611 -0.44781
0.56659 0.27118 0.63727 -0.44653
0.56708 0.26874 0.63953 -0.44403
0.56818 0.26556 0.64203 -0.44092
0.56873 0.2641 0.64325 -0.43933
0.56915 0.26257 0.64496 -0.43726
0.57025 0.26038 0.64691 -0.43414
0.57074 0.25952 0.64783 -0.43268
0.57086 0.25867 0.64954 -0.43048
0.57135 0.25714 0.65149 -0.4278
0.57159 0.25653 0.6524 -0.42651
0.57104 0.2569 0.65356 -0.42517
0.57129 0.25616 0.65564 -0.42206
0.57141 0.25586 0.65692 -0.4201
0.57159 0.25543 0.65936 -0.41626
0.5719 0.2547 0.66162 -0.41272
0.57196 0.25446 0.6626 -0.41119
0.57208 0.25403 0.66449 -0.40826
0.57214 0.25354 0.66687 -0.40454
0.57208 0.25342 0.66809 -0.40271
0.57062 0.25488 0.66968 -0.40125
0.56921 0.25592 0.67133 -0.39978
0.5683 0.25677 0.67206 -0.39935
0.56525 0.2616 0.67236 -0.40002
0.56299 0.26532 0.67297 -0.39978
0.56158 0.26733 0.67297 -0.40033
0.55786 0.27264 0.67212 -0.40338
0.55432 0.27722 0.67145 -0.40631
0.55249 0.27942 0.67126 -0.40759
0.54889 0.28461 0.66992 -0.41101
0.54651 0.2879 0.66919 -0.41315
0.5451 0.28949 0.66895 -0.41418
0.54254 0.29254 0.66864 -0.41595
0.54034 0.29553 0.66833 -0.41718
0.53931 0.29736 0.66791 -0.41791
0.53687 0.30176 0.66681 -0.41962
0.53387 0.30634 0.66577 -0.42175
0.5321 0.30859 0.66534 -0.42303
0.52856 0.31238 0.66473 -0.4256
0.52563 0.31519 0.66443 -0.42761
0.5246 0.31622 0.66443 -0.42816
0.52313 0.31769 0.66486 -0.42828
0.52216 0.31873 0.66565 -0.42743
0.52179 0.31909 0.6662 -0.42676
0.52112 0.31976 0.6673 -0.42529
0.52039 0.3205 0.66827 -0.42419
0.51996 0.32104 0.66852 -0.42389
0.51874 0.32245 0.66882 -0.42383
0.51733 0.32391 0.6687 -0.4245
0.51666 0.32471 0.66858 -0.42505
0.5152 0.32599 0.66833 -0.42609
0.51385 0.32715 0.66827 -0.42706
0.51324 0.32745 0.66833 -0.42743
0.51227 0.32776 0.6687 -0.4278
0.5116 0.32776 0.66925 -0.42767
0.51141 0.3277 0.66956 -0.42761
0.51093 0.32733 0.6701 -0.42743
0.51031 0.32709 0.67065 -0.42755
0.51001 0.32709 0.67084 -0.42773
0.50909 0.32733 0.67096 -0.42841
0.508 0.32782 0.6709 -0.42938
0.50745 0.32794 0.67084 -0.43005
0.50702 0.32733 0.67114 -0.43054
0.50604 0.32776 0.6712 -0.43127
0.50562 0.32794 0.67126 -0.43152
0.50586 0.32654 0.67224 -0.43079
0.50562 0.32642 0.67224 -0.43115
0.50537 0.32629 0.67242 -0.43127
0.50549 0.32495 0.67322 -0.43097
0.50488 0.32428 0.67389 -0.43109
0.50452 0.32391 0.67426 -0.43121
0.50439 0.32184 0.67542 -0.43109
0.50311 0.32098 0.67566 -0.43286
0.50238 0.32056 0.67554 -0.43414
0.50183 0.31818 0.67548 -0.43658
0.50055 0.31628 0.67438 -0.44116
0.49969 0.31531 0.67346 -0.44421
0.49823 0.31213 0.67163 -0.45087
0.49438 0.31024 0.66888 -0.46033
0.49249 0.30884 0.66748 -0.46533
0.48987 0.3031 0.66595 -0.47394
0.48523 0.29797 0.66522 -0.48297
0.48248 0.29517 0.66498 -0.48773
0.47675 0.28815 0.66461 -0.49799
0.46942 0.28113 0.66376 -0.50995
0.46552 0.27704 0.66351 -0.51599
0.45862 0.26611 0.66443 -0.52661
0.45184 0.25269 0.66528 -0.53796
0.44836 0.24542 0.66553 -0.54382
0.44214 0.23041 0.66595 -0.55487
0.43695 0.21545 0.66516 -0.56586
0.43469 0.20819 0.66406 -0.57159
0.43048 0.19226 0.66223 -0.5824
0.42621 0.1734 0.6618 -0.59186
0.42346 0.16333 0.66248 -0.59595
0.41821 0.14209 0.66461 -0.60266
0.41278 0.11896 0.66742 -0.60834
0.40973 0.10706 0.66864 -0.61127
0.4032 0.08392 0.67041 -0.61719
0.396 0.06049 0.67145 -0.62341
0.39233 0.04865 0.67187 -0.62628
0.38403 0.02545 0.67358 -0.63098
0.37476 0.00323 0.67542 -0.63513
0.37018 -0.00739 0.67584 -0.63733
0.36084 -0.02777 0.67633 -0.64154
0.3512 -0.04767 0.67682 -0.6452
0.34644 -0.05725 0.67688 -0.64691
0.33783 -0.07623 0.67627 -0.65015
0.32916 -0.09412 0.6756 -0.65295
0.32483 -0.10278 0.67523 -0.65417
0.31628 -0.11951 0.67438 -0.65643
0.30859 -0.13525 0.67346 -0.65796
0.30524 -0.14301 0.67303 -0.65839
0.29883 -0.15735 0.67212 -0.65894
0.29285 -0.17017 0.67084 -0.65973
0.29016 -0.17627 0.67004 -0.6601
0.28577 -0.18774 0.66852 -0.66046
0.28168 -0.19818 0.6673 -0.6604
0.2796 -0.20319 0.66681 -0.66022
0.27612 -0.21216 0.66632 -0.65936
0.27332 -0.21948 0.66602 -0.65845
0.27197 -0.22217 0.66595 -0.65814
0.26971 -0.22528 0.66638 -0.65759
0.26965 -0.2262 0.66724 -0.65643
0.27045 -0.22577 0.66772 -0.65576
0.27319 -0.22284 0.66895 -0.65442
0.27765 -0.21875 0.67047 -0.65234
0.28021 -0.216 0.6712 -0.65137
0.28564 -0.20801 0.67273 -0.65009
0.29242 -0.19879 0.67462 -0.64801
0.29602 -0.19415 0.6756 -0.64667
0.30304 -0.18378 0.67767 -0.64435
0.31012 -0.17163 0.67938 -0.64252
0.31396 -0.16516 0.67981 -0.64191
0.32214 -0.15137 0.68018 -0.64081
0.33002 -0.13611 0.68066 -0.63971
0.33392 -0.12799 0.68103 -0.63898
0.34253 -0.11212 0.68109 -0.63739
0.35144 -0.0957 0.6795 -0.6369
0.35565 -0.0871 0.67853 -0.63678
0.36462 -0.06873 0.67621 -0.63641
0.37415 -0.05145 0.6734 -0.63556
0.37714 -0.04175 0.67322 -0.6347
0.38361 -0.02252 0.67169 -0.63336
0.3913 -0.0036 0.66858 -0.63239
0.39618 0.0047 0.66656 -0.63141
0.40436 0.02069 0.66473 -0.62781
0.40839 0.04364 0.66541 -0.62329
0.40979 0.05762 0.66559 -0.62103
0.41833 0.08002 0.66089 -0.61792
0.42279 0.09088 0.65839 -0.61603
0.42828 0.11566 0.65662 -0.60992
0.42963 0.14301 0.65747 -0.60223
0.42999 0.1557 0.65765 -0.59863
0.43103 0.17578 0.65833 -0.59155
0.43188 0.19214 0.66113 -0.58264
0.43396 0.19739 0.6629 -0.57733
0.44061 0.20557 0.66614 -0.56561
0.44531 0.21387 0.66895 -0.55536
0.44952 0.2215 0.6712 -0.54626
0.45117 0.22516 0.6723 -0.54199
0.45361 0.23413 0.67395 -0.53406
0.45398 0.23975 0.67493 -0.52997
0.45251 0.25287 0.67804 -0.52112
0.4516 0.26282 0.68158 -0.51227
0.45349 0.2652 0.68262 -0.508
0.45825 0.26947 0.68335 -0.50043
0.46191 0.27368 0.68396 -0.49396
0.46289 0.27582 0.68439 -0.49121
0.46259 0.28119 0.68542 -0.48694
0.46167 0.2887 0.68579 -0.48291
0.46191 0.29266 0.68524 -0.48108
0.46417 0.29999 0.68231 -0.47858
0.46539 0.30841 0.67859 -0.47729
0.46466 0.31329 0.67706 -0.47693
0.46198 0.32349 0.67401 -0.47711
0.46185 0.32941 0.6712 -0.47711
0.46277 0.33087 0.67059 -0.47614
0.46405 0.33325 0.67102 -0.4726
0.46539 0.33508 0.67358 -0.46625
0.46698 0.33557 0.67523 -0.46191
0.47168 0.33514 0.67834 -0.45276
0.47717 0.33417 0.67969 -0.44568
0.47937 0.33356 0.68011 -0.44318
0.48199 0.33136 0.68127 -0.44019
0.48431 0.32794 0.68188 -0.43927
0.48547 0.32648 0.68182 -0.43921
0.4873 0.32458 0.68134 -0.43921
0.48773 0.32434 0.68109 -0.43939
0.48767 0.32446 0.68085 -0.4397
0.48846 0.32416 0.68005 -0.44025
0.48926 0.32361 0.67963 -0.44049
0.48975 0.32312 0.6795 -0.44043
0.49103 0.32172 0.67944 -0.44019
0.49237 0.32007 0.67926 -0.44012
0.49292 0.31934 0.67914 -0.44025
0.49414 0.31812 0.67834 -0.44104
0.49463 0.31793 0.67725 -0.4422
0.49487 0.31805 0.6767 -0.44275
0.49548 0.31818 0.6756 -0.4436
0.49634 0.31781 0.67487 -0.44415
0.49689 0.31726 0.67462 -0.44421
0.49805 0.31561 0.67462 -0.44403
0.49921 0.31378 0.6748 -0.44379
0.49988 0.31293 0.67487 -0.4436
0.50128 0.31134 0.6748 -0.44324
0.50238 0.31042 0.67444 -0.44318
0.50281 0.31012 0.67426 -0.44318
0.50348 0.30969 0.67389 -0.44324
0.50397 0.30939 0.67352 -0.44354
0.50415 0.3092 0.67334 -0.44373
0.50458 0.30896 0.67273 -0.44427
0.50531 0.30853 0.67187 -0.44501
0.50568 0.30835 0.67163 -0.44513
0.50665 0.30762 0.67084 -0.44574
0.50751 0.30713 0.66956 -0.44702
0.50793 0.30707 0.66858 -0.44806
0.50867 0.30731 0.66608 -0.45074
0.50909 0.30829 0.66278 -0.45441
0.50934 0.30872 0.66083 -0.45667
0.51007 0.30908 0.65643 -0.46198
0.51038 0.30896 0.65222 -0.46765
0.51031 0.30878 0.65045 -0.47028
0.5105 0.30786 0.64771 -0.47449
0.51129 0.30621 0.64624 -0.47662
0.51196 0.30518 0.646 -0.47693
0.51349 0.30286 0.6463 -0.47632
0.51569 0.29968 0.64752 -0.4743
0.51691 0.29779 0.64844 -0.4729
0.51953 0.29352 0.65094 -0.4693
0.52081 0.29114 0.65253 -0.4671
0.52386 0.2854 0.65594 -0.46246
0.52765 0.2793 0.65869 -0.45801
0.5296 0.27655 0.65936 -0.45636
0.53345 0.27191 0.65955 -0.45441
0.53693 0.26782 0.65869 -0.45392
0.53857 0.26581 0.65796 -0.45435
0.54034 0.26263 0.6571 -0.45526
0.54236 0.26007 0.65692 -0.45453
0.54352 0.25946 0.65674 -0.45386
0.54492 0.26013 0.65503 -0.45422
0.5448 0.26294 0.65216 -0.45685
0.54395 0.26538 0.65045 -0.45892
0.54181 0.2713 0.64679 -0.46313
0.5401 0.27765 0.64331 -0.46619
0.53973 0.28064 0.64154 -0.46722
0.53998 0.2865 0.63739 -0.46912
0.53943 0.29291 0.63324 -0.47137
0.53857 0.29663 0.63116 -0.4729
0.53644 0.30487 0.62622 -0.4765
0.53375 0.31268 0.62103 -0.48126
0.53174 0.31659 0.61926 -0.48328
0.52832 0.32239 0.61633 -0.48694
0.52606 0.32477 0.61389 -0.49084
0.52527 0.32562 0.61298 -0.49231
0.52478 0.32556 0.61182 -0.49432
0.52557 0.3244 0.61115 -0.49506
0.52643 0.32336 0.61121 -0.49469
0.52979 0.32007 0.6106 -0.49402
0.53363 0.31647 0.60901 -0.49414
0.53546 0.31506 0.60791 -0.49445
0.53809 0.31287 0.6059 -0.49542
0.53998 0.31116 0.60388 -0.49695
0.54065 0.31055 0.60284 -0.4978
0.54126 0.31018 0.60059 -0.50012
0.54132 0.31097 0.59772 -0.50299
0.5412 0.31158 0.5965 -0.50427
0.54089 0.31238 0.59497 -0.5058
0.54059 0.31262 0.59424 -0.50684
0.54053 0.31268 0.59393 -0.5072
0.54102 0.31238 0.59332 -0.50769
0.54187 0.31134 0.59344 -0.50726
0.54242 0.31055 0.59369 -0.5069
0.54327 0.3092 0.59387 -0.50653
0.54382 0.3078 0.59467 -0.50586
0.54407 0.30707 0.59546 -0.50513
0.54456 0.30591 0.59686 -0.50366
0.54474 0.30518 0.59827 -0.5022
0.54468 0.30505 0.59924 -0.50122
0.54431 0.30499 0.60114 -0.49933
0.54395 0.3053 0.6026 -0.4978
0.54382 0.30542 0.60333 -0.49701
0.54352 0.30518 0.60541 -0.49493
0.5434 0.30499 0.60779 -0.49219
0.54352 0.30481 0.60889 -0.49084
0.54382 0.30444 0.61047 -0.48883
0.54395 0.30396 0.61163 -0.48755
0.54395 0.30383 0.6123 -0.48669
0.54431 0.30328 0.61456 -0.48376
0.54456 0.3028 0.61694 -0.48083
0.54449 0.3028 0.61761 -0.47992
0.54456 0.30292 0.61847 -0.47876
0.54456 0.30292 0.6192 -0.47772
0.54431 0.30273 0.62177 -0.47485
0.54413 0.3028 0.62378 -0.47235
0.54437 0.30267 0.62445 -0.47131
0.54523 0.30194 0.62592 -0.46881
0.54602 0.30078 0.62866 -0.46497
0.5462 0.30011 0.63055 -0.46265
0.54669 0.29919 0.63403 -0.45782
0.54767 0.29803 0.63696 -0.45343
0.5481 0.29718 0.63843 -0.45129
0.54889 0.29541 0.64142 -0.44727
0.54907 0.29413 0.64478 -0.44305
0.54895 0.29395 0.6463 -0.4411
0.54846 0.29443 0.6485 -0.43811
0.54724 0.29608 0.65027 -0.43591
0.54657 0.29706 0.651 -0.435
0.54523 0.29938 0.6521 -0.43341
0.54327 0.3017 0.65265 -0.43347
0.54144 0.30365 0.6535 -0.43317
0.53717 0.30774 0.65515 -0.43298
0.53357 0.31134 0.65625 -0.43329
0.53217 0.31281 0.65668 -0.43323
0.52972 0.31543 0.65796 -0.43243
0.52795 0.31738 0.65924 -0.43115
0.52722 0.31805 0.65997 -0.43048
0.52612 0.31873 0.66138 -0.42926
0.52545 0.3186 0.6626 -0.42822
0.52527 0.31836 0.66321 -0.42761
0.5249 0.31769 0.66467 -0.42633
0.52429 0.31726 0.66595 -0.42542
0.5238 0.31732 0.66644 -0.42523
0.52258 0.31763 0.66693 -0.4256
0.52136 0.31812 0.66742 -0.42609
0.52081 0.31836 0.66766 -0.42609
0.5199 0.31879 0.66864 -0.42542
0.5188 0.31927 0.66962 -0.4248
0.51819 0.31964 0.67004 -0.42474
0.51691 0.32043 0.67041 -0.42511
0.5155 0.32196 0.67017 -0.42603
0.51465 0.323 0.66992 -0.42664
0.51276 0.32538 0.66931 -0.42804
0.51068 0.32758 0.66882 -0.42957
0.50964 0.32849 0.66876 -0.4303
0.50793 0.32996 0.66876 -0.43115
0.50677 0.33093 0.66895 -0.43152
0.50629 0.33136 0.66907 -0.43158
0.50537 0.33209 0.66937 -0.43158
0.50452 0.33276 0.6698 -0.43146
0.50409 0.33307 0.66998 -0.43134
0.50336 0.3335 0.67059 -0.43091
Thanks
As #minorlogic also wrote in their comment, these jumps occur near the circular limits of (0,2pi] or (-pi,pi] -- or, in terms of unit quaternions (being constructed as [cos(phi), sin(phi) * rotAxis]), more appropriately in the range (-1,1] for each component.
Most IMUs however don't represent their orientation readings in a normalised fashion, though, but the normalisation factor(s) should be stated in the data sheet (as #daniel-wisehart mentioned in their comment). When applying these factors, you should obtain a quaternion representation with values between -1 and 1. These overflows will still be present, though, as these are basically a normalisation requirement (i.e. to keep the unit part of unit quaternions).

Not a Number (NaN) for the standard errors in summary.lmList

I'm using Pixel data from nlme package to fit a model with lmList function:
dat <- lmList(pixel ~ day+I(day^2)|Dog/Side, data=Pixel[Pixel$Dog != 9,], level=2)
I'm curious why why do I get NaN for Dog==10 when I try to print the fitted object using summary?
summary(dat)
Call:
Model: pixel ~ day + I(day^2) | Dog/Side
Level: 2
Data: Pixel[Pixel$Dog != 9, ]
Coefficients:
(Intercept)
Estimate Std. Error t value Pr(>|t|)
1/R 1045.349 6.436476 162.41015 0
2/R 1042.166 6.436476 161.91569 0
3/R 1046.265 7.853767 133.21825 0
4/R 1045.602 7.853767 133.13382 0
5/R 1110.309 27.576874 40.26231 0
6/R 1093.556 27.576874 39.65482 0
7/R 1156.478 30.223890 38.26369 0
8/R 1030.754 30.223890 34.10393 0
10/R 1056.600 NaN NaN NaN
1/L 1046.538 6.436476 162.59486 0
2/L 1050.367 6.436476 163.18985 0
3/L 1047.438 7.853767 133.36754 0
4/L 1050.915 7.853767 133.81027 0
5/L 1068.412 27.576874 38.74306 0
6/L 1089.184 27.576874 39.49630 0
7/L 1139.851 30.223890 37.71356 0
8/L 1086.129 30.223890 35.93611 0
10/L 1041.100 NaN NaN NaN
day
Estimate Std. Error t value Pr(>|t|)
1/R 0.21534820 2.600975 0.08279519 9.343899e-01
2/R 3.82436362 2.600975 1.47035789 1.485802e-01
3/R 8.59752235 1.698113 5.06298479 7.828854e-06
4/R 12.18801561 1.698113 7.17738612 6.287493e-09
5/R 4.91365979 6.709441 0.73235013 4.678382e-01
6/R -0.01159794 6.709441 -0.00172860 9.986286e-01
7/R 0.27908291 7.755457 0.03598536 9.714568e-01
8/R 14.20961055 7.755457 1.83220800 7.369405e-02
10/R 16.10000000 NaN NaN NaN
1/L 2.22308391 2.600975 0.85471187 3.973407e-01
2/L 3.31617525 2.600975 1.27497407 2.090100e-01
3/L 6.03985508 1.698113 3.55680313 9.127977e-04
4/L 12.48222079 1.698113 7.35064026 3.512296e-09
5/L 14.13427835 6.709441 2.10662542 4.088737e-02
6/L 7.22757732 6.709441 1.07722501 2.872506e-01
7/L -0.77719849 7.755457 -0.10021311 9.206304e-01
8/L 3.97248744 7.755457 0.51221835 6.110599e-01
10/L 30.60000000 NaN NaN NaN
I(day^2)
Estimate Std. Error t value Pr(>|t|)
1/R -0.0507392 0.1819114 -0.2789227 7.816110e-01
2/R -0.2228509 0.1819114 -1.2250523 2.270733e-01
3/R -0.3556849 0.0755204 -4.7097854 2.498505e-05
4/R -0.4708779 0.0755204 -6.2351082 1.522147e-07
5/R -0.3510125 0.3639863 -0.9643565 3.401377e-01
6/R -0.0880891 0.3639863 -0.2420122 8.098952e-01
7/R -0.1462626 0.4245106 -0.3445440 7.320786e-01
8/R -0.7429334 0.4245106 -1.7500941 8.707333e-02
10/R -1.6250000 NaN NaN NaN
1/L -0.1649267 0.1819114 -0.9066324 3.695397e-01
2/L -0.2135152 0.1819114 -1.1737319 2.468167e-01
3/L -0.2764050 0.0755204 -3.6600044 6.720231e-04
4/L -0.5425352 0.0755204 -7.1839551 6.150012e-09
5/L -0.8313144 0.3639863 -2.2839170 2.725859e-02
6/L -0.5060199 0.3639863 -1.3902170 1.714560e-01
7/L -0.1847048 0.4245106 -0.4351005 6.656163e-01
8/L -0.1878769 0.4245106 -0.4425729 6.602428e-01
10/L -1.9500000 NaN NaN NaN
Residual standard error: 8.820516 on 44 degrees of freedom
For Dog==10 the model goes exactly through every data point, which results in NaN for Std. Error.

Comparing regression models with R

Is there a tool available in R to produce publication ready regression tables? I am working on a course paper in which I need to compare several regression models and I would be very glad if I could make them nest within a single table like this one, from the estout Stata package.
I have checked xtable, but could not reach the same results. Any tips would be appreciated.
Here's what I have in mind:
You probably want the mtable function in 'memisc' package. It has associated LaTeX output arguments:
==========================================================================
Model 1 Model 2 Model 3
--------------------------------------------------------------------------
Constant 30.628*** 6.360*** 28.566***
(7.409) (1.252) (7.355)
Percentage of population under 15 -0.471** -0.461**
(0.147) (0.145)
Percentage of population over 75 -1.934 -1.691
(1.041) (1.084)
Real per-capita disposable income 0.001 -0.000
(0.001) (0.001)
Growth rate of real per-capita disp. income 0.529* 0.410*
(0.210) (0.196)
--------------------------------------------------------------------------
sigma 3.931 4.189 3.803
R-squared 0.262 0.162 0.338
F 8.332 4.528 5.756
p 0.001 0.016 0.001
N 50 50 50
==========================================================================
This is the LaTeX code you get:
texfile123 <- "mtable123.tex"
write.mtable(mtable123,forLaTeX=TRUE,file=texfile123)
file.show(texfile123)
#------------------------
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%
% Calls:
% Model 1: lm(formula = sr ~ pop15 + pop75, data = LifeCycleSavings)
% Model 2: lm(formula = sr ~ dpi + ddpi, data = LifeCycleSavings)
% Model 3: lm(formula = sr ~ pop15 + pop75 + dpi + ddpi, data = LifeCycleSavings)
%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\begin{tabular}{lcD{.}{.}{7}cD{.}{.}{7}cD{.}{.}{7}}
\toprule
&&\multicolumn{1}{c}{Model 1} && \multicolumn{1}{c}{Model 2} && \multicolumn{1}{c}{Model 3}\\
\midrule
Constant & & 30.628^{***} && 6.360^{***} && 28.566^{***}\\
& & (7.409) && (1.252) && (7.355) \\
Percentage of population under 15 & & -0.471^{**} && && -0.461^{**} \\
& & (0.147) && && (0.145) \\
Percentage of population over 75 & & -1.934 && && -1.691 \\
& & (1.041) && && (1.084) \\
Real per-capita disposable income & & && 0.001 && -0.000 \\
& & && (0.001) && (0.001) \\
Growth rate of real per-capita disp. income & & && 0.529^{*} && 0.410^{*} \\
& & && (0.210) && (0.196) \\
\midrule
sigma & & 3.931 && 4.189 && 3.803 \\
R-squared & & 0.262 && 0.162 && 0.338 \\
F & & 8.332 && 4.528 && 5.756 \\
p & & 0.001 && 0.016 && 0.001 \\
N & & 50 && 50 && 50 \\
\bottomrule
\end{tabular}
The R wikibook has some good sources on production quality output in R.
I think this function from Paul Johnson that is listed in the wikibook is exactly what you're looking for:
http://pj.freefaculty.org/R/WorkingExamples/outreg-worked.R
I edited the function for my own use to work with the booktabs format and allow for models that have extra attributes:
http://chandlerlutz.com/R/outregBkTabs.r
xtable can do this, but its somewhat of a hack.
Take two linear models, named lm.x and lm.y.
If you use the following code:
myregtables <- rbind(xtable(summary(lm.x)), xtable(summary(lm.y)))
xtable will then produce a table with both regression models. If you add a \hline (or perhaps two) in LaTeX then it should look OK. You'll still only have one label and caption for the two models. As I said, its somewhat of a hacky solution.

Resources