How to create a loop to run 2 variable generalized regression models? - r

I have 19 variables and I want to run 19 different regressions that consist of 2 independent variables from my dataset.
*Update -This is my dataset's structure:
$ Failure_Response_Var_Yr: num 0 0 0 0 0 0 0 0 0 0 ...
$ exp_var_nocorr_2 : num 4.61 5.99 6.13 3.17 4.4 ...
$ exp_var_nocorr_3 : num 4.16 5.46 5.24 2.86 3.72 ...
$ exp_var_nocorr_4 : num 0.00191 2.23004 0.5613 1.07986 0.99836 ...
$ exp_var_nocorr_5 : num 0.709 2.79 6.846 15.478 11.418 ...
$ exp_var_nocorr_6 : num 0.724 0.497 1.782 0.156 2.525 ...
$ exp_var_nocorr_7 : num 0 168.17 92.041 0.584 265.338 ...
$ exp_var_nocorr_8 : num -38.64 4.89 1.5 24.8 16.56 ...
$ exp_var_nocorr_9 : num 116 88.3 56.4 60.6 57.6 ...
$ exp_var_nocorr_10 : num 0 10.3 0 93.7 0 ...
$ exp_var_nocorr_11 : num 1.02 1.23 1.31 2.06 1.33 ...
$ exp_var_nocorr_12 : num 60 140 124 275 203 ...
$ exp_var_nocorr_13 : num 10.835 5.175 1.838 0.347 0.783 ...
$ exp_var_nocorr_14 : num 59 60.2 87.2 42.2 84.2 ...
$ exp_var_nocorr_15 : num 61.9 68.3 99 50.2 103.9 ...
$ exp_var_nocorr_16 : num 4.4 11.24 8.23 6.9 8.84 ...
$ exp_var_nocorr_17 : num 6.43 18.62 10.72 15.62 10.35 ...
I wrote this code:
col17 <- names(my.sample)[-c(1:9,26:29)]
Such that now dput(col17) gives out:
c("exp_var_nocorr_2", "exp_var_nocorr_3", "exp_var_nocorr_4", "exp_var_nocorr_5", "exp_var_nocorr_6", "exp_var_nocorr_7", "exp_var_nocorr_8", "exp_var_nocorr_9", "exp_var_nocorr_10", "exp_var_nocorr_11", "exp_var_nocorr_12", "exp_var_nocorr_13", "exp_var_nocorr_14", "exp_var_nocorr_15", "exp_var_nocorr_16", "exp_var_nocorr_17" )
`logit.test2 <- vector("list", length(col17))
#start of loop #
for(i in seq_along(col17)){
for(k in seq_along(col17)){
logit.test2[i] <- glm(reformulate(col17[i]+col17[k], "Failure_Response_Var_Yr"),
family=binomial(link='logit'), data=my.sample)
}
}`
# end of loop #
but it printed out this problem:
"Error in col17[i] + col17[k] : non-numeric argument to binary operator"
Can anybody hand me out a code that can fix this problem?

Related

Error when applying comb_CSR function in R

my aim is to estimate a complete subset regression using the function comb_CSR() in R.
My data set is the following:
str(df_na)
Classes ‘fredmd’ and 'data.frame': 360 obs. of 128 variables:
$ date : Date, format: "1992-03-01" "1992-04-01" "1992-05-01" "1992-06-01" ...
$ RPI : num 0.001653 0.00373 0.005329 0.004173 -0.000796 ...
$ W875RX1 : num 0.000812 0.002751 0.005493 0.004447 -0.001346 ...
$ DPCERA3M086SBEA: num 0.001824 0.000839 0.005146 0.002696 0.003342 ...
$ CMRMTSPLx : num 0.00402 0.00664 -0.00874 0.01049 0.0133 ...
$ RETAILx : num -0.003 0.00602 0.00547 0.0028 0.00708 ...
$ INDPRO : num 0.008279 0.007593 0.003221 0.000539 0.008911 ...
$ IPFPNSS : num 0.00851 0.00743 0.0055 -0.00244 0.00998 ...
$ IPFINAL : num 0.00899 0.0076 0.0058 -0.00309 0.01129 ...
$ IPCONGD : num 0.00911 0.00934 0.00648 -0.0049 0.01298 ...
$ IPDCONGD : num 0.0204 0.0185 0.0308 -0.0138 0.0257 ...
$ IPNCONGD : num 0.00518 0.00612 -0.00219 -0.00172 0.00843 ...
$ IPBUSEQ : num 0.01174 0.00958 0.00792 0.00247 0.01016 ...
$ IPMAT : num 0.007989 0.007794 0.000352 0.004296 0.007562 ...
$ IPDMAT : num 0.0113 0.00652 0.01044 0.00211 0.01118 ...
$ IPNMAT : num 0.014042 0.001707 -0.004866 0.010879 0.000204 ...
$ IPMANSICS : num 0.01014 0.00538 0.00579 0.00327 0.0089 ...
$ IPB51222S : num -0.00883 0.04244 -0.02427 -0.04027 0.00958 ...
$ IPFUELS : num 0.0048 0.00603 -0.00854 -0.00383 0.00329 ...
$ CUMFNS : num 0.6213 0.2372 0.2628 0.0569 0.5077 ...
$ HWI : num 140 -104 94 -36 -20 68 -91 55 98 3 ...
$ HWIURATIO : num 0.014611 -0.009559 -0.000538 -0.012463 0.003537 ...
$ CLF16OV : num 0.003116 0.001856 0.002172 0.00265 0.000809 ...
$ CE16OV : num 0.003315 0.002384 -0.000431 0.000372 0.00248 ...
$ UNRATE : num 0 0 0.2 0.2 -0.1 ...
$ UEMPMEAN : num 0.4 0.3 0.4 0.4 -0.1 ...
$ UEMPLT5 : num 0.0404 -0.0346 0.0361 0.0291 -0.023 ...
$ UEMP5TO14 : num -0.05285 0.00711 -0.01177 0.0075 0.00815 ...
$ UEMP15OV : num 0.00439 -0.02087 0.0956 0.08725 -0.03907 ...
$ UEMP15T26 : num -0.0365 -0.0321 0.0564 0.0966 -0.0762 ...
$ UEMP27OV : num 0.0386 -0.0119 0.1255 0.0804 -0.0122 ...
$ CLAIMSx : num -0.02914 -0.02654 -0.00203 0.00323 0.05573 ...
$ PAYEMS : num 0.000498 0.00142 0.001197 0.000607 0.000717 ...
$ USGOOD : num -0.000678 0.000226 0.000136 -0.001718 -0.001041 ...
$ CES1021000001 : num -0.00225 -0.00628 -0.00486 -0.01144 -0.00296 ...
$ USCONS : num 0.00195 -0.003903 0.000434 -0.004571 -0.003059 ...
$ MANEMP : num -0.001427 0.001546 0.000238 -0.000535 -0.000416 ...
$ DMANEMP : num -0.002104 0.000802 0.0002 -0.001304 -0.002009 ...
$ NDMANEMP : num -0.000439 0.00263 0.000292 0.000583 0.001893 ...
$ SRVPRD : num 0.0008 0.00173 0.00147 0.0012 0.00117 ...
$ USTPU : num 4.52e-05 4.97e-04 -6.78e-04 1.36e-04 -1.95e-03 ...
$ USWTRADE : num -0.000509 -0.002313 -0.002043 -0.001476 -0.003649 ...
$ USTRADE : num 1.56e-05 1.47e-03 -1.17e-04 5.22e-04 -1.68e-03 ...
$ USFIRE : num 0.000153 0.001379 0.001835 0.001222 -0.000153 ...
$ USGOVT : num 0.00139 0.001282 0.000747 0.00048 0.002927 ...
$ CES0600000007 : num 40.3 40.5 40.4 40.3 40.3 40.3 40.2 40.3 40.3 40.3 ...
$ AWOTMAN : num 0.1 0 0.2 -0.1 0 ...
$ AWHMAN : num 40.7 40.8 40.9 40.8 40.8 40.8 40.7 40.8 40.9 40.9 ...
$ HOUST : num 7.17 7 7.1 7.04 7.04 ...
$ HOUSTNE : num 4.92 4.81 4.82 4.74 4.81 ...
$ HOUSTMW : num 5.79 5.47 5.7 5.62 5.6 ...
$ HOUSTS : num 6.27 6.17 6.22 6.14 6.16 ...
$ HOUSTW : num 5.72 5.57 5.67 5.68 5.61 ...
$ PERMIT : num 6.99 6.96 6.96 6.96 6.99 ...
$ PERMITNE : num 4.81 4.77 4.83 4.8 4.86 ...
$ PERMITMW : num 5.58 5.49 5.55 5.49 5.53 ...
$ PERMITS : num 6.1 6.05 6.04 6.08 6.09 ...
$ PERMITW : num 5.52 5.59 5.54 5.55 5.59 ...
$ ACOGNO : num 0.04458 0.00165 0.02271 0.01092 -0.00382 ...
$ AMDMNOx : num 0.04682 0.03636 0.0108 -0.02403 -0.00199 ...
$ ANDENOx : num 0.0931 0.0104 0.0242 -0.0371 -0.0105 ...
$ AMDMUOx : num -0.00481 0.00194 0.00191 -0.00509 -0.00927 ...
$ BUSINVx : num 0.003853 0.003351 -0.000536 0.006642 0.005653 ...
$ ISRATIOx : num -0.03 0 -0.01 0 -0.01 ...
$ M1SL : num -0.003773 -0.004802 -0.000372 -0.003294 0.005502 ...
$ M2SL : num -0.004398 -0.002381 0.000911 -0.001208 0.001679 ...
$ M2REAL : num -0.00245 -0.0034 -0.00246 -0.00441 -0.00269 ...
$ BOGMBASE : num 0.01892 -0.02067 0.01167 0.00355 0.002 ...
$ TOTRESNS : num 0.0305 -0.1304 0.0784 0.0465 -0.0082 ...
$ NONBORRES : num 0.02896 -0.12317 0.06947 0.04605 -0.00826 ...
$ BUSLOANS : num 0.00237 -0.00104 0.00132 0.00173 -0.00106 ...
$ REALLN : num -0.00132 0.0058 -0.00663 -0.00338 0.00177 ...
$ NONREVSL : num -6.43e-05 -4.49e-03 3.65e-03 -2.72e-03 5.74e-03 ...
$ CONSPI : num -0.000498 -0.001173 -0.000825 -0.001014 -0.000115 ...
$ S&P 500 : num -0.012684 0.000123 0.018001 -0.015892 0.01647 ...
$ S&P: indust : num -0.01236 -0.000681 0.012694 -0.018013 0.010731 ...
$ S&P div yield : num 0.047815 -0.000371 -0.053946 0.047576 -0.04368 ...
$ S&P PE ratio : num 0.00689 0.02343 0.0377 -0.00193 0.02857 ...
$ FEDFUNDS : num -0.08 -0.25 0.09 -0.06 -0.51 ...
$ CP3Mx : num 0.19 -0.26 -0.16 0.04 -0.48 ...
$ TB3MS : num 0.2 -0.29 -0.12 0.03 -0.45 ...
$ TB6MS : num 0.25 -0.31 -0.12 0.02 -0.49 ...
$ GS1 : num 0.34 -0.33 -0.11 -0.02 -0.57 ...
$ GS5 : num 0.37 -0.17 -0.09 -0.21 -0.64 ...
$ GS10 : num 0.2 -0.06 -0.09 -0.13 -0.42 ...
$ AAA : num 0.06 -0.02 -0.05 -0.06 -0.15 ...
$ BAA : num 0.02 -0.04 -0.08 -0.08 -0.21 ...
$ COMPAPFFx : num 0.32 0.31 0.06 0.16 0.19 0.08 0.02 0.23 0.57 0.75 ...
$ TB3SMFFM : num 0.06 0.02 -0.19 -0.1 -0.04 -0.17 -0.31 -0.24 0.04 0.3 ...
$ TB6SMFFM : num 0.2 0.14 -0.07 0.01 0.03 -0.09 -0.26 -0.06 0.25 0.44 ...
$ T1YFFM : num 0.65 0.57 0.37 0.41 0.35 0.17 -0.04 0.2 0.59 0.79 ...
$ T5YFFM : num 2.97 3.05 2.87 2.72 2.59 2.3 2.16 2.5 2.95 3.16 ...
$ T10YFFM : num 3.56 3.75 3.57 3.5 3.59 3.29 3.2 3.49 3.78 3.85 ...
$ AAAFFM : num 4.37 4.6 4.46 4.46 4.82 4.65 4.7 4.89 5.01 5.06 ...
$ BAAFFM : num 5.27 5.48 5.31 5.29 5.59 5.35 5.4 5.74 5.87 5.89 ...
$ TWEXAFEGSMTHx : num 0.02529 -0.00399 -0.01238 -0.02224 -0.02363 ...
$ EXSZUSx : num 0.036 0.0066 -0.0191 -0.0451 -0.0655 ...
$ EXJPUSx : num 0.03964 0.00508 -0.02095 -0.03056 -0.00755 ...
$ EXUSUKx : num -0.0308 0.0188 0.0297 0.0249 0.0332 ...
[list output truncated]
- attr(*, "na.action")= 'omit' Named int [1:402] 1 2 3 4 5 6 7 8 9 10 ...
..- attr(*, "names")= chr [1:402] "2" "3" "4" "5" ...
The code is the following so far:
#define response variable
y <- df_na$PAYEMS
#define matrix of predictor variables
x <- data.matrix(df_na[, !names(df_na) %in% c("PAYEMS", "date")])
# break data into in-sample and out-of-smple
y.in = y[1:190]; y.out = y[-c(1:190)]
x.in = x[1:190, ]; x.out = x[-c(1:190), ]
trial <- foreccomb(y.in, x.in, y.out, x.out)
result <- comb_CSR(trial)
However, as soon as I run the last line, I get the following error:
> result <- comb_CSR(trial)
Error in matrix(0, ndiff_models, 4) :
invalid 'nrow' value (too large or NA)
In addition: Warning message:
In matrix(0, ndiff_models, 4) : NAs introduced by coercion to integer range
The data set does not have any NA values as I get rid of them beforehand. Unfortunately, I do not understand where the error comes from. Does anyone have an idea?

"'groups' must be a factor" on Shapiro-Wilk test on Rcmdr

I am trying to run a shapiro-wilk normality test on R (Rcmdr to be more accurate) by going to "Statistics=>Summary=>Descriptive statistics" and then selecting one of my dependent variable and choosing "summary by group".
Rcmdr automatically triggers the following code :
normalityTest(Algometre.J0 ~ Modalite, test="shapiro.test",
data=Dataset)
And I am getting the following error message :
'groups' must be a factor.
I have already categorized my independant variable as a factor (I swear, I did !)
Any idea what's wrong ?
Thanx in advance
Here is what str(Dataset) shows :
'data.frame': 76 obs. of 11 variables:
$ Modalite : chr "C" "C" "C" "C" ...
$ Angle.J0 : num 20.1 20.5 21 22.5 19.1 ...
$ Angle.J1 : num 21.7 22.6 22.8 23.3 20.5 ...
$ Angle.J2 : num 22.3 23 23.9 24.2 21 ...
$ Epaisseur.J0: num 1.97 1.54 1.76 1.89 1.53 1.87 1.54 2 1.79 1.41 ...
$ Epaisseur.J1: num 2.07 1.49 1.87 1.91 1.54 1.9 1.51 2.03 1.71 1.48 ...
$ Epaisseur.J2: num 2.08 1.69 1.77 2 1.61 1.99 1.38 2.06 1.86 1.53 ...
$ Algometre.J0: num 45 40 105 165 66.3 ...
$ Algometre.J1: num 32.7 39.7 91.7 124 63.7 ...
$ Algometre.J2: num 51.3 58.7 101 138 60.3 ...
$ ObsNumber : int 1 2 3 4 5 6 7 8 9 10 ...
What does that mean ?

How to perform sensitivity analysis using Lek's profile in R?

I am trying to do sensitivity analysis using R. My data set has few continuous explanatory variables and a categorical response variable (7 categories).
I tried to run the below mentioned code.
model=train(factor(mode)~Time+Cost+Age+Income,
method="nnet",
preProcess("center","scale"),
data=train,
verbose=F,
trControl=trainControl(method='cv', verboseIter = F),
tuneGrid=expand.grid(.size=c(1:20), .decay=c(0,0.001,0.01,0.1)))
After getting the output through this code, I tried to develop Lek's profile using the below mentioned code.
Lekprofile(model)
However, I got the error stating "Errors in xvars[, x_names]: subscript out of bound"
Please help me to resolve the error.
It doesn't work for a classification model , for example, if we use a regression model:
library(caret)
library(NeuralNetTools)
library(mlbench)
data(BostonHousing)
str(BostonHousing)
'data.frame': 506 obs. of 14 variables:
$ crim : num 0.00632 0.02731 0.02729 0.03237 0.06905 ...
$ zn : num 18 0 0 0 0 0 12.5 12.5 12.5 12.5 ...
$ indus : num 2.31 7.07 7.07 2.18 2.18 2.18 7.87 7.87 7.87 7.87 ...
$ chas : Factor w/ 2 levels "0","1": 1 1 1 1 1 1 1 1 1 1 ...
$ nox : num 0.538 0.469 0.469 0.458 0.458 0.458 0.524 0.524 0.524 0.524 ...
$ rm : num 6.58 6.42 7.18 7 7.15 ...
$ age : num 65.2 78.9 61.1 45.8 54.2 58.7 66.6 96.1 100 85.9 ...
$ dis : num 4.09 4.97 4.97 6.06 6.06 ...
$ rad : num 1 2 2 3 3 3 5 5 5 5 ...
$ tax : num 296 242 242 222 222 222 311 311 311 311 ...
$ ptratio: num 15.3 17.8 17.8 18.7 18.7 18.7 15.2 15.2 15.2 15.2 ...
$ b : num 397 397 393 395 397 ...
$ lstat : num 4.98 9.14 4.03 2.94 5.33 ...
$ medv : num 24 21.6 34.7 33.4 36.2 28.7 22.9 27.1 16.5 18.9 ...
We train the model, exclude the categorical chas:
model = train(medv ~ .,data=BostonHousing[,-4],method="nnet",
trControl=trainControl(method="cv",number=10),
tuneGrid=data.frame(size=c(5,10,20),decay=0.1))
lekprofile(model)
You can see the y-axis is meant to be continuous. We can try to discretize our response variable medv and you can see it crashes:
BostonHousing$medv = cut(BostonHousing$medv,4)
model = train(medv ~ .,data=BostonHousing[,-4],method="nnet",
trControl=trainControl(method="cv",number=10),
tuneGrid=data.frame(size=c(5,10,20),decay=0.1))
lekprofile(model)
Error in `[.data.frame`(preds, , ysel, drop = FALSE) :
undefined columns selected

How to normalize all variables in an R dataframe (except for the one variable that's a factor)

I'm having difficulty applying the max-min normalize function to the predictor variables (30 of them) in my data frame without excluding the diagnosis variable (as it is a factor and not subject to the function) from the data frame.
```{r}
cancer_data <- as.data.frame(lapply(cancer_data, normalize))
```
This won't run bc it will prompt an error message referencing the factor column, but I don't want the new data frame to be created without that column. I would just like to apply the normalize function I created to the 30 predictor variables.
Here is the structure of my data frame if it provides helpful context at all:
str(cancer_data)
## 'data.frame': 569 obs. of 31 variables:
## $ diagnosis : Factor w/ 2 levels "Benign","Malignant": 1 1 1 1 1 1 1 2 1 1 ...
## $ radius_mean : num 12.3 10.6 11 11.3 15.2 ...
## $ texture_mean : num 12.4 18.9 16.8 13.4 13.2 ...
## $ perimeter_mean : num 78.8 69.3 70.9 73 97.7 ...
## $ area_mean : num 464 346 373 385 712 ...
## $ smoothness_mean : num 0.1028 0.0969 0.1077 0.1164 0.0796 ...
## $ compactness_mean : num 0.0698 0.1147 0.078 0.1136 0.0693 ...
## $ concavity_mean : num 0.0399 0.0639 0.0305 0.0464 0.0339 ...
## $ points_mean : num 0.037 0.0264 0.0248 0.048 0.0266 ...
## $ symmetry_mean : num 0.196 0.192 0.171 0.177 0.172 ...
## $ dimension_mean : num 0.0595 0.0649 0.0634 0.0607 0.0554 ...
## $ radius_se : num 0.236 0.451 0.197 0.338 0.178 ...
## $ texture_se : num 0.666 1.197 1.387 1.343 0.412 ...
## $ perimeter_se : num 1.67 3.43 1.34 1.85 1.34 ...
## $ area_se : num 17.4 27.1 13.5 26.3 17.7 ...
## $ smoothness_se : num 0.00805 0.00747 0.00516 0.01127 0.00501 ...
## $ compactness_se : num 0.0118 0.03581 0.00936 0.03498 0.01485 ...
## $ concavity_se : num 0.0168 0.0335 0.0106 0.0219 0.0155 ...
## $ points_se : num 0.01241 0.01365 0.00748 0.01965 0.00915 ...
## $ symmetry_se : num 0.0192 0.035 0.0172 0.0158 0.0165 ...
## $ dimension_se : num 0.00225 0.00332 0.0022 0.00344 0.00177 ...
## $ radius_worst : num 13.5 11.9 12.4 11.9 16.2 ...
## $ texture_worst : num 15.6 22.9 26.4 15.8 15.7 ...
## $ perimeter_worst : num 87 78.3 79.9 76.5 104.5 ...
## $ area_worst : num 549 425 471 434 819 ...
## $ smoothness_worst : num 0.139 0.121 0.137 0.137 0.113 ...
## $ compactness_worst: num 0.127 0.252 0.148 0.182 0.174 ...
## $ concavity_worst : num 0.1242 0.1916 0.1067 0.0867 0.1362 ...
## $ points_worst : num 0.0939 0.0793 0.0743 0.0861 0.0818 ...
## $ symmetry_worst : num 0.283 0.294 0.3 0.21 0.249 ...
## $ dimension_worst : num 0.0677 0.0759 0.0788 0.0678 0.0677 ...
Assuming you already have normalize function in your environment. You can get the numeric variables in your data and apply the function to selected columns using lapply.
cols <- sapply(cancer_data, is.numeric)
cancer_data[cols] <- lapply(cancer_data[cols], normalize)
Or without creating cols.
cancer_data[] <- lapply(cancer_data, function(x)
if(is.numeric(x)) normalize(x) else x)
If you want to exclude only 1st column, you can also use :
cancer_data[-1] <- lapply(cancer_data[-1], normalize)
This should work, but do look into tidymodels
Thanks to akrun for the new shorter answer.
library(tidyverse)
cancer_data <-cancer_data %>% mutate_if(negate(is.factor), normalize)

Error using IRMI imputation

I've been trying to use irmi from VIM package to replace NA's.
My data looks something like this:
> str(sub_mex)
'data.frame': 21 obs. of 83 variables:
$ pH : num 7.2 7.4 7.4 7.36 7.2 7.82 7.67 7.73 7.79 7.7 ...
$ Cond : num 1152 1078 1076 1076 1018 ...
$ CO3 : num NA NA NA NA NA ...
$ Mg : num 25.8 24.9 24.3 24.8 23.4 ...
$ NO3 : num 49.7 25.6 27.1 39.6 52.8 ...
$ Cd : num 0.0088 0.0104 0.0085 0.0092 0.0086 ...
$ As_H : num 0.006 0.0059 0.0056 0.0068 0.0073 ...
$ As_F : num 0.0056 0.0058 0.0057 0.0066 0.0065 0.004 0.004 0.004 0.0048 0.0078 ...
$ As_FC : num NA NA NA NA NA NA NA NA NA 0.0028 ...
$ Pb : num 0.0097 0.0096 0.0092 0.01 0.0093 0.0275 0.024 0.0255 0.031 0.024 ...
$ Fe : num 0.39 0.26 0.27 0.28 0.32 0.135 0.08 NA 0.13 NA ...
$ No_EPT : int 0 0 0 0 0 0 0 0 0 0 ...
I've subset my sub_mex dataset to analyze observations separately, so i have sub_t dataset. Which look something like this
> str(sub_t)
'data.frame': 5 obs. of 83 variables:
$ pH : num 7.82 7.67 7.73 7.79 7.7
$ CO3 : num 45 NA 37.2 41.9 40.3
$ Mg : num 41.3 51.4 47.7 51.8 53
$ NO3 : num 47.1 40.7 39.9 42.1 37.6
$ Cd : num 0.0173 0.0145 0.016 0.016 0.0154
$ As_H : num 0.00949 0.01009 0.00907 0.00972 0.00954
$ As_F : num 0.004 0.004 0.004 0.0048 0.0078
$ As_FC : num NA NA NA NA 0.0028
$ Pb : num 0.0275 0.024 0.0255 0.031 0.024
$ Fe : num 0.135 0.08 NA 0.13 NA
$ No_EPT : int 0 0 0 0 0
I impute NA's of the sub_mex dataset using:
imp_mexi <- irmi(sub_mex) which works fine
However when I try to impute the subset sub_t I got the following error message:
> imp_t <- irmi(sub_t)
Error in indexNA2s[, variable[j]] : subscript out of bounds
Does anyone have an idea of how to solve this? I want to impute my data sub_t and I don't want to use a subset of the ìmp_mexi imputed dataset.
Any help will be deeply appreciated.
I had a similar issue and discovered that one of the columns in my dataframe was entirely missing- hence the out of bounds error.

Resources