I'm trying to view a SAS Transport File provided by CMS HCC risk adjustment model software files (2020 Model Software (zip)).
I already tried to open with SAS Universal Viewer to no luck.
Also tried reading in the file using library(haven) by doing the following.
sessionInfo()
R version 4.0.3 (2020-10-10)
Platform: x86_64-apple-darwin17.0 (64-bit)
Running under: macOS Catalina 10.15.7
setwd('~/Documents/MSSP/CMS-HCC software V2421.86.P1')
library(haven)
sasdata <- read_xpt("C2419P1M")
Which gave me the following error.
Error: Failed to parse Documents/MSSP/CMS-HCC software V2421.86.P1/C2419P1M: Invalid file, or file has unsupported features.
The read_xpt() from haven should be able to read the transport file according to the documentation.
Hoping someone may have a troubleshooting idea that I haven't come across.
The files are not SAS transport files. They are SAS CPORT files, created by PROC CPORT. So HAVEN cannot read them.
You seem to be asking about one data file that is just one observation with 1,039 variables. Here is SAS code to convert it to a text file with 1,039 lines containing the variable name and value.
%let path=C:\downloads\CMS-HCC;
libname out "&path";
proc cimport file="&path\C2419P1M" lib=out; run;
proc transpose data=out.C2419P1M out=C2419P1M(rename=(col1=VALUE)) name=NAME;
run;
filename text temp;
data _null_;
set C2419P1M;
file text;
put name value;
run;
Results:
CFA_F65_69 0.441
CFA_F70_74 0.519
CFA_F75_79 0.593
CFA_F80_84 0.716
CFA_F85_89 0.865
CFA_F90_94 0.987
CFA_F95_GT 1.041
CFA_M65_69 0.494
CFA_M70_74 0.6
CFA_M75_79 0.71
CFA_M80_84 0.803
CFA_M85_89 1
CFA_M90_94 1.142
CFA_M95_GT 1.267
CFA_OriginallyDisabled_Female 0.173
CFA_OriginallyDisabled_Male 0.182
CFA_HCC1 0.595
CFA_HCC2 0.453
CFA_HCC6 0.572
CFA_HCC8 2.566
CFA_HCC9 1.01
CFA_HCC10 0.717
CFA_HCC11 0.317
CFA_HCC12 0.158
CFA_HCC17 0.34
CFA_HCC18 0.34
CFA_HCC19 0.107
CFA_HCC21 0.693
CFA_HCC22 0.383
CFA_HCC23 0.211
CFA_HCC27 1.111
CFA_HCC28 0.411
CFA_HCC29 0.042
CFA_HCC33 0.258
CFA_HCC34 0.349
CFA_HCC35 0.275
CFA_HCC39 0.558
CFA_HCC40 0.371
CFA_HCC46 1.214
CFA_HCC47 0.452
CFA_HCC48 0.221
CFA_HCC54 0.538
CFA_HCC55 0.538
CFA_HCC56 0.538
CFA_HCC57 0.57
CFA_HCC58 0.57
CFA_HCC59 0.299
CFA_HCC60 0.299
CFA_HCC70 1.038
CFA_HCC71 0.921
CFA_HCC72 0.532
CFA_HCC73 1.101
CFA_HCC74 0
CFA_HCC75 0.407
CFA_HCC76 0.413
CFA_HCC77 0.742
CFA_HCC78 0.601
CFA_HCC79 0.237
CFA_HCC80 0.511
CFA_HCC82 2.183
CFA_HCC83 0.902
CFA_HCC84 0.492
CFA_HCC85 0.371
CFA_HCC86 0.377
CFA_HCC87 0.302
CFA_HCC88 0.034
CFA_HCC96 0.384
CFA_HCC99 0.38
CFA_HCC100 0.38
CFA_HCC103 0.487
CFA_HCC104 0.345
CFA_HCC106 1.724
CFA_HCC107 0.565
CFA_HCC108 0.294
CFA_HCC110 0.509
CFA_HCC111 0.43
CFA_HCC112 0.161
CFA_HCC114 0.641
CFA_HCC115 0.258
CFA_HCC122 0.271
CFA_HCC124 0.298
CFA_HCC134 0.683
CFA_HCC135 0.683
CFA_HCC136 0.26
CFA_HCC137 0.26
CFA_HCC138 0.017
CFA_HCC157 2.463
CFA_HCC158 1.471
CFA_HCC161 0.727
CFA_HCC162 0.162
CFA_HCC166 0.511
CFA_HCC167 0.144
CFA_HCC169 0.532
CFA_HCC170 0.409
CFA_HCC173 0.221
CFA_HCC176 0.68
CFA_HCC186 0.728
CFA_HCC188 0.742
CFA_HCC189 0.795
CFA_HCC51 0.453
CFA_HCC52 0.453
CFA_HCC159 0.863
CFA_HCC47_gCancer 0.853
CFA_DIABETES_CHF 0.192
CFA_CHF_gCopdCF 0.23
CFA_HCC85_gRenal_V24 0.187
CFA_gCopdCF_CARD_RESP_FAIL 0.528
CFA_HCC85_HCC96 0.138
CFA_D6 0.04
CFA_D7 0.057
CFA_D8 0.095
CFA_D9 0.156
CFA_D10P 0.373
CFD_F0_34 0.349
CFD_F35_44 0.349
CFD_F45_54 0.374
CFD_F55_59 0.434
CFD_F60_64 0.49
CFD_M0_34 0.24
CFD_M35_44 0.235
CFD_M45_54 0.307
CFD_M55_59 0.402
CFD_M60_64 0.526
CFD_HCC1 0.396
CFD_HCC2 0.53
CFD_HCC6 0.803
CFD_HCC8 2.801
CFD_HCC9 1.001
CFD_HCC10 0.756
CFD_HCC11 0.355
CFD_HCC12 0.212
CFD_HCC17 0.423
CFD_HCC18 0.423
CFD_HCC19 0.145
CFD_HCC21 0.723
CFD_HCC22 0.297
CFD_HCC23 0.299
CFD_HCC27 1.101
CFD_HCC28 0.365
CFD_HCC29 0.292
CFD_HCC33 0.538
CFD_HCC34 0.762
CFD_HCC35 0.551
CFD_HCC39 0.682
CFD_HCC40 0.328
CFD_HCC46 4.309
CFD_HCC47 0.691
CFD_HCC48 0.298
CFD_HCC54 0.896
CFD_HCC55 0.356
CFD_HCC56 0.348
CFD_HCC57 0.381
CFD_HCC58 0.231
CFD_HCC59 0.127
CFD_HCC60 0.1
CFD_HCC70 1
CFD_HCC71 0.957
CFD_HCC72 0.377
CFD_HCC73 1.245
CFD_HCC74 0
CFD_HCC75 0.404
CFD_HCC76 0.597
CFD_HCC77 0.789
CFD_HCC78 0.443
CFD_HCC79 0.139
CFD_HCC80 0.105
CFD_HCC82 1.465
CFD_HCC83 0.531
CFD_HCC84 0.531
CFD_HCC85 0.486
CFD_HCC86 0.425
CFD_HCC87 0.425
CFD_HCC88 0.152
CFD_HCC96 0.308
CFD_HCC99 0.486
CFD_HCC100 0.324
CFD_HCC103 0.296
CFD_HCC104 0.258
CFD_HCC106 1.748
CFD_HCC107 0.653
CFD_HCC108 0.267
CFD_HCC110 3.516
CFD_HCC111 0.331
CFD_HCC112 0.275
CFD_HCC114 0.375
CFD_HCC115 0
CFD_HCC122 0.269
CFD_HCC124 0.145
CFD_HCC134 0.594
CFD_HCC135 0.594
CFD_HCC136 0.323
CFD_HCC137 0.138
CFD_HCC138 0
CFD_HCC157 2.582
CFD_HCC158 1.38
CFD_HCC161 0.583
CFD_HCC162 0.308
CFD_HCC166 0.105
CFD_HCC167 0.025
CFD_HCC169 0.377
CFD_HCC170 0.469
CFD_HCC173 0.525
CFD_HCC176 0.982
CFD_HCC186 0.865
CFD_HCC188 0.77
CFD_HCC189 0.934
CFD_HCC51 0.256
CFD_HCC52 0.256
CFD_HCC159 0.467
CFD_HCC47_gCancer 0.679
CFD_DIABETES_CHF 0.043
CFD_CHF_gCopdCF 0.154
CFD_HCC85_gRenal_V24 0.461
CFD_gCopdCF_CARD_RESP_FAIL 0.455
CFD_HCC85_HCC96 0.361
CFD_gSubstanceUseDisorder_gPsych 0.191
CFD_D5 0.055
CFD_D6 0.167
CFD_D7 0.269
CFD_D8 0.424
CFD_D9 0.549
CFD_D10P 1.056
CPA_F65_69 0.359
CPA_F70_74 0.406
CPA_F75_79 0.476
CPA_F80_84 0.55
CPA_F85_89 0.653
CPA_F90_94 0.783
CPA_F95_GT 0.873
CPA_M65_69 0.37
CPA_M70_74 0.427
CPA_M75_79 0.5
CPA_M80_84 0.544
CPA_M85_89 0.659
CPA_M90_94 0.834
CPA_M95_GT 1.047
CPA_OriginallyDisabled_Female 0.136
CPA_OriginallyDisabled_Male 0.083
CPA_HCC1 0.482
CPA_HCC2 0.316
CPA_HCC6 0.318
CPA_HCC8 2.455
CPA_HCC9 1.001
CPA_HCC10 0.648
CPA_HCC11 0.33
CPA_HCC12 0.154
CPA_HCC17 0.326
CPA_HCC18 0.326
CPA_HCC19 0.087
CPA_HCC21 0.457
CPA_HCC22 0.233
CPA_HCC23 0.174
CPA_HCC27 0.729
CPA_HCC28 0.403
CPA_HCC29 0.181
CPA_HCC33 0.232
CPA_HCC34 0.371
CPA_HCC35 0.275
CPA_HCC39 0.443
CPA_HCC40 0.347
CPA_HCC46 1.234
CPA_HCC47 0.674
CPA_HCC48 0.186
CPA_HCC54 0.372
CPA_HCC55 0.372
CPA_HCC56 0.372
CPA_HCC57 0.495
CPA_HCC58 0.449
CPA_HCC59 0.306
CPA_HCC60 0.255
CPA_HCC70 1
CPA_HCC71 1
CPA_HCC72 0.512
CPA_HCC73 0.687
CPA_HCC74 0.114
CPA_HCC75 0.287
CPA_HCC76 0
CPA_HCC77 0.276
CPA_HCC78 0.536
CPA_HCC79 0.257
CPA_HCC80 0.729
CPA_HCC82 0.836
CPA_HCC83 0.361
CPA_HCC84 0.361
CPA_HCC85 0.336
CPA_HCC86 0.293
CPA_HCC87 0.276
CPA_HCC88 0.149
CPA_HCC96 0.264
CPA_HCC99 0.23
CPA_HCC100 0.23
CPA_HCC103 0.438
CPA_HCC104 0.3
CPA_HCC106 1.504
CPA_HCC107 0.463
CPA_HCC108 0.297
CPA_HCC110 0.392
CPA_HCC111 0.358
CPA_HCC112 0.2
CPA_HCC114 0.514
CPA_HCC115 0.093
CPA_HCC122 0.182
CPA_HCC124 0.393
CPA_HCC134 0.446
CPA_HCC135 0.446
CPA_HCC136 0.28
CPA_HCC137 0.28
CPA_HCC138 0.043
CPA_HCC157 2.028
CPA_HCC158 1.162
CPA_HCC161 0.541
CPA_HCC162 0
CPA_HCC166 0.729
CPA_HCC167 0.034
CPA_HCC169 0.512
CPA_HCC170 0.354
CPA_HCC173 0.176
CPA_HCC176 0.52
CPA_HCC186 0.438
CPA_HCC188 0.52
CPA_HCC189 0.697
CPA_HCC51 0.42
CPA_HCC52 0.42
CPA_HCC159 0.649
CPA_HCC47_gCancer 0.656
CPA_DIABETES_CHF 0.113
CPA_CHF_gCopdCF 0.158
CPA_HCC85_gRenal_V24 0.186
CPA_gCopdCF_CARD_RESP_FAIL 0.392
CPA_HCC85_HCC96 0.101
CPA_D5 0.037
CPA_D6 0.071
CPA_D7 0.08
CPA_D8 0.125
CPA_D9 0.402
CPA_D10P 0.548
CPD_F0_34 0.383
CPD_F35_44 0.414
CPD_F45_54 0.418
CPD_F55_59 0.414
CPD_F60_64 0.412
CPD_M0_34 0.389
CPD_M35_44 0.282
CPD_M45_54 0.313
CPD_M55_59 0.34
CPD_M60_64 0.373
CPD_HCC1 0.2
CPD_HCC2 0.297
CPD_HCC6 0.658
CPD_HCC8 2.659
CPD_HCC9 0.88
CPD_HCC10 0.667
CPD_HCC11 0.351
CPD_HCC12 0.181
CPD_HCC17 0.373
CPD_HCC18 0.373
CPD_HCC19 0.122
CPD_HCC21 0.679
CPD_HCC22 0.204
CPD_HCC23 0.319
CPD_HCC27 0.887
CPD_HCC28 0.341
CPD_HCC29 0.238
CPD_HCC33 0.552
CPD_HCC34 0.597
CPD_HCC35 0.543
CPD_HCC39 0.435
CPD_HCC40 0.264
CPD_HCC46 4.138
CPD_HCC47 0.594
CPD_HCC48 0.33
CPD_HCC54 0.679
CPD_HCC55 0.275
CPD_HCC56 0.275
CPD_HCC57 0.309
CPD_HCC58 0.239
CPD_HCC59 0.109
CPD_HCC60 0.065
CPD_HCC70 1.134
CPD_HCC71 0.933
CPD_HCC72 0.336
CPD_HCC73 0.933
CPD_HCC74 0
CPD_HCC75 0.314
CPD_HCC76 0.286
CPD_HCC77 0.46
CPD_HCC78 0.43
CPD_HCC79 0.169
CPD_HCC80 0.134
CPD_HCC82 0.769
CPD_HCC83 0.769
CPD_HCC84 0.343
CPD_HCC85 0.422
CPD_HCC86 0.379
CPD_HCC87 0.379
CPD_HCC88 0.149
CPD_HCC96 0.281
CPD_HCC99 0.163
CPD_HCC100 0.163
CPD_HCC103 0.31
CPD_HCC104 0.164
CPD_HCC106 1.525
CPD_HCC107 0.45
CPD_HCC108 0.314
CPD_HCC110 3.051
CPD_HCC111 0.267
CPD_HCC112 0.229
CPD_HCC114 0.198
CPD_HCC115 0.082
CPD_HCC122 0.201
CPD_HCC124 0.158
CPD_HCC134 0.48
CPD_HCC135 0.48
CPD_HCC136 0.261
CPD_HCC137 0.039
CPD_HCC138 0
CPD_HCC157 2.512
CPD_HCC158 0.925
CPD_HCC161 0.542
CPD_HCC162 0.324
CPD_HCC166 0.134
CPD_HCC167 0.019
CPD_HCC169 0.336
CPD_HCC170 0.333
CPD_HCC173 0.18
CPD_HCC176 0.832
CPD_HCC186 0.613
CPD_HCC188 0.732
CPD_HCC189 0.626
CPD_HCC51 0.257
CPD_HCC52 0.257
CPD_HCC159 0.824
CPD_HCC47_gCancer 0.601
CPD_DIABETES_CHF 0
CPD_CHF_gCopdCF 0.141
CPD_HCC85_gRenal_V24 0.382
CPD_gCopdCF_CARD_RESP_FAIL 0.479
CPD_HCC85_HCC96 0.303
CPD_gSubstanceUseDisorder_gPsych 0.201
CPD_D5 0.083
CPD_D6 0.117
CPD_D7 0.291
CPD_D8 0.452
CPD_D9 0.499
CPD_D10P 0.893
CND_F0_34 0.241
CND_F35_44 0.315
CND_F45_54 0.348
CND_F55_59 0.379
CND_F60_64 0.428
CND_M0_34 0.156
CND_M35_44 0.199
CND_M45_54 0.241
CND_M55_59 0.287
CND_M60_64 0.33
CND_HCC1 0.287
CND_HCC2 0.414
CND_HCC6 0.74
CND_HCC8 2.714
CND_HCC9 0.91
CND_HCC10 0.663
CND_HCC11 0.345
CND_HCC12 0.212
CND_HCC17 0.351
CND_HCC18 0.351
CND_HCC19 0.124
CND_HCC21 0.674
CND_HCC22 0.183
CND_HCC23 0.378
CND_HCC27 1.065
CND_HCC28 0.334
CND_HCC29 0.314
CND_HCC33 0.503
CND_HCC34 0.58
CND_HCC35 0.523
CND_HCC39 0.378
CND_HCC40 0.367
CND_HCC46 3.566
CND_HCC47 0.86
CND_HCC48 0.312
CND_HCC54 0.543
CND_HCC55 0.279
CND_HCC56 0.247
CND_HCC57 0.352
CND_HCC58 0.352
CND_HCC59 0.164
CND_HCC60 0.108
CND_HCC70 1.001
CND_HCC71 0.739
CND_HCC72 0.369
CND_HCC73 1.132
CND_HCC74 0.098
CND_HCC75 0.481
CND_HCC76 0.621
CND_HCC77 0.566
CND_HCC78 0.501
CND_HCC79 0.196
CND_HCC80 0.274
CND_HCC82 0.781
CND_HCC83 0.4
CND_HCC84 0.385
CND_HCC85 0.447
CND_HCC86 0.264
CND_HCC87 0.264
CND_HCC88 0.111
CND_HCC96 0.262
CND_HCC99 0.17
CND_HCC100 0.146
CND_HCC103 0.281
CND_HCC104 0.27
CND_HCC106 1.521
CND_HCC107 0.464
CND_HCC108 0.301
CND_HCC110 2.676
CND_HCC111 0.246
CND_HCC112 0.237
CND_HCC114 0.236
CND_HCC115 0
CND_HCC122 0.231
CND_HCC124 0.314
CND_HCC134 0.406
CND_HCC135 0.406
CND_HCC136 0.231
CND_HCC137 0.105
CND_HCC138 0.021
CND_HCC157 2.097
CND_HCC158 1.212
CND_HCC161 0.592
CND_HCC162 0.506
CND_HCC166 0.274
CND_HCC167 0
CND_HCC169 0.369
CND_HCC170 0.394
CND_HCC173 0.172
CND_HCC176 0.911
CND_HCC186 0.445
CND_HCC188 0.755
CND_HCC189 0.437
CND_HCC51 0.224
CND_HCC52 0.224
CND_HCC159 0.628
CND_HCC47_gCancer 0.46
CND_DIABETES_CHF 0.024
CND_CHF_gCopdCF 0.121
CND_HCC85_gRenal_V24 0.411
CND_gCopdCF_CARD_RESP_FAIL 0.379
CND_HCC85_HCC96 0.282
CND_gSubstanceUseDisorder_gPsych 0.138
CND_D5 0.043
CND_D6 0.131
CND_D7 0.201
CND_D8 0.441
CND_D9 0.441
CND_D10P 0.897
INS_F0_34 0.902
INS_F35_44 1.105
INS_F45_54 1.043
INS_F55_59 1.065
INS_F60_64 1.067
INS_F65_69 1.245
INS_F70_74 1.15
INS_F75_79 1.014
INS_F80_84 0.882
INS_F85_89 0.798
INS_F90_94 0.668
INS_F95_GT 0.501
INS_M0_34 1.101
INS_M35_44 1.002
INS_M45_54 0.965
INS_M55_59 1.017
INS_M60_64 1.061
INS_M65_69 1.288
INS_M70_74 1.329
INS_M75_79 1.317
INS_M80_84 1.207
INS_M85_89 1.122
INS_M90_94 0.989
INS_M95_GT 0.821
INS_LTIMCAID 0.061
INS_ORIGDS 0
INS_DISABLED_HCC85 0.279
INS_DISABLED_PRESSURE_ULCER 0.544
INS_DISABLED_HCC161 0.473
INS_DISABLED_HCC39 0.456
INS_DISABLED_HCC77 0.496
INS_DISABLED_HCC6 0.405
INS_CHF_gCopdCF 0.191
INS_gCopdCF_CARD_RESP_FAIL 0.414
INS_SEPSIS_PRESSURE_ULCER 0.155
INS_SEPSIS_ARTIF_OPENINGS 0.474
INS_ART_OPENINGS_PRESS_ULCER 0.359
INS_DIABETES_CHF 0.169
INS_gCopdCF_ASP_SPEC_B_PNEUM 0.216
INS_ASP_SPEC_B_PNEUM_PRES_ULC 0.472
INS_SEPSIS_ASP_SPEC_BACT_PNEUM 0.346
INS_SCHIZOPHRENIA_gCopdCF 0.417
INS_SCHIZOPHRENIA_CHF 0.127
INS_SCHIZOPHRENIA_SEIZURES 0.573
INS_HCC1 1.722
INS_HCC2 0.324
INS_HCC6 0.534
INS_HCC8 1.303
INS_HCC9 0.623
INS_HCC10 0.461
INS_HCC11 0.294
INS_HCC12 0.21
INS_HCC17 0.44
INS_HCC18 0.44
INS_HCC19 0.178
INS_HCC21 0.267
INS_HCC22 0.455
INS_HCC23 0.379
INS_HCC27 0.874
INS_HCC28 0.485
INS_HCC29 0.485
INS_HCC33 0.352
INS_HCC34 0.422
INS_HCC35 0.355
INS_HCC39 0.401
INS_HCC40 0.292
INS_HCC46 0.799
INS_HCC47 0.576
INS_HCC48 0.19
INS_HCC54 0.178
INS_HCC55 0.178
INS_HCC56 0.178
INS_HCC57 0.187
INS_HCC58 0.187
INS_HCC59 0.187
INS_HCC60 0
INS_HCC70 0.549
INS_HCC71 0.492
INS_HCC72 0.289
INS_HCC73 0.476
INS_HCC74 0
INS_HCC75 0.332
INS_HCC76 0.356
INS_HCC77 0
INS_HCC78 0.159
INS_HCC79 0.065
INS_HCC80 0
INS_HCC82 1.622
INS_HCC83 0.511
INS_HCC84 0.313
INS_HCC85 0.203
INS_HCC86 0.366
INS_HCC87 0.366
INS_HCC88 0.366
INS_HCC96 0.252
INS_HCC99 0.111
INS_HCC100 0.111
INS_HCC103 0
INS_HCC104 0
INS_HCC106 0.867
INS_HCC107 0.299
INS_HCC108 0.093
INS_HCC110 0.593
INS_HCC111 0.311
INS_HCC112 0.11
INS_HCC114 0.156
INS_HCC115 0.156
INS_HCC122 0.394
INS_HCC124 0.217
INS_HCC134 0.468
INS_HCC135 0.468
INS_HCC136 0.245
INS_HCC137 0.201
INS_HCC138 0.092
INS_HCC157 0.854
INS_HCC158 0.322
INS_HCC161 0.294
INS_HCC162 0
INS_HCC166 0
INS_HCC167 0
INS_HCC169 0.25
INS_HCC170 0
INS_HCC173 0.092
INS_HCC176 0.469
INS_HCC186 1.046
INS_HCC188 0.514
INS_HCC189 0.357
INS_HCC51 0
INS_HCC52 0
INS_HCC159 0.322
NE_NMCAID_NORIGDIS_NEF0_34 0.804
NE_NMCAID_NORIGDIS_NEF35_44 0.947
NE_NMCAID_NORIGDIS_NEF45_54 1.016
NE_NMCAID_NORIGDIS_NEF55_59 1.017
NE_NMCAID_NORIGDIS_NEF60_64 1.122
NE_NMCAID_NORIGDIS_NEF65 0.52
NE_NMCAID_NORIGDIS_NEF66 0.515
NE_NMCAID_NORIGDIS_NEF67 0.544
NE_NMCAID_NORIGDIS_NEF68 0.598
NE_NMCAID_NORIGDIS_NEF69 0.6
NE_NMCAID_NORIGDIS_NEF70_74 0.69
NE_NMCAID_NORIGDIS_NEF75_79 0.86
NE_NMCAID_NORIGDIS_NEF80_84 1.014
NE_NMCAID_NORIGDIS_NEF85_89 1.293
NE_NMCAID_NORIGDIS_NEF90_94 1.293
NE_NMCAID_NORIGDIS_NEF95_GT 1.293
NE_NMCAID_NORIGDIS_NEM0_34 0.442
NE_NMCAID_NORIGDIS_NEM35_44 0.657
NE_NMCAID_NORIGDIS_NEM45_54 0.864
NE_NMCAID_NORIGDIS_NEM55_59 0.904
NE_NMCAID_NORIGDIS_NEM60_64 0.921
NE_NMCAID_NORIGDIS_NEM65 0.518
NE_NMCAID_NORIGDIS_NEM66 0.533
NE_NMCAID_NORIGDIS_NEM67 0.582
NE_NMCAID_NORIGDIS_NEM68 0.626
NE_NMCAID_NORIGDIS_NEM69 0.69
NE_NMCAID_NORIGDIS_NEM70_74 0.786
NE_NMCAID_NORIGDIS_NEM75_79 1.06
NE_NMCAID_NORIGDIS_NEM80_84 1.247
NE_NMCAID_NORIGDIS_NEM85_89 1.498
NE_NMCAID_NORIGDIS_NEM90_94 1.498
NE_NMCAID_NORIGDIS_NEM95_GT 1.498
NE_MCAID_NORIGDIS_NEF0_34 0.969
NE_MCAID_NORIGDIS_NEF35_44 1.202
NE_MCAID_NORIGDIS_NEF45_54 1.306
NE_MCAID_NORIGDIS_NEF55_59 1.307
NE_MCAID_NORIGDIS_NEF60_64 1.408
NE_MCAID_NORIGDIS_NEF65 0.993
NE_MCAID_NORIGDIS_NEF66 0.897
NE_MCAID_NORIGDIS_NEF67 0.92
NE_MCAID_NORIGDIS_NEF68 0.951
NE_MCAID_NORIGDIS_NEF69 0.951
NE_MCAID_NORIGDIS_NEF70_74 0.985
NE_MCAID_NORIGDIS_NEF75_79 1.134
NE_MCAID_NORIGDIS_NEF80_84 1.353
NE_MCAID_NORIGDIS_NEF85_89 1.536
NE_MCAID_NORIGDIS_NEF90_94 1.701
NE_MCAID_NORIGDIS_NEF95_GT 1.701
NE_MCAID_NORIGDIS_NEM0_34 0.734
NE_MCAID_NORIGDIS_NEM35_44 1.059
NE_MCAID_NORIGDIS_NEM45_54 1.353
NE_MCAID_NORIGDIS_NEM55_59 1.418
NE_MCAID_NORIGDIS_NEM60_64 1.551
NE_MCAID_NORIGDIS_NEM65 1.144
NE_MCAID_NORIGDIS_NEM66 1.094
NE_MCAID_NORIGDIS_NEM67 1.151
NE_MCAID_NORIGDIS_NEM68 1.202
NE_MCAID_NORIGDIS_NEM69 1.202
NE_MCAID_NORIGDIS_NEM70_74 1.298
NE_MCAID_NORIGDIS_NEM75_79 1.407
NE_MCAID_NORIGDIS_NEM80_84 1.555
NE_MCAID_NORIGDIS_NEM85_89 1.777
NE_MCAID_NORIGDIS_NEM90_94 1.777
NE_MCAID_NORIGDIS_NEM95_GT 1.777
NE_NMCAID_ORIGDIS_NEF65 1.122
NE_NMCAID_ORIGDIS_NEF66 1.174
NE_NMCAID_ORIGDIS_NEF67 1.174
NE_NMCAID_ORIGDIS_NEF68 1.174
NE_NMCAID_ORIGDIS_NEF69 1.174
NE_NMCAID_ORIGDIS_NEF70_74 1.174
NE_NMCAID_ORIGDIS_NEF75_79 1.174
NE_NMCAID_ORIGDIS_NEF80_84 1.174
NE_NMCAID_ORIGDIS_NEF85_89 1.293
NE_NMCAID_ORIGDIS_NEF90_94 1.293
NE_NMCAID_ORIGDIS_NEF95_GT 1.293
NE_NMCAID_ORIGDIS_NEM65 0.921
NE_NMCAID_ORIGDIS_NEM66 1.071
NE_NMCAID_ORIGDIS_NEM67 1.123
NE_NMCAID_ORIGDIS_NEM68 1.123
NE_NMCAID_ORIGDIS_NEM69 1.32
NE_NMCAID_ORIGDIS_NEM70_74 1.408
NE_NMCAID_ORIGDIS_NEM75_79 1.408
NE_NMCAID_ORIGDIS_NEM80_84 1.408
NE_NMCAID_ORIGDIS_NEM85_89 1.498
NE_NMCAID_ORIGDIS_NEM90_94 1.498
NE_NMCAID_ORIGDIS_NEM95_GT 1.498
NE_MCAID_ORIGDIS_NEF65 1.462
NE_MCAID_ORIGDIS_NEF66 1.887
NE_MCAID_ORIGDIS_NEF67 1.887
NE_MCAID_ORIGDIS_NEF68 1.887
NE_MCAID_ORIGDIS_NEF69 1.887
NE_MCAID_ORIGDIS_NEF70_74 1.887
NE_MCAID_ORIGDIS_NEF75_79 1.887
NE_MCAID_ORIGDIS_NEF80_84 1.887
NE_MCAID_ORIGDIS_NEF85_89 1.887
NE_MCAID_ORIGDIS_NEF90_94 1.887
NE_MCAID_ORIGDIS_NEF95_GT 1.887
NE_MCAID_ORIGDIS_NEM65 1.811
NE_MCAID_ORIGDIS_NEM66 2.199
NE_MCAID_ORIGDIS_NEM67 2.199
NE_MCAID_ORIGDIS_NEM68 2.199
NE_MCAID_ORIGDIS_NEM69 2.199
NE_MCAID_ORIGDIS_NEM70_74 2.199
NE_MCAID_ORIGDIS_NEM75_79 2.199
NE_MCAID_ORIGDIS_NEM80_84 2.199
NE_MCAID_ORIGDIS_NEM85_89 2.199
NE_MCAID_ORIGDIS_NEM90_94 2.199
NE_MCAID_ORIGDIS_NEM95_GT 2.199
CFA_D1 0
CFA_D2 0
CFA_D3 0
CFA_D4 0
CFA_D5 0
CPA_D1 0
CPA_D2 0
CPA_D3 0
CPA_D4 0
CFD_D1 0
CFD_D2 0
CFD_D3 0
CFD_D4 0
CND_D1 0
CND_D2 0
CND_D3 0
CND_D4 0
CPD_D1 0
CPD_D2 0
CPD_D3 0
CPD_D4 0
CNA_F65_69 0.323
CNA_F70_74 0.386
CNA_F75_79 0.451
CNA_F80_84 0.528
CNA_F85_89 0.641
CNA_F90_94 0.783
CNA_F95_GT 0.787
CNA_M65_69 0.308
CNA_M70_74 0.394
CNA_M75_79 0.473
CNA_M80_84 0.556
CNA_M85_89 0.686
CNA_M90_94 0.841
CNA_M95_GT 0.986
CNA_OriginallyDisabled_Female 0.25
CNA_OriginallyDisabled_Male 0.147
CNA_HCC1 0.335
CNA_HCC2 0.352
CNA_HCC6 0.424
CNA_HCC8 2.659
CNA_HCC9 1.024
CNA_HCC10 0.675
CNA_HCC11 0.307
CNA_HCC12 0.15
CNA_HCC17 0.302
CNA_HCC18 0.302
CNA_HCC19 0.105
CNA_HCC21 0.455
CNA_HCC22 0.25
CNA_HCC23 0.194
CNA_HCC27 0.882
CNA_HCC28 0.363
CNA_HCC29 0.147
CNA_HCC33 0.219
CNA_HCC34 0.287
CNA_HCC35 0.308
CNA_HCC39 0.401
CNA_HCC40 0.421
CNA_HCC46 1.372
CNA_HCC47 0.665
CNA_HCC48 0.192
CNA_HCC54 0.329
CNA_HCC55 0.329
CNA_HCC56 0.329
CNA_HCC57 0.524
CNA_HCC58 0.393
CNA_HCC59 0.309
CNA_HCC60 0.309
CNA_HCC70 1.242
CNA_HCC71 1.068
CNA_HCC72 0.481
CNA_HCC73 0.999
CNA_HCC74 0.339
CNA_HCC75 0.472
CNA_HCC76 0.518
CNA_HCC77 0.423
CNA_HCC78 0.606
CNA_HCC79 0.22
CNA_HCC80 0.486
CNA_HCC82 1
CNA_HCC83 0.354
CNA_HCC84 0.282
CNA_HCC85 0.331
CNA_HCC86 0.195
CNA_HCC87 0.195
CNA_HCC88 0.135
CNA_HCC96 0.268
CNA_HCC99 0.23
CNA_HCC100 0.23
CNA_HCC103 0.437
CNA_HCC104 0.331
CNA_HCC106 1.488
CNA_HCC107 0.383
CNA_HCC108 0.288
CNA_HCC110 0.51
CNA_HCC111 0.335
CNA_HCC112 0.219
CNA_HCC114 0.517
CNA_HCC115 0.13
CNA_HCC122 0.222
CNA_HCC124 0.521
CNA_HCC134 0.435
CNA_HCC135 0.435
CNA_HCC136 0.289
CNA_HCC137 0.289
CNA_HCC138 0.069
CNA_HCC157 2.028
CNA_HCC158 1.069
CNA_HCC161 0.515
CNA_HCC162 0.224
CNA_HCC166 0.486
CNA_HCC167 0.077
CNA_HCC169 0.476
CNA_HCC170 0.35
CNA_HCC173 0.208
CNA_HCC176 0.582
CNA_HCC186 0.832
CNA_HCC188 0.534
CNA_HCC189 0.519
CNA_HCC51 0.346
CNA_HCC52 0.346
CNA_HCC159 0.656
CNA_HCC47_gCancer 0.838
CNA_DIABETES_CHF 0.121
CNA_CHF_gCopdCF 0.155
CNA_HCC85_gRenal_V24 0.156
CNA_gCopdCF_CARD_RESP_FAIL 0.363
CNA_HCC85_HCC96 0.085
CNA_D4 0.006
CNA_D5 0.042
CNA_D6 0.077
CNA_D7 0.126
CNA_D8 0.214
CNA_D9 0.258
CNA_D10P 0.505
CNA_D1 0
CNA_D2 0
CNA_D3 0
SNPNE_NMCAID_NORIGDIS_NEF0_34 1.513
SNPNE_NMCAID_NORIGDIS_NEF35_44 1.513
SNPNE_NMCAID_NORIGDIS_NEF45_54 1.513
SNPNE_NMCAID_NORIGDIS_NEF55_59 1.619
SNPNE_NMCAID_NORIGDIS_NEF60_64 1.686
SNPNE_NMCAID_NORIGDIS_NEF65 0.999
SNPNE_NMCAID_NORIGDIS_NEF66 0.999
SNPNE_NMCAID_NORIGDIS_NEF67 1.07
SNPNE_NMCAID_NORIGDIS_NEF68 1.108
SNPNE_NMCAID_NORIGDIS_NEF69 1.164
SNPNE_NMCAID_NORIGDIS_NEF70_74 1.31
SNPNE_NMCAID_NORIGDIS_NEF75_79 1.516
SNPNE_NMCAID_NORIGDIS_NEF80_84 1.746
SNPNE_NMCAID_NORIGDIS_NEF85_89 1.971
SNPNE_NMCAID_NORIGDIS_NEF90_94 2.161
SNPNE_NMCAID_NORIGDIS_NEF95_GT 2.161
SNPNE_NMCAID_NORIGDIS_NEM0_34 1.276
SNPNE_NMCAID_NORIGDIS_NEM35_44 1.276
SNPNE_NMCAID_NORIGDIS_NEM45_54 1.498
SNPNE_NMCAID_NORIGDIS_NEM55_59 1.63
SNPNE_NMCAID_NORIGDIS_NEM60_64 1.673
SNPNE_NMCAID_NORIGDIS_NEM65 0.98
SNPNE_NMCAID_NORIGDIS_NEM66 0.98
SNPNE_NMCAID_NORIGDIS_NEM67 1.02
SNPNE_NMCAID_NORIGDIS_NEM68 1.082
SNPNE_NMCAID_NORIGDIS_NEM69 1.14
SNPNE_NMCAID_NORIGDIS_NEM70_74 1.345
SNPNE_NMCAID_NORIGDIS_NEM75_79 1.581
SNPNE_NMCAID_NORIGDIS_NEM80_84 1.832
SNPNE_NMCAID_NORIGDIS_NEM85_89 2.095
SNPNE_NMCAID_NORIGDIS_NEM90_94 2.351
SNPNE_NMCAID_NORIGDIS_NEM95_GT 2.351
SNPNE_MCAID_NORIGDIS_NEF0_34 1.776
SNPNE_MCAID_NORIGDIS_NEF35_44 1.776
SNPNE_MCAID_NORIGDIS_NEF45_54 2.01
SNPNE_MCAID_NORIGDIS_NEF55_59 2.095
SNPNE_MCAID_NORIGDIS_NEF60_64 2.126
SNPNE_MCAID_NORIGDIS_NEF65 1.375
SNPNE_MCAID_NORIGDIS_NEF66 1.375
SNPNE_MCAID_NORIGDIS_NEF67 1.483
SNPNE_MCAID_NORIGDIS_NEF68 1.559
SNPNE_MCAID_NORIGDIS_NEF69 1.576
SNPNE_MCAID_NORIGDIS_NEF70_74 1.789
SNPNE_MCAID_NORIGDIS_NEF75_79 1.98
SNPNE_MCAID_NORIGDIS_NEF80_84 2.194
SNPNE_MCAID_NORIGDIS_NEF85_89 2.49
SNPNE_MCAID_NORIGDIS_NEF90_94 2.68
SNPNE_MCAID_NORIGDIS_NEF95_GT 2.68
SNPNE_MCAID_NORIGDIS_NEM0_34 1.533
SNPNE_MCAID_NORIGDIS_NEM35_44 1.533
SNPNE_MCAID_NORIGDIS_NEM45_54 1.854
SNPNE_MCAID_NORIGDIS_NEM55_59 2.041
SNPNE_MCAID_NORIGDIS_NEM60_64 2.167
SNPNE_MCAID_NORIGDIS_NEM65 1.525
SNPNE_MCAID_NORIGDIS_NEM66 1.525
SNPNE_MCAID_NORIGDIS_NEM67 1.646
SNPNE_MCAID_NORIGDIS_NEM68 1.646
SNPNE_MCAID_NORIGDIS_NEM69 1.646
SNPNE_MCAID_NORIGDIS_NEM70_74 1.967
SNPNE_MCAID_NORIGDIS_NEM75_79 2.14
SNPNE_MCAID_NORIGDIS_NEM80_84 2.272
SNPNE_MCAID_NORIGDIS_NEM85_89 2.63
SNPNE_MCAID_NORIGDIS_NEM90_94 2.63
SNPNE_MCAID_NORIGDIS_NEM95_GT 2.63
SNPNE_NMCAID_ORIGDIS_NEF65 1.81
SNPNE_NMCAID_ORIGDIS_NEF66 1.81
SNPNE_NMCAID_ORIGDIS_NEF67 1.834
SNPNE_NMCAID_ORIGDIS_NEF68 1.834
SNPNE_NMCAID_ORIGDIS_NEF69 1.834
SNPNE_NMCAID_ORIGDIS_NEF70_74 2.006
SNPNE_NMCAID_ORIGDIS_NEF75_79 2.112
SNPNE_NMCAID_ORIGDIS_NEF80_84 2.476
SNPNE_NMCAID_ORIGDIS_NEF85_89 2.476
SNPNE_NMCAID_ORIGDIS_NEF90_94 2.476
SNPNE_NMCAID_ORIGDIS_NEF95_GT 2.476
SNPNE_NMCAID_ORIGDIS_NEM65 1.664
SNPNE_NMCAID_ORIGDIS_NEM66 1.667
SNPNE_NMCAID_ORIGDIS_NEM67 1.725
SNPNE_NMCAID_ORIGDIS_NEM68 1.74
SNPNE_NMCAID_ORIGDIS_NEM69 1.797
SNPNE_NMCAID_ORIGDIS_NEM70_74 1.935
SNPNE_NMCAID_ORIGDIS_NEM75_79 2.073
SNPNE_NMCAID_ORIGDIS_NEM80_84 2.349
SNPNE_NMCAID_ORIGDIS_NEM85_89 2.349
SNPNE_NMCAID_ORIGDIS_NEM90_94 2.349
SNPNE_NMCAID_ORIGDIS_NEM95_GT 2.349
SNPNE_MCAID_ORIGDIS_NEF65 2.183
SNPNE_MCAID_ORIGDIS_NEF66 2.209
SNPNE_MCAID_ORIGDIS_NEF67 2.213
SNPNE_MCAID_ORIGDIS_NEF68 2.248
SNPNE_MCAID_ORIGDIS_NEF69 2.336
SNPNE_MCAID_ORIGDIS_NEF70_74 2.424
SNPNE_MCAID_ORIGDIS_NEF75_79 2.562
SNPNE_MCAID_ORIGDIS_NEF80_84 2.772
SNPNE_MCAID_ORIGDIS_NEF85_89 2.772
SNPNE_MCAID_ORIGDIS_NEF90_94 2.772
SNPNE_MCAID_ORIGDIS_NEF95_GT 2.772
SNPNE_MCAID_ORIGDIS_NEM65 2.173
SNPNE_MCAID_ORIGDIS_NEM66 2.173
SNPNE_MCAID_ORIGDIS_NEM67 2.179
SNPNE_MCAID_ORIGDIS_NEM68 2.179
SNPNE_MCAID_ORIGDIS_NEM69 2.179
SNPNE_MCAID_ORIGDIS_NEM70_74 2.419
SNPNE_MCAID_ORIGDIS_NEM75_79 2.509
SNPNE_MCAID_ORIGDIS_NEM80_84 2.805
SNPNE_MCAID_ORIGDIS_NEM85_89 2.805
SNPNE_MCAID_ORIGDIS_NEM90_94 2.805
SNPNE_MCAID_ORIGDIS_NEM95_GT 2.805
Related
I am trying to create a structural equation model that tests the structure of latent variables underlying a big 5 dataset found on kaggle. More specifically, I would like to replicate a finding which suggests that common method variance (e.g., response biases) inflate the often observed high intercorrelations between the manifest variables/items of the big 5 (Chang, Connelly & Geeza (2012).
big5_CFAmodel_cmv <-'EXTRA =~ EXT1 + EXT2 + EXT3 + EXT4 + EXT5 + EXT7 + EXT8 + EXT9 + EXT10
AGREE =~ AGR1 + AGR2 + AGR4 + AGR5 + AGR6 + AGR7 + AGR8 + AGR9 + AGR10
EMO =~ EST1 + EST2 + EST3 + EST5 + EST6 + EST7 + EST8 + EST9 + EST10
OPEN =~ OPN1 + OPN2 + OPN3 + OPN5 + OPN6 + OPN7 + OPN8 + OPN9 + OPN10
CON =~ CSN1 + CSN2 + CSN3 + CSN4 + CSN5 + CSN6 + CSN7 + CSN8 + CSN9
CMV =~ EXT1 + EXT2 + EXT3 + EXT4 + EXT5 + EXT7 + EXT8 + EXT9 + EXT10 + AGR1 + AGR2 + AGR4 + AGR5 + AGR6 + AGR7 + AGR8 + AGR9 + AGR10 + CSN1 + CSN2 + CSN3 + CSN4 + CSN5 + CSN6 + CSN7 + CSN8 + CSN9 + EST1 + EST2 + EST3 + EST5 + EST6 + EST7 + EST8 + EST9 + EST10 + OPN1 + OPN2 + OPN3 + OPN5 + OPN6 + OPN7 + OPN8 + OPN9 + OPN10 '
big5_CFA_cmv <- cfa(model = big5_CFAmodel_cmv,
data = big5, estimator = "MLR")
Here is my full code on Github. Now I get a warning from lavaan:
lavaan WARNING:
The variance-covariance matrix of the estimated parameters (vcov)
does not appear to be positive definite! The smallest eigenvalue
(= -4.921738e-07) is smaller than zero. This may be a symptom that
the model is not identified.
But when I run summary(big5_CFA_cmv, fit.measures = TRUE, standardized = TRUE, rsquare = TRUE), lavaan appeared to end normally and produced good fit statistics.
lavaan 0.6-8 ended normally after 77 iterations
Estimator ML
Optimization method NLMINB
Number of model parameters 150
Used Total
Number of observations 498 500
Model Test User Model:
Standard Robust
Test Statistic 2459.635 2262.490
Degrees of freedom 885 885
P-value (Chi-square) 0.000 0.000
Scaling correction factor 1.087
Yuan-Bentler correction (Mplus variant)
Model Test Baseline Model:
Test statistic 9934.617 8875.238
Degrees of freedom 990 990
P-value 0.000 0.000
Scaling correction factor 1.119
User Model versus Baseline Model:
Comparative Fit Index (CFI) 0.824 0.825
Tucker-Lewis Index (TLI) 0.803 0.805
Robust Comparative Fit Index (CFI) 0.830
Robust Tucker-Lewis Index (TLI) 0.810
Loglikelihood and Information Criteria:
Loglikelihood user model (H0) -31449.932 -31449.932
Scaling correction factor 1.208
for the MLR correction
Loglikelihood unrestricted model (H1) -30220.114 -30220.114
Scaling correction factor 1.105
for the MLR correction
Akaike (AIC) 63199.863 63199.863
Bayesian (BIC) 63831.453 63831.453
Sample-size adjusted Bayesian (BIC) 63355.347 63355.347
Root Mean Square Error of Approximation:
RMSEA 0.060 0.056
90 Percent confidence interval - lower 0.057 0.053
90 Percent confidence interval - upper 0.063 0.059
P-value RMSEA <= 0.05 0.000 0.000
Robust RMSEA 0.058
90 Percent confidence interval - lower 0.055
90 Percent confidence interval - upper 0.061
Standardized Root Mean Square Residual:
SRMR 0.061 0.061
Parameter Estimates:
Standard errors Sandwich
Information bread Observed
Observed information based on Hessian
Latent Variables:
Estimate Std.Err z-value P(>|z|) Std.lv Std.all
EXTRA =~
EXT1 1.000 0.455 0.372
EXT2 1.010 0.323 3.129 0.002 0.459 0.358
EXT3 0.131 0.301 0.434 0.664 0.059 0.049
EXT4 1.393 0.430 3.240 0.001 0.633 0.532
EXT5 0.706 0.168 4.188 0.000 0.321 0.263
EXT7 1.001 0.183 5.477 0.000 0.455 0.323
EXT8 1.400 0.545 2.570 0.010 0.637 0.513
EXT9 1.468 0.426 3.446 0.001 0.667 0.505
EXT10 1.092 0.335 3.258 0.001 0.497 0.387
AGREE =~
AGR1 1.000 0.616 0.486
AGR2 0.721 0.166 4.349 0.000 0.444 0.374
AGR4 1.531 0.205 7.479 0.000 0.944 0.848
AGR5 0.999 0.141 7.085 0.000 0.615 0.568
AGR6 1.220 0.189 6.464 0.000 0.752 0.661
AGR7 0.743 0.155 4.795 0.000 0.458 0.406
AGR8 0.836 0.126 6.614 0.000 0.515 0.502
AGR9 1.292 0.209 6.176 0.000 0.796 0.741
AGR10 0.423 0.124 3.409 0.001 0.261 0.258
EMO =~
EST1 1.000 0.856 0.669
EST2 0.674 0.063 10.626 0.000 0.577 0.485
EST3 0.761 0.059 12.831 0.000 0.651 0.580
EST5 0.646 0.081 7.970 0.000 0.552 0.444
EST6 0.936 0.069 13.542 0.000 0.801 0.661
EST7 1.256 0.128 9.805 0.000 1.075 0.880
EST8 1.298 0.131 9.888 0.000 1.111 0.883
EST9 0.856 0.071 11.997 0.000 0.733 0.602
EST10 0.831 0.085 9.744 0.000 0.711 0.545
OPEN =~
OPN1 1.000 0.593 0.518
OPN2 0.853 0.106 8.065 0.000 0.506 0.492
OPN3 1.064 0.205 5.186 0.000 0.631 0.615
OPN5 1.012 0.124 8.161 0.000 0.600 0.654
OPN6 1.039 0.204 5.085 0.000 0.616 0.553
OPN7 0.721 0.089 8.115 0.000 0.428 0.481
OPN8 0.981 0.077 12.785 0.000 0.582 0.474
OPN9 0.550 0.106 5.187 0.000 0.326 0.332
OPN10 1.269 0.200 6.332 0.000 0.753 0.772
CON =~
CSN1 1.000 0.779 0.671
CSN2 1.151 0.128 8.997 0.000 0.897 0.665
CSN3 0.567 0.068 8.336 0.000 0.442 0.437
CSN4 1.054 0.107 9.867 0.000 0.821 0.669
CSN5 0.976 0.083 11.749 0.000 0.760 0.593
CSN6 1.393 0.133 10.464 0.000 1.085 0.779
CSN7 0.832 0.082 10.175 0.000 0.648 0.583
CSN8 0.684 0.077 8.910 0.000 0.532 0.500
CSN9 0.938 0.075 12.535 0.000 0.730 0.574
CMV =~
EXT1 1.000 0.815 0.666
EXT2 1.074 0.091 11.863 0.000 0.875 0.683
EXT3 1.112 0.159 7.001 0.000 0.907 0.749
EXT4 0.992 0.090 11.067 0.000 0.809 0.679
EXT5 1.194 0.108 11.064 0.000 0.974 0.798
EXT7 1.253 0.069 18.133 0.000 1.021 0.725
EXT8 0.733 0.109 6.706 0.000 0.597 0.481
EXT9 0.857 0.105 8.136 0.000 0.698 0.529
EXT10 1.010 0.088 11.446 0.000 0.824 0.641
AGR1 0.047 0.142 0.328 0.743 0.038 0.030
AGR2 0.579 0.173 3.336 0.001 0.472 0.397
AGR4 -0.144 0.167 -0.859 0.390 -0.117 -0.105
AGR5 0.154 0.143 1.075 0.282 0.125 0.116
AGR6 -0.156 0.161 -0.971 0.332 -0.127 -0.112
AGR7 0.581 0.178 3.270 0.001 0.474 0.421
AGR8 0.224 0.123 1.820 0.069 0.183 0.178
AGR9 -0.043 0.145 -0.299 0.765 -0.035 -0.033
AGR10 0.540 0.137 3.935 0.000 0.440 0.436
CSN1 -0.109 0.143 -0.761 0.446 -0.089 -0.077
CSN2 -0.289 0.150 -1.931 0.054 -0.235 -0.175
CSN3 -0.064 0.114 -0.561 0.575 -0.052 -0.052
CSN4 0.041 0.166 0.246 0.806 0.033 0.027
CSN5 0.009 0.132 0.065 0.948 0.007 0.005
CSN6 -0.307 0.181 -1.694 0.090 -0.251 -0.180
CSN7 -0.206 0.132 -1.555 0.120 -0.168 -0.151
CSN8 0.102 0.137 0.741 0.459 0.083 0.078
CSN9 0.016 0.151 0.107 0.915 0.013 0.010
EST1 -0.063 0.167 -0.375 0.708 -0.051 -0.040
EST2 0.136 0.109 1.248 0.212 0.110 0.093
EST3 -0.103 0.165 -0.625 0.532 -0.084 -0.075
EST5 0.117 0.125 0.932 0.351 0.095 0.076
EST6 0.002 0.158 0.010 0.992 0.001 0.001
EST7 -0.253 0.239 -1.058 0.290 -0.206 -0.169
EST8 -0.216 0.243 -0.888 0.375 -0.176 -0.140
EST9 0.159 0.136 1.168 0.243 0.129 0.106
EST10 0.331 0.135 2.462 0.014 0.270 0.207
OPN1 -0.025 0.150 -0.169 0.866 -0.021 -0.018
OPN2 0.042 0.127 0.332 0.740 0.034 0.033
OPN3 -0.088 0.110 -0.799 0.424 -0.072 -0.070
OPN5 0.208 0.139 1.499 0.134 0.170 0.185
OPN6 -0.012 0.116 -0.102 0.919 -0.010 -0.009
OPN7 0.146 0.126 1.156 0.248 0.119 0.133
OPN8 -0.140 0.135 -1.036 0.300 -0.114 -0.093
OPN9 -0.074 0.103 -0.723 0.470 -0.060 -0.062
OPN10 0.035 0.138 0.250 0.802 0.028 0.029
Covariances:
Estimate Std.Err z-value P(>|z|) Std.lv Std.all
EXTRA ~~
AGREE -0.096 0.036 -2.692 0.007 -0.342 -0.342
EMO -0.089 0.050 -1.782 0.075 -0.228 -0.228
OPEN -0.013 0.025 -0.534 0.594 -0.048 -0.048
CON -0.060 0.042 -1.440 0.150 -0.170 -0.170
CMV -0.063 0.081 -0.783 0.434 -0.171 -0.171
AGREE ~~
EMO -0.003 0.057 -0.059 0.953 -0.006 -0.006
OPEN 0.068 0.040 1.712 0.087 0.186 0.186
CON 0.085 0.047 1.818 0.069 0.177 0.177
CMV 0.239 0.046 5.185 0.000 0.476 0.476
EMO ~~
OPEN 0.040 0.042 0.957 0.338 0.079 0.079
CON 0.229 0.050 4.542 0.000 0.343 0.343
CMV 0.250 0.066 3.810 0.000 0.358 0.358
OPEN ~~
CON 0.058 0.044 1.308 0.191 0.125 0.125
CMV 0.098 0.069 1.412 0.158 0.202 0.202
CON ~~
CMV 0.185 0.072 2.576 0.010 0.291 0.291
Variances:
Estimate Std.Err z-value P(>|z|) Std.lv Std.all
.EXT1 0.754 0.059 12.680 0.000 0.754 0.503
.EXT2 0.804 0.065 12.443 0.000 0.804 0.489
.EXT3 0.658 0.084 7.843 0.000 0.658 0.449
.EXT4 0.537 0.059 9.162 0.000 0.537 0.379
.EXT5 0.545 0.049 11.184 0.000 0.545 0.366
.EXT7 0.892 0.080 11.107 0.000 0.892 0.450
.EXT8 0.907 0.117 7.740 0.000 0.907 0.589
.EXT9 0.971 0.099 9.763 0.000 0.971 0.556
.EXT10 0.867 0.081 10.666 0.000 0.867 0.525
.AGR1 1.207 0.109 11.087 0.000 1.207 0.750
.AGR2 0.790 0.085 9.293 0.000 0.790 0.561
.AGR4 0.439 0.079 5.592 0.000 0.439 0.355
.AGR5 0.708 0.066 10.721 0.000 0.708 0.602
.AGR6 0.803 0.075 10.670 0.000 0.803 0.621
.AGR7 0.628 0.056 11.266 0.000 0.628 0.495
.AGR8 0.664 0.059 11.168 0.000 0.664 0.631
.AGR9 0.548 0.056 9.726 0.000 0.548 0.474
.AGR10 0.647 0.059 10.934 0.000 0.647 0.636
.EST1 0.935 0.080 11.644 0.000 0.935 0.571
.EST2 1.026 0.077 13.359 0.000 1.026 0.724
.EST3 0.869 0.070 12.409 0.000 0.869 0.689
.EST5 1.196 0.075 15.912 0.000 1.196 0.773
.EST6 0.826 0.067 12.380 0.000 0.826 0.562
.EST7 0.453 0.059 7.653 0.000 0.453 0.304
.EST8 0.457 0.065 7.044 0.000 0.457 0.289
.EST9 0.862 0.067 12.860 0.000 0.862 0.581
.EST10 0.986 0.074 13.395 0.000 0.986 0.579
.OPN1 0.964 0.098 9.828 0.000 0.964 0.735
.OPN2 0.792 0.070 11.309 0.000 0.792 0.750
.OPN3 0.670 0.085 7.903 0.000 0.670 0.635
.OPN5 0.413 0.039 10.466 0.000 0.413 0.490
.OPN6 0.866 0.099 8.780 0.000 0.866 0.696
.OPN7 0.574 0.048 11.944 0.000 0.574 0.725
.OPN8 1.181 0.094 12.627 0.000 1.181 0.784
.OPN9 0.863 0.083 10.424 0.000 0.863 0.894
.OPN10 0.376 0.051 7.358 0.000 0.376 0.395
.CSN1 0.774 0.079 9.836 0.000 0.774 0.574
.CSN2 1.082 0.099 10.961 0.000 1.082 0.595
.CSN3 0.837 0.072 11.594 0.000 0.837 0.820
.CSN4 0.817 0.067 12.117 0.000 0.817 0.542
.CSN5 1.063 0.077 13.728 0.000 1.063 0.646
.CSN6 0.856 0.089 9.613 0.000 0.856 0.442
.CSN7 0.850 0.065 13.025 0.000 0.850 0.688
.CSN8 0.817 0.057 14.298 0.000 0.817 0.721
.CSN9 1.079 0.077 13.982 0.000 1.079 0.667
EXTRA 0.207 0.141 1.467 0.142 1.000 1.000
AGREE 0.380 0.101 3.744 0.000 1.000 1.000
EMO 0.732 0.104 7.075 0.000 1.000 1.000
OPEN 0.352 0.098 3.603 0.000 1.000 1.000
CON 0.606 0.089 6.792 0.000 1.000 1.000
CMV 0.665 0.203 3.269 0.001 1.000 1.000
R-Square:
Estimate
EXT1 0.497
EXT2 0.511
EXT3 0.551
EXT4 0.621
EXT5 0.634
EXT7 0.550
EXT8 0.411
EXT9 0.444
EXT10 0.475
AGR1 0.250
AGR2 0.439
AGR4 0.645
AGR5 0.398
AGR6 0.379
AGR7 0.505
AGR8 0.369
AGR9 0.526
AGR10 0.364
EST1 0.429
EST2 0.276
EST3 0.311
EST5 0.227
EST6 0.438
EST7 0.696
EST8 0.711
EST9 0.419
EST10 0.421
OPN1 0.265
OPN2 0.250
OPN3 0.365
OPN5 0.510
OPN6 0.304
OPN7 0.275
OPN8 0.216
OPN9 0.106
OPN10 0.605
CSN1 0.426
CSN2 0.405
CSN3 0.180
CSN4 0.458
CSN5 0.354
CSN6 0.558
CSN7 0.312
CSN8 0.279
CSN9 0.333
However, there are some negative factor loadings on the common method variance factor. Additionally, extraversion seems to correlate negatively with cmv.
What does this mean? And can I trust the fit statistics or is my model misspecified?
First, let me clear up your misinterpretation of the warning message. It refers to the covariance matrix of estimated parameters (i.e., vcov(big5_CFA_cmv), from which SEs are calculated as the square-roots of the variances on the diagonal), not to the estimates themselves. Redundancy among estimates can possibly indicate a lack of identification, which you empirically check by saving the model-implied covariance matrix and fitting the same model to it.
MI_COV <- lavInspect(big5_CFA_cmv, "cov.ov")
summary(cfa(model = big5_CFAmodel_cmv,
sample.cov = MI_COV,
sample.nobs = nobs(big5_CFA_cmv))
If your estimates change, that is evidence that your model is not identified. If the estimates remain the same, the empirical check is inconclusive (i.e., it might still not be identified, but the optimizer just found the same local solution that seemed stable enough to stop searching the parameter space; criteria for inferring convergence are not perfect).
Regarding your model specification, I would doubt it is identified because your CMV factor (on which all indicators load) is allowed to correlate with the trait factors (which are also allowed to correlate). That contradicts the definition of a "method factor", which is something about the way the data were measured that has nothing to do with what is attempted to be measured. Even when traits are orthogonal to methods, empirical identification becomes tenuous when traits and/or methods are allowed to correlate among each other. Multitrait--multimethod (MTMM) are notorious for such problems, as are many bifactor models (which are typically one trait and many methods, which your model resembles but is reversed).
What does this mean?
Your negative (and most positive) CMV loadings are not significant. Varying around 0 (in both directions) is consistent with the null hypothesis that they are zero. More noteworthy (and related to my concern above) is that the CMV loadings are significant for all EXT indicators, but only a few others (3 AGR and an EST indicator). The correlations between CMV and traits really complicates the interpretation, as does using reference indicators. Before you interpret anything, I would recommend fixing all factor variances to 1 using std.lv=TRUE and making CMV orthogonal: EXTRA + AGREE + EMO + OPEN + CON ~~ 0*CMV.
However, I still anticipate problems due to estimating so many model parameters with a relatively small sample of 500 (498 after listwise deletion). That is not nearly a large enough sample to expect 50*51/2 = 1275 (co)variances to be reliably estimated.
I keep getting the "class and train have different lengths" error when trying to use the knn model on my dataset.
newDF<- newDF[c(14, 1:13)]
newDF
str(newDF)
newDF1 <- newDF[c(2:11, 14)]
newDF1
df_train = newDF1[1:47385,]
dim(df_train)
df_test = newDF1[47386:59231,]
dim(df_test)
train_lbl <- newDF[1:47385,1]
test_lbl <- newDF[47386:59231,1]
dim(train_lbl)
install.packages("class")
library(class)
newDF_pred <- knn(train = df_train, test = df_test, cl = train_lbl, k = 245)
CrossTable(x = test_lbl, y=newDF_pred, propchisq=FALSE)
newDF is my entire dataset, while newDF1 is inclusive only of datatype "num"
Where is the issue and how can I fix it?
This is the data:
-10lgP Mass Length ppm m/z RT start end Intensity Sample 9 Precursor Id range
1 0.543 0.234 0.245 0.348 0.0310 0.543 0.234 0.245 0.348 0.0310 0.0254
2 0.198 0.476 0.499 0.348 0.588 0.198 0.476 0.499 0.348 0.588 0.0256
3 0.234 0.245 0.348 0.0310 0.543 0.234 0.245 0.348 0.0310 0.543 0.0255
4 0.476 0.499 0.348 0.588 0.198 0.476 0.499 0.348 0.588 0.198 0.0254
5 0.245 0.348 0.0310 0.543 0.234 0.245 0.348 0.0310 0.543 0.234 0.0254
6 0.499 0.348 0.588 0.198 0.476 0.499 0.348 0.588 0.198 0.476 0.0256
7 0.348 0.0310 0.543 0.234 0.245 0.348 0.0310 0.543 0.234 0.245 0.0255
8 0.348 0.588 0.198 0.476 0.499 0.348 0.588 0.198 0.476 0.499 0.0254
9 0.0310 0.543 0.234 0.245 0.348 0.0310 0.543 0.234 0.245 0.348 0.0254
10 0.588 0.198 0.476 0.499 0.348 0.588 0.198 0.476 0.499 0.348 0.0256
... with 59,221 more rows
The size for the class and train are as follows:
dim(train_lbl)
[1] 47385 1
dim(df_train)
[1] 47385 11
I am trying to make a 3D plot showing the relationship between x, y, and z, based on a sample program I found. However, as you can see in the picture, all the x and y points are clumped together. I need the x and y limits to be from about .35 to.5 and from .275 to .4, respectively. I think the problem may be from this step:
Cov <- matrix(c(3,3.5,0,3.5,10,0,0,0,1), 3,3)
I am not exactly sure what this line does.
Any help fixing this issue would be much appreciated.
data <- read.table(textConnection(
'x y z
0.461 0.348 5.42
0.429 0.343 4.99
0.457 0.336 5.22
0.402 0.332 4.5
0.438 0.331 4.66
0.426 0.33 4.69
0.43 0.329 4.83
0.398 0.329 4.41
0.43 0.326 4.74
0.426 0.326 4.71
0.442 0.325 4.81
0.433 0.322 4.72
0.407 0.322 4.14
0.405 0.322 4.43
0.394 0.322 4.07
0.384 0.321 4.03
0.432 0.32 4.64
0.417 0.319 4.47
0.409 0.319 4.48
0.443 0.317 4.59
0.41 0.317 4.23
0.417 0.316 4.14
0.421 0.316 4.46
0.408 0.316 4.42
0.405 0.315 4.2
0.4 0.312 4.17
0.426 0.307 4.15
0.395 0.304 4.03
0.384 0.301 3.77
0.39 0.299 4.23
0.414 0.324 4.44
0.359 0.314 3.54
0.421 0.307 4.4
0.415 0.325 4.62
0.398 0.321 4.25
0.38 0.306 3.84
0.394 0.312 3.95
0.401 0.325 4.16
0.432 0.315 4.55
0.42 0.328 4.28
0.437 0.315 4.5
0.412 0.322 4.47
0.396 0.307 4.08
0.413 0.326 4.12
0.384 0.31 3.78
0.393 0.307 4.04
0.399 0.305 4.3
0.4 0.312 4.22
0.421 0.323 4.72
0.395 0.312 4.28
0.382 0.303 3.86
0.396 0.323 4.3
0.385 0.3 4.01
0.411 0.311 4.05
0.406 0.326 4.3
0.394 0.321 3.99
0.406 0.314 3.98
0.413 0.325 4.64
0.457 0.34 5.5
0.403 0.321 4.34
0.376 0.302 3.8
0.36 0.305 3.54
0.422 0.311 4.35
0.369 0.316 3.91
0.385 0.3 3.79
0.398 0.31 4.07
0.365 0.296 3.67
0.389 0.317 4.13
0.445 0.327 4.66
0.426 0.331 4.67
0.383 0.309 3.88
0.376 0.314 4.02
0.406 0.322 4.77
0.406 0.333 4.43
0.378 0.317 3.98
0.397 0.311 4.01
0.389 0.324 4.41
0.364 0.308 3.88
0.38 0.307 3.91
0.381 0.32 4.5
0.363 0.302 3.82
0.404 0.33 4.21
0.342 0.292 3.3
0.376 0.3 3.91
0.388 0.311 4.1
0.369 0.32 3.82
0.367 0.317 3.78
0.375 0.314 3.93
0.414 0.323 4.46
0.393 0.321 4.23
0.391 0.323 4.23
0.402 0.321 4.25
0.431 0.313 4.6
0.446 0.349 5.27
0.392 0.3 3.72
0.378 0.302 3.69
0.391 0.327 4.31
0.41 0.327 4.6
0.418 0.323 4.36
0.434 0.346 4.91
0.375 0.299 3.77
0.379 0.315 4
0.414 0.329 4.52
0.396 0.326 4.01
0.335 0.293 3.17
0.398 0.311 3.95
0.38 0.312 3.79
0.366 0.306 3.82
0.376 0.307 4.01
0.419 0.327 4.73
0.384 0.306 3.77
0.396 0.313 3.91
0.378 0.308 3.81
0.39 0.306 3.85
0.381 0.32 3.88
0.401 0.332 4.83
0.408 0.329 4.29
0.412 0.323 4.48
0.411 0.318 4.4
0.398 0.313 4.05
0.418 0.328 4.53
0.389 0.32 4.32
0.417 0.311 4.4
0.415 0.315 4.53
0.378 0.302 3.78
0.422 0.318 4.62
0.411 0.315 4.13
0.381 0.324 4.12
0.436 0.33 4.68
0.422 0.335 4.48
0.371 0.302 3.6
0.4 0.317 4.17
0.433 0.332 4.73
0.374 0.317 3.93
0.382 0.308 3.76
0.437 0.325 4.79
0.39 0.325 4.33
0.386 0.316 4.01
0.453 0.337 4.96
0.404 0.31 4.4
0.4 0.317 4.22
0.395 0.304 4.02
0.38 0.319 4.02
0.369 0.296 3.82
0.397 0.327 4.43
0.421 0.338 4.72
0.394 0.317 4.3
0.446 0.334 4.99
0.407 0.309 4.42
0.428 0.322 4.51
0.413 0.322 4.51
0.387 0.308 3.96
0.413 0.316 4.37
0.461 0.349 5.4
0.401 0.314 4.04
0.388 0.319 4.04
0.408 0.326 4.54
0.396 0.317 4.35
0.41 0.329 4.54
0.434 0.34 4.86
0.388 0.318 3.86
0.374 0.311 3.8
0.415 0.329 4.51
0.402 0.313 4.12
0.375 0.322 4
0.425 0.325 4.45
0.36 0.306 3.82
0.391 0.335 4.43
0.444 0.343 5.35
0.369 0.311 3.98
0.395 0.323 4.4
0.368 0.309 3.77
0.349 0.305 3.66
0.348 0.292 3.43
0.368 0.303 3.52
0.425 0.341 4.7
0.402 0.322 4.36
0.46 0.34 5.28
0.413 0.317 4.59
0.383 0.309 3.88
0.416 0.325 4.4
0.401 0.339 4.56
0.386 0.316 3.78
0.451 0.339 5.05
0.401 0.32 4.23
0.42 0.332 4.64
0.436 0.338 4.88
0.378 0.322 3.99
0.425 0.336 4.75
0.415 0.335 4.64
0.403 0.321 4.44
0.362 0.303 3.77
0.399 0.331 4.17
0.39 0.311 4.2
0.379 0.322 4.12
0.424 0.335 4.63
0.422 0.341 4.82
0.383 0.314 4.05
0.436 0.35 5.3
0.378 0.324 4.09
0.413 0.332 4.77
0.373 0.304 3.62
0.371 0.317 4.1
0.339 0.298 3.17
0.408 0.321 4.3
0.402 0.332 4.54
0.403 0.333 4.95
0.419 0.338 4.86
0.454 0.312 4.66
0.39 0.318 4.04'), header = TRUE)
Mean <- c(0.402,0.319, 4.279)
Mean
Cov <- matrix(c(3,3.5,0,3.5,10,0,0,0,1), 3,3)
Cov
round(var(data),2)
round(Cov - var(data),2)
options(rgl.printRglwidget = TRUE)
open3d()
plot3d(data, box=TRUE,
xlab="x", ylab="y", zlab="z")
aspect3d("iso")
dataMean <- colMeans(data)
dataCov <- var(data)
plot3d( ellipse3d(Cov,centre=Mean, level=.9),
col="cyan", alpha=0.5, add = TRUE)][1]][1]
The Cov <- line is defining a covariance matrix. I'm assuming these values are leftover from the script you found? I notice that the Mean vector matches the column means of your dataset (colMeans(data)), so I'm hoping that all that needs to happen is to update the Cov matrix.
The following appeared to work for me (updated to dataCov and took out the aspect3d("iso") line) - hope it works on your machine too. BTW - there was a typo in one of your square brackets.
options(rgl.printRglwidget = TRUE)
open3d()
plot3d(data, box=TRUE,
xlab="x", ylab="y", zlab="z")
# aspect3d("iso") ## this makes it long and skinny and ugly
dataMean <- colMeans(data)
dataCov <- var(data)
plot3d(ellipse3d(dataCov,centre=Mean, level=.9),
col="cyan", alpha=0.5, add = TRUE)[[1]][1]
I have the following correlation matrix.
AUS AUT CAN CHE DEU EU15 FRA GBR ITA JPN USA
AUS 1.000 0.058 0.476 0.313 0.111 0.277 0.184 0.296 0.202 0.192 0.267
AUT 0.058 1.000 0.254 0.658 0.749 0.761 0.626 0.387 0.460 0.410 0.278
CAN 0.476 0.254 1.000 0.390 0.321 0.534 0.377 0.538 0.391 0.231 0.746
CHE 0.313 0.658 0.390 1.000 0.604 0.706 0.610 0.310 0.565 0.437 0.305
DEU 0.111 0.749 0.321 0.604 1.000 0.859 0.620 0.387 0.472 0.520 0.369
EU15 0.277 0.761 0.534 0.706 0.859 1.000 0.808 0.682 0.713 0.601 0.531
FRA 0.184 0.626 0.377 0.610 0.620 0.808 1.000 0.467 0.553 0.444 0.357
GBR 0.296 0.387 0.538 0.310 0.387 0.682 0.467 1.000 0.324 0.407 0.591
ITA 0.202 0.460 0.391 0.565 0.472 0.713 0.553 0.324 1.000 0.492 0.315
JPN 0.192 0.410 0.231 0.437 0.520 0.601 0.444 0.407 0.492 1.000 0.321
USA 0.267 0.278 0.746 0.305 0.369 0.531 0.357 0.591 0.315 0.321 1.000
My goal: plot all correlations in ascending order assigning different colours to the correlations of USA.
I have accomplished to plot all the correlations like this:
x[lower.tri(x)] <- NA
diag(x) <- NA
x <- as.vector(gdpcor)
x <- x[!is.na(x)]
x <- x[order(x)]
plot(x)
But I can't figure out how to assign a different colour for USA's correlations. Any ideas?
How about starting with something like:
x <- as.matrix(read.table(text="AUS AUT CAN CHE DEU EU15 FRA GBR ITA JPN USA
AUS 1.000 0.058 0.476 0.313 0.111 0.277 0.184 0.296 0.202 0.192 0.267
AUT 0.058 1.000 0.254 0.658 0.749 0.761 0.626 0.387 0.460 0.410 0.278
CAN 0.476 0.254 1.000 0.390 0.321 0.534 0.377 0.538 0.391 0.231 0.746
CHE 0.313 0.658 0.390 1.000 0.604 0.706 0.610 0.310 0.565 0.437 0.305
DEU 0.111 0.749 0.321 0.604 1.000 0.859 0.620 0.387 0.472 0.520 0.369
EU15 0.277 0.761 0.534 0.706 0.859 1.000 0.808 0.682 0.713 0.601 0.531
FRA 0.184 0.626 0.377 0.610 0.620 0.808 1.000 0.467 0.553 0.444 0.357
GBR 0.296 0.387 0.538 0.310 0.387 0.682 0.467 1.000 0.324 0.407 0.591
ITA 0.202 0.460 0.391 0.565 0.472 0.713 0.553 0.324 1.000 0.492 0.315
JPN 0.192 0.410 0.231 0.437 0.520 0.601 0.444 0.407 0.492 1.000 0.321
USA 0.267 0.278 0.746 0.305 0.369 0.531 0.357 0.591 0.315 0.321 1.000"))
x[lower.tri(x)] <- NA
diag(x) <- NA
df <- subset(as.data.frame(as.table(x), responseName = 'Corr'),!is.na(Corr))
df <- df[order(df$Corr),]
ggplot(df, aes(x=1:nrow(df),y=Corr,col=Var2=='USA')) + geom_point()
side note: In case you haven't tried it, check out library(corrplot) as a good way to visualize correlations. For example:
corrplot(x, is.corr = FALSE, method='square', diag=FALSE)
I am learning time series from the book by Cowpertwait and Metcalfe,"Introductory Time series with R". Here is the ebook link: http://unalmed.edu.co/~ndgirald/Archivos%20Lectura/Archivos%20curso%20Series%20EIO/notas%20series%20en%20r%20couperwait.pdf
I followed the book code on page 20 to convert a dataframe of 12 columns into a time series, aggregate the months global temperature into yearly, and then plot it, but it's not working
Here is the error I got: Error in plotts(x = x, y = y, plot.type = plot.type, xy.labels = xy.labels, : cannot plot more than 10 series as "multiple"
Here is the code from the book, I have also included the dataset from their website
global<-read.table("Chapter01global.txt",header=F)
global.ts = ts(global, st=c(1856,1), end=c(2005,12), fr=12)
global.annual = aggregate(global.ts, FUN=mean)
plot(global.ts); plot(global.annual)
Here is what I don't understand:
dim(global) is 150 12, but
dim(global.ts) is 1800 12. Shouldn't it be the same dimension? Then
dim(global.annual) is 150 12. Shouldn't it be 150 1, since the mean is taken over all the months.
Also, I am assuming each columns is global temperatures for each months, since there are 150 rows,which corresponds to the numbers of years from 1856 to 2005.
Does anyone know how to fix the code?
Thaks
Here is the dataset:
-0.384 -0.457 -0.673 -0.344 -0.311 -0.071 -0.246 -0.235 -0.380 -0.418 -0.670 -0.386
-0.437 -0.150 -0.528 -0.692 -0.629 -0.363 -0.375 -0.328 -0.495 -0.646 -0.754 -0.137
-0.452 -1.031 -0.643 -0.328 -0.311 -0.263 -0.248 -0.274 -0.203 -0.121 -0.913 -0.197
-0.249 -0.041 -0.082 -0.172 -0.085 -0.278 -0.220 -0.132 -0.436 -0.234 -0.288 -0.486
-0.070 -0.526 -0.599 -0.420 -0.273 -0.063 -0.182 -0.256 -0.213 -0.326 -0.696 -0.813
-0.858 -0.415 -0.431 -0.443 -0.735 -0.169 -0.227 -0.131 -0.377 -0.375 -0.434 -0.209
-0.711 -0.817 -0.435 -0.232 -0.194 -0.322 -0.466 -0.623 -0.345 -0.382 -0.932 -0.768
0.263 -0.063 -0.379 -0.187 -0.320 -0.365 -0.510 -0.359 -0.291 -0.431 -0.362 -0.276
-0.834 -0.604 -0.516 -0.482 -0.399 -0.200 -0.138 -0.332 -0.394 -0.711 -0.507 -0.587
-0.125 -0.615 -0.597 -0.135 -0.152 -0.227 -0.153 -0.267 0.002 -0.358 -0.200 -0.324
0.095 -0.173 -0.396 -0.119 -0.533 0.078 -0.089 -0.282 -0.224 -0.364 -0.294 -0.368
-0.306 0.087 -0.665 -0.235 -0.452 -0.316 -0.267 -0.218 -0.073 -0.177 -0.317 -0.540
-0.718 -0.382 0.004 -0.281 -0.029 -0.113 0.067 -0.135 -0.234 -0.247 -0.517 -0.098
-0.159 0.241 -0.602 -0.232 -0.210 -0.372 -0.295 -0.228 -0.252 -0.581 -0.489 -0.487
-0.072 -0.462 -0.426 -0.246 -0.260 -0.242 -0.110 -0.216 -0.248 -0.352 -0.133 -0.729
-0.349 -0.528 -0.058 -0.157 -0.215 -0.216 -0.129 -0.218 -0.437 -0.505 -0.613 -0.643
-0.378 -0.385 -0.469 -0.163 -0.109 -0.117 -0.206 -0.146 -0.171 -0.269 -0.359 -0.385
-0.090 -0.397 -0.365 -0.414 -0.433 -0.187 -0.138 -0.183 -0.302 -0.541 -0.508 -0.397
-0.033 -0.407 -0.569 -0.573 -0.470 -0.300 -0.111 -0.342 -0.248 -0.476 -0.564 -0.425
-0.665 -0.673 -0.561 -0.457 -0.099 -0.162 -0.330 -0.233 -0.390 -0.463 -0.533 -0.493
-0.320 -0.361 -0.451 -0.407 -0.548 -0.387 -0.272 -0.239 -0.456 -0.465 -0.739 -0.812
-0.401 -0.085 -0.310 -0.442 -0.530 -0.155 -0.136 -0.165 -0.121 -0.157 -0.186 0.142
0.009 0.204 0.341 0.159 -0.125 -0.066 -0.134 -0.116 -0.113 -0.169 -0.274 -0.430
-0.244 -0.254 -0.145 -0.330 -0.279 -0.222 -0.166 -0.298 -0.220 -0.217 -0.537 -0.555
-0.108 -0.264 -0.300 -0.217 -0.334 -0.382 -0.280 -0.112 -0.274 -0.434 -0.557 -0.280
-0.506 -0.354 -0.231 -0.174 -0.020 -0.211 -0.029 -0.104 -0.221 -0.351 -0.497 -0.246
-0.070 -0.123 -0.070 -0.276 -0.266 -0.321 -0.221 -0.166 -0.252 -0.365 -0.489 -0.530
-0.480 -0.382 -0.437 -0.327 -0.317 -0.043 -0.146 -0.241 -0.335 -0.435 -0.348 -0.315
-0.402 -0.300 -0.362 -0.444 -0.345 -0.344 -0.243 -0.198 -0.290 -0.300 -0.559 -0.384
-0.551 -0.481 -0.343 -0.426 -0.386 -0.432 -0.268 -0.301 -0.241 -0.309 -0.311 -0.150
-0.340 -0.490 -0.339 -0.140 -0.061 -0.236 -0.138 -0.134 -0.202 -0.341 -0.362 -0.290
-0.439 -0.483 -0.363 -0.426 -0.262 -0.302 -0.151 -0.278 -0.259 -0.465 -0.365 -0.357
-0.629 -0.512 -0.556 -0.281 -0.301 -0.234 -0.265 -0.260 -0.139 -0.114 -0.218 -0.234
-0.195 -0.144 -0.142 -0.120 -0.036 -0.088 -0.200 -0.266 -0.270 -0.351 -0.436 -0.185
-0.347 -0.308 -0.334 -0.315 -0.454 -0.398 -0.406 -0.406 -0.466 -0.472 -0.596 -0.423
-0.583 -0.555 -0.422 -0.364 -0.256 -0.285 -0.312 -0.258 -0.167 -0.319 -0.537 -0.163
-0.477 -0.099 -0.453 -0.481 -0.407 -0.250 -0.397 -0.351 -0.232 -0.439 -0.571 -0.746
-1.032 -0.803 -0.386 -0.538 -0.518 -0.342 -0.185 -0.314 -0.300 -0.259 -0.368 -0.329
-0.422 -0.333 -0.370 -0.407 -0.435 -0.467 -0.342 -0.358 -0.500 -0.417 -0.474 -0.408
-0.567 -0.688 -0.465 -0.356 -0.353 -0.291 -0.384 -0.257 -0.205 -0.245 -0.274 -0.267
-0.241 -0.200 -0.412 -0.358 -0.096 -0.133 -0.145 -0.148 -0.155 -0.099 -0.311 -0.091
-0.290 -0.168 -0.323 -0.095 -0.011 -0.076 -0.111 -0.119 -0.105 -0.149 -0.414 -0.350
-0.053 -0.297 -0.770 -0.396 -0.398 -0.205 -0.269 -0.255 -0.258 -0.446 -0.461 -0.265
-0.247 -0.505 -0.518 -0.294 -0.250 -0.291 -0.173 -0.151 -0.115 -0.135 0.054 -0.404
-0.302 -0.269 -0.339 -0.250 -0.159 -0.067 -0.153 -0.173 -0.140 -0.038 -0.312 -0.089
-0.134 -0.255 -0.256 -0.144 -0.178 -0.149 -0.236 -0.238 -0.314 -0.325 -0.426 -0.436
-0.083 -0.149 -0.321 -0.423 -0.344 -0.329 -0.298 -0.353 -0.368 -0.449 -0.522 -0.544
-0.321 -0.136 -0.421 -0.445 -0.441 -0.473 -0.459 -0.492 -0.524 -0.532 -0.533 -0.575
-0.626 -0.546 -0.649 -0.561 -0.452 -0.403 -0.416 -0.373 -0.358 -0.350 -0.266 -0.330
-0.470 -0.679 -0.503 -0.549 -0.342 -0.288 -0.280 -0.265 -0.251 -0.342 -0.238 -0.233
-0.171 -0.319 -0.375 -0.182 -0.251 -0.226 -0.308 -0.238 -0.318 -0.300 -0.503 -0.312
-0.531 -0.531 -0.426 -0.604 -0.661 -0.556 -0.478 -0.459 -0.382 -0.363 -0.595 -0.483
-0.432 -0.465 -0.635 -0.515 -0.523 -0.379 -0.387 -0.446 -0.386 -0.516 -0.571 -0.490
-0.538 -0.514 -0.635 -0.572 -0.529 -0.444 -0.493 -0.285 -0.276 -0.291 -0.287 -0.516
-0.317 -0.435 -0.441 -0.404 -0.447 -0.460 -0.331 -0.386 -0.372 -0.468 -0.585 -0.640
-0.520 -0.612 -0.643 -0.666 -0.500 -0.466 -0.383 -0.372 -0.346 -0.409 -0.352 -0.273
-0.354 -0.260 -0.457 -0.347 -0.354 -0.251 -0.414 -0.507 -0.494 -0.573 -0.461 -0.399
-0.379 -0.464 -0.602 -0.471 -0.520 -0.516 -0.370 -0.302 -0.346 -0.394 -0.230 -0.133
-0.063 -0.194 -0.330 -0.368 -0.259 -0.198 -0.183 -0.270 -0.299 -0.140 -0.328 -0.360
-0.190 -0.004 -0.408 -0.050 -0.218 -0.104 -0.032 -0.159 -0.085 -0.265 -0.169 -0.248
-0.242 -0.237 -0.529 -0.339 -0.374 -0.433 -0.300 -0.250 -0.275 -0.310 -0.510 -0.658
-0.514 -0.785 -0.788 -0.503 -0.748 -0.374 -0.252 -0.319 -0.150 -0.411 -0.353 -0.787
-0.540 -0.557 -0.537 -0.541 -0.598 -0.324 -0.393 -0.417 -0.344 -0.137 -0.231 -0.269
-0.198 -0.157 -0.355 -0.104 -0.295 -0.232 -0.254 -0.289 -0.171 -0.305 -0.634 -0.484
-0.246 -0.414 -0.196 -0.275 -0.176 -0.265 -0.333 -0.238 -0.175 -0.304 -0.412 -0.443
-0.105 -0.242 -0.221 -0.180 -0.131 -0.149 -0.116 -0.304 -0.265 -0.197 -0.438 -0.238
-0.370 -0.350 -0.336 -0.321 -0.332 -0.311 -0.285 -0.285 -0.310 -0.332 -0.299 -0.325
-0.195 -0.463 -0.354 -0.452 -0.354 -0.262 -0.382 -0.433 -0.303 -0.224 -0.061 -0.058
-0.320 -0.308 -0.320 -0.406 -0.322 -0.265 -0.274 -0.287 -0.319 -0.330 -0.391 -0.565
-0.402 -0.291 -0.256 -0.263 -0.293 -0.282 -0.265 -0.172 -0.244 -0.328 -0.151 0.015
0.085 0.055 -0.011 -0.203 -0.235 -0.142 -0.195 -0.054 -0.122 -0.082 -0.215 -0.242
-0.256 -0.108 -0.347 -0.275 -0.278 -0.190 -0.135 -0.153 -0.114 -0.075 -0.205 -0.473
-0.128 -0.214 -0.400 -0.324 -0.259 -0.280 -0.187 -0.214 -0.170 -0.172 -0.133 -0.194
-0.430 -0.687 -0.388 -0.397 -0.389 -0.344 -0.349 -0.257 -0.260 -0.143 -0.093 -0.516
-0.354 -0.207 -0.188 -0.211 -0.233 -0.127 -0.105 -0.079 -0.130 -0.119 0.026 -0.095
0.007 -0.240 -0.122 -0.161 -0.163 -0.016 0.019 -0.061 -0.097 -0.061 -0.179 -0.084
0.126 -0.234 -0.307 -0.110 -0.207 -0.139 -0.110 -0.136 -0.030 -0.083 -0.232 -0.188
-0.276 -0.320 -0.322 -0.218 -0.195 -0.159 -0.132 -0.101 -0.188 -0.147 -0.313 -0.489
-0.221 -0.151 -0.371 -0.267 -0.090 -0.034 -0.060 -0.046 -0.141 -0.073 -0.056 -0.118
-0.286 0.171 -0.204 -0.302 -0.285 -0.184 -0.123 -0.122 -0.151 -0.036 -0.311 -0.216
-0.232 -0.366 -0.239 -0.236 -0.127 -0.086 0.019 -0.014 -0.068 -0.032 -0.046 -0.006
-0.154 0.080 -0.247 -0.167 -0.056 0.019 0.030 0.108 0.130 0.088 0.013 -0.128
0.032 0.062 0.147 0.150 0.053 0.032 0.124 0.053 0.124 0.197 0.096 -0.187
-0.042 -0.030 -0.218 -0.116 -0.035 0.094 0.028 0.044 -0.113 -0.207 -0.068 0.160
-0.402 -0.183 -0.184 -0.100 -0.122 0.016 0.134 -0.070 0.033 -0.077 -0.103 0.084
-0.040 0.037 -0.114 0.089 -0.050 0.054 0.066 0.028 -0.117 0.211 0.061 0.102
0.143 -0.104 -0.039 0.006 0.007 0.044 -0.084 -0.019 0.042 -0.018 -0.060 -0.124
-0.249 0.036 -0.221 -0.020 0.071 -0.063 0.082 -0.037 0.024 0.188 0.005 0.162
0.282 0.133 0.101 0.024 0.107 0.187 0.210 0.213 0.294 0.247 0.062 -0.011
0.036 0.037 0.003 0.161 -0.100 0.004 -0.018 0.301 0.107 0.134 -0.013 -0.189
0.084 0.094 -0.196 -0.002 -0.190 -0.241 -0.105 -0.197 -0.067 -0.041 -0.127 -0.383
-0.148 -0.186 -0.097 -0.071 -0.129 -0.064 -0.094 -0.086 -0.126 0.043 -0.014 -0.236
0.018 -0.147 -0.212 -0.091 0.005 -0.019 -0.140 -0.092 -0.093 -0.033 -0.096 -0.205
0.107 -0.201 -0.187 -0.043 -0.113 -0.150 -0.110 -0.043 -0.058 -0.060 -0.084 -0.230
-0.329 -0.274 -0.157 -0.209 -0.148 -0.119 -0.139 -0.165 -0.135 -0.170 -0.403 -0.237
-0.364 -0.462 -0.270 -0.157 -0.089 -0.053 -0.018 0.044 0.091 0.092 -0.036 0.106
0.106 0.069 -0.121 0.021 -0.007 0.008 0.003 0.018 0.018 -0.056 -0.218 -0.086
0.051 0.122 0.107 0.112 0.021 0.030 -0.039 0.015 0.047 0.056 -0.071 0.068
-0.221 -0.103 -0.186 -0.212 -0.254 -0.158 -0.254 -0.146 -0.107 -0.112 -0.021 -0.247
0.053 -0.169 -0.389 -0.290 -0.219 -0.188 -0.192 -0.075 -0.103 -0.115 -0.268 -0.320
-0.239 -0.379 -0.323 -0.326 -0.323 -0.261 -0.203 -0.242 -0.268 -0.195 -0.295 -0.248
-0.154 -0.103 -0.115 -0.066 0.013 0.050 -0.019 0.061 0.027 -0.002 0.083 0.168
0.275 0.174 0.065 0.046 0.045 0.001 0.049 0.017 -0.034 0.013 0.041 0.078
0.110 0.056 0.088 0.062 -0.010 0.053 0.033 -0.011 0.032 -0.061 -0.138 -0.055
-0.020 0.158 -0.296 -0.128 -0.133 0.009 0.006 0.009 0.042 -0.006 -0.118 0.136
0.028 0.134 0.055 0.069 0.077 0.066 -0.023 0.007 -0.023 -0.079 -0.049 -0.092
0.031 0.087 0.016 -0.020 -0.067 -0.062 -0.022 -0.017 -0.004 0.033 0.024 0.027
-0.042 0.166 -0.123 -0.053 -0.040 -0.021 0.061 0.100 0.085 0.155 0.110 0.016
-0.023 -0.158 -0.277 -0.246 -0.175 -0.185 -0.172 -0.255 -0.283 -0.342 -0.297 -0.414
-0.195 -0.290 -0.244 -0.273 -0.155 -0.129 -0.217 -0.127 -0.100 -0.055 -0.148 -0.078
-0.097 -0.080 -0.066 -0.132 -0.120 0.017 0.005 -0.057 -0.018 -0.119 -0.108 -0.218
-0.172 -0.269 -0.110 -0.094 0.019 -0.125 -0.112 -0.052 -0.096 0.071 -0.107 -0.134
-0.272 -0.227 0.015 -0.192 -0.209 -0.095 -0.068 -0.055 -0.053 -0.013 -0.050 -0.093
-0.178 -0.106 0.028 0.104 0.079 0.025 0.076 0.051 0.027 0.039 0.130 0.169
0.068 0.172 -0.040 0.043 -0.042 -0.013 -0.067 -0.079 -0.051 -0.096 -0.070 -0.215
-0.078 -0.309 -0.286 -0.236 -0.232 -0.246 -0.152 -0.129 -0.126 -0.148 -0.126 -0.206
-0.350 -0.286 -0.154 -0.065 -0.038 0.022 0.013 0.069 -0.002 0.052 0.016 0.173
0.187 0.263 0.237 0.137 0.129 0.128 0.074 0.028 -0.023 0.011 -0.071 -0.070
-0.301 -0.314 -0.194 -0.169 -0.146 -0.111 -0.065 -0.073 -0.140 -0.172 -0.174 -0.218
-0.077 -0.111 -0.092 -0.091 -0.038 -0.082 -0.082 -0.120 -0.066 -0.220 -0.264 -0.281
-0.197 -0.278 -0.391 -0.153 -0.258 -0.148 -0.156 -0.165 -0.129 -0.294 -0.166 -0.092
-0.112 0.088 0.140 0.122 0.060 0.123 0.051 0.019 0.067 0.017 0.140 -0.014
0.065 0.027 0.027 -0.030 -0.114 -0.130 -0.031 -0.120 -0.045 -0.092 0.011 -0.041
0.014 -0.121 0.006 -0.042 -0.023 0.078 0.036 0.114 0.128 0.159 0.138 0.334
0.129 0.148 0.047 0.138 0.203 0.126 0.067 0.056 0.040 0.032 0.162 0.056
0.304 0.202 0.227 0.159 0.067 0.077 0.042 0.090 0.072 0.035 0.074 0.254
-0.007 -0.025 -0.104 0.030 0.051 -0.021 -0.002 -0.011 0.077 0.039 -0.012 0.209
0.373 0.334 0.247 0.188 0.173 0.198 0.207 0.235 0.211 0.114 0.276 0.113
0.130 0.047 0.106 0.037 0.115 0.018 0.061 0.095 0.036 0.012 -0.078 -0.211
0.086 -0.093 0.010 -0.015 0.052 -0.033 -0.018 0.047 0.005 0.029 -0.016 0.086
0.198 0.185 0.120 0.109 0.102 0.116 0.055 0.053 0.076 0.077 -0.029 0.084
0.189 0.339 0.070 0.182 0.199 0.219 0.330 0.270 0.330 0.264 0.252 0.418
0.427 0.297 0.373 0.316 0.248 0.251 0.200 0.220 0.166 0.185 0.038 0.143
0.061 0.159 0.155 0.131 0.107 0.148 0.208 0.206 0.198 0.225 0.132 0.243
0.237 0.283 0.534 0.355 0.282 0.271 0.253 0.291 0.218 0.351 0.360 0.260
0.301 0.385 0.220 0.376 0.296 0.341 0.289 0.222 0.210 0.173 0.107 0.095
0.349 0.306 0.245 0.143 0.144 0.130 0.020 0.035 0.004 -0.021 -0.064 0.097
0.321 0.296 0.262 0.227 0.223 0.169 0.136 0.111 0.051 0.137 0.017 0.194
0.222 -0.032 0.223 0.243 0.244 0.224 0.185 0.224 0.239 0.327 0.361 0.331
0.462 0.558 0.373 0.319 0.281 0.369 0.387 0.415 0.323 0.350 0.361 0.280
0.163 0.343 0.217 0.164 0.256 0.274 0.281 0.228 0.189 0.165 0.166 0.278
0.254 0.348 0.324 0.295 0.299 0.431 0.422 0.444 0.517 0.538 0.488 0.572
0.512 0.824 0.593 0.660 0.613 0.639 0.704 0.670 0.475 0.452 0.345 0.469
0.404 0.607 0.283 0.357 0.296 0.328 0.340 0.287 0.324 0.280 0.197 0.383
0.203 0.429 0.388 0.469 0.304 0.267 0.250 0.365 0.304 0.219 0.123 0.172
0.343 0.307 0.505 0.432 0.456 0.415 0.447 0.502 0.407 0.390 0.523 0.348
0.641 0.680 0.620 0.445 0.429 0.449 0.488 0.395 0.439 0.395 0.423 0.301
0.545 0.430 0.393 0.397 0.450 0.440 0.454 0.518 0.521 0.573 0.429 0.573
0.508 0.619 0.527 0.469 0.295 0.358 0.364 0.436 0.452 0.494 0.586 0.385
0.502 0.355 0.512 0.553 0.494 0.516 0.537 0.510 0.526 0.514 0.493 0.305
Your problem is here, where you apply ts to a data.frame resulting from read.table rather than a vector of values:
global <- read.table("Chapter01global.txt",header=F)
global.ts = ts(global, st=c(1856,1), end=c(2005,12), fr=12)
Compare their code:
www = "http://web.address.that.doesnt.work.anymore.com"
global = scan(www)
#Read 1800 items
global.ts = ts(global, st=c(1856,1), end=c(2005,12), fr=12)
Your code results in:
str(global)
#'data.frame': 150 obs. of 12 variables: ...
Their code would result in a vector which can then be turned into a proper ts object:
str(global)
#num [1:1800] -0.384 -0.457 -0.673 -0.344 -...