metasem icon indicating copy to clipboard operation
metasem copied to clipboard

tssem2 error

Open Hi3751 opened this issue 1 year ago • 6 comments

Hello Mike, I am performing a meta-analysis (with 13 variables) using the “metaSEM” package. I don't know if this is because the model is complex or because there are many missing correlations, but when I run “tssem2” I get a bunch of problems with error messages (tssem1 takes a long time (about an hour) but produces results normally). The error messages are: “1: sqrt(Diag(vcov.wls(object))) NaN” “2: print.summary.wls(x): OpenMx status1 is neither 0 or 1. You are advised to ‘rerun’ it again.” and sometimes,
“Polite note from mxTryHard: Hessian not checked as model contains mxConstraints” error2

I keep trying to rerun it, but I keep getting NaNs (especially when using likelihood-based confidence intervals), and I get values that are impossible to come up with (e.g. -2.113, or a negative value when it shouldn't be possible to come up with a negative value at all). error1

error3

Could you please take a look at the attached my data and code and see what the problem is? Thank you.

Kim data.txt code.txt

Hi3751 avatar Sep 23 '24 04:09 Hi3751

Could you please include the R code to read the data? It is hard to check without reproducing the errors.

mikewlcheung avatar Sep 24 '24 02:09 mikewlcheung

I've attached a text file of the entire process, including the R code. I would really appreciate it if you could take a look and let me know what's causing the problem. Thank you. lastsave.txt

Hi3751 avatar Sep 24 '24 05:09 Hi3751

It works better if you use diag.constraints=FALSE in this model. The following code works for me.

stage2 = tssem2(stage1random, Amatrix=A, Smatrix=S,diag.constraints=FALSE,intervals.type="z")
summary(stage2)

> summary(stage2)

Call:
wls(Cov = pooledS, aCov = aCov, n = tssem1.obj$total.n, RAM = RAM, 
    Amatrix = Amatrix, Smatrix = Smatrix, Fmatrix = Fmatrix, 
    diag.constraints = diag.constraints, cor.analysis = cor.analysis, 
    intervals.type = intervals.type, mx.algebras = mx.algebras, 
    mxModel.Args = mxModel.Args, subset.variables = subset.variables, 
    model.name = model.name, suppressWarnings = suppressWarnings, 
    silent = silent, run = run)

95% confidence intervals: z statistic approximation
Coefficients:
       Estimate Std.Error    lbound    ubound z value  Pr(>|z|)    
b106   0.620805  0.036968  0.548349  0.693260 16.7932 < 2.2e-16 ***
b109   0.279495  0.040409  0.200296  0.358695  6.9167 4.623e-12 ***
b111   0.207178  0.064638  0.080489  0.333866  3.2052 0.0013497 ** 
b112   0.091933  0.056985 -0.019756  0.203622  1.6133 0.1066846    
b113   0.021483  0.060856 -0.097794  0.140759  0.3530 0.7240857    
b116   0.232994  0.051435  0.132183  0.333805  4.5299 5.902e-06 ***
b119   0.381586  0.038774  0.305591  0.457581  9.8414 < 2.2e-16 ***
b1210  0.361437  0.051973  0.259571  0.463303  6.9543 3.544e-12 ***
b1211  0.359614  0.046320  0.268828  0.450399  7.7637 8.216e-15 ***
b129   0.216031  0.046035  0.125805  0.306257  4.6928 2.695e-06 ***
b1310  0.236028  0.047946  0.142055  0.330001  4.9228 8.533e-07 ***
b1311  0.269357  0.043248  0.184592  0.354121  6.2282 4.719e-10 ***
b1312  0.431349  0.062691  0.308476  0.554222  6.8805 5.964e-12 ***
b94    0.092657  0.112126 -0.127107  0.312421  0.8264 0.4085992    
b95    0.685681  0.186260  0.320617  1.050745  3.6813 0.0002320 ***
b96   -1.201594  0.343349 -1.874547 -0.528642 -3.4996 0.0004659 ***
b97    1.111854  0.251584  0.618758  1.604951  4.4194 9.897e-06 ***
b98    0.138821  0.143336 -0.142112  0.419754  0.9685 0.3327938    
p62    0.536688  0.036330  0.465482  0.607894 14.7725 < 2.2e-16 ***
p72    0.609550  0.030796  0.549191  0.669908 19.7934 < 2.2e-16 ***
p42    0.504761  0.022788  0.460098  0.549424 22.1507 < 2.2e-16 ***
p82    0.492029  0.050320  0.393403  0.590654  9.7780 < 2.2e-16 ***
p32    0.559028  0.045737  0.469385  0.648671 12.2226 < 2.2e-16 ***
p52    0.616778  0.036728  0.544792  0.688763 16.7931 < 2.2e-16 ***
p76    0.817995  0.040601  0.738418  0.897572 20.1471 < 2.2e-16 ***
p86    0.600213  0.035588  0.530463  0.669964 16.8657 < 2.2e-16 ***
p87    0.586971  0.049288  0.490368  0.683574 11.9089 < 2.2e-16 ***
p64    0.604648  0.035375  0.535314  0.673981 17.0926 < 2.2e-16 ***
p74    0.608988  0.020168  0.569460  0.648516 30.1961 < 2.2e-16 ***
p84    0.540567  0.048510  0.445488  0.635645 11.1434 < 2.2e-16 ***
p54    0.612588  0.019861  0.573660  0.651516 30.8430 < 2.2e-16 ***
p21    0.561069  0.033812  0.494799  0.627338 16.5940 < 2.2e-16 ***
p61    0.576889  0.035386  0.507534  0.646244 16.3028 < 2.2e-16 ***
p71    0.651274  0.031139  0.590242  0.712306 20.9148 < 2.2e-16 ***
p41    0.522513  0.030269  0.463187  0.581840 17.2622 < 2.2e-16 ***
p81    0.526480  0.033625  0.460577  0.592384 15.6575 < 2.2e-16 ***
p31    0.647821  0.047624  0.554480  0.741161 13.6029 < 2.2e-16 ***
p51    0.592772  0.042540  0.509395  0.676149 13.9344 < 2.2e-16 ***
p63    0.579444  0.033002  0.514761  0.644127 17.5578 < 2.2e-16 ***
p73    0.634354  0.034057  0.567604  0.701104 18.6264 < 2.2e-16 ***
p43    0.532117  0.053182  0.427882  0.636352 10.0055 < 2.2e-16 ***
p83    0.495409  0.037836  0.421251  0.569567 13.0934 < 2.2e-16 ***
p53    0.544873  0.037867  0.470654  0.619091 14.3890 < 2.2e-16 ***
p65    0.730608  0.044916  0.642575  0.818642 16.2662 < 2.2e-16 ***
p75    0.620114  0.040664  0.540414  0.699815 15.2496 < 2.2e-16 ***
p85    0.555708  0.033339  0.490364  0.621052 16.6682 < 2.2e-16 ***
---
Signif. codes:  0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1

Goodness-of-fit indices:
                                                Value
Sample size                                78047.0000
Chi-square of target model                   145.6214
DF of target model                            32.0000
p value of target model                        0.0000
Number of constraints imposed on "Smatrix"     0.0000
DF manually adjusted                           0.0000
Chi-square of independence model           24163.1397
DF of independence model                      78.0000
RMSEA                                          0.0067
RMSEA lower 95% CI                             0.0057
RMSEA upper 95% CI                             0.0079
SRMR                                           0.0476
TLI                                            0.9885
CFI                                            0.9953
AIC                                           81.6214
BIC                                         -214.8607
OpenMx status1: 0 ("0" or "1": The optimization is considered fine.
Other values indicate problems.)

mikewlcheung avatar Sep 24 '24 06:09 mikewlcheung

Thank you for your reply.

I also tried with “diag.constraints=FALSE” before, but it yielded strange values of 1 or more, as in b96 (-1.202) and b.97 (1.112) in the result you provide. Is this possible? For b96 (-1.202), all correlations are positive, and when simplifying the model, such as reducing the variables, the values become positive.

The problem I ran into was (1) it generates NaNs (2) even if it doesn't generate NaNs, some of the values are unreliable.

Hi3751 avatar Sep 24 '24 06:09 Hi3751

A standardized coefficient can be larger than 1. See, for example, the following discussion: https://stats.stackexchange.com/questions/120201/magnitude-of-standardized-coefficients-beta-in-multiple-linear-regression

There are 13 variables and only a few studies on some of these cells exist. Thus, there may not be sufficient information to estimate all of them.

> pattern.na(cormat, show.na = FALSE)
    rlb acc scr psn ufd ast cmp rsp peu enj  pu att  bi
rlb  45  23  15  12  15   8   6  19  31  19  23   9  30
acc  23  59  10  15  11   4  15  17  46  24  43  16  48
scr  15  10  60  11  16   7   8  18  46  24  50  20  41
psn  12  15  11  32   9   7   4  17  20  13  18   8  18
ufd  15  11  16   9  39   7   4  17  27  12  22   8  28
ast   8   4   7   7   7  24   4   8  17  11  14  10  17
cmp   6  15   8   4   4   4  37   8  34   6  33  13  34
rsp  19  17  18  17  17   8   8  47  31  13  23  12  31
peu  31  46  46  20  27  17  34  31 116  48 102  46  94
enj  19  24  24  13  12  11   6  13  48  63  41  21  45
pu   23  43  50  18  22  14  33  23 102  41 113  43  94
att   9  16  20   8   8  10  13  12  46  21  43  51  43
bi   30  48  41  18  28  17  34  31  94  45  94  43 124

mikewlcheung avatar Sep 26 '24 10:09 mikewlcheung

Thank you for your response! I'll gather more data and try to analyze it again.

Hi3751 avatar Sep 28 '24 12:09 Hi3751