You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi there,
First off, thanks for developing the mmrm package - I think its a real game changer for anyone wanting to perform repeated measures analysis in R. Keep up the good work!
I have a potential bug that may just be "one of those things". I've been struggling to get alignment between SAS's Proc Mixed and mmrm for the overall tests of effects. I wonder if these differences are the way SAS and R handle mixed models, however in the past I have been able to get the lowest line in the overall effects table to align between the two (admittedly when using nlme without Kenward Rogers).
Here is the SAS code I'm using:
proc mixed data=RMData;
class Treatment1 Day Subject;
model Response1 = Treatment1|Day / ddfm=KR2 htype=3;
repeated Day / subject = subject type=cs;
run;
and this is the R code:
model <- mmrm(formula = Response1 ~ Treatment1+Day+Treatment1*Day + cs(Day | Subject), data=RMData, method = "Kenward-Roger")
Anova(model, type = "III")
The SAS Proc Mixed overall effects test results are:
Type 3 Tests of Fixed Effects
Effect Num DF Den DF F Value Pr > F
Treatment1 2 69 1.44 0.2434
Day 3 207 3.05 0.0295
Treatment1*Day 6 207 1.16 0.3275
and the mmrm results are:
Analysis of Fixed Effect Table (Type III F tests)
Num Df Denom Df F Statistic Pr(>=F)
Treatment1 2 69 1.4463 0.24248
Day 3 207 3.0665 0.02899 *
Treatment1:Day 6 207 1.1675 0.32501
As you can see the results are similar but not numerically the same. In your experience are these numbers as close as you'd expect or is it indicative of a bigger issue?
Thanks in advance for any thoughts/advice you have!
Simon
The text was updated successfully, but these errors were encountered:
Hi there,
First off, thanks for developing the mmrm package - I think its a real game changer for anyone wanting to perform repeated measures analysis in R. Keep up the good work!
I have a potential bug that may just be "one of those things". I've been struggling to get alignment between SAS's Proc Mixed and mmrm for the overall tests of effects. I wonder if these differences are the way SAS and R handle mixed models, however in the past I have been able to get the lowest line in the overall effects table to align between the two (admittedly when using nlme without Kenward Rogers).
This is the dataset I've been using:
RMData.csv
Here is the SAS code I'm using:
proc mixed data=RMData;
class Treatment1 Day Subject;
model Response1 = Treatment1|Day / ddfm=KR2 htype=3;
repeated Day / subject = subject type=cs;
run;
and this is the R code:
model <- mmrm(formula = Response1 ~ Treatment1+Day+Treatment1*Day + cs(Day | Subject), data=RMData, method = "Kenward-Roger")
Anova(model, type = "III")
The SAS Proc Mixed overall effects test results are:
Type 3 Tests of Fixed Effects
Effect Num DF Den DF F Value Pr > F
Treatment1 2 69 1.44 0.2434
Day 3 207 3.05 0.0295
Treatment1*Day 6 207 1.16 0.3275
and the mmrm results are:
Analysis of Fixed Effect Table (Type III F tests)
Num Df Denom Df F Statistic Pr(>=F)
Treatment1 2 69 1.4463 0.24248
Day 3 207 3.0665 0.02899 *
Treatment1:Day 6 207 1.1675 0.32501
As you can see the results are similar but not numerically the same. In your experience are these numbers as close as you'd expect or is it indicative of a bigger issue?
Thanks in advance for any thoughts/advice you have!
Simon
The text was updated successfully, but these errors were encountered: