Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Overall tests of effects: mmrm vs Proc Mixed #497

Open
SimonTBate opened this issue Mar 2, 2025 · 1 comment
Open

Overall tests of effects: mmrm vs Proc Mixed #497

SimonTBate opened this issue Mar 2, 2025 · 1 comment
Labels
bug Something isn't working

Comments

@SimonTBate
Copy link

Hi there,
First off, thanks for developing the mmrm package - I think its a real game changer for anyone wanting to perform repeated measures analysis in R. Keep up the good work!

I have a potential bug that may just be "one of those things". I've been struggling to get alignment between SAS's Proc Mixed and mmrm for the overall tests of effects. I wonder if these differences are the way SAS and R handle mixed models, however in the past I have been able to get the lowest line in the overall effects table to align between the two (admittedly when using nlme without Kenward Rogers).

This is the dataset I've been using:
RMData.csv

Here is the SAS code I'm using:
proc mixed data=RMData;
class Treatment1 Day Subject;
model Response1 = Treatment1|Day / ddfm=KR2 htype=3;
repeated Day / subject = subject type=cs;
run;

and this is the R code:
model <- mmrm(formula = Response1 ~ Treatment1+Day+Treatment1*Day + cs(Day | Subject), data=RMData, method = "Kenward-Roger")
Anova(model, type = "III")

The SAS Proc Mixed overall effects test results are:
Type 3 Tests of Fixed Effects
Effect Num DF Den DF F Value Pr > F
Treatment1 2 69 1.44 0.2434
Day 3 207 3.05 0.0295
Treatment1*Day 6 207 1.16 0.3275

and the mmrm results are:
Analysis of Fixed Effect Table (Type III F tests)
Num Df Denom Df F Statistic Pr(>=F)
Treatment1 2 69 1.4463 0.24248
Day 3 207 3.0665 0.02899 *
Treatment1:Day 6 207 1.1675 0.32501

As you can see the results are similar but not numerically the same. In your experience are these numbers as close as you'd expect or is it indicative of a bigger issue?

Thanks in advance for any thoughts/advice you have!
Simon

@SimonTBate SimonTBate added the bug Something isn't working label Mar 2, 2025
@danielinteractive
Copy link
Collaborator

Thank you @SimonTBate for posting this, that is very helpful.
@clarkliming do you think you could have a look at it, since it is about Type III test?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Status: Issue To start
Development

No branches or pull requests

2 participants