Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
334 views
in Technique[技术] by (71.8m points)

r - Convergence error for development version of lme4

I am attempting to do a power analysis for a mixed-effects model using the development version of lme4 and this tutorial. I notice in the tutorial that lme4 throws a convergence error:

## Warning: Model failed to converge with max|grad| = 0.00187101 (tol =
## 0.001)

The same warning comes up when I run the code for my dataset, with:

## Warning message: In checkConv(attr(opt, "derivs"), opt$par, checkCtrl =
control$checkConv,  : 
Model failed to converge with max|grad| = 0.774131 (tol = 0.001)

The estimates from a regular glmer call with this updated version are also slightly different from when I was using the updated CRAN version (no warnings in that case). Any idea as to why this might be happening?

EDIT

The model I tried to specify was:

glmer(resp ~ months.c * similarity * percSem + (similarity | subj), family = binomial, data = myData)

The dataset I have has one between-subjects (age, centered), and two within-subjects variables (similarity: 2 levels, percSem: 3 levels) predicting a binary outcome (false memory/guess). Additionally, each within-subjects cell has 3 repeated measures. Thus, there exists a total of 2 x 3 x 3 = 18 binary responses for each individual and 38 participants total.

structure(list(subj = structure(c(1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 8L, 8L, 8L, 8L, 8L, 8L, 8L, 8L, 8L, 8L, 8L, 8L, 8L, 8L, 8L, 8L, 8L, 8L, 9L, 9L, 9L, 9L, 9L, 9L, 9L, 9L, 9L, 9L, 9L, 9L, 9L, 9L, 9L, 9L, 9L, 9L, 10L, 10L, 10L, 10L, 10L, 10L, 10L, 10L, 10L, 10L, 10L, 10L, 10L, 10L, 10L, 10L, 10L, 10L, 11L, 11L, 11L, 11L, 11L, 11L, 11L, 11L, 11L, 11L, 11L, 11L, 11L, 11L, 11L, 11L, 11L, 11L, 12L, 12L, 12L, 12L, 12L, 12L, 12L, 12L, 12L, 12L, 12L, 12L, 12L, 12L, 12L, 12L, 12L, 12L, 13L, 13L, 13L, 13L, 13L, 13L, 13L, 13L, 13L, 13L, 13L, 13L, 13L, 13L, 13L, 13L, 13L, 13L, 14L, 14L, 14L, 14L, 14L, 14L, 14L, 14L, 14L, 14L, 14L, 14L, 14L, 14L, 14L, 14L, 14L, 14L, 15L, 15L, 15L, 15L, 15L, 15L, 15L, 15L, 15L, 15L, 15L, 15L, 15L, 15L, 15L, 15L, 15L, 15L, 16L, 16L, 16L, 16L, 16L, 16L, 16L, 16L, 16L, 16L, 16L, 16L, 16L, 16L, 16L, 16L, 16L, 16L, 17L, 17L, 17L, 17L, 17L, 17L, 17L, 17L, 17L, 17L, 17L, 17L, 17L, 17L, 17L, 17L, 17L, 17L, 18L, 18L, 18L, 18L, 18L, 18L, 18L, 18L, 18L, 18L, 18L, 18L, 18L, 18L, 18L, 18L, 18L, 18L, 19L, 19L, 19L, 19L, 19L, 19L, 19L, 19L, 19L, 19L, 19L, 19L, 19L, 19L, 19L, 19L, 19L, 19L, 20L, 20L, 20L, 20L, 20L, 20L, 20L, 20L, 20L, 20L, 20L, 20L, 20L, 20L, 20L, 20L, 20L, 20L, 21L, 21L, 21L, 21L, 21L, 21L, 21L, 21L, 21L, 21L, 21L, 21L, 21L, 21L, 21L, 21L, 21L, 21L, 22L, 22L, 22L, 22L, 22L, 22L, 22L, 22L, 22L, 22L, 22L, 22L, 22L, 22L, 22L, 22L, 22L, 22L, 23L, 23L, 23L, 23L, 23L, 23L, 23L, 23L, 23L, 23L, 23L, 23L, 23L, 23L, 23L, 23L, 23L, 23L, 24L, 24L, 24L, 24L, 24L, 24L, 24L, 24L, 24L, 24L, 24L, 24L, 24L, 24L, 24L, 24L, 24L, 24L, 25L, 25L, 25L, 25L, 25L, 25L, 25L, 25L, 25L, 25L, 25L, 25L, 25L, 25L, 25L, 25L, 25L, 25L, 26L, 26L, 26L, 26L, 26L, 26L, 26L, 26L, 26L, 26L, 26L, 26L, 26L, 26L, 26L, 26L, 26L, 26L, 27L, 27L, 27L, 27L, 27L, 27L, 27L, 27L, 27L, 27L, 27L, 27L, 27L, 27L, 27L, 27L, 27L, 27L, 28L, 28L, 28L, 28L, 28L, 28L, 28L, 28L, 28L, 28L, 28L, 28L, 28L, 28L, 28L, 28L, 28L, 28L, 29L, 29L, 29L, 29L, 29L, 29L, 29L, 29L, 29L, 29L, 29L, 29L, 29L, 29L, 29L, 29L, 29L, 29L, 30L, 30L, 30L, 30L, 30L, 30L, 30L, 30L, 30L, 30L, 30L, 30L, 30L, 30L, 30L, 30L, 30L, 30L, 31L, 31L, 31L, 31L, 31L, 31L, 31L, 31L, 31L, 31L, 31L, 31L, 31L, 31L, 31L, 31L, 31L, 31L, 32L, 32L, 32L, 32L, 32L, 32L, 32L, 32L, 32L, 32L, 32L, 32L, 32L, 32L, 32L, 32L, 32L, 32L, 33L, 33L, 33L, 33L, 33L, 33L, 33L, 33L, 33L, 33L, 33L, 33L, 33L, 33L, 33L, 33L, 33L, 33L, 34L, 34L, 34L, 34L, 34L, 34L, 34L, 34L, 34L, 34L, 34L, 34L, 34L, 34L, 34L, 34L, 34L, 34L, 35L, 35L, 35L, 35L, 35L, 35L, 35L, 35L, 35L, 35L, 35L, 35L, 35L, 35L, 35L, 35L, 35L, 35L, 36L, 36L, 36L, 36L, 36L, 36L, 36L, 36L, 36L, 36L, 36L, 36L, 36L, 36L, 36L, 36L, 36L, 36L, 37L, 37L, 37L, 37L, 37L, 37L, 37L, 37L, 37L, 37L, 37L, 37L, 37L, 37L, 37L, 37L, 37L, 37L, 38L, 38L, 38L, 38L, 38L, 38L, 38L, 38L, 38L, 38L, 38L, 38L, 38L, 38L, 38L, 38L, 38L, 38L), .Label = c("09A", "10", "11", "12", "12A", "13", "14", "14A", "15", "15A", "16", "17", "18", "19", "1A", "2", "20", "21", "22", "22A", "23", "24", "25", "26", "27", "28", "29", "3", "30", "31", "32A", "32B", "33", "4B", "5", "6", "7", "8"), class = "factor"), months.c = structure(c(-9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, 2.18421052631579, 2.18421052631579, 2.18421052631579, 2.18421052631579, 2.18421052631579, 2.18421052631579, 2.18421052631579, 2.18421052631579, 2.18421052631579, 2.18421052631579, 2.18421052631579, 2.18421052631579, 2.18421052631579, 2.18421052631579, 2.18421052631579, 2.18421052631579, 2.18421052631579, 2.18421052631579, 11.1842105263158, 11.1842105263158, 11.1842105263158, 11.1842105263158, 11.1842105263158, 11.1842105263158, 11.1842105263158, 11.1842105263158, 11.1842105263158, 11.1842105263158, 11.1842105263158, 11.1842105263158, 11.1842105263158, 11.1842105263158, 11.1842105263158, 11.1842105263158, 11.1842105263158, 11.1842105263158, -3.81578947368421, -3.81578947368421, -3.81578947368421, -3.81578947368421, -3.81578947368421, -3.81578947368421, -3.81578947368421, -3.81578947368421, -3.81578947368421, -3.81578947368421, -3.81578947368421, -3.81578947368421, -3.81578947368421, -3.81578947368421, -3.81578947368421, -3.81578947368421, -3.81578947368421, -3.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, 1.18421052631579, 1.18421052631579, 1.18421052631579, 1.18421052631579, 1.18421052631579, 1.18421052631579, 1.18421052631579, 1.18421052631579, 1.18421052631579, 1.18421052631579, 1.18421052631579, 1.18421052631579, 1.18421052631579, 1.18421052631579, 1.18421052631579, 1.18421052631579, 1.18421052631579, 1.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, -10.8157894736842, -10.8157894736842, -10.8157894736842, -10.8157894736842, -10.8157894736842, -10.8157894736842, -10.8157894736842, -10.8157894736842, -10.8157894736842, -10.8157894736842, -10.8157894736842, -10.8157894736842, -10.8157894736842, -10.8157894736842, -10.8157894736842, -10.8157894736842, -10.8157894736842, -10.8157894736842, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, -7.81578947368421, 9.18421052631579, 9.18421052631579, 9.18421052631579, 9.18421052631579, 9.18421052631579, 9.18421052631579, 9.18421052631579, 9.18421052631579, 9.18421052631579, 9.18421052631579, 9.18421052631579, 9.18421052631579, 9.18421052631579, 9.18421052631579, 9.18421052631579, 9.18421052631579, 9.18421052631579, 9.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, 6.18421052631579, -2.81578947368421, -2.81578947368421, -2.81578947368421, -2.81578947368421, -2.81578947368421, -2.81578947368421, -2.81578947368421, -2.81578947368421, -2.81578947368421, -2.81578947368421, -2.81578947368421, -2.81578947368421, -2.81578947368421, -2.81578947368421, -2.81578947368421, -2.81578947368421, -2.81578947368421, -2.81578947368421, -0.815789473684205, -0.815789473684205, -0.815789473684205, -0.815789473684205, -0.815789473684205, -0.815789473684205, -0.815789473684205, -0.815789473684205, -0.815789473684205, -0.815789473684205, -0.815789473684205, -0.815789473684205, -0.815789473684205, -0.815789473684205, -0.815789473684205, -0.815789473684205, -0.815789473684205, -0.815789473684205, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, -9.81578947368421, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 8.18421052631579, 3.18421052631579, 3.18421052631579, 3.18421052631579, 3.18421052631579, 3.18421052631579, 3.18421052631579, 3.18421052631579, 3.18421052631579, 3.18421052631579, 3.18421052631579, 3.18421052631579, 3.18421052631579, 3.18421052631579, 3.18421052631579, 3.18421052631579, 3.18421052631579, 3.18421052631579, 3.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, 5.18421052631579, -1.81578947368421, -1.81578947368421, -1.81578947368421, -1.81578947368421, -1.81578947368421, -1.81578947368421, -1.81578947368421, -1.81578947368421, -1.81578947368421, -1.81578947368421, -1.81578947368421, -1.81578947368421, -1.81578947368421, -1.81578947368421, -1.81578947368421, -1.81578947368421, -1.8157894736

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Reply

0 votes
by (71.8m points)

tl;dr this looks like a false positive -- I don't see any particularly important differences among the fits with a variety of different optimizers, although it does look as though the outliers are the built-in Nelder-Mead optimizer and nlminb; built-in bobyqa, and bobyqa and Nelder-Mead from the nloptr package, give extremely close answers, and no warnings.

My general advice in these cases would be to try re-fitting with control=glmerControl(optimizer="bobyqa"); we are considering switching to using bobyqa as the default (this question increases the weight of evidence in its favour).

I put the dput output in a separate file:

source("convdat.R")

Run the whole gamut of possible optimizers: built-in N-M and bobyqa; nlminb and L-BFGS-B from base R, via the optimx package; and the nloptr versions of N-M and bobyqa.

library(lme4)
g0.bobyqa <- glmer(resp ~ months.c * similarity * percSem +
                 (similarity | subj),
      family = binomial, data = myData,
                   control=glmerControl(optimizer="bobyqa"))
g0.NM <- update(g0.bobyqa,control=glmerControl(optimizer="Nelder_Mead"))
library(optimx)
g0.nlminb <- update(g0.bobyqa,control=glmerControl(optimizer="optimx",
                              optCtrl=list(method="nlminb")))
g0.LBFGSB <- update(g0.bobyqa,control=glmerControl(optimizer="optimx",
                              optCtrl=list(method="L-BFGS-B")))

library(nloptr)
## from https://github.com/lme4/lme4/issues/98:
defaultControl <- list(algorithm="NLOPT_LN_BOBYQA",xtol_rel=1e-6,maxeval=1e5)
nloptwrap2 <- function(fn,par,lower,upper,control=list(),...) {
    for (n in names(defaultControl)) 
      if (is.null(control[[n]])) control[[n]] <- defaultControl[[n]]
    res <- nloptr(x0=par,eval_f=fn,lb=lower,ub=upper,opts=control,...)
    with(res,list(par=solution,
                  fval=objective,
                  feval=iterations,
                  conv=if (status>0) 0 else status,
                  message=message))
}
g0.bobyqa2 <- update(g0.bobyqa,control=glmerControl(optimizer=nloptwrap2))
g0.NM2 <- update(g0.bobyqa,control=glmerControl(optimizer=nloptwrap2,
                           optCtrl=list(algorithm="NLOPT_LN_NELDERMEAD")))

Summarize results. We get warnings from nlminb, L-BFGS-B, and Nelder-Mead (but the size of the max abs gradient is largest from Nelder-Mead)

getpar <- function(x) c(getME(x,c("theta")),fixef(x))
modList <- list(bobyqa=g0.bobyqa,NM=g0.NM,nlminb=g0.nlminb,
                bobyqa2=g0.bobyqa2,NM2=g0.NM2,LBFGSB=g0.LBFGSB)
ctab <- sapply(modList,getpar)
library(reshape2)
mtab <- melt(ctab)
library(ggplot2)
theme_set(theme_bw())
ggplot(mtab,aes(x=Var2,y=value,colour=Var2))+
    geom_point()+facet_wrap(~Var1,scale="free")

Just the 'good' fits:

ggplot(subset(mtab,Var2 %in% c("NM2","bobyqa","bobyqa2")),
       aes(x=Var2,y=value,colour=Var2))+
    geom_point()+facet_wrap(~Var1,scale="free")

Coefficient of variation of estimates among optimizers:

summary(cvvec <- apply(ctab,1,function(x) sd(x)/mean(x)))

The highest CV is for months.c, which is still only about 4% ...

The log-likelihoods don't differ very much: NM2 gives the max log-likelihood, and all the 'good' ones are very close (even the 'bad' ones are at most 1% different)

likList <- sapply(modList,logLik)
round(log10(max(likList)-likList),1)
##  bobyqa      NM  nlminb bobyqa2     NM2  LBFGSB 
##    -8.5    -2.9    -2.0   -11.4    -Inf    -5.0 

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
OGeek|极客中国-欢迎来到极客的世界,一个免费开放的程序员编程交流平台!开放,进步,分享!让技术改变生活,让极客改变未来! Welcome to OGeek Q&A Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...