Principal Investigators’ New Websites

May 6, 2020:

Stephanie Lanza, Bethany Bray, Linda Collins, Susan Murphy, Runze LiAs previously announced, substantial changes lie ahead for The Methodology Center. Over the coming months, we will stop updating this website. Resources will remain available for at least a year, but in order to provide the latest information, each of our investigators will maintain her or his own website. These sites will include content developed at The Methodology Center and new resources related to the researcher’s future work.

Stephanie Lanza and Ashley Linden-Carmichael built a website for the Addictions and Innovative Methods (AIM) lab. Their great new site,, describes their research and includes the content about time-varying effect modeling (TVEM) from The Methodology Center’s website.

Susan Murphy has incorporated The Methodology Center’s content on just-in-time adaptive interventions in her website, The site also includes workshop materials and other resources.

Bethany Bray‘s new site at will include The Methodology Center’s resources for latent class analysis (LCA) and latent transition analysis (LTA). Bethany has concrete plans for new LCA and LTA resources, so stay tuned.

Runze Li will update his page at to incorporate Methodology Center resources on variable screening and variable selection for high-dimensional data analysis.

Linda Collins will build a new website to house The Methodology Center’s content on the multiphase optimization strategy (MOST) for optimizing interventions after she moves to New York University. In the meantime, follow Linda on Twitter, @collins_most.

Daniel Almirall and Inbal “Billie” Nahum-Shani’s informative website,, will soon incorporate The Methodology Center’s resources for the sequential, multiple assignment, randomized trial (SMART).

More information will follow in June or July. Thank you for staying connected to our research! We are all proud of our time at The Methodology Center and very excited about the future.

Building Better Adaptive Interventions by Expanding SMART

June 27, 2019:

John DziakBehavioral interventions for prevention and treatment are an important part of the fight against drug abuse and HIV/AIDS. Among the challenges faced by scientists is how and when to alter the course of treatment for participants in the intervention. Adaptive interventions change based on evidence about what is best for the participant at a given time.

For over a decade, Methodology Center researchers have developed and applied sequential, multiple assignment, randomized trials (SMARTs), which are experimental designs that can be used to build adaptive interventions that address a variety of health and behavioral challenges, such as substance abuse abstinence, weight loss, ADHD management, and language acquisition. Recently, researchers have begun developing methods to evaluate SMARTs by using multiple measures of the outcome over time rather than only considering the outcome at the end of the study. For example, a researcher who is developing an adaptive intervention to promote abstinence from alcohol may want to consider alcohol usage rates every month for six months to decide how to construct the intervention. In a recent article in Multivariate Behavioral Research by Methodology Center Investigator John Dziak, Methodology Center Affiliates Daniel Almirall and Inbal “Billie” Nahum-Shani, and others, the authors develop and demonstrate a new method for evaluating a SMART using repeated measures of a binary outcome (such as substance use versus nonuse).

The authors apply their method to the ENGAGE SMART study, which was conducted to help develop an adaptive intervention for promoting treatment engagement among cocaine- and alcohol-dependent individuals. The authors found that certain designs correlated to increased abstinence rates during the first two months but abstinence rates that were equivalent to other designs by the end of the study. Had the investigators measured relapse solely at six months, they would not have observed the relapse differences during the early months, which may have practical or clinical significance. The authors go on to provide guidelines for using multiple binary measurements of the outcome while analyzing data from a SMART.

Lead author John Dziak discussed the importance of the study. “SMART is a valuable method because conditions such as addiction and many other health problems, are chronic and often need treatment over time. In many cases, the appropriate treatment could change depending on the individual’s experiences. SMART trials can help scientists decide which set of adaptive treatment rules will work the best. In a lot of the past SMART literature, ‘work the best’ just meant having the best expected outcome at the end of the study.  But considering short-term and long-term effects together might help clinicians make better decisions to fit an individual’s  goals.  Also, it allows scientists to study delayed effects, where an early treatment choice affects how well later treatments work, and that could render theoretical insight into the treatments.”


Dziak, J. J., Yap, J. R., Almirall, D., McKay, J. R., Lynch, K. G., & Nahum-Shani, I. (2019). A data analysis method for using longitudinal binary outcome data from a SMART to compare adaptive interventions. Multivariate Behavioral Research, 1-24.

Featured Article: Analyzing Data from a SMART to Prevent Alcohol Abuse Relapse

April 6, 2017:Ishani

Response to substance abuse treatment can look very different between individuals and even within individuals at different points in time. Sequential, multiple assignment, randomized trials (SMARTs) are being used to develop interventions that adapt based on individual needs and circumstances. New methods for data analysis show promise for improving intervention developers’ ability to tailor an intervention even more specifically to an in individual’s needs for a broad range of health issues, including substance use. In a recent article in the journal Addiction, Methodology Center researchers Inbal (Billie) Nahum-Shani, Daniel Almirall, and their collaborators demonstrate the utility of Q-learning, a method developed in computer science, for the analysis of data from a SMART to prevent relapse among individuals with alcohol use disorders. Q-learning helped the authors identify a subset of individuals who appeared to be responding to treatment, but who needed additional treatment to maintain progress.

The authors analyzed data from 250 participants in the Extending Treatment Effectiveness of Naltrexone (ExTEND) trial (D. Oslin, P.I.; NIAAA; R01 AA014851). ExTEND was a 24-week SMART that examined how to build an adaptive intervention to prevent relapse among people with alcohol use disorders using the drug naltrexone. Naltrexone is promising for treating alcohol dependence, but there is a broad range of responses to the drug. The researchers used data from the ExTEND SMART to construct an adaptive intervention that is tailored even further based on an individual’s response to the initial treatment.

The resulting adaptive intervention recommends additional treatment for a subset of participants who, though initially classified as responders to Naltrexone, are likely to benefit from a more intense maintenance intervention. Q-learning is similar to moderated regression analysis, but it is suitable for examining whether and how certain covariates are useful in developing or improving an adaptive intervention. Lead author Inbal “Billie” Nahum-Shani said, “Q-learning can help us identify new ways to tailor treatments beyond the tailoring variables we typically include in a SMART by design. The goal of this paper is to provide an accessible overview of this method to investigators in the area of substance use disorders and to demonstrate how it can help advance the science of adaptive interventions in this important field.”

By using Q-learning with data from a SMART, researchers can build empirically validated adaptive interventions for a broad array of health problems.

Open the article. (Journal access required.)


Nahum‐Shani, I., Ertefaie, A., Lu, X. L., Lynch, K. G., McKay, J. R., Oslin, D. W., & Almirall, D. (2017). A SMART data analysis method for constructing adaptive treatment strategies for substance use disorders. Addiction.

Grant: Expanding the Methodological Toolbox for Sequential, Multiple Assignment, Randomized Trials (SMARTs)

smartWORDLESSOctober 31, 2016:

Over the course of treatment, a clinician often alters treatment based on patient characteristics or response to earlier treatment. Sequential, multiple assignment, randomized trial (SMART) designs provide the data needed to construct high-quality adaptive interventions. Interventions that adapt at the right times (e.g., intensifying for people who do not respond to the initial treatment) can improve participant outcomes while decreasing the cost and burden of the intervention (e.g., stepping down treatment for responsive participants). SMART designs are currently being used around the world in dozens of trials to build adaptive interventions for drug use, HIV, ADHD, autism, obesity, and more.

Last year, a team of Methodology Center researchers was awarded a grant from the National Institute on Drug Abuse (R01 DA039901) to expand the methodological toolbox available for intervention designers seeking to analyze data and plan future SMART studies.

This research will develop multilevel models to allow intervention scientists to answer new questions using longitudinal data from a SMART—for example, to compare the effect of two adaptive interventions on changes in craving or substance use over time. The research will also develop sample size calculators to facilitate the planning of SMARTs studies with longitudinal outcomes. Co-principal investigators Inbal (Billie) Nahum Shani and Daniel Almirall are excited about the potential of this research to further expand the usefulness of SMART designs. Billie said, “Right now, the methods that we have for analyzing and planning sample size for SMART studies are relatively limited.  For example, they allow us to compare adaptive interventions only in terms of end-of-study outcome. However, many scientists are interested in taking advantage of longitudinal data they often collect in the course of a SMART study, and using it to compare adaptive interventions in terms of trajectories of change. For example, scientists may want to study change in HIV-risk behavior during the course of the intervention program. This is because modeling change, rather than end-of-study outcome, provides greater statistical power and a more nuanced picture of how the adaptive intervention works. In this project we will develop the tools to allow intervention designers conduct these analyses and plan future SMARTs with longitudinal outcomes.”

Other researchers on the team that include Linda Collins, John Dziak, and Susan Murphy. Billie, Danny, and Susan are based at University of Michigan, and Linda and John are at Penn State. This grant will add five more years of methodological research to the development of SMART, allowing this valuable method to be applied even more broadly.

Read more about SMART 

Eric Laber Receives Alumni Award

September 23, 2016:ELABER

Congratulations to Eric Laber, associate professor of statistics at North Carolina State University, recipient of The Methodology Center 2016 Distinguished Alumni Award. Eric develops methods for data-driven decision making. He applies his work in a broad variety of ways including precision medicine, artificial intelligence, adaptive conservation, and the management of infectious diseases.

Eric was a research assistant at The University of Michigan’s Institute for Social Research from 2008-2011 where he worked with Susan Murphy and was a Methodology Center trainee. During his time at Michigan, he worked on developing the sequential, multiple assignment, randomized trial (SMART), and assisted in the development of PROC Qlearn. Eric has published 34 peer-reviewed articles, nine first-authored. Eric is also the recipient of North Carolina State University’s 2015-16 Cavell Brownie Mentoring Award for his creative mentoring of graduate and undergraduate statistics students.

Visit Eric’s website.

Special Issue: Adaptive Interventions for Children’s Mental Health

September 8, 2016:boy sits in field

There are vast individual differences in youth presenting for mental health treatment. Youth vary in their initial clinical presentation; their contextual risk and protective factors; and their engagement, adherence and response to evidence-based treatments. For this reason, adaptive interventions, which are individually tailored to each person, are valuable tools in the treatment and prevention of child and adolescent mental health (CAMH) disorders. Methodology Center Investigator Daniel Almirall co-edited a recent special issue of the Journal of Clinical Child & Adolescent Psychology that showcases recent applications and innovations of adaptive interventions for addressing CAMH disorders.

To introduce the issue, Daniel and his co-editor Andrea Chronis-Tuscano wrote an article that introduces adaptive interventions and the use of the sequential, multiple assignment, randomized trial (SMART) for the development of evidence-based adaptive interventions. The article also gives an overview of research using adaptive interventions for CAMH disorders and describes future directions for this research.

The special issue includes articles on using adaptive interventions to treat ADHD, autism spectrum disorder, depression, conduct problems and more.

Read the article.

Open the special issue.



Almirall, D., & Chronis-Tuscano, A. (2016). Adaptive interventions in child and adolescent mental health. Journal of Clinical Child & Adolescent Psychology, 45(4), 383-395.

Susan Murphy Elected to Academy of Sciences


Susan Murphy, 2013 MacArthur Fellow

May 13, 2016:

We are pleased to announce that Methodology Center Principal Investigator Susan Murphy has been elected to the National Academy of Sciences. Susan’s innovative research, particularly her development of the sequential, multiple assignment, randomized trial (SMART), and her recent work on just-in-time adaptive interventions (JITAIs) has garnered many honors and accolades, including membership in the National Academy of Medicine in 2014 and a MacArthur Foundation “genius” award in 2013.

The National Academy of Sciences was established by Congress in 1863 to provide independent advice to the government about science and technology. Susan was elected to another Academy, the National Academy of Medicine (previously known as the Institute of Medicine), in 2014. In both Academies, current members elect new members who have made distinguished research contributions.

Susan’s research focuses on developing innovative research approaches to improve the personalization of treatment. She developed SMART, an experimental design tool that allows scientists to build empirically based interventions that adapt to patient characteristics and response to treatment. Recently, Susan has begun investigating the construction of JITAIs, which use real-time data from mobile technologies to deliver personalized behavioral interventions exactly when they are needed.

Susan’s work has impacted the research of countless scientists and interventionists. SMARTs are being used to address a broad range of topics including cocaine abuse, depression, problem drinking, obesity, ADHD, and autism. Dozens of SMARTs have been funded by the National Institutes of Health, and multiple NIH program announcements specifically request SMART designs.

Susan is Herbert E. Robbins Distinguished University Professor of statistics, research professor at the Institute for Social Research, and professor of psychiatry at the University of Michigan. She is a fellow of the College of Problems of Drug Dependence, the MacArthur Foundation, the Center for Advanced Study in the Behavioral Sciences at Stanford University, the American Statistical Association, and the Institute of Mathematical Statistics. She is an elected member of the International Statistical Institute and from 2007-2009 served as co-editor of The Annals of Statistics. She has been a primary investigator in The Methodology Center since 1995.

Read more about Susan’s research.