Search Results for: smart

Principal Investigators’ New Websites

May 6, 2020:

Stephanie Lanza, Bethany Bray, Linda Collins, Susan Murphy, Runze LiAs previously announced, substantial changes lie ahead for The Methodology Center. Over the coming months, we will stop updating this website. Resources will remain available for at least a year, but in order to provide the latest information, each of our investigators will maintain her or his own website. These sites will include content developed at The Methodology Center and new resources related to the researcher’s future work.

Stephanie Lanza and Ashley Linden-Carmichael built a website for the Addictions and Innovative Methods (AIM) lab. Their great new site,, describes their research and includes the content about time-varying effect modeling (TVEM) from The Methodology Center’s website.

Susan Murphy has incorporated The Methodology Center’s content on just-in-time adaptive interventions in her website, The site also includes workshop materials and other resources.

Bethany Bray‘s new site at will include The Methodology Center’s resources for latent class analysis (LCA) and latent transition analysis (LTA). Bethany has concrete plans for new LCA and LTA resources, so stay tuned.

Runze Li will update his page at to incorporate Methodology Center resources on variable screening and variable selection for high-dimensional data analysis.

Linda Collins will build a new website to house The Methodology Center’s content on the multiphase optimization strategy (MOST) for optimizing interventions after she moves to New York University. In the meantime, follow Linda on Twitter, @collins_most.

Daniel Almirall and Inbal “Billie” Nahum-Shani’s informative website,, will soon incorporate The Methodology Center’s resources for the sequential, multiple assignment, randomized trial (SMART).

More information will follow in June or July. Thank you for staying connected to our research! We are all proud of our time at The Methodology Center and very excited about the future.

Apply Now: Summer Institute on Just-in-Time Adaptive Interventions

Susan Murphy and Daniel AlmirallJanuary 23, 2020:

Apply now to attend this year’s Summer Institute on Innovative Methods, “Building effective just-in-time adaptive interventions using micro-randomized trial designs.” Susan Murphy, professor of statistics and computer science and Radcliffe Alumnae Professor at Harvard University, and Daniel Almirall, research associate professor at The University of Michigan’s Survey Research Center, will introduce the just-in-time adaptive intervention (JITAI) and micro-randomized trial (MRT) for the development of adaptive mobile health interventions. The Institute will be held July 23 – 24 in Bethesda Maryland.

JITAIs are a special type of adaptive intervention where—thanks to mobile technology like activity sensors and smartphones—an intervention can be delivered when and where it is needed. MRTs are a new trial design for addressing scientific questions concerning the construction of highly effective JITAIs. In this workshop, we will introduce JITAIs and provide examples of key scientific questions can be answered using MRTs. Useful primary aim data analysis methods for MRTs will also be discussed.

Day 1 and part of Day 2 of this workshop will focus on JITAI and MRT design considerations and applications. Much of Day 2 will be allotted to understanding primary aims in an MRT and conducting associated primary aim analyses.

The 2020 Summer Institute on Innovative Methods is hosted as a partnership between The Methodology Center at Penn State and the Center for Dissemination and Implementation Science at the University of Illinois at Chicago.

Read more or apply to attend.

POSTPONED: 2020 Summer Institute on Just-in-Time Adaptive Interventions

Susan A. MurphyDaniel AlmirallTopic: Building Effective Just-in-Time Adaptive Interventions Using Micro-Randomized Trial Designs

Presenters: Susan Murphy and Daniel Almirall

Date: RESCHEDULED to June 28-29, 2021

Venue:  Hyatt Regency Bethesda in Bethesda, MD


For updated application information, please visit UIC’s website about the training,


Workshop information

A just-in-time adaptive intervention (JITAI) is an emerging mobile health intervention design aiming to provide support “just-in-time”, namely, whenever and wherever support is needed. A JITAI does this via adaptation. The JITAI employs wearable sensors and other approaches to data collection to monitor ongoing information on the dynamics of an individual’s emotional, social, physical and contextual states. The adaptation occurs when this information is used to individualize the type and delivery timing of support. The adaptation in a JITAI is intended to ensure that the right type of support is provided whenever the person is (a) vulnerable and/or is in a state of opportunity, and (b) receptive, namely, able and willing to receive, process and utilize the support provided.

We will introduce the micro-randomized trial (MRT), a new trial design useful for addressing scientific questions concerning the construction of highly-effective JITAIs. We will provide an introduction to JITAIs, as well as examples of key scientific questions that need to be addressed in the development of these interventions. We will discuss MRTs and how they can be used to answer these scientific questions. Useful primary aim data analysis methods for MRTs will also be discussed.

The emphasis of this workshop is on JITAI and MRT design considerations and applications. Most of Day 1 and some of Day 2 focus on this. On Day 2, much of the time will be allotted to understanding primary aims in an MRT and conducting associated primary aim analyses.


The prerequisites for this workshop are (1) familiarity with the basic principles of experimental (e.g., randomized trial) design, and (2) graduate-level statistics training for the behavioral, management, social or health sciences up through linear regression (usually two semesters of course work).  Basic familiarity with the R programming language is necessary for participation in the computer exercises.


Participants will be provided with a hard copy of all lecture notes, select computer exercises, and output.  Three different formats will be used. First, all materials will be presented following the standard didactic format with a slideshow.  Second, there will be practice exercises (i.e., practicums) designed to help participants connect the material with their own research area.  These practicums are aimed at helping investigators learn how to implement an MRT and helping to prepare participants to write a grant proposal that uses an MRT design to build a JITAI.  Third, there will be computer exercises using R on Day 2 of the workshop. Computer code and simulated data examples will be supplied by the instructors. The computer exercises will help investigators learn how to conduct and interpret the results of typical primary and secondary analyses. Throughout the workshop, ample time will be set aside for Q&A and discussion about how the concepts learned in class can be applied in participants’ research.

Computer requirements

Participants are strongly encouraged to bring a laptop so that they can participate in the computer exercises. To conduct analyses at the workshop, the latest version of R must be installed on your laptop prior to arrival.

We cannot provide IT support (e.g., R installation, troubleshoot errors running R) at the workshop. However, we expect that even if you do experience some difficulty with R (or other software trouble with your laptop), you will still be able to appreciate and learn from the computer portions of the workshop.

Topics covered

  • Just-in-time adaptive intervention (JITAI)
  • Micro-randomized trial (MRT) design principles
  • Primary and secondary scientific aims in an MRT
  • Primary aim analysis of MRTs
  • MRT case studies
  • Related optimization trial designs

Return to top

How to attend

Enrollment is limited to 40 participants to maintain an informal atmosphere and to encourage interaction between and among the presenter and participants. We will give priority to individuals who are involved in drug abuse prevention and treatment research or HIV research, who have the appropriate statistical background to get the most out of the Institute, and for whom the topic is directly and immediately relevant to their current work. We also aim to maximize geographic and minority representation.

Applications to the 2020 Summer Institute were due by 5 p.m Eastern Time, Monday, March 2, 2020. Applicants will be notified about decisions by Friday, April 3, 2020.

Once accepted, participants will be emailed instructions about how to register. The registration fee of $395 for the two-day Institute will cover all instruction, program materials, and a reception the first evening of the Institute. A block of rooms at the Hyatt Regency Bethesda will be available for lodging.

Participants are encouraged to bring their own laptop computers for conducting exercises.

For updated application information, please visit UIC’s website about the training,      

Review our refund, access, and cancellation policy.

Return to top


Susan Murphy, Ph.D.

Susan A. MurphySusan Murphy is Professor of Statistics, Computer Science and Radcliffe Alumnae Professor, Harvard University.

Dr. Murphy’s lab develops data analysis methods and experimental designs to improve real time sequential decision-making in mobile health. In particular, her lab develops algorithms, deployed on wearable devices, to deliver and continually optimize individually tailored treatments. She developed the micro-randomized trial for use in constructing mobile health interventions; this trial design is in use across a broad range of health related areas. In these trials each participant can be randomized or re-randomized 100’s of times. Browse a list of micro-randomized trials that are completed or are in the field.

Dr. Murphy is a member of the National Academy of Sciences and of the National Academy of Medicine, both of the U.S. National Academies.  In 2013 she was awarded a MacArthur Fellowship for her work on experimental designs to inform sequential decision making.

Daniel Almirall, Ph.D.

Daniel AlmirallDaniel Almirall is Research Associate Professor at the University of Michigan’s Survey Research Center.

Dr. Almirall is a statistician who develops methods to form evidence-based adaptive interventions. Adaptive interventions can be used to inform individualized intervention guidelines for the on-going management of chronic illnesses or disorders such as drug abuse, depression, anxiety, autism, obesity, or HIV/AIDS. More recently, Dr. Almirall has been interested in methods to form related adaptive implementation interventions and just-in-time-adaptive interventions in mobile health. His work includes the development of approaches related to the design, execution, and analysis of sequential multiple assignment randomized trials (SMARTs) and micro-randomized trials (MRTs). He is particularly interested in applications in child and adolescent mental health research.

Return to top


Hyatt Regency Bethesda in Bethesda, MD

Return to top


Funding for this conference was made possible by award number R13 DA020334 from the National Institute on Drug Abuse. The views expressed in written conference materials or publications and by speakers and moderators do not necessarily reflect the official views and/or policies of the Department of Health and Human Services; nor does mention of trade names, commercial practices, or organizations imply endorsement by the U.S. Government.

Return to top


  • 2019 – Variability in Intensive Longitudinal Data: Mixed-Effects Location Scale Modeling by Donald Hedeker
  • 2018 – Analysis of Ecological Momentary Assessment Data by Stephanie T. Lanza and Michael Russell
  • 2017 – Statistical Power Analysis for Intensive Longitudinal Studies by Jean-Philippe Laurenceau and Niall Bolger
  • 2016 – Ecological Momentary Assessment (EMA): Investigating Biopsychosocial Processes in Context by Joshua Smyth, Kristin Heron, and Michael Russell
  • 2015 – An Introduction to Time-Varying Effect Modeling by Stephanie T. Lanza and Sara Vasilenko
  • 2014 – Experimental Design and Analysis Methods for Developing Adaptive Interventions: Getting SMART by Daniel Almirall and Inbal Nahum-Shani
  • 2013 – Introduction to Latent Class Analysis by Stephanie Lanza and Bethany Bray
  • 2012 – Causal Inference by Donna Coffman
  • 2011 – The Multiphase Optimization Strategy (MOST) by Linda Collins
  • 2010 – Analysis of Longitudinal Dyadic Data by Niall Bolger and Jean-Philippe Laurenceau
  • 2009 – Latent Class and Latent Transition Analysis by Linda Collins and Stephanie Lanza
  • 2008 – Statistical Mediation Analysis by David MacKinnon
  • 2007 – Mixed Models and Practical Tools for Causal Inference by Donald Hedeker and Joseph Schafer
  • 2006 – Causal Inference by Christopher Winship and Felix Elwert
  • 2005 – Survival Analysis by Paul Allison
  • 2004 – Analyzing Developmental Trajectories by Daniel Nagin
  • 2003 – Modeling Change and Event Occurrence by Judith Singer and John Willett
  • 2002 – Missing Data by Joseph Schafer
  • 2001 – Longitudinal Modeling with MPlus by Bengt Muthén and Linda Muthén
  • 2000 – Integrating Design and Analysis and Mixed-Effect Models by Richard Campbell, Paras Mehta, and Donald Hedeker
  • 1999 – Structural Equation Modeling by John McArdle
  • 1998 – Categorical Data Analysis by David Rindskopf and Linda Collins
  • 1997 – Hierarchical Linear Models and Missing Data Analysis by Stephen Raudenbush and Joseph Schafer
  • 1996 – Analysis of Stage Sequential Development by Linda Collins, Peter Molenaar, and Han van der Maas

Are Adaptive Interventions Bayesian?

I love the idea of adaptive behavioral interventions. But, I keep hearing about adaptive designs and how they are Bayesian. How can an adaptive behavioral intervention be Bayesian? — Signed, Adaptively Confused, Determined to Continue


Dear AC, DC:

Actually, adaptive behavioral interventions are not Bayesian, but your confusion is understandable, because the same words have been used to refer to different concepts.

Let’s start with the word “adaptive.” This word, which is used in many fields, refers broadly to anything that changes in a principled, systematic way in response to specific circumstances in order to produce a desired outcome. An adaptive intervention is a set of tailoring variables and decision rules whereby an intervention is changed in response to characteristics of the individual program participant or the environment in order to produce a better outcome for each participant (Collins et al., 2004). For example, in an adaptive intervention for treating alcoholism, an individual participant’s alcohol intake may be monitored and reviewed periodically. Here alcohol intake is the tailoring variable. The decision rule might be as follows: if the individual has no more than two drinks per week for six weeks, frequency of clinic visits will be reduced. On the other hand, if the individual has more than two drinks per week, additional clinic visits will be required, plus a pharmaceutical will be added to the treatment regime.

Now let’s discuss the word “design.” In intervention science, the term intervention design refers to the specifics of a behavioral intervention: the factors included in the prevention or treatment approach, such as whether the intervention is delivered in a group or individual setting, or the intensity level of the intervention. In contrast, the term research design refers to how an empirical study is set up. For example, in an experimental research design there is random assignment to conditions; in a longitudinal research design, measures are taken over several different time points.

Confusion may arise when the words “adaptive” and “design” are paired without specifying which sense of the word “design” is meant. In intervention science, an adaptive intervention design is the approach used in a particular adaptive intervention. In methodology, an adaptive research design (Berry et al., 2011) is a Bayesian approach to randomized clinical trials (RCTs), in which the research design may be altered during the course of a clinical trial based on information gathered during the trial. For example, an adaptive clinical trial may be halted before the entire planned sample size is collected if the results are judged to be so clear that additional information would be unlikely to affect decision making. Adaptive intervention designs are not experimental designs, and therefore cannot be Bayesian.

You might think that building or evaluating an adaptive behavioral intervention would require the use of an adaptive experimental design, but this is generally not the case. Investigators who want to build an adaptive intervention—that is, conduct an experiment to decide on the best tailoring variables and/or decision rules—probably want to consider a Sequential Multiple Assignment Randomized Trial (SMART Trial; Murphy et al., 2007). Note that the SMART Trial has been developed especially for building adaptive behavioral interventions, but it is not an adaptive experimental design.

Investigators who have already selected the tailoring variables and decision rules and want to evaluate the adaptive intervention probably want to conduct an RCT. The RCT could potentially involve an adaptive experimental design, but it could also be a standard RCT.

I hope this helps to clear things up! Reading the references listed below may help. And, AC, DC: The next time you hear the term “design,” if you are not sure whether the speaker is talking about intervention design or experimental design, be sure to ask forc larification.


Berry, S. M., Carlin, B. P., Lee, J. J., & Muller, P. (2011). Bayesian adaptive methods for clinical trials. Boca Raton, FL: CRC Press.

Collins, L. M., Murphy, S. A., & Bierman, K. (2004). A conceptual framework for adaptive preventive interventions. Prevention Science, 3, 185-196

Murphy, S. A., Lynch, K. G., McKay, J. R., Oslin, D., & Ten Have, T. (2007). Developing adaptive treatment strategies in substance abuse research. Drug and Alcohol Dependence, 88(2), S24-S30.

Analyzing EMA Data

I designed a study to assess 50 college students’ motivations to use alcohol and its correlates during their first semester. The most innovative part of this study was that I collected data with smart phones that beeped at several random times on every Thursday, Friday, and Saturday throughout the semester. Now that I’ve collected the data, I’m overwhelmed by how rich the data are and don’t know where to start! My first thought is to collapse the data to weekly summary scores and model those using growth curve analysis. Is there anything more I can do with the data? — Signed, Swimming in Data


Dear Swimming:

You did indeed collect an amazing dataset! With technological advances, the collection of intensive longitudinal data, such as ecological momentary assessments (EMA), is becoming popular among researchers hoping to better understand dynamic processes related to mood, cigarette or alcohol use, physical activity, and many other states or behaviors. Some of the most compelling research questions in these studies often have to do with effects of time-varying predictors.

One familiar way to approach the analysis of EMA data is to reduce the data, summarizing within-day or within-week assessments to a single measure, so that growth curve models may be fit to estimate an average trend and predictors of the intercept and slope. However, this approach disregards the richness of the data that were so carefully collected. Further, EMA studies are typically designed in order to capture something more dynamic than what could be captured as a linear function of time.

A more common approach to the analysis of EMA data is multilevel models, where within- and between-individual variability can be separated. This approach is helpful for understanding, for example, the degree of stability of processes. However, these methods typically impose important constraints, such as the assumption that the effects of covariates on an outcome are stable over time.

New methods for the analysis of intensive longitudinal data have been proposed in the statistical literature, and hold immense promise for addressing important questions about dynamic processes such as the factors driving alcohol use during the freshman year of college. For example, the time-varying effect model (TVEM) is a flexible approach that allows the effects of covariates to vary with time. A detailed introduction to time-varying effect models for audiences in psychological science will appear in Tan, Shiyko, Li, Li, & Dierker (in press).

A demonstration of this approach will appear in an article by Methodology Center researchers and colleagues (Shiyko et al., in press). The authors analyzed data collected as part of a smoking-cessation trial and found that individuals with a successful quit attempt had a rapid decrease in craving within the first few days of quitting, whereas those who eventually relapsed did not experience this decrease. Eventual relapsers had low levels of confidence in their abilities to abstain on cravings early in their quit attempt, but among successful quitters the association with confidence in ability to abstain was significantly weaker.

Any researcher with access to EMA data can fit a TVEM using the %TVEM SAS macro, which is freely available at /downloads/tvem. Give it a try so that you can explore the timevarying effects of individual factors, such as residing in a dorm, and contextual factors, such as excitement about an upcoming sporting event, on motivations to use alcohol.


Shiyko, M. P., Lanza, S. T., Tan, X., Li, R., & Shiffman, S. (2012). Using the time-varying effect model (TVEM) to examine dynamic associations between negative affect and self confidence on smoking urges: Differences between successful quitters and relapsers. Prevention Science. PMCID: PMC3171604

Tan, X., Shiyko, M. P., Li, R., Li, Y., & Dierker, L. (2012). A time-varying effect model for intensive longitudinal data. Psychological Methods, 17(1), 61-77. PMCID: PMC3288551

Adolescent Substance Abuse: Progressive Treatment for Adolescent Who Use Drugs

Because the history of adolescent substance abuse interventions shows that individuals respond differently to treatment, this study uses a pair of SMART designs to examine when and how to treat adolescent drug users.
  • PI: Holly Barrett Waldron
  • Location: Oregon Research Institute
  • Funding: NIDA-funded, completed project

Improving Mental Health Outcomes: Building an Adaptive Implementation Strategy

This SMART is cluster-randomized. Randomization occurs at the clinic level. The aim of the study is to develop an adaptive quality improvement strategy designed to enhance the implementation of an evidence-based mental health intervention. Outcomes are measured at the patient level.
  • PI: Amy Kilbourn
  • Location: University of Michigan
  • Funding: NIMH Project R01MH099898

Adaptive Treatment for Smoking Among People With HIV

Between 50% and 70% of people living with HIV are nicotine dependent. This SMART examines how and when to apply contingency management and standard treatment to promote smoking cessation in this population.
  • PI: David Ledgerwood
  • Location: Wayne State
  • Funding: NIMH Project R01DA034537

Characterizing Cognition in Nonverbal Individuals With Autism

In order to develop communication skills among school-aged children who are nonverbal, this project employs a SMART design to test a novel intervention. The intervention includes components that focus on spoken language and the use of a speech-generating device (e.g., iPad). The SMART design provides the data needed to define response and nonresponse to the intervention and identify the best treatment sequence.


  • PI: Connie Kasari
  • Location: Center for Autism Research and Treatment, University of California, Los Angeles
  • Funding: Funded by Autism Speaks

Adaptive Treatment for Bipolar Disorder

Patients suffering from bipolar disorder are assigned to one of two mood stabilizers. A SMART design is used to determine the appropriate treatment for patients who develop depression.
  • PIs: Charles Lee Bowden, Joseph Calabres
    Locations: University of Texas Health Science Center at San Antonio, 2nd site: Case Western Reserve University Medical Center
    Funding: NIMH Project P30MH086045

Adaptive Treatment for Persistent Insomnia

This project aims to develop an adaptive intervention for persistent insomnia. Researchers are using SMART to determine the best sequencing of cognitive behavioral therapy and medication for persistent insomnia.
  • PI: Charles Morin
  • Location: Laval University
  • Funding: NIMH Project R01MH091053

Adaptive Treatment Strategies for Children and Adolescents With Obsessive-Compulsive Disorder (OCD)

For youth with OCD, the most common treatments are cognitive-behavioral therapy (CBT), pharmacological treatment, or both. Up to 30% of patients may not benefit from their initial treatments. Researchers will employ a SMART to determine the optimal treatment sequence for participants dependent on whether or not they respond to their initial treatment.
  • PI: Roseli Shavitt
  • Location: University of Sao Paulo

Adaptive Treatment for Adolescent Obesity

This project targets African American adolescents with obesity and their parents. SMARTs are used to develop an adaptive intervention that increases skills in changing dietary, exercise, and sedentary behaviors.
  • PI: Sylvie Naar-King
  • Location: Wayne State University

Adaptive Approach to Naltrexone Treatment for Alcoholism

Naltrexone (NTX) is an opioid receptor antagonist used to prevent alcoholism relapse. This trial examines how to define “non-response” to treatment with NTX and what treatments are most effective for those who do or do not respond to the initial treatment. More details about this study can be found on the SMART Example page.

  • PI: David Oslin
  • Location: University of Pennsylvania
  • Funding: NIAAA Project R01AA017164

Adaptive Treatment for Cocaine Dependence

A SMART design is being implemented to develop an adaptive intervention to increase treatment engagement and decrease cocaine use for patients who are cocaine dependent. The study also examines whether patient choice of care affects patient outcomes.

  • PI: James R. McKay
  • Location: University of Pennsylvania
  • Funding: NIDA Project P01AA016821

Adaptive Interventions for Children with ADHD

The aim of this SMART is to understand whether to begin with medication or behavioral therapy for children with ADHD, and whether to intensify or augment initial treatment for children who do not respond to treatment.

  • PI: William Pelham
  • Location: Florida International University
  • Funding: U.S. Department of Education-funded, completed project