The relative efficiency of time-to-threshold and rate of change in longitudinal data

M. C. Donohue, A. C. Gamst, R. G. Thomas, R. Xu, L. Beckett, R. C. Petersen, M. W. Weiner, P. Aisen

Research output: Contribution to journalArticlepeer-review

19 Scopus citations

Abstract

Randomized, placebo-controlled trials often use time-to-event as the primary endpoint, even when a continuous measure of disease severity is available. We compare the power to detect a treatment effect using either rate of change, as estimated by linear models of longitudinal continuous data, or time-to-event estimated by Cox proportional hazards models. We propose an analytic inflation factor for comparing the two types of analyses assuming that the time-to-event can be expressed as a time-to-threshold of the continuous measure. We conduct simulations based on a publicly available Alzheimer's disease data set in which the time-to-event is algorithmically defined based on a battery of assessments. A Cox proportional hazards model of the time-to-event endpoint is compared to a linear model of a single assessment from the battery. The simulations also explore the impact of baseline covariates in either analysis.

Original languageEnglish (US)
Pages (from-to)685-693
Number of pages9
JournalContemporary Clinical Trials
Volume32
Issue number5
DOIs
StatePublished - Sep 2011

Keywords

  • Linear mixed models
  • Longitudinal data
  • Marginal linear models
  • Power
  • Survival analysis

ASJC Scopus subject areas

  • Pharmacology (medical)

Fingerprint

Dive into the research topics of 'The relative efficiency of time-to-threshold and rate of change in longitudinal data'. Together they form a unique fingerprint.

Cite this