Acute lymphoblastic leukemia (ALL) is the most common pediatric
cancer and advances in its clinical and laboratory biology have grown exponentially over the last few decades. Treatment outcome has improved steadily with over 90% of patients surviving 5 years from initial diagnosis. This success can be attributed in part to the development of a risk stratification approach to identify those subsets of patients with an outstanding outcome that might qualify for a reduction in
therapy associated with fewer short and long term side effects. Likewise, recognition of patients with an inferior prognosis allows for augmentation of
therapy, which has been shown to improve outcome. Among the clinical and
biological variables known to impact prognosis, the kinetics of the reduction in
tumor burden during initial
therapy has emerged as the most important prognostic variable. Specifically, various methods have been used to detect
minimal residual disease (MRD) with flow cytometric and molecular detection of
antigen receptor gene rearrangements being the most common. However, many questions remain as to the optimal timing of these assays, their sensitivity, integration with other variables and role in treatment allocation of various ALL subgroups. Importantly, the emergence of next generation sequencing assays is likely to broaden the use of these assays to track disease evolution. This review will discuss the
biological basis for utilizing MRD in risk assessment, the technical approaches and limitations of MRD detection and its emerging applications.