Department of Biostatistics

People

Prospective Students

Academics

Research

News & Events

Calendar

Consulting

Employment Opportunities

Resource Quicklinks

Computing Environment

Contact

 


THESIS DEFENSE ABSTRACT

Complex Distributions, Hmmmm... Hierarchical Mixtures of Marginalized Multilevel Models

 Michael Griswold, PhD Candidate, Johns Hopkins Department of Biostatistics

"Complex Distributions" exhibit characteristics such as skewness, multiple modes, and point masses. Conventional models that do not account for these complexities fail to describe the data well, possibly leading to biased parameter estimates and/or erroneous inferences. Health care expenditure data is a common example where complex distributions can complicate analyses. The primary approach we take is to decompose a complex distribution into simpler, sub-component distributions, a technique known as mixture modelling.

Clustered observations exhibiting complex distributional characteristics offer additional challenges for understanding the systems generating these features. We develop methods to visualize clustered responses arising from mixtures of multiple underlying components and apply our methods to a recently proposed joint model that ties clustered categorical and continuous responses together via random effects.

Any clustered data analysis is characterized by the need to describe systematic variation in a mean model and cluster-dependent random variation in an association model. Marginalized multilevel models embrace the robustness and interpretations of a marginal mean model, while retaining the likelihood inference capabilities of a conditional association model. There has been a gap in the practical application of these models arising from a lack of readily available estimation procedures. We show that marginalized models may be formulated through conditional specifications to facilitate estimation with mixed model computational solutions already in place.

Hierarchical mixtures-of-experts models (HMEM) are a flexible class of mixture model where each mixture component within the hierarchical structure can itself be a mixture model. We develop a clustered data extension of hierarchical mixtures-of-experts models using random effects to account for associations within clusters. We term these models hierarchical mixtures of random-effects models (HMREM). We also provide a marginalized version of the HMREM which we term hierarchical mixtures of marginalized multilevel models (HMMMM). When subject-specific inferences are of direct interest in longitudinal data with complex distributions, HMREMs may be used to obtain parameters that are dependent on specified values of the conjectured latent effects. Alternatively, when population-averaged inferences are of interest, HMMMMs may be used to obtain parameters that describe directly observable group contrasts. When appropriate, both marginal and conditional parameters may be presented, allowing inferences to be drawn on the aspect of central scientific interest.
 


 
Return to
Upcoming Events List | Return to Home Page
© 2004, The Johns Hopkins University. All rights reserved.
web policies, 615 N. Wolfe Street, Batimore, MD 21205-2179, 410-955-5000