Selected article for: "high score and large number"

Author: Jacob, Daniel
Title: Variable Selection for Causal Inference via Outcome-Adaptive Random Forest
  • Cord-id: drxp5h96
  • Document date: 2021_9_9
  • ID: drxp5h96
    Snippet: Estimating a causal effect from observational data can be biased if we do not control for self-selection. This selection is based on confounding variables that affect the treatment assignment and the outcome. Propensity score methods aim to correct for confounding. However, not all covariates are confounders. We propose the outcome-adaptive random forest (OARF) that only includes desirable variables for estimating the propensity score to decrease bias and variance. Our approach works in high-dim
    Document: Estimating a causal effect from observational data can be biased if we do not control for self-selection. This selection is based on confounding variables that affect the treatment assignment and the outcome. Propensity score methods aim to correct for confounding. However, not all covariates are confounders. We propose the outcome-adaptive random forest (OARF) that only includes desirable variables for estimating the propensity score to decrease bias and variance. Our approach works in high-dimensional datasets and if the outcome and propensity score model are non-linear and potentially complicated. The OARF excludes covariates that are not associated with the outcome, even in the presence of a large number of spurious variables. Simulation results suggest that the OARF produces unbiased estimates, has a smaller variance and is superior in variable selection compared to other approaches. The results from two empirical examples, the effect of right heart catheterization on mortality and the effect of maternal smoking during pregnancy on birth weight, show comparable treatment effects to previous findings but tighter confidence intervals and more plausible selected variables.

    Search related documents:
    Co phrase search for related documents
    • absolute standardized and logistic regression: 1, 2, 3
    • additional approach and logistic regression: 1
    • additional approach and machine learning: 1, 2, 3, 4, 5
    • logistic regression and low birth weight: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22
    • logistic regression and machine learning: 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25
    • logit model and machine learning: 1, 2