# lmtp

## An Illustrated Guide to TMLE, Part III: Properties, Theory, and Learning More

The is the third and final post in a three-part series to help beginners and/or visual learners understand Targeted Maximum Likelihood Estimation (TMLE). In this section, I discuss more statistical properties of TMLE, offer a brief explanation for the theory behind TMLE, and provide resources for learning more. Properties of TMLE 📈 To reiterate a point from Parts I and II, a main motivation for TMLE is that it allows the use of machine learning algorithms while still yielding asymptotic properties for inference.

## An Illustrated Guide to TMLE, Part II: The Algorithm

The second post of a three-part series to help beginners and/or visual learners understand Targeted Maximum Likelihood Estimation (TMLE). This section walks through the TMLE algorithm for the mean difference in outcomes for a binary treatment and binary outcome. This post is an expansion of a printable “visual guide” available on my Github. I hope it helps analysts who feel out-of-practice reading mathematical notation follow along with the TMLE algorithm.

## A Condensed Key for A Visual Guide to Targeted Maximum Likelihood Estimation (TMLE)

A condensed key for my corresponding TMLE tutorial blog post. Initial set up Estimand of interest: $ATE = \Psi = E_W[\mathrm{E}[Y|A=1,\mathbf{W}] - \mathrm{E}[Y|A=0,\mathbf{W}]]$ Step 1: Estimate the Outcome First, estimate the expected value of the outcome using treatment and confounders as predictors. $Q(A,\mathbf{W}) = \mathrm{E}[Y|A,\mathbf{W}]$ Then use that fit to obtain estimates of the expected outcome under varying three different treatment conditions: