diff --git a/doc/api/datasets.rst b/doc/api/datasets.rst index a1ead85a..29b58c96 100644 --- a/doc/api/datasets.rst +++ b/doc/api/datasets.rst @@ -32,6 +32,7 @@ Dataset Generators plm.datasets.make_plr_CCDDHNR2018 plm.datasets.make_plr_turrell2018 plm.datasets.make_lplr_LZZ2020 + plm.datasets.make_plpr_CP2025 plm.datasets.make_pliv_CHS2015 plm.datasets.make_pliv_multiway_cluster_CKMS2021 plm.datasets.make_confounded_plr_data diff --git a/doc/api/dml_models.rst b/doc/api/dml_models.rst index 77e1104a..450eded7 100644 --- a/doc/api/dml_models.rst +++ b/doc/api/dml_models.rst @@ -17,6 +17,7 @@ doubleml.plm DoubleMLPLR DoubleMLLPLR + DoubleMLPLPR DoubleMLPLIV diff --git a/doc/conf.py b/doc/conf.py index b53df08d..bde22ea4 100644 --- a/doc/conf.py +++ b/doc/conf.py @@ -275,7 +275,9 @@ # Valid DOI; Causes 403 Client Error: Forbidden for url:... "https://doi.org/10.3982/ECTA15732", # Valid DOI; Causes 403 Client Error: Forbidden for url:... - "https://doi.org/10.1093/ectj/utab019" + "https://doi.org/10.1093/ectj/utab019", + # Valid DOI; Causes 403 Client Error: Forbidden for url:... + "https://doi.org/10.1093/ectj/utaf011" ] # To execute R code via jupyter-execute one needs to install the R kernel for jupyter diff --git a/doc/examples/index.rst b/doc/examples/index.rst index 03f4e713..394134d3 100644 --- a/doc/examples/index.rst +++ b/doc/examples/index.rst @@ -23,6 +23,7 @@ General Examples py_double_ml_apo.ipynb py_double_ml_irm_vs_apo.ipynb py_double_ml_lplr.ipynb + py_double_ml_plpr.ipynb py_double_ml_ssm.ipynb learners/py_optuna.ipynb learners/py_learner.ipynb diff --git a/doc/examples/py_double_ml_plpr.ipynb b/doc/examples/py_double_ml_plpr.ipynb new file mode 100644 index 00000000..9db17da7 --- /dev/null +++ b/doc/examples/py_double_ml_plpr.ipynb @@ -0,0 +1,3969 @@ +{ + "cells": [ + { + "attachments": {}, + "cell_type": "markdown", + "metadata": {}, + "source": [ + "# Python: Static Panel Models with Fixed Effects" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "In this example, we illustrate how the [DoubleML](https://docs.doubleml.org/stable/index.html) package can be used to estimate treatment effects for static panel models with fixed effects in a partially linear panel regression [DoubleMLPLPR](http://docs.doubleml.org/stable/guide/models.html#partially-linear-panel-regression-model-plpr) model. The model is based on [Clarke and Polselli (2025)](https://doi.org/10.1093/ectj/utaf011)." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "metadata": {}, + "outputs": [], + "source": [ + "import optuna\n", + "import numpy as np\n", + "import pandas as pd\n", + "import matplotlib.pyplot as plt\n", + "import seaborn as sns\n", + "\n", + "from sklearn.base import clone\n", + "from sklearn.preprocessing import StandardScaler\n", + "from sklearn.preprocessing import PolynomialFeatures\n", + "from sklearn.compose import ColumnTransformer\n", + "from sklearn.pipeline import make_pipeline\n", + "from sklearn.base import BaseEstimator, TransformerMixin\n", + "from sklearn.linear_model import LassoCV\n", + "from lightgbm import LGBMRegressor\n", + "\n", + "from doubleml.data import DoubleMLPanelData\n", + "from doubleml.plm.datasets import make_plpr_CP2025\n", + "from doubleml import DoubleMLPLPR\n", + "\n", + "import warnings\n", + "warnings.filterwarnings(\"ignore\")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Data\n", + "\n", + "We will use the implemented data generating process [make_plpr_CP2025](https://docs.doubleml.org/stable/api/generated/doubleml.plm.datasets.make_plpr_CP2025.html) to generate data similar to the simulation in [Clarke and Polselli (2025)](https://doi.org/10.1093/ectj/utaf011). For exposition, we use the simple linear `dgp_type=\"dgp1\"`, with 150 units, 10 time periods per unit, and a true treatment effect of `theta=0.5`.\n", + "\n", + "We set `time_type=\"int\"` such that the time variable values will be integers. It's also possible to use `\"float\"` or `\"datetime\"` time variables with [DoubleMLPLPR](http://docs.doubleml.org/stable/guide/models.html#partially-linear-panel-regression-model-plpr)." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
idtimeydx1x2x3x4x5x6...x21x22x23x24x25x26x27x28x29x30
011-1.2904790.9083071.710715-1.853675-1.4739071.366514-0.3220242.944020...-1.828362-3.010547-0.840202-3.0851591.169952-0.954107-3.925198-0.779510-0.4307001.004298
112-2.850646-1.316777-0.3250434.178599-1.159857-0.139527-0.230115-0.631976...-0.724172-0.421045-2.012480-2.081784-2.734123-0.879470-2.1412184.598401-4.222797-2.523024
213-4.338502-1.756120-0.8975901.505972-0.9251891.511500-2.2065610.132579...1.766109-2.252858-2.919826-1.974066-0.7738810.244633-1.7275501.6654670.562291-1.553616
314-2.7132360.9348661.9878492.596228-0.220666-0.480717-3.966273-0.911226...0.8561240.727759-0.5015791.0775042.268052-3.8214221.629055-0.220834-1.185091-5.462884
415-5.782997-4.357881-3.0865593.796975-1.539641-2.425617-1.020599-1.666200...2.617215-1.231835-0.8913500.2469812.4896420.319735-2.8103660.5858263.6437490.147147
\n", + "

5 rows × 34 columns

\n", + "
" + ], + "text/plain": [ + " id time y d x1 x2 x3 x4 \\\n", + "0 1 1 -1.290479 0.908307 1.710715 -1.853675 -1.473907 1.366514 \n", + "1 1 2 -2.850646 -1.316777 -0.325043 4.178599 -1.159857 -0.139527 \n", + "2 1 3 -4.338502 -1.756120 -0.897590 1.505972 -0.925189 1.511500 \n", + "3 1 4 -2.713236 0.934866 1.987849 2.596228 -0.220666 -0.480717 \n", + "4 1 5 -5.782997 -4.357881 -3.086559 3.796975 -1.539641 -2.425617 \n", + "\n", + " x5 x6 ... x21 x22 x23 x24 x25 \\\n", + "0 -0.322024 2.944020 ... -1.828362 -3.010547 -0.840202 -3.085159 1.169952 \n", + "1 -0.230115 -0.631976 ... -0.724172 -0.421045 -2.012480 -2.081784 -2.734123 \n", + "2 -2.206561 0.132579 ... 1.766109 -2.252858 -2.919826 -1.974066 -0.773881 \n", + "3 -3.966273 -0.911226 ... 0.856124 0.727759 -0.501579 1.077504 2.268052 \n", + "4 -1.020599 -1.666200 ... 2.617215 -1.231835 -0.891350 0.246981 2.489642 \n", + "\n", + " x26 x27 x28 x29 x30 \n", + "0 -0.954107 -3.925198 -0.779510 -0.430700 1.004298 \n", + "1 -0.879470 -2.141218 4.598401 -4.222797 -2.523024 \n", + "2 0.244633 -1.727550 1.665467 0.562291 -1.553616 \n", + "3 -3.821422 1.629055 -0.220834 -1.185091 -5.462884 \n", + "4 0.319735 -2.810366 0.585826 3.643749 0.147147 \n", + "\n", + "[5 rows x 34 columns]" + ] + }, + "execution_count": 3, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "np.random.seed(123)\n", + "data = make_plpr_CP2025(num_id=150, num_t=10, dim_x=30, theta=0.5, dgp_type=\"dgp1\", time_type=\"int\")\n", + "data.head()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To create a corresponding [DoubleMLPanelData](https://docs.doubleml.org/stable/api/generated/doubleml.data.DoubleMLPanelData.html) object, we need to set `static_panel=True` and specify `id_col` and `time_col` columns." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "data_obj = DoubleMLPanelData(data, y_col=\"y\", d_cols=\"d\", t_col=\"time\", id_col=\"id\", static_panel=True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "## Model" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The partially linear panel regression (PLPR) model extends the partially linear model to panel data by introducing fixed effects $\\alpha_i^*$.\n", + "\n", + "The PLPR model takes the form\n", + "\n", + "\\begin{align*}\n", + " Y_{it} &= \\theta_0 D_{it} + g_1(X_{it}) + \\alpha_i^* + U_{it}, \\\\\n", + " D_{it} &= m_1(X_{it}) + \\gamma_i + V_{it},\n", + "\\end{align*}\n", + "\n", + "where\n", + "- $Y_{it}$ outcome, $D_{it}$ treatment, $X_{it}$ covariates, $\\theta_0$ causal treatment effect\n", + "- $g_1$ and $m_1$ nuisance functions\n", + "- $\\alpha_i^*$, $\\gamma_i$ unobserved individual heterogeneity, correlated with covariates\n", + "- $U_{it}$, $V_{it}$ error terms\n", + "\n", + "Further note $\\mathbb{E}[U_{it} \\mid D_{it}, X_{it}, \\alpha_i^*] = 0$ and $\\mathbb{E}[V_{it} \\mid X_{it}, \\gamma_i]=0$, but $\\mathbb{E}[\\alpha_i^* \\mid D_{it}, X_{it}] \\neq 0$.\n", + "\n", + "Alternatively we can write the partialling-out PLPR as \n", + "\n", + "\\begin{align*}\n", + " Y_{it} &= \\theta_0 V_{it} + \\ell_1(X_{it}) + \\alpha_i + U_{it}, \\\\\n", + " V_{it} &= D_{it} - m_1(X_{it}) - \\gamma_i,\n", + "\\end{align*}\n", + "\n", + "with nuisance function $\\ell_1$ and fixed effect $\\alpha_i$." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Assumptions\n", + "\n", + "Define $\\xi_i$ as time-invariant heterogeneity terms influencing outcome and treatment and $L_{t-1}(W_i) = \\{ W_{i1}, \\dots, W_{it-1} \\}$ as lags of a random variable $W_{it}$ at wave $t$.\n", + "\n", + "- No feedback to predictors\n", + "$$ X_{it} \\perp L_{t-1} (Y_i, D_i) \\mid L_{t-1} (X_i), \\xi_i $$\n", + "- Static panel\n", + "$$ Y_{it}, D_{it} \\perp L_{t-1} (Y_i, X_i, D_i) \\mid X_{it}, \\xi_i $$\n", + "- Selection on observables and omitted time-invariant variables\n", + "$$ Y_{it} (.) \\perp D_{it} \\mid X_{it}, \\xi_i $$\n", + "- Homogeneity and linearity of the treatment effect\n", + "$$ \\mathbb{E} [Y_{it}(d) - Y_{it}(0) \\mid X_{it}, \\xi_i] = d \\theta_0 $$\n", + "- Additive Separability\n", + "\\begin{align*}\n", + "\\mathbb{E} [Y_{it}(0) \\mid X_{it}, \\xi_i] &= g_1(X_{it}) + \\alpha^*_i \\quad \\text{where } \\alpha^*_i = \\alpha^*(\\xi_i), \\\\\n", + "\\mathbb{E} [D_{it} \\mid X_{it}, \\xi_i] &= m_1(X_{it}) + \\gamma_i \\quad \\text{where } \\gamma_i = \\gamma(\\xi_i)\n", + "\\end{align*} \n", + "\n", + "For more information, see [Clarke and Polselli (2025)](https://doi.org/10.1093/ectj/utaf011)." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "To estimate the causal effect, we can create a [DoubleMLPLPR](https://docs.doubleml.org/stable/api/generated/doubleml.plm.DoubleMLPLPR.html) object. \n", + "\n", + "The model described in [Clarke and Polselli (2025)](https://doi.org/10.1093/ectj/utaf011) uses block-k-fold cross-fitting, where the entire time series of the sampled unit is allocated to one fold to allow for possible serial correlation within each unit which is common with panel data. Furthermore, cluster robust standard error are employed. [DoubleMLPLPR](http://docs.doubleml.org/stable/guide/models.html#partially-linear-panel-regression-model-plpr) implements both aspects by using `id_col` as the cluster variable.\n", + "\n", + "[Clarke and Polselli (2025)](https://doi.org/10.1093/ectj/utaf011) describes multiple estimation approaches, which can be set with the `approach` parameter. Depending on the type of `approach`, different data transformations are performed along the way." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Approaches\n", + "\n", + "#### Correlated Random Effect\n", + "\n", + "`cre_general` approach:\n", + "\n", + "- Learning $\\ell_1$ from $\\{ Y_{it}, X_{it}, \\bar{X}_i : t=1,\\dots, T \\}_{i=1}^N$,\n", + "- First learning $\\tilde{m}_1$ from $\\{ D_{it}, X_{it}, \\bar{X}_i : t=1,\\dots, T \\}_{i=1}^N$, with predictions $\\hat{m}_{1,it} = \\tilde{m}_1 (X_{it}, \\bar{X}_i) $\n", + " - Calculate $\\hat{\\bar{m}}_i = T^{-1} \\sum_{t=1}^T \\hat{m}_{1,it} $,\n", + " - Calculate final nuisance part as $ \\hat{m}^*_1 (X_{it}, \\bar{X}_i, \\bar{D}_i) = \\hat{m}_{1,it} + \\bar{D}_i - \\hat{\\bar{m}}_i $. " + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [], + "source": [ + "learner = LassoCV()\n", + "ml_l = clone(learner)\n", + "ml_m = clone(learner)\n", + "\n", + "dml_plpr_cre_general = DoubleMLPLPR(data_obj, ml_l=ml_l, ml_m=ml_m, approach=\"cre_general\", n_folds=5)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can look at the the transformed data using the `data_transform` property after the [DoubleMLPLPR](https://docs.doubleml.org/stable/api/generated/doubleml.plm.DoubleMLPLPR.html) object was created." + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
idtimeydx1x2x3x4x5x6...x21_meanx22_meanx23_meanx24_meanx25_meanx26_meanx27_meanx28_meanx29_meanx30_mean
011-1.2904790.9083071.710715-1.853675-1.4739071.366514-0.3220242.944020...1.24018-0.52821-0.7341450.2274941.1647630.412979-1.2726080.459816-0.829863-1.145189
112-2.850646-1.316777-0.3250434.178599-1.159857-0.139527-0.230115-0.631976...1.24018-0.52821-0.7341450.2274941.1647630.412979-1.2726080.459816-0.829863-1.145189
213-4.338502-1.756120-0.8975901.505972-0.9251891.511500-2.2065610.132579...1.24018-0.52821-0.7341450.2274941.1647630.412979-1.2726080.459816-0.829863-1.145189
314-2.7132360.9348661.9878492.596228-0.220666-0.480717-3.966273-0.911226...1.24018-0.52821-0.7341450.2274941.1647630.412979-1.2726080.459816-0.829863-1.145189
415-5.782997-4.357881-3.0865593.796975-1.539641-2.425617-1.020599-1.666200...1.24018-0.52821-0.7341450.2274941.1647630.412979-1.2726080.459816-0.829863-1.145189
\n", + "

5 rows × 64 columns

\n", + "
" + ], + "text/plain": [ + " id time y d x1 x2 x3 x4 \\\n", + "0 1 1 -1.290479 0.908307 1.710715 -1.853675 -1.473907 1.366514 \n", + "1 1 2 -2.850646 -1.316777 -0.325043 4.178599 -1.159857 -0.139527 \n", + "2 1 3 -4.338502 -1.756120 -0.897590 1.505972 -0.925189 1.511500 \n", + "3 1 4 -2.713236 0.934866 1.987849 2.596228 -0.220666 -0.480717 \n", + "4 1 5 -5.782997 -4.357881 -3.086559 3.796975 -1.539641 -2.425617 \n", + "\n", + " x5 x6 ... x21_mean x22_mean x23_mean x24_mean x25_mean \\\n", + "0 -0.322024 2.944020 ... 1.24018 -0.52821 -0.734145 0.227494 1.164763 \n", + "1 -0.230115 -0.631976 ... 1.24018 -0.52821 -0.734145 0.227494 1.164763 \n", + "2 -2.206561 0.132579 ... 1.24018 -0.52821 -0.734145 0.227494 1.164763 \n", + "3 -3.966273 -0.911226 ... 1.24018 -0.52821 -0.734145 0.227494 1.164763 \n", + "4 -1.020599 -1.666200 ... 1.24018 -0.52821 -0.734145 0.227494 1.164763 \n", + "\n", + " x26_mean x27_mean x28_mean x29_mean x30_mean \n", + "0 0.412979 -1.272608 0.459816 -0.829863 -1.145189 \n", + "1 0.412979 -1.272608 0.459816 -0.829863 -1.145189 \n", + "2 0.412979 -1.272608 0.459816 -0.829863 -1.145189 \n", + "3 0.412979 -1.272608 0.459816 -0.829863 -1.145189 \n", + "4 0.412979 -1.272608 0.459816 -0.829863 -1.145189 \n", + "\n", + "[5 rows x 64 columns]" + ] + }, + "execution_count": 14, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "dml_plpr_cre_general.data_transform.data.head()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can see that the covariates inlcude the original $X_{it}$ and additionally the unit mean $\\bar{X}_i$." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "After fitting the model, we can print the [DoubleMLPLPR](https://docs.doubleml.org/stable/api/generated/doubleml.plm.DoubleMLPLPR.html) object.\n", + "\n", + "The Data Summary corresponds to the transformed data. Additional Information at the end also includes a Pre-Transformation Data Summary." + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "================== DoubleMLPLPR Object ==================\n", + "\n", + "------------------ Data Summary ------------------\n", + "Outcome variable: y\n", + "Treatment variable(s): ['d']\n", + "Covariates: ['x1', 'x2', 'x3', 'x4', 'x5', 'x6', 'x7', 'x8', 'x9', 'x10', 'x11', 'x12', 'x13', 'x14', 'x15', 'x16', 'x17', 'x18', 'x19', 'x20', 'x21', 'x22', 'x23', 'x24', 'x25', 'x26', 'x27', 'x28', 'x29', 'x30', 'x1_mean', 'x2_mean', 'x3_mean', 'x4_mean', 'x5_mean', 'x6_mean', 'x7_mean', 'x8_mean', 'x9_mean', 'x10_mean', 'x11_mean', 'x12_mean', 'x13_mean', 'x14_mean', 'x15_mean', 'x16_mean', 'x17_mean', 'x18_mean', 'x19_mean', 'x20_mean', 'x21_mean', 'x22_mean', 'x23_mean', 'x24_mean', 'x25_mean', 'x26_mean', 'x27_mean', 'x28_mean', 'x29_mean', 'x30_mean']\n", + "Instrument variable(s): None\n", + "Time variable: time\n", + "Id variable: id\n", + "Static panel data: True\n", + "No. Unique Ids: 150\n", + "No. Observations: 1500\n", + "\n", + "------------------ Score & Algorithm ------------------\n", + "Score function: partialling out\n", + "Static panel model approach: cre_general\n", + "\n", + "------------------ Machine Learner ------------------\n", + "Learner ml_l: LassoCV()\n", + "Learner ml_m: LassoCV()\n", + "Out-of-sample Performance:\n", + "Regression:\n", + "Learner ml_l RMSE: [[1.8336022]]\n", + "Learner ml_m RMSE: [[0.99534683]]\n", + "\n", + "------------------ Resampling ------------------\n", + "No. folds per cluster: 5\n", + "No. folds: 5\n", + "No. repeated sample splits: 1\n", + "\n", + "------------------ Fit Summary ------------------\n", + " coef std err t P>|t| 2.5 % 97.5 %\n", + "d 0.490173 0.026161 18.737077 2.468215e-78 0.438899 0.541447\n", + "\n", + "------------------ Additional Information -------------\n", + "Cluster variable(s): ['id']\n", + "\n", + "Pre-Transformation Data Summary: \n", + "Outcome variable: y\n", + "Treatment variable(s): ['d']\n", + "Covariates: ['x1', 'x2', 'x3', 'x4', 'x5', 'x6', 'x7', 'x8', 'x9', 'x10', 'x11', 'x12', 'x13', 'x14', 'x15', 'x16', 'x17', 'x18', 'x19', 'x20', 'x21', 'x22', 'x23', 'x24', 'x25', 'x26', 'x27', 'x28', 'x29', 'x30']\n", + "No. Observations: 1500\n", + "\n" + ] + } + ], + "source": [ + "dml_plpr_cre_general.fit()\n", + "print(dml_plpr_cre_general)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`cre_normal` approach:\n", + "\n", + "Under the assumption that the conditional distribution $ D_{i1}, \\dots, D_{iT} \\mid X_{i1}, \\dots X_{iT} $ is multivariate normal (see [Clarke and Polselli (2025)](https://doi.org/10.1093/ectj/utaf011) for further details):\n", + "- Learn $\\ell_1$ from $\\{ Y_{it}, X_{it}, \\bar{X}_i : t=1,\\dots, T \\}_{i=1}^N$,\n", + "- Learn $m^*_{1}$ from $\\{ D_{it}, X_{it}, \\bar{X}_i, \\bar{D}_i: t=1,\\dots, T \\}_{i=1}^N$." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + " coef std err t P>|t| 2.5 % 97.5 %\n", + "d 0.505807 0.027168 18.617927 2.299468e-77 0.45256 0.559055\n" + ] + } + ], + "source": [ + "dml_plpr_cre_normal = DoubleMLPLPR(data_obj, ml_l=ml_l, ml_m=ml_m, approach=\"cre_normal\", n_folds=5)\n", + "dml_plpr_cre_normal.fit()\n", + "print(dml_plpr_cre_normal.summary)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The `cre_normal` approach uses additionally inlcudes $\\bar{D}_i$ in the treatment nuisance estimation. The corresponding data can be assesses by the `d_mean` property." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "array([[-0.94278854],\n", + " [-0.94278854],\n", + " [-0.94278854],\n", + " ...,\n", + " [ 0.15320478],\n", + " [ 0.15320478],\n", + " [ 0.15320478]], shape=(1500, 1))" + ] + }, + "execution_count": 9, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "dml_plpr_cre_normal.d_mean" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "#### Transformation Approaches\n", + "\n", + "`fd_exact` approach:\n", + "\n", + "Consider FD transformation $Q(Y_{it})= Y_{it} - Y_{it-1} $, under the assumptions from above, [Clarke and Polselli (2025)](https://doi.org/10.1093/ectj/utaf011) show that $\\mathbb{E}[Y_{it}-Y_{it-1} | X_{it-1},X_{it}] =\\Delta \\ell_1 (X_{it-1}, X_{it})$ and $\\mathbb{E}[D_{it}-D_{it-1} | X_{it-1},X_{it}] =\\Delta m_1 (X_{it-1}, X_{it})$. Therefore, the transformed nuisance function can be learnt as\n", + "\n", + "- $ \\Delta \\ell_1 (X_{it-1}, X_{it}) $ from $ \\{ Y_{it}-Y_{it-1}, X_{it-1}, X_{it} : t=2, \\dots , T \\}_{i=1}^N $,\n", + "- $ \\Delta m_1 (X_{it-1}, X_{it}) $ from $ \\{ D_{it}-D_{it-1}, X_{it-1}, X_{it} : t=2, \\dots , T \\}_{i=1}^N $.\n" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
idtimey_diffd_diffx1x2x3x4x5x6...x21_lagx22_lagx23_lagx24_lagx25_lagx26_lagx27_lagx28_lagx29_lagx30_lag
012-1.560167-2.225084-0.3250434.178599-1.159857-0.139527-0.230115-0.631976...-1.828362-3.010547-0.840202-3.0851591.169952-0.954107-3.925198-0.779510-0.4307001.004298
113-1.487856-0.439343-0.8975901.505972-0.9251891.511500-2.2065610.132579...-0.724172-0.421045-2.012480-2.081784-2.734123-0.879470-2.1412184.598401-4.222797-2.523024
2141.6252662.6909861.9878492.596228-0.220666-0.480717-3.966273-0.911226...1.766109-2.252858-2.919826-1.974066-0.7738810.244633-1.7275501.6654670.562291-1.553616
315-3.069761-5.292747-3.0865593.796975-1.539641-2.425617-1.020599-1.666200...0.8561240.727759-0.5015791.0775042.268052-3.8214221.629055-0.220834-1.185091-5.462884
416-1.0947990.5510510.289315-2.823134-3.137179-1.425923-0.7301160.232687...2.617215-1.231835-0.8913500.2469812.4896420.319735-2.8103660.5858263.6437490.147147
\n", + "

5 rows × 64 columns

\n", + "
" + ], + "text/plain": [ + " id time y_diff d_diff x1 x2 x3 x4 \\\n", + "0 1 2 -1.560167 -2.225084 -0.325043 4.178599 -1.159857 -0.139527 \n", + "1 1 3 -1.487856 -0.439343 -0.897590 1.505972 -0.925189 1.511500 \n", + "2 1 4 1.625266 2.690986 1.987849 2.596228 -0.220666 -0.480717 \n", + "3 1 5 -3.069761 -5.292747 -3.086559 3.796975 -1.539641 -2.425617 \n", + "4 1 6 -1.094799 0.551051 0.289315 -2.823134 -3.137179 -1.425923 \n", + "\n", + " x5 x6 ... x21_lag x22_lag x23_lag x24_lag x25_lag \\\n", + "0 -0.230115 -0.631976 ... -1.828362 -3.010547 -0.840202 -3.085159 1.169952 \n", + "1 -2.206561 0.132579 ... -0.724172 -0.421045 -2.012480 -2.081784 -2.734123 \n", + "2 -3.966273 -0.911226 ... 1.766109 -2.252858 -2.919826 -1.974066 -0.773881 \n", + "3 -1.020599 -1.666200 ... 0.856124 0.727759 -0.501579 1.077504 2.268052 \n", + "4 -0.730116 0.232687 ... 2.617215 -1.231835 -0.891350 0.246981 2.489642 \n", + "\n", + " x26_lag x27_lag x28_lag x29_lag x30_lag \n", + "0 -0.954107 -3.925198 -0.779510 -0.430700 1.004298 \n", + "1 -0.879470 -2.141218 4.598401 -4.222797 -2.523024 \n", + "2 0.244633 -1.727550 1.665467 0.562291 -1.553616 \n", + "3 -3.821422 1.629055 -0.220834 -1.185091 -5.462884 \n", + "4 0.319735 -2.810366 0.585826 3.643749 0.147147 \n", + "\n", + "[5 rows x 64 columns]" + ] + }, + "execution_count": 11, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "dml_plpr_fd_exact = DoubleMLPLPR(data_obj, ml_l=ml_l, ml_m=ml_m, approach=\"fd_exact\", n_folds=5)\n", + "dml_plpr_fd_exact.data_transform.data.head()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We see that the outcome and treatment variables are now labeled `y_diff` and `d_diff` to indicate the first-difference transformation. Moreover, lagged covariates $X_{it-1}$ are added and rows for the first time period are dropped." + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + " coef std err t P>|t| 2.5 % 97.5 %\n", + "d_diff 0.511822 0.032746 15.630162 4.536510e-55 0.447641 0.576002\n" + ] + } + ], + "source": [ + "dml_plpr_fd_exact.fit()\n", + "print(dml_plpr_fd_exact.summary)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "`wg_approx` approach:\n", + "\n", + "For WG transformation $Q(X_{it})= X_{it} - \\bar{X}_{i} $, where $ \\bar{X}_{i} = T^{-1} \\sum_{t=1}^T X_{it} $. Approximate the model as\n", + "\\begin{align*}\n", + " Q(Y_{it}) &\\approx \\theta_0 Q(D_{it}) + g_1 (Q(X_{it})) + Q(U_{it}), \\\\\n", + " Q(D_{it}) &\\approx m_1 (Q(X_{it})) + Q(V_{it}).\n", + "\\end{align*}\n", + "\n", + "Similarly for the partialling-out PLPR\n", + "\n", + "$$\n", + "Q(Y_{it}) \\approx \\theta_0 Q(V_{it}) + \\ell_1 (Q(X_{it})) + Q(U_{it}).\n", + "$$\n", + "\n", + "- $\\ell_1$ can be learnt from transformed data $ \\{ Q(Y_{it}), Q(X_{it}) : t=1,\\dots,T \\}_{i=1}^N $,\n", + "- $m_1$ can be learnt from transformed data $ \\{ Q(D_{it}), Q(X_{it}) : t=1,\\dots,T \\}_{i=1}^N $." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
idtimey_demeand_demeanx1_demeanx2_demeanx3_demeanx4_demeanx5_demeanx6_demean...x21_demeanx22_demeanx23_demeanx24_demeanx25_demeanx26_demeanx27_demeanx28_demeanx29_demeanx30_demean
0111.5435711.7606602.207607-2.039516-0.6428471.0142040.3841661.826013...-3.162933-2.4759420.000728-3.3442190.082829-1.351303-2.670511-1.2753440.4075962.187878
112-0.016596-0.4644240.1718493.992759-0.328797-0.4918370.476074-1.749982...-2.0587430.113560-1.171550-2.340844-3.821246-1.276666-0.8865314.102567-3.384501-1.339444
213-1.504452-0.903767-0.4006981.320131-0.0941291.159190-1.500371-0.985427...0.431538-1.718253-2.078895-2.233126-1.861004-0.152563-0.4728631.1696321.400587-0.370036
3140.1208141.7872192.4847412.4103870.610394-0.833027-3.260084-2.029232...-0.4784471.2623640.3393520.8184431.180930-4.2186182.883741-0.716668-0.346795-4.279304
415-2.948947-3.505528-2.5896673.611134-0.708582-2.777927-0.314410-2.784206...1.282645-0.697230-0.050420-0.0120801.402519-0.077461-1.5556790.0899914.4820451.330727
\n", + "

5 rows × 34 columns

\n", + "
" + ], + "text/plain": [ + " id time y_demean d_demean x1_demean x2_demean x3_demean x4_demean \\\n", + "0 1 1 1.543571 1.760660 2.207607 -2.039516 -0.642847 1.014204 \n", + "1 1 2 -0.016596 -0.464424 0.171849 3.992759 -0.328797 -0.491837 \n", + "2 1 3 -1.504452 -0.903767 -0.400698 1.320131 -0.094129 1.159190 \n", + "3 1 4 0.120814 1.787219 2.484741 2.410387 0.610394 -0.833027 \n", + "4 1 5 -2.948947 -3.505528 -2.589667 3.611134 -0.708582 -2.777927 \n", + "\n", + " x5_demean x6_demean ... x21_demean x22_demean x23_demean x24_demean \\\n", + "0 0.384166 1.826013 ... -3.162933 -2.475942 0.000728 -3.344219 \n", + "1 0.476074 -1.749982 ... -2.058743 0.113560 -1.171550 -2.340844 \n", + "2 -1.500371 -0.985427 ... 0.431538 -1.718253 -2.078895 -2.233126 \n", + "3 -3.260084 -2.029232 ... -0.478447 1.262364 0.339352 0.818443 \n", + "4 -0.314410 -2.784206 ... 1.282645 -0.697230 -0.050420 -0.012080 \n", + "\n", + " x25_demean x26_demean x27_demean x28_demean x29_demean x30_demean \n", + "0 0.082829 -1.351303 -2.670511 -1.275344 0.407596 2.187878 \n", + "1 -3.821246 -1.276666 -0.886531 4.102567 -3.384501 -1.339444 \n", + "2 -1.861004 -0.152563 -0.472863 1.169632 1.400587 -0.370036 \n", + "3 1.180930 -4.218618 2.883741 -0.716668 -0.346795 -4.279304 \n", + "4 1.402519 -0.077461 -1.555679 0.089991 4.482045 1.330727 \n", + "\n", + "[5 rows x 34 columns]" + ] + }, + "execution_count": 9, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "dml_plpr_wg_approx = DoubleMLPLPR(data_obj, ml_l=ml_l, ml_m=ml_m, approach=\"wg_approx\", n_folds=5)\n", + "dml_plpr_wg_approx.data_transform.data.head()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We see that the outcome, treatment and covariate variables are now labeled `y_deman`, `d_deman`, `xi_deman` to indicate the within-group transformations." + ] + }, + { + "cell_type": "code", + "execution_count": 10, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + " coef std err t P>|t| 2.5 % 97.5 %\n", + "d_demean 0.495323 0.025841 19.167824 6.872435e-82 0.444675 0.545972\n" + ] + } + ], + "source": [ + "dml_plpr_wg_approx.fit()\n", + "print(dml_plpr_wg_approx.summary)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For the simple linear data generating process `dgp_type=\"dgp1\"`, we can see that all approaches lead to estimated close the true effect of `theta=0.5`." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "The `data_original` property additionally includes the original data before any transformation was applied." + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
idtimeydx1x2x3x4x5x6...x21x22x23x24x25x26x27x28x29x30
011-1.2904790.9083071.710715-1.853675-1.4739071.366514-0.3220242.944020...-1.828362-3.010547-0.840202-3.0851591.169952-0.954107-3.925198-0.779510-0.4307001.004298
112-2.850646-1.316777-0.3250434.178599-1.159857-0.139527-0.230115-0.631976...-0.724172-0.421045-2.012480-2.081784-2.734123-0.879470-2.1412184.598401-4.222797-2.523024
213-4.338502-1.756120-0.8975901.505972-0.9251891.511500-2.2065610.132579...1.766109-2.252858-2.919826-1.974066-0.7738810.244633-1.7275501.6654670.562291-1.553616
314-2.7132360.9348661.9878492.596228-0.220666-0.480717-3.966273-0.911226...0.8561240.727759-0.5015791.0775042.268052-3.8214221.629055-0.220834-1.185091-5.462884
415-5.782997-4.357881-3.0865593.796975-1.539641-2.425617-1.020599-1.666200...2.617215-1.231835-0.8913500.2469812.4896420.319735-2.8103660.5858263.6437490.147147
\n", + "

5 rows × 34 columns

\n", + "
" + ], + "text/plain": [ + " id time y d x1 x2 x3 x4 \\\n", + "0 1 1 -1.290479 0.908307 1.710715 -1.853675 -1.473907 1.366514 \n", + "1 1 2 -2.850646 -1.316777 -0.325043 4.178599 -1.159857 -0.139527 \n", + "2 1 3 -4.338502 -1.756120 -0.897590 1.505972 -0.925189 1.511500 \n", + "3 1 4 -2.713236 0.934866 1.987849 2.596228 -0.220666 -0.480717 \n", + "4 1 5 -5.782997 -4.357881 -3.086559 3.796975 -1.539641 -2.425617 \n", + "\n", + " x5 x6 ... x21 x22 x23 x24 x25 \\\n", + "0 -0.322024 2.944020 ... -1.828362 -3.010547 -0.840202 -3.085159 1.169952 \n", + "1 -0.230115 -0.631976 ... -0.724172 -0.421045 -2.012480 -2.081784 -2.734123 \n", + "2 -2.206561 0.132579 ... 1.766109 -2.252858 -2.919826 -1.974066 -0.773881 \n", + "3 -3.966273 -0.911226 ... 0.856124 0.727759 -0.501579 1.077504 2.268052 \n", + "4 -1.020599 -1.666200 ... 2.617215 -1.231835 -0.891350 0.246981 2.489642 \n", + "\n", + " x26 x27 x28 x29 x30 \n", + "0 -0.954107 -3.925198 -0.779510 -0.430700 1.004298 \n", + "1 -0.879470 -2.141218 4.598401 -4.222797 -2.523024 \n", + "2 0.244633 -1.727550 1.665467 0.562291 -1.553616 \n", + "3 -3.821422 1.629055 -0.220834 -1.185091 -5.462884 \n", + "4 0.319735 -2.810366 0.585826 3.643749 0.147147 \n", + "\n", + "[5 rows x 34 columns]" + ] + }, + "execution_count": 11, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "dml_plpr_wg_approx.data_original.data.head()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Feature preprocessing pipelines\n", + "\n", + "We can incorporate preprocessing pipelines. For example, when using Lasso, we may want to include polynomial and interaction terms. Here, we create a class that allows us to include, for example, polynomials of order 3 and interactions between all variables." + ] + }, + { + "cell_type": "code", + "execution_count": 9, + "metadata": {}, + "outputs": [], + "source": [ + "class PolyPlus(BaseEstimator, TransformerMixin):\n", + " \"\"\"PolynomialFeatures(degree=k) and additional terms x_i^(k+1).\"\"\"\n", + "\n", + " def __init__(self, degree=2, interaction_only=False, include_bias=False):\n", + " self.degree = degree\n", + " self.extra_degree = degree + 1\n", + " self.interaction_only = interaction_only\n", + " self.include_bias = include_bias\n", + " self.poly = PolynomialFeatures(degree=degree, interaction_only=interaction_only, include_bias=include_bias)\n", + "\n", + " def fit(self, X, y=None):\n", + " self.poly.fit(X)\n", + " self.n_features_in_ = X.shape[1]\n", + " return self\n", + "\n", + " def transform(self, X):\n", + " X = np.asarray(X)\n", + " X_poly = self.poly.transform(X)\n", + " X_extra = X ** self.extra_degree\n", + " return np.hstack([X_poly, X_extra])\n", + "\n", + " def get_feature_names_out(self, input_features=None):\n", + " input_features = np.array(\n", + " input_features\n", + " if input_features is not None\n", + " else [f\"x{i}\" for i in range(self.n_features_in_)]\n", + " )\n", + " poly_names = self.poly.get_feature_names_out(input_features)\n", + " extra_names = [f\"{name}^{self.extra_degree}\" for name in input_features]\n", + " return np.concatenate([poly_names, extra_names])" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For this example we use the non-linear and discontinuous `dgp_type=\"dgp3\"`, with 30 covariates and a true treatment effect `theta=0.5`." + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "metadata": {}, + "outputs": [], + "source": [ + "dim_x = 30\n", + "theta = 0.5\n", + "\n", + "np.random.seed(123)\n", + "data_dgp3 = make_plpr_CP2025(num_id=500, num_t=10, dim_x=dim_x, theta=theta, dgp_type=\"dgp3\")\n", + "dml_data_dgp3 = DoubleMLPanelData(data_dgp3, y_col=\"y\", d_cols=\"d\", t_col=\"time\", id_col=\"id\", static_panel=True)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can apply the polynomial and intercation transformation for specific sets of covariates. For example, for the `fd_exact` approach, we can apply it to the original $X_{it}$ and lags $X_{it-1}$ seperately using `ColumnTransformer`.\n", + "\n", + "To achieve this, we pass need to pass the corresponding indices for these two sets. [DoubleMLPLPR](http://docs.doubleml.org/stable/guide/models.html#partially-linear-panel-regression-model-plpr) stacks sets $X_{it}$ and $X_{it-1}$ column-wise. Given our example data has 30 covariates, this means that the first 30 features in the nuisance estimation correspond to the original $X_{it}$, and the last 30 correspond to lags $X_{it-1}$. Therefore we define the indices `indices_x` and `indices_x_tr` as below." + ] + }, + { + "cell_type": "code", + "execution_count": 12, + "metadata": {}, + "outputs": [], + "source": [ + "indices_x = [x for x in range(dim_x)]\n", + "indices_x_tr = [x + dim_x for x in indices_x]\n", + "\n", + "preprocessor = ColumnTransformer(\n", + " [\n", + " (\n", + " \"poly_x\",\n", + " PolyPlus(degree=2, include_bias=False, interaction_only=False),\n", + " indices_x,\n", + " ),\n", + " (\n", + " \"poly_x_tr\",\n", + " PolyPlus(degree=2, include_bias=False, interaction_only=False),\n", + " indices_x_tr,\n", + " ),\n", + " ],\n", + " remainder=\"passthrough\",\n", + ")" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "This preprocessor can be applied for approaches `cre_general` and `cre_normal` in the same fashion. In this case the two sets of covariates would be the original $X_{it}$ and the unit mean $\\bar{X}_i$.\n", + "\n", + "**Remark**: Note that we set `remainder=\"passthrough\"` such that all remaining features, not part of `indices_x` and `indices_x_tr`, would not be preprocessed but still included in the nuisance estimation. This is particularly important for the `cre_normal` approach, as $\\bar{D}_i$ is further added to $X_{it}$ and $\\bar{X}_i$ in the treatment nuisance model." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Finally, we can create the learner using a pipeline and fit the model." + ] + }, + { + "cell_type": "code", + "execution_count": 49, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
Pipeline(steps=[('columntransformer',\n",
+       "                 ColumnTransformer(remainder='passthrough',\n",
+       "                                   transformers=[('poly_x', PolyPlus(),\n",
+       "                                                  [0, 1, 2, 3, 4, 5, 6, 7, 8, 9,\n",
+       "                                                   10, 11, 12, 13, 14, 15, 16,\n",
+       "                                                   17, 18, 19, 20, 21, 22, 23,\n",
+       "                                                   24, 25, 26, 27, 28, 29]),\n",
+       "                                                 ('poly_x_tr', PolyPlus(),\n",
+       "                                                  [30, 31, 32, 33, 34, 35, 36,\n",
+       "                                                   37, 38, 39, 40, 41, 42, 43,\n",
+       "                                                   44, 45, 46, 47, 48, 49, 50,\n",
+       "                                                   51, 52, 53, 54, 55, 56, 57,\n",
+       "                                                   58, 59])])),\n",
+       "                ('standardscaler', StandardScaler()),\n",
+       "                ('lassocv', LassoCV(alphas=20, cv=2, n_jobs=5))])
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
" + ], + "text/plain": [ + "Pipeline(steps=[('columntransformer',\n", + " ColumnTransformer(remainder='passthrough',\n", + " transformers=[('poly_x', PolyPlus(),\n", + " [0, 1, 2, 3, 4, 5, 6, 7, 8, 9,\n", + " 10, 11, 12, 13, 14, 15, 16,\n", + " 17, 18, 19, 20, 21, 22, 23,\n", + " 24, 25, 26, 27, 28, 29]),\n", + " ('poly_x_tr', PolyPlus(),\n", + " [30, 31, 32, 33, 34, 35, 36,\n", + " 37, 38, 39, 40, 41, 42, 43,\n", + " 44, 45, 46, 47, 48, 49, 50,\n", + " 51, 52, 53, 54, 55, 56, 57,\n", + " 58, 59])])),\n", + " ('standardscaler', StandardScaler()),\n", + " ('lassocv', LassoCV(alphas=20, cv=2, n_jobs=5))])" + ] + }, + "execution_count": 49, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "ml_lasso = make_pipeline(\n", + " preprocessor, StandardScaler(), LassoCV(alphas=20, cv=2, n_jobs=5)\n", + ")\n", + "\n", + "ml_lasso" + ] + }, + { + "cell_type": "code", + "execution_count": 50, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + " coef std err t P>|t| 2.5 % 97.5 %\n", + "d_diff 0.516685 0.018054 28.619386 3.855865e-180 0.481301 0.55207\n" + ] + } + ], + "source": [ + "plpr_lasso_fd = DoubleMLPLPR(dml_data_dgp3, clone(ml_lasso), clone(ml_lasso), approach=\"fd_exact\", n_folds=5)\n", + "plpr_lasso_fd.fit(store_models=True)\n", + "print(plpr_lasso_fd.summary)" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "Given that we apply the polynomial and interactions preprossing to two sets of 30 columns each, the number of features is 1050." + ] + }, + { + "cell_type": "code", + "execution_count": 51, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "1050" + ] + }, + "execution_count": 51, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "plpr_lasso_fd.models[\"ml_m\"][\"d_diff\"][0][0].named_steps[\"lassocv\"].n_features_in_" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As describes above, for the `cre_normal` approach adds $\\bar{X}_i$ to the features used in the treatment nuisance estimation." + ] + }, + { + "cell_type": "code", + "execution_count": 54, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + " coef std err t P>|t| 2.5 % 97.5 %\n", + "d 0.552196 0.028428 19.424151 4.822927e-84 0.496478 0.607915\n" + ] + }, + { + "data": { + "text/plain": [ + "1051" + ] + }, + "execution_count": 54, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "plpr_lasso_cre_normal = DoubleMLPLPR(dml_data_dgp3, clone(ml_lasso), clone(ml_lasso), approach=\"cre_normal\", n_folds=5)\n", + "plpr_lasso_cre_normal.fit(store_models=True)\n", + "print(plpr_lasso_cre_normal.summary)\n", + "plpr_lasso_cre_normal.models[\"ml_m\"][\"d\"][0][0].named_steps[\"lassocv\"].n_features_in_" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "For the `wg_approx` approach, there is only one set of features. We can create a similar learner for this setting." + ] + }, + { + "cell_type": "code", + "execution_count": 13, + "metadata": {}, + "outputs": [], + "source": [ + "preprocessor_wg = ColumnTransformer(\n", + " [\n", + " (\n", + " \"poly_x\",\n", + " PolyPlus(degree=2, include_bias=False, interaction_only=False),\n", + " indices_x,\n", + " )\n", + " ],\n", + " remainder=\"passthrough\",\n", + ")\n", + "\n", + "ml_lasso_wg = make_pipeline(\n", + " preprocessor_wg, StandardScaler(), LassoCV(alphas=20, cv=2, n_jobs=5)\n", + ")" + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + " coef std err t P>|t| 2.5 % 97.5 %\n", + "d_demean 1.150157 0.014083 81.671376 0.0 1.122555 1.177758\n" + ] + }, + { + "data": { + "text/plain": [ + "525" + ] + }, + "execution_count": 14, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "plpr_lasso_wg = DoubleMLPLPR(dml_data_dgp3, clone(ml_lasso_wg), clone(ml_lasso_wg), approach=\"wg_approx\", n_folds=5)\n", + "plpr_lasso_wg.fit(store_models=True)\n", + "print(plpr_lasso_wg.summary)\n", + "plpr_lasso_wg.models[\"ml_l\"][\"d_demean\"][0][0].named_steps[\"lassocv\"].n_features_in_" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can see that for the more complicated data generating process `dgp3`, the approximation approach performs worse compared to the other approaches." + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "As another example, below we should how to select a specific covariate subset for preprocessing. This can be useful in case of the data includes dummy covariates, where adding polynomials might not be appropriate." + ] + }, + { + "cell_type": "code", + "execution_count": null, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "39" + ] + }, + "execution_count": 20, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "x_cols = dml_data_dgp3.x_cols \n", + "x_cols_to_pre = [\"x3\", \"x6\", \"x22\"]\n", + "\n", + "indices_x_pre = [i for i, c in enumerate(x_cols) if c in x_cols_to_pre]\n", + "\n", + "preprocessor_alt = ColumnTransformer(\n", + " [\n", + " (\n", + " \"poly_x\",\n", + " PolyPlus(degree=2, include_bias=False, interaction_only=False),\n", + " indices_x_pre,\n", + " )\n", + " ],\n", + " remainder=\"passthrough\",\n", + ")\n", + "ml_lasso_alt = make_pipeline(\n", + " preprocessor_alt, StandardScaler(), LassoCV(alphas=20, cv=2, n_jobs=5)\n", + ")\n", + "\n", + "plpr_lasso_wg.learner[\"ml_l\"] = ml_lasso_alt\n", + "plpr_lasso_wg.learner[\"ml_m\"] = ml_lasso_alt\n", + "\n", + "plpr_lasso_wg.fit(store_models=True)\n", + "plpr_lasso_wg.models[\"ml_l\"][\"d_demean\"][0][0].named_steps[\"lassocv\"].n_features_in_" + ] + }, + { + "cell_type": "code", + "execution_count": 22, + "metadata": {}, + "outputs": [ + { + "data": { + "text/html": [ + "
ColumnTransformer(remainder='passthrough',\n",
+       "                  transformers=[('poly_x', PolyPlus(), [2, 5, 21])])
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
" + ], + "text/plain": [ + "ColumnTransformer(remainder='passthrough',\n", + " transformers=[('poly_x', PolyPlus(), [2, 5, 21])])" + ] + }, + "execution_count": 22, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "plpr_lasso_wg.models[\"ml_l\"][\"d_demean\"][0][0].named_steps['columntransformer']" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We can also look at the resulting features.\n", + "\n", + "**Remark**: Note, however, that the feature names here refer only to the corresponding `x_cols` indices, not the column names from the `pd.DataFrame` because [DoubleML](https://docs.doubleml.org/stable/index.html) uses `np.array`'s for fitting the model. Therefore the difference to the names from `x_cols_to_pre`." + ] + }, + { + "cell_type": "code", + "execution_count": 25, + "metadata": {}, + "outputs": [ + { + "data": { + "text/plain": [ + "array(['poly_x__x2', 'poly_x__x5', 'poly_x__x21', 'poly_x__x2^2',\n", + " 'poly_x__x2 x5', 'poly_x__x2 x21', 'poly_x__x5^2',\n", + " 'poly_x__x5 x21', 'poly_x__x21^2', 'poly_x__x2^3', 'poly_x__x5^3',\n", + " 'poly_x__x21^3', 'remainder__x0', 'remainder__x1', 'remainder__x3',\n", + " 'remainder__x4', 'remainder__x6', 'remainder__x7', 'remainder__x8',\n", + " 'remainder__x9', 'remainder__x10', 'remainder__x11',\n", + " 'remainder__x12', 'remainder__x13', 'remainder__x14',\n", + " 'remainder__x15', 'remainder__x16', 'remainder__x17',\n", + " 'remainder__x18', 'remainder__x19', 'remainder__x20',\n", + " 'remainder__x22', 'remainder__x23', 'remainder__x24',\n", + " 'remainder__x25', 'remainder__x26', 'remainder__x27',\n", + " 'remainder__x28', 'remainder__x29'], dtype=object)" + ] + }, + "execution_count": 25, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "plpr_lasso_wg.models[\"ml_l\"][\"d_demean\"][0][0].named_steps['columntransformer'].get_feature_names_out()\n" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "### Hyperparameter tuning\n", + "\n", + "In this section we will use the `tune_ml_models()` method to tune hyperparameters using the [Optuna](https://optuna.org/) package. More details can found in the [Python: Hyperparametertuning with Optuna](https://docs.doubleml.org/stable/examples/learners/py_optuna.html) example notebook.\n", + "\n", + "As an example, we use [LightGBM](https://lightgbm.readthedocs.io/en/stable/) regressors and compare the estimates for the different static panel model approaches, when applied to the non-linear and discontinuous `dgp3`." + ] + }, + { + "cell_type": "code", + "execution_count": 5, + "metadata": {}, + "outputs": [], + "source": [ + "dim_x = 30\n", + "theta = 0.5\n", + "\n", + "np.random.seed(11)\n", + "data_tune = make_plpr_CP2025(num_id=4000, num_t=10, dim_x=dim_x, theta=theta, dgp_type=\"dgp3\")\n", + "dml_data_tune = DoubleMLPanelData(data_tune, y_col=\"y\", d_cols=\"d\", t_col=\"time\", id_col=\"id\", static_panel=True)\n", + "ml_boost = LGBMRegressor(random_state=314, verbose=-1)" + ] + }, + { + "cell_type": "code", + "execution_count": 6, + "metadata": {}, + "outputs": [], + "source": [ + "# parameter space for both ml models\n", + "def ml_params(trial):\n", + " return {\n", + " \"n_estimators\": 100,\n", + " \"learning_rate\": trial.suggest_float(\"learning_rate\", 0.1, 0.4, log=True),\n", + " \"max_depth\": trial.suggest_int(\"max_depth\", 2, 10),\n", + " \"min_child_samples\": trial.suggest_int(\"min_child_samples\", 1, 5),\n", + " \"reg_lambda\": trial.suggest_float(\"reg_lambda\", 1e-2, 5, log=True),\n", + " }\n", + "\n", + "param_space = {\n", + " \"ml_l\": ml_params,\n", + " \"ml_m\": ml_params\n", + "}\n", + "\n", + "optuna_settings = {\n", + " \"n_trials\": 100,\n", + " \"show_progress_bar\": True,\n", + " \"verbosity\": optuna.logging.WARNING, # Suppress Optuna logs\n", + "}" + ] + }, + { + "cell_type": "code", + "execution_count": 7, + "metadata": {}, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "Best trial: 94. Best value: -1.45766: 100%|██████████| 100/100 [03:22<00:00, 2.03s/it]\n", + "Best trial: 91. Best value: -1.2035: 100%|██████████| 100/100 [03:30<00:00, 2.11s/it]\n" + ] + }, + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
coefstd errtP>|t|2.5 %97.5 %
d0.5076950.00815462.2621370.00.4917130.523677
\n", + "
" + ], + "text/plain": [ + " coef std err t P>|t| 2.5 % 97.5 %\n", + "d 0.507695 0.008154 62.262137 0.0 0.491713 0.523677" + ] + }, + "execution_count": 7, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "plpr_tune_cre_general = DoubleMLPLPR(dml_data_tune, clone(ml_boost), clone(ml_boost), approach=\"cre_general\", n_folds=5)\n", + "\n", + "plpr_tune_cre_general.tune_ml_models(\n", + " ml_param_space=param_space,\n", + " optuna_settings=optuna_settings,\n", + ")\n", + "\n", + "plpr_tune_cre_general.fit()\n", + "plpr_tune_cre_general.summary" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "0.509102" + ] + }, + { + "cell_type": "code", + "execution_count": 8, + "metadata": {}, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "Best trial: 71. Best value: -1.46224: 100%|██████████| 100/100 [03:22<00:00, 2.03s/it]\n", + "Best trial: 43. Best value: -1.22011: 100%|██████████| 100/100 [03:20<00:00, 2.00s/it]\n" + ] + }, + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
coefstd errtP>|t|2.5 %97.5 %
d0.4729770.01013446.6708740.00.4531140.492839
\n", + "
" + ], + "text/plain": [ + " coef std err t P>|t| 2.5 % 97.5 %\n", + "d 0.472977 0.010134 46.670874 0.0 0.453114 0.492839" + ] + }, + "execution_count": 8, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "plpr_tune_cre_normal = DoubleMLPLPR(dml_data_tune, clone(ml_boost), clone(ml_boost), approach=\"cre_normal\", n_folds=5)\n", + "\n", + "plpr_tune_cre_normal.tune_ml_models(\n", + " ml_param_space=param_space,\n", + " optuna_settings=optuna_settings,\n", + ")\n", + "\n", + "plpr_tune_cre_normal.fit()\n", + "plpr_tune_cre_normal.summary" + ] + }, + { + "cell_type": "code", + "execution_count": 14, + "metadata": {}, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "Best trial: 98. Best value: -1.75751: 100%|██████████| 100/100 [01:51<00:00, 1.11s/it]\n", + "Best trial: 90. Best value: -1.51545: 100%|██████████| 100/100 [02:03<00:00, 1.24s/it]\n" + ] + }, + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
coefstd errtP>|t|2.5 %97.5 %
d_diff0.5515950.00865163.7575310.00.5346380.568551
\n", + "
" + ], + "text/plain": [ + " coef std err t P>|t| 2.5 % 97.5 %\n", + "d_diff 0.551595 0.008651 63.757531 0.0 0.534638 0.568551" + ] + }, + "execution_count": 14, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "plpr_tune_fd = DoubleMLPLPR(dml_data_tune, clone(ml_boost), clone(ml_boost), approach=\"fd_exact\", n_folds=5)\n", + "\n", + "plpr_tune_fd.tune_ml_models(\n", + " ml_param_space=param_space,\n", + " optuna_settings=optuna_settings,\n", + ")\n", + "\n", + "plpr_tune_fd.fit()\n", + "plpr_tune_fd.summary" + ] + }, + { + "cell_type": "code", + "execution_count": 11, + "metadata": {}, + "outputs": [ + { + "name": "stderr", + "output_type": "stream", + "text": [ + "Best trial: 91. Best value: -2.25528: 100%|██████████| 100/100 [01:26<00:00, 1.15it/s]\n", + "Best trial: 21. Best value: -1.62987: 100%|██████████| 100/100 [01:33<00:00, 1.07it/s]\n" + ] + }, + { + "data": { + "text/html": [ + "
\n", + "\n", + "\n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + " \n", + "
coefstd errtP>|t|2.5 %97.5 %
d_demean1.1374080.004974228.6562330.01.1276591.147158
\n", + "
" + ], + "text/plain": [ + " coef std err t P>|t| 2.5 % 97.5 %\n", + "d_demean 1.137408 0.004974 228.656233 0.0 1.127659 1.147158" + ] + }, + "execution_count": 11, + "metadata": {}, + "output_type": "execute_result" + } + ], + "source": [ + "plpr_tune_wg = DoubleMLPLPR(dml_data_tune, clone(ml_boost), clone(ml_boost), approach=\"wg_approx\", n_folds=5)\n", + "\n", + "plpr_tune_wg.tune_ml_models(\n", + " ml_param_space=param_space,\n", + " optuna_settings=optuna_settings,\n", + ")\n", + "\n", + "plpr_tune_wg.fit()\n", + "plpr_tune_wg.summary" + ] + }, + { + "cell_type": "code", + "execution_count": 15, + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "True treatment effect: 0.5\n", + "\n", + " Model theta se ci_lower ci_upper\n", + "cre_general 0.507695 0.008154 0.491713 0.523677\n", + " cre_normal 0.472977 0.010134 0.453114 0.492839\n", + " fd_exact 0.551595 0.008651 0.534638 0.568551\n", + " wg_approx 1.137408 0.004974 1.127659 1.147158\n" + ] + }, + { + "data": { + "image/png": "iVBORw0KGgoAAAANSUhEUgAABKUAAAJOCAYAAABm7rQwAAAAOnRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjEwLjMsIGh0dHBzOi8vbWF0cGxvdGxpYi5vcmcvZiW1igAAAAlwSFlzAAAPYQAAD2EBqD+naQAAi+JJREFUeJzt3Qd8U+X+x/FfFy0FWobMCjJVQAQERVw4QJwXnOhFQRG8ep3gAgcIzqsyFOdVUFRU3NcrigMnigvFgaKCCF72bMtoKU3+r+/j/9Q0bWnapmmTft6vV6A5OSd5kpw8yfmd3/N74vx+v98AAAAAAACACIqP5IMBAAAAAAAAQlAKAAAAAAAAEUdQCgAAAAAAABFHUAoAAAAAAAARR1AKAAAAAAAAEUdQCgAAAAAAABFHUAoAAAAAAAARR1AKAAAAAAAAEUdQCgAAAAAAABFHUAoAgCgRFxdnN998c1U3A9XAkUceafvtt1+p6/3+++9uv3niiSci0i7ULNqvtH999dVXVd0UAECUIigFAIgaS5cutX/84x/Wtm1bS0lJsbS0NDv00EPt3nvvtR07dlR181ADeEEe75KUlGR77LGHHXLIIXb99dfbihUrLFZ88MEHBc/z6aefLnYdff50e3CArHXr1nbSSSft9v7PO++8Qq+lPs9du3a1iRMnWm5ubsF6CsTq9g0bNoTUVu99UT8xZMgQ++2330p8/+Lj461hw4Z2/PHH2/z588vw6gAAgHBIDMu9AABQyWbPnm1nnHGGJScnuwNNHQTv3LnT5s2bZ9dcc40tWrTI/v3vf1ssU+AtMZGv7urg7LPPthNOOMF8Pp9t3rzZvvzyS5syZYoLkE6bNs3OOussixUKAD/zzDN2zjnnFFquAM+nn37qbi8vfZ4fe+wx9/eWLVvspZdesquvvtq9ns8991yZ7+/yyy+3Aw880PLy8uzrr792fYL6ju+//95atGhR5P3Lz8+3X375xR588EE76qij3ON26dKl3M8HAACUDb9sAQDV3rJly9xB/l577WXvvfeeNW/evOC2Sy65xJYsWeIOPGORgh4KvunAvyIH/9Fu27ZtVqdOHasuDjjggCJBmuXLl9uxxx5rQ4cOtY4dO7qsn1ig4M1rr73mMpWUFeZRoKpp06bWoUMHF5grDwVZA1/Hf/7zn9arVy+bNWuWTZo0qVAgKRSHH364nX766e7v888/3/bee28XqJoxY4aNGTOmxPdP2ylb6qGHHnIBqmiSk5NjtWrVcllfAABEG769AADV3l133WVbt251GSiBASlP+/bt7Yorrii4vmvXLrvlllusXbt2LhNDQ4k0tCpwSFDgECMN/enZs6fVrl3bZUnourz88svuuoJBPXr0sG+++abI8KO6deu64UH9+/d3QRMdRE+YMMH8fn+hde+55x43xKtRo0bucXR/L774YpHnoiFFl156qc2cOdM6d+7s2j9nzpxia0plZ2fblVde6Z6H1mvSpIn169fPZYgEeuGFF9zj6XEVVNDB+MqVK4t9Llo+cOBA93fjxo1d1oqySUrzn//8x0488UT3/NUWvfZ6D4rb9vPPP3eBjgYNGrjXbP/993cZRsFt0XBNrVevXj0bPHhwQXDqqquuspYtW7rH2WeffdxrG/x6v/POO3bYYYdZ/fr13X1pPe0DgaZOnepe49TUVNcW7QMKtJSXgqaqsaMgovbZQNpHlOmnoWJ6vIMPPrhIINWrz6MMpOKGpnn7ZaAFCxa4/UrvbZs2bezhhx8Oqa2LFy92wRu1R/u3nrsCT8UZMGCAe621HwXSa3XmmWdaQkKChYsCK6qXJcGvQ3kcffTRBYHt3VFQSrTPlWbTpk3uc6G+QfuWhh0qoPXtt98W+74pwKZ9r1mzZm5//9vf/mZ//PFHsTXCSns/vftUFtmNN95oGRkZbn/KysoK+bP+3Xffuc+YNwxa7Ro2bJht3LixyHPVthdccEHB51ptuvjii90+Hkh966hRo1yfoed4yimn2Pr164vc35tvvulea62jz7X6DGW5BlqzZo0LKO65557uMdXnax8Mx/4AAKh+yJQCAFR7//3vf90BlA7WQjF8+HCXGaGDbgUwFAS544477KeffrJXXnml0LrKsvr73//ualXpAE4BjpNPPtkdDOpAUpkbou11AP7zzz8XykhQ0OW4445zQQYFIhRAGjdunAuMKTjlUdBFB6MKruiATgeVClK8/vrr7sAskLLBnn/+eRec0oGlgk7Fueiii1xgS+t16tTJHVRqOKOepzJBvECHDvA0pEnPYe3ata4tn3zyiQuyKWgT+FwUXFOmil6Hd99919X3UYBJB6K7o8fRAboOTPW/nsPYsWPdwfLdd99dKFikQKAONBVI1AGx2qvXITiwqLYosKS26MBbgSe9hu+//747UO7WrZu99dZbbvimDp4nT57sttVBrh5DwS69Bzqw1fus5+x59NFHXQaN9hE9rrJNdLCufUX7Q3n17t3bvV56nh695tp3t2/f7h5TgUntn3ouev90AF8eyk5S0E77pYajaZ/R+6SsGQUZSqLXR7WgFNAYPXq0CxBoWwUjNXwuuD167RUUePbZZwv2AwVgdD8aeqfXLZy8wJBep0jdlxfwUHCyNAowvvrqq+7zqyCN3t9HHnnE+vTpYz/++GOR7K7bbrvNBZKuu+46W7dunRvm2bdvX1u4cKELHpXn/VTAV8sVHFNASH+H+lnXvqnnoHX1+fOGPuv/zz77zLVVVq1aZQcddJAbVnnhhRfavvvu6z5n2me1L+sxPZdddpl77dT36bXUc1S/pICc56mnnnJZhPpc/+tf/3L3ocw0fcbVPq+fO+2001xbdJ9aptdMbVa9tpL6QgBAFPMDAFCNZWZmKgXGP2DAgJDWX7hwoVt/+PDhhZZfffXVbvl7771XsGyvvfZyyz799NOCZW+99ZZbVrt2bf/y5csLlj/yyCNu+fvvv1+wbOjQoW7ZZZddVrDM5/P5TzzxRH+tWrX869evL1i+ffv2Qu3ZuXOnf7/99vMfffTRhZbr/uLj4/2LFi0q8tx027hx4wqup6en+y+55JISXws9RpMmTdzj7Nixo2D566+/7u5r7NixRZ7LhAkTCt1H9+7d/T169CjxMUp6fvKPf/zDn5qa6s/JyXHXd+3a5W/Tpo173Tdv3lxoXb1uwW0ZPXp0oXVeffVVt/zWW28ttPz000/3x8XF+ZcsWeKuT5482a0X+PoH0/7UuXNnf1ktW7bM3ffdd9+92/vWOtp35corr3TXP/7444J1srOz3WvRunVrf35+vlv2+OOPu/X0GIG0zwXve3369HHLJk6cWLAsNzfX361bN/ee670PbK/u23PMMcf4u3TpUvC+eK//IYcc4u/QoUORx33hhRfcPqPXeMWKFe62a665xt+2bduCtgS/lnqP9TnYHb3PderUce+TLnr/br/9dvc4+++/f8F62udLez+9tk6fPt2tt2rVKv/s2bPd66v7+/LLLwu9HuPHj3frrVmzxr0vBx54YMFzLY1eN+898+h+k5OTC31+vDZlZGT4s7KyCpY///zzbvm9995b5vfTu0+99oGfubJ81ov7rD777LNuvY8++qhg2ZAhQ1xf5L12xX1evX22b9++hT7DI0eO9CckJPi3bNlSsL/Xr1/fP2LEiEL3o9df/Zi3XP1CaZ8vAEBsYfgeAKBa84alaKhHKN544w33vzJ2AiljSoKHTCnDSNktHmUJecN+WrVqVWR54ExeHmUEBA+/UzaUMo08wRkRmZmZbhhL8FA7UcaF2lUaZT4os0cZDcXRNO3KMlC2V2A9KmVmKeuhuDpcyr4KpDYW95yDBT4/DStU/SFtq2wIDRUTZUNoGJWGHAZmaImXnREoODtL762GiinbKPi9VcxOQ4PEu28NKVRNruJonf/973+usHW4KVPMex28divjRBkhgeso+0RZJcquKW89JmX4eZS5out6zzUMrKShZ8piUzaO9z7poiw7ZbD8+uuvRYZ7iWplaaifMvz0Wut/ZfNUlIZjasiXLhqGq+xEfR6DMxpDpYwi3ZeylbSf6/6VlabhiYGU0aP1lCmk/VTZesoK9OpR7Y4y77xsSWUX6rXzhogW93nWxAyB/ZceQ5mCXl9VnvdTGUeBn7myfNYDt1OGoN5/ZXqK1359bpQNpqzR4NeuuM+r9uXAZXpN9dqozpoo00kZV9pnvH1OF32e1bcq+9Frm563himWt04ZACC6EJQCAFRrqtcSeIBfGh0E6YBRB7iBdPCpQIR3kOQJDDxJenq6+181i4pbHnygpMfS0MJAKq4sgTVQNDxNB346YNTBvQ6INXRFwalgGhIUCg0X/OGHH1xbFfRQvanAAJL3XHWwHEwHqsGvhdqmdgXSkJxQDg413EbDvvQ66T3T/XiFpL3n6A2lUu2c0ugAXTVlAqm9CjYEByhVVDzw+Q4aNMgNT9MwThXiVpF8DYUKDFBpKJUCCXrdVKhbBfMDh/dVhOqfiddOtau49yC43WWl1yK4+Htx+14gDWNUUOmmm24qCAZ5FwVqRMGNYElJSW64mupIffTRR64mUkWGOQbucwpY6OLdr96H4M9UqDRkVPelwJuGFSpge+655xZZT0EUraehwSNHjnQzW4ZSO020H2moqPYbBag0xFavnx6vuM+z1guk4I36p+D3qCzvZ3AfUZbPugKTGrKqz4aCQGq7d39e+1UPSicEQvmsFtePesMgvb5DwU4v2B+837399tsF+5xeTw3tU4BZ7TviiCNcP6c6UwCA2ERNKQBAtaYAhw7WFHwpi+Iyb4pTUpHmkpYHF9QOxccff+zqB+kASzN7KUtCB/mPP/54sYW1AzMZdkfZLspIUFaJDuxUu0kHdCrQrsLLZVXegtXKgFB2l94r1XBSTSUFG5R1oeBPSdlKoWajlJVePwU4lH2hDBHV+VJtGx0Q63XS81RASPXBFCzU7aqlpPdGQY3x48dbRWhfVdF5L6Ba0X021GBJKLz3QrWIlBlVnOCArkdBKNVaU/BTMwuGks1XGr0Xqq8ULio+Hsr9KVDkraf6Y2qH6msdddRRxWYGBbr99ttdUE9ZWartpCCz9lVlAJZnXy+PUPuIkvqNTz/91NViU102BWfVbtXGK2/7S+svvftVXSmdICguCO3R66gMLWVqqWacXmvVyFKgsXv37uVqHwCg+iIoBQCo9nTQqEK88+fPLzTUrqQZ0HQApDPzXiaKqOivgie6PZz0WMpO8jIa5JdffnH/e0V5FfBQkEYHWAq2eBSUqigFuDRkRxdlG6jAuQorKyjlPVcFX7xZyDxaFq7XQkNtNIRJwTAF3jzBM54pWOUFbcoTiFB7NSRSWXOB2VLe8MDA56MgwTHHHOMukyZNcoGEG264wQWqvMdWVoqyqnTRcMtTTz3VvXZjxowpNASqLLSPKiPMyxLz2qXXO1hwu73sEu2ngUrKpFIWkIanBWbXBO97wbwMJAVFy/oeaPihMmL0fiv4GUu0b6j4vWa082a7LIkKfSt4pdlAA+l9U9ZUMC9LKDBQo4w1FeKv6PvpCfWzrsyluXPnusCrArAltVEZTAqqlvVkQEm8z76CtaHsd1pfw3J1UdsUPNPwyqeffjos7QEAVB8M3wMAVHvXXnutO1DTcCwFl4IpCKBZpkSzV4lmfwqkwIQEz3QXDvfff3+hA05d10G/AiJeFoGyYAIzXjQcR5kA5aX7Ch4qpAM+ZZVpNi5RxoeWKbvFWyYaGqMaOuF6LbwsicAsMgV5lHkUSAEzDRPSexMceAklA03vrZ534OstGkql19fLDtPwpGA6qBXvdVAQLZDq2CjzR+3Iy8uz8lDw6LzzznP3pSyUwHZ/8cUXLmDlUfBBgVYFG7yMI+/AXVleHj1frVcczVCoWd8CX3NdV0ChR48exW6j/eHII490661evbrI7Rq2VRK9xvfdd58b5lfckLhopqG9qt+kwLFmxSttfw/eX1944YVia3HJk08+WWj4sYJaeu2DsxnL8356Qv2sF/dZLa6/VFBXszFqeKPqVVU0Y1RZeQpyKThc3OfL2+9Ug051rgLpc6EgdODzAgDEDjKlAADVng5KNMxNGS3KflLhYNU60UGbhqHogFDBANGwIhUB1oG8N6xMAQEVO9ZBljIcwkkZNcqs0GOqYK8OAjVkTAWbvfpMOiBUUEzDYzQEShlNDzzwgBsmpTo05aGDXNVcUtFkPWcNwVEWkQp3K6NAFBhTRoumftfroCLD3jTxCoaolk44HHLIIS7LR6+BipAreKFhOsEHrjrQVR0tDc1RkEjtUqaXMoZUk0oBgd3Rdnr/lNWioJ6et4bjqaC5hvx4QR0NIVRgR6+7MkT0eitAptfLKzauwt0aRqTaU6pdowN3Bbu0TShF9TU0UVkbypTTfqbXXRlx3nMPzILRsLBnn33WBSH0+mi4l/ZHZZJpG2+YYufOnV3dMWVqKbDmFRZXsKI4CkDq/dVroUw9DVFUQEX7vt77kmjf0+ugoW4jRoxw2VPaLxQ0U/H3b7/9tsRtBwwY4C6hUDbQrbfeWmS5hmCVJyCqz1BqamqhZXrt9FkLB9VZUnDmzjvvdK/77jI3tY9p/9W+//3339vMmTNLrIOl91Gvt9bX66zH0Gdfr3043s+yfNYVGPLqNCk4lJGR4T5DwVmNogCSbtP9qQaX+l4F09Tfzps3r8hkBbujx9VnX8FMBadV503944oVK1x/qc+hPn/KDFMwX0MMFazVsD4NT9Zz0TYAgBhU1dP/AQAQql9++cVNHa5p3mvVquWvV6+e/9BDD/VPnTq10PT2eXl5bsr3Nm3a+JOSkvwtW7b0jxkzptA6u5u2Xl+Pl1xySaFl3lTygVOVe1PaL1261H/sscf6U1NT/U2bNnVT2AdPGT9t2jR/hw4d3LTx++67r5tK3ZvqvrTHDrxN23jTxV9zzTX+rl27utdB7dDfDz74YJHtZs2a5e/evbt77IYNG/oHDx7s/9///ldoHe+5BCuujcX55JNP/AcffLC/du3a/hYtWvivvfZa/1tvveW21TT2gebNm+fv169fQbv3339/9x6W1hZvanlNN6/H0Hur11TvSeB09HPnzvUPGDDAraP9RP+fffbZbv/xPPLII/4jjjjC36hRI/e6tGvXzr2emZmZu32e3n7gXRITE91r2qtXL7ePLV++vNjttI+cfvrp/vr16/tTUlL8Bx10kP/1118vdr2+ffu6Nmlfuv766/3vvPNOkdexT58+/s6dO/u/+uorf+/evd19an++//77i22v9rfgxxkyZIi/WbNm7nXMyMjwn3TSSf4XX3yxYB09nrZ94YUXdvuaeG0JpLYEvk6BlwsuuKDU97m4fbC4S0JCQpnaWtznONB5553n7nPJkiUl3of6kauuusrfvHlzt7+rD5o/f757HXTxeG169tln3b7RpEkTt776nOD9JNT3s7TnGcpnXddPOeUUty+mp6f7zzjjDP+qVasK9S8etVP7SePGjd19tm3b1vVP6n9E+5W2+/LLL4ttZ/BnX9f79+/vHlfPUZ87veZ63rJhwwZ3/+ojtW9oPX22nn/++RLfDwBAdIvTP1UdGAMAIBopO0tDcbzZ1gDAo9pbyuxTZpEyGndHQyo3bNgQthpOAABEC2pKAQAAAAAAIOIISgEAAAAAACDiCEoBAAAAAAAg4qgpBQAAAAAAgIgjUwoAAAAAAAARR1AKAAAAAAAAEZdoNYzP57NVq1ZZvXr1LC4urqqbAwAAAAAAEFNUKSo7O9tatGhh8fEl50PVuKCUAlItW7as6mYAAAAAAADEtD/++MP23HPPEm+vcUEpZUh5L0xaWppFe9bX+vXrrXHjxruNPAIASkZfCgAVR18KABXni6G+NCsryyUEeTGYktS4oJQ3ZE8BqVgISuXk5LjnEe07LABUFfpSAKg4+lIAqDhfDPalpZVNio1nCQAAAAAAgKhCUAoAAAAAAAARR1AKAAAAAAAAEVfjakqFKj8/3/Ly8qy6jzdVGzXmNFbGm1YXSUlJlpCQUNXNAAAAAAAgZhGUCuL3+23NmjW2ZcsWi4a2KjCVnZ1davEwlF39+vWtWbNmvLYAAAAAAFQCglJBvIBUkyZNLDU1tVoHJBSU2rVrlyUmJlbrdkYbva7bt2+3devWuevNmzev6iYBAAAAABBzCEoFDdnzAlKNGjWy6o6gVOWpXbu2+1+BKe0PDOUDAAAAACC8KEQUwKshpQwpwNsPqnttMQAAAAAAohFBqWKQdQRhPwAAAAAAoPIQlAIAAAAAAEDEEZRCxL366qvWvn17V6fpyiuvLHEZAAAAAACIXQSlYkB8fLwbalbS5eabb7bq5B//+Iedfvrp9scff9gtt9xS4rKK+OCDD9xzV+F6AAAAAABQ/TD7XgxYtWpVQf2jWbNm2dixY+3nn38uuL1u3bqFZuzTLIOasa8qbN261c1o179/f2vRokWJywAAAAAAQGwjUyoGNGvWrOCSnp7uAlTe9cWLF1u9evXszTfftB49elhycrLNmzfPzjvvPBs4cGCh+9GwuSOPPLLgus/nszvuuMPatGljtWvXtq5du9qLL76427bk5uba1VdfbRkZGVanTh3r1auXy1oS/a+2yNFHH+3aWdIyUTsPP/xw99gtW7a0yy+/3LZt21bosa677jp3m56Xhv9NmzbNfv/9dzvqqKPcOg0aNHD3qecLAAAAAACqD4JSNcTo0aPtzjvvtJ9++sn233//kLZRQOrJJ5+0hx9+2BYtWmQjR460c845xz788MMSt7n00ktt/vz59txzz9l3331nZ5xxhh133HH266+/2iGHHFKQwfXSSy/Z6tWrS1y2dOlSt91pp53m7kcZYApS6f49Q4YMsWeffdbuu+8+97weeeQRlxWmIJXuS3Tfus977723gq8gAAAAAAAIJ4bvhWjtxyts7bwVpa6XmlHP2g/pWmjZkie/te0rs0vdtulhrazp4a2sMkyYMMH69esX8vrKQrr99tvt3Xfftd69e7tlbdu2dYEhBX/69OlTZJsVK1bY448/7v73huEpa2rOnDluue6vSZMmbnnDhg1dJpcUt0wBscGDBxcUPe/QoYMLPulxH3roIfcYzz//vL3zzjvWt2/fgvZ5dF/efdevX7+crxoAAAAAAKgsBKVClJ+7y/Kycktdb1d6ctFlW3eGtK0eo7L07NmzTOsvWbLEtm/fXiSQtXPnTuvevXux23z//feuXtXee+9dJMDVqFGjMj3+t99+6zKkZs6cWageloYULlu2zD2WZuorLjgGAAAAAACqP4JSIUpITrSktKIBp2CJdWsVuyyUbfUYlUX1nYJn7FOQJ1BeXl7B3yo+LrNnz3b1oQKpflNxtI0CRQsWLHD/Bwosth4K3Zdm5FMdqWCtWrVyQTMAAAAAABC9CEqFSMPqyju0Lng4X3XQuHFj++GHHwotW7hwoSUlJbm/O3Xq5IJPGiYXajaSMqiUKaWZ9FSgvCIOOOAA+/HHH13x8uJ06dLFZU2pvpU3fC9QrVp/BgfVHgAAAAAAUP0QlKqhNNPd3Xff7QqZq2bU008/7YJU3tA8zYinelAqbq7gz2GHHWaZmZn2ySefWFpamg0dOrTIfWrYnupAqQD5xIkT3X2tX7/e5s6d64qrn3jiiSG3T7PqHXzwwa6w+fDhw12ml4JUqiF1//33W+vWrV0bhg0b5mpNaWbA5cuXu4DYmWeeaXvttZebde/111+3E044wc3gV9ZsLQAAAAAAwm3jnEnuEsiNZPL7XGJFpkYexcW7Y9pAjY4b5S6xhNn3aqj+/fvbTTfdZNdee60deOCBlp2d7YJJgW655Ra3joqOd+zY0c2Gp+F8bdq0KfF+VdBc93PVVVfZPvvsYwMHDrQvv/zSDbkrCwWxlAX1yy+/uKwrBbjGjh1bUEBdVPD89NNPt3/+85+277772ogRI2zbtm3uNg05HD9+vJt1sGnTpoVm7QMAAAAAoKrk78iyXZtXFrrkb1ll+ZlrzLaud//repF1dmRZrInzBxcWinFZWVmWnp7usn6U8RMoJyfHFdFW0CUlJcWqO711u3btssTExCIRVFRctO0PAMpH2aDKstRsnaq3BwAoO/pSACh7ppQvL8d8Wzfp6H43a8dZfN2GFp+UElWZUruLvQRi+B4AAAAAAECEKLCUnNHZVkw60UwJJrvLFYqLM9/2LbbnqNlWt0t/izWcxgAAAAAAAIiQ/G1b7I+pp/0ZjPL7dr+ybvf73fraLtYQlAIAAAAAAIiQLZ/MMH/u9tIDUh6/z/w7t9uWT560WENQCgAAAAAAIEK1oTe9M7Vc2256574/Z+mLIQSlAAAAAAAAIiB/60bLW7e0lOLmxfD73Xb521QYPXYQlAIAAAAAAIgAX87Wim2/I9tiCUEpAAAAAACACIhPqVux7WvXs1iSWNUNiAWTPlxqkz/6rdAyjfP0qZC++S3O4iw+TjM5xhVaZ+QRbW1Un3YRbi0AAAAAAKgKCXUbWVKTdpa3/rc/Z98LVVycJTVuawl1GlosISgVBlk5u2xlZk65tgMAAAAAADWDklUa9rvM1s4cWeZtG/a7vEiyS7Rj+F4YpKUkWkZ6ijVKTbLSdg/drvW0vrZD1fn999/dB3rhwoVV3RQAAAAAQA1R/9ChFpecahYXYkgmLt7iaqVa/UOHWKwhKBUGGoL32JldbcuOXcqo2y3drvW0PkP3AAAAAACoWbZ8PN3iklLM/L7QNvD73PraLtYQlAqDLTvy7PQZX7n6UaojtTtenSmtr+2qSl5e1T12Zdu5c2dVNwEAAAAAgGLl78gy39aNZdpG62u7WENQKgxmfPWHbd+ZX2pAyqP1tP6TX/0R1nb4fD676667rH379pacnGytWrWy2267rWCY2qxZs6xPnz6WkpJiM2fOdNs89thj1rFjR7ds3333tQcffDDkx/v000+tW7dubtuePXvaq6++WmQ43A8//GDHH3+81a1b15o2bWrnnnuubdiwoeD2I4880i6//HK79tprrWHDhtasWTO7+eabCz3Oli1bbPjw4da4cWNLS0uzo48+2r799tuC27W+2qHn0qZNG9cemTNnjh122GFWv359a9SokZ100km2dOnSCr3GAAAAAABURELtNEtskOEuCenNLb52ull8QuGV4hPc8oT6zf9at3aaxRqKGlWQZtm7f96ycm07dd4yu+ywNmErVDZmzBh79NFHbfLkyS4Ys3r1alu8eHHB7aNHj7aJEyda9+7dCwJTY8eOtfvvv98t++abb2zEiBFWp04dGzp06G4fKysry04++WQ74YQT7JlnnrHly5fblVdeWSSYpACSAkpq044dO+y6666zM8880957772C9WbMmGGjRo2yzz//3ObPn2/nnXeeHXroodavXz93+xlnnGG1a9e2N99809LT0+2RRx6xY445xn755RcXyJIlS5bYSy+9ZC+//LIlJPz5Yd62bZu73/3339+2bt3qnuspp5zigmbx8cRjAQAAAACR1+i4Ue4SHFvYlb3B1v3vd2uyZ2tLrLdHzBU1Lw5BqQrauH2nLd24vczbKalK223anmeN6tSqcDuys7Pt3nvvdQEmL6DUrl07F5xSppQoaHTqqacWbDNu3DgXpPKWKcvoxx9/dEGf0oJSCkTpA6IgmAJcnTp1spUrV7qglscLdt1+++0Fy6ZPn24tW7Z0AaW9997bLVPQSG2RDh06uO3mzp3rglLz5s2zL774wtatW+eyv+See+5xWVkvvviiXXjhhQVD9p588kmXTeU57bTTCrVZj63b9Rz322+/cr/WAAAAAACEU1xcnCXUbWQJDfPd/zUhICUEpSpoa25+hbbPzt0VlqDUTz/9ZLm5uS6DqCQaYudRFpGGsl1wwQWFAkm7du1y2Uil+fnnn10wyRsqJwcddFChdTTE7v3333dD94LpsQODUoGaN2/uglDefSjLScPvAinrKnAo3l577VUoICW//vqry45SBpaGDGp4o6xYsYKgFAAAAAAAVYygVAXVTQ4a91lG9ZLD8xZoeFtpNCzPo0CPKNOpV69ehdbzhr9VlB5DQ/z+9a9/FblNgSdPUlJSodsUEfYCSLoPrfvBBx8UuQ/ViiruuXn02ApW6Tm2aNHC3aeCURRCBwAAAACg6hGUqqBGqbWsXaNU+23jdjckL1RKxGvbKNUaphYOyJSXhr0pMKVhb6rhVBoVHVeg5rfffrPBgweX+fH22Wcfe/rpp112ljes7ssvvyy0zgEHHODqPLVu3doSE8u3q+k+1qxZ47bX/YRq48aNLptLAanDDz/cLdNQQAAAAAAAUD1Q7bmClNVz6WFtyrVtOIucaxidiohrFjvVVtLQts8++8ymTZtW4jbjx4+3O+64w+677z5X4+n777+3xx9/3CZNmlTq4/397393mUeq6aShg2+99Zar9STec7rkkkts06ZNdvbZZ7uAldqk9c4//3zLzw9t2GPfvn2td+/eNnDgQHv77bddfSzN+nfDDTfYV199VeJ2DRo0cEP+/v3vf7si6CqsrqLnAAAAAACgeiAoFQZDe7a01FoJFh9ifEnraf0hPVuGtR033XSTXXXVVa6OUseOHW3QoEEFtZmKo4yqxx57zAWiunTpYn369LEnnnjCFTwvTVpamv33v/91M9l169bNBYn0uOLVmVIm1ieffOICUMcee6x7DBVb17C7UGe/U4DrjTfesCOOOMIFs1SH6qyzznKz/SnbqyS6/+eee84WLFjghuyNHDnS7r777pAeEwAAAAAAVL44v+YdrEGysrJcIe/MzEwXWAmUk5Njy5Ytc0GZwALeoXjr53V20mNfmN/85vPvPiAVZ3E2e/hBduw+Tawi3JSRu3a5oW3VoTL/zJkzXeBIr20oNa6qu4rsDwCih7I+FcBv0qRJyAFzAEBh9KUAUHG+GOpLdxd7CRTdz7KamPThUhv+/LdWv3ailRbi0+1a74Lnv3XbRTMNE1SdJgVuXn31VTd88Mwzz4yJgBQAAAAAAKhcBKXCICtnl63MzLGN2/NKLXau27We1td21dXtt99udevWLfZy/PHHu3VUgPycc85xQwU1PO6MM85wNZwAAAAAAABKw+x7YZCWkmgZ6SlFhtZpGJ+G82m4nhu2FzTETttVVxdddJHLeiqOlwmlouq6AAAAAAAAlFX1jYpEkVF92rlLLGnYsKG7AAAAAAAAxNzwvY8++shOPvlkN0ubsohUl2h3Vq9ebX//+9/dDGwq+qWZ3AAAAAAAABB9qjQotW3bNuvatas98MADIa2fm5trjRs3thtvvNFtBwAAAAAAgOhUpcP3VDDbK5oditatW9u9997r/p4+fXoltgwAAAAAAACVKeZrSim7ShdPVlaW+9/n87lLIF1XgXLvEqrMr6dY1jf3FVr25/Y+/aEK5y4prUih8+6XW/oBFRuC6LWzLO1FaLz9oLh9BUDs8Pp+PucAUH70pQBQcb4Y6ktDfQ4xH5S64447bPz48UWWr1+/3nJycgoty8vLcy/crl273CVUu3IyLX/ryjK3TduV5XGCaWfNz893fwcHvFBxem+0P2zcuNGSkpKqujkAKok+55mZma5PVb1CAEDZ0ZcCQMX5Yqgvzc7ODmm9mA9KjRkzxkaNGlUoU6ply5auNlVaWlqhdRWk0guXmJjoLqFKTEm3hLoZ5t+VY76cTQoX7WbtOItPaWhxiSluu7I8TkkImITmgw8+sKOPPto2bdpk9evXL3V9vTfqCBo1amQpKSkRaSOAqvnyV2Bf3wvR/uUPAFWFvhQAKs4XQ31pqMfQMR+USk5OdpdgeoOD32Rd1w7gXUJVv8dIq9Wos639zwAXdCotKOXL3WJNj/uPpbY+1ipC0VOvnWRKlS7wtQrl9fLWK25fARBb+KwDQMXRlwJAxcXFSF8aavuj+1lWE/k5W2zd7EF/1o9SHand+rPOlNbXdlVFQxWro507d1Z1EwAAAAAAQARUaVBq69attnDhQneRZcuWub9XrFhRMPRuyJAhhbbx1te2qgulv3/88UerSlt/esr8edtDCEh5fG79rT89HfZUv7vuusvat2/vssNatWplt912m/3+++8u2jpr1izr06ePS6ObOXOm2+axxx6zjh07umX77ruvPfjggyE9lnefL7/8sh111FGWmppqXbt2tfnz5xda76WXXrLOnTu79mj2xIkTJxa6XctuueUW9z5rOOWFF15oTzzxhBte9/rrr9s+++zj7vv000+37du324wZM9w2DRo0sMsvv7ygppY89dRT1rNnT6tXr541a9bM/v73v9u6devC8toCAAAAAIDwqtLhe1999ZULaHi82k9Dhw51gYnVq1cXBKg83bt3L/h7wYIF9swzz9hee+3lgiRVQUPoshY+UK5tsxbeb2ndLgnb0DsF8R599FGbPHmyHXbYYe71W7x4ccHto0ePdkEhvYZeYGrs2LF2//33u2XffPONjRgxwurUqePeg1DccMMNds8991iHDh3c32effbYtWbLE1WPS+3PmmWfazTffbIMGDbJPP/3U/vnPf7oaTeedd17BfWh7tWPcuHHu+scff+wCUPfdd58999xzrs7XqaeeaqeccooLVr3xxhv222+/2WmnnWaHHnqou28v+0sBLgWyFIzS/qTH0foAAAAAAKB6ifMrqlKDqNB5enq6q2hfXKFzZWu1adMm5KJc+Ts22IpHWpS7Pa3+sdoSajcq17Z66zRDnAJAyhxTMTQFmIYPH15oPQXs9JymTJliV1xxRcFyZVQpiKNAkufWW291QRwFkHbHu09lWl1wwQVumTLWlBX1008/uayrwYMHu2y2t99+u2C7a6+91mbPnm2LFi1y15X1pIDYK6+8UrCOApLnn3++C261a9fOLbvoootcJtTatWutbt26btlxxx3ntn/44YdLDHoeeOCBLqilbVToXEHQzZs3h1TovDz7A4DooyxTBbKbNGkS9WP3AaCq0JcCQMX5Yqgv3V3sJVB0P8tqwLdzawW3D22axNIoEJSbm2vHHHNMietoaJtn27ZttnTpUhdQUsDGuygopeWh2n///Qv+bt68ufvfGzKnNimTKZCu//rrr4WG3QW2y6Mhe15ASpo2beoCUF5AylsWODxPmVknn3yyG7aoIXwaqijB2XYAAAAAAKDqxfzse5UtvlbdCm5fLyztqF27dqnraFieR5lVouF+vXr1KrReQkJCyI+blJRU8Lc3DFHR3bIIbFdx9+vdd3HLvMdSkK1///7uomGJyhpTMErXKZ4OAAAAAED1Q1CqguJTGllielvblblMA+rKsGWcJaa3sfiUhmFph2o6KTA1d+7cIsP3iqMsoxYtWrjaTBpmVxlUQP2TTz4ptEzX99577zIFvkKh2lkbN260O++801q2bFkwfA8AAAAAAFRPBKUqSNk6Kla+6cOry7xtWrdLw1bkXDWPrrvuOlezqVatWm6YnOo5qXZTSUP6xo8f72aw0zhP1WfS8D8FclRzySs6XxFXXXWVq+mkulUqRq6Z+VTzKtQZ/spCQ/b0vKdOnerqT/3www/ucQEAAAAAQPVETakwqNvxXItLSi3Dyxnv1q/b8ZywtuOmm25ygSDNZKcsJQWCAmsuBVNGlQqVP/7449alSxdXg0lFxlXYOxwOOOAAe/75590Mevvtt59r14QJEwrNvBcuGq6ntr/wwgvWqVMnlzGlWf0AAAAAAED1xOx7YZptbfvvb9va/wzQlHiqqrSbNeOVXmVNB75mqXv1s4oInH0vXBlX+Auz7wE1QyzNcgIAVYW+FAAqzhdDfSmz70VQ5tdTbMO7/7D45Poh1JXyu/U2vHOh2w4AAAAAAKAmIigVBr7cLMvfutJ8ORtDCkppPbd+bpZVV7fffrvVrVu32Mvxxx9f1c0DAAAAAABRjkLnYRCfnGYJdTMKLftzVKTvz+F8bmhdfJEhdtquulKx8DPPPLPY2zTLHwAAAAAAQEUQlAqD9AOudJdY0rBhQ3cBAAAAAACoDAzfAwAAAAAAQMQRlAIAAAAAAEDEEZQCAAAAAABAxFFTKgwm//ChTV70UZFC5z7zF9Q5j7e4IoXOR3Y+wkbu1yfCrQUAAAAAAKh6BKXCICsvx1ZuzyzXdgAAAAAAADURw/fCIC0pxTJS061RcqoVzoUqSrdrPa2v7cJFmVkXXnihmzFPGVkLFy4sss6RRx5pV14ZW7MEAgAAAACA6ERQKgw0BO/Rw860LTt3aJDebtfV7VpP64dz6N6cOXPsiSeesNdff91Wr15t++23n0Wz33//vcTgGgAAAAAAiH4EpcJgS+4OO+O9Ga5+lOpI7Y5XZ0rra7twWbp0qTVv3twOOeQQa9asmSUmMjITAAAAAABUXwSlwuDJJV/Z9l07Sw1IebSe1n9q6Vdhefzzzz/fLrvsMluxYoXLLmrdurVt27bNhgwZYnXr1nXBqokTJ5bpPnNzc+3qq6+2jIwMq1OnjvXq1cs++OADd1tOTo517tzZDRcMDIrVq1fPpk+f7q5v3LjRzj77bLd9amqqdenSxZ599tnCr4PPZ3fddZe1b9/ekpOTrVWrVnbbbbe529q0aeP+7969u3tOGnoIAAAAAABiB0GpMNRyuv+neeXaduqP89z2FTVlyhSbMGGC7bnnnm7o3pdffmnXXHONffjhh/af//zH3n77bRdQ+vrrr0O+z0svvdTmz59vzz33nH333Xd2xhln2HHHHWe//vqrpaSk2MyZM23GjBnu/vPz8+2cc86xfv362bBhwwoCVz169LDZs2fbDz/84AJY5557rn3xxRcFjzFmzBi788477aabbrIff/zRnnnmGWvatKm7zVvv3Xffdc/p5ZdfrvDrBAAAAAAAqg/GeFXQxtzttjR7Y5m3UyhK223K3W6NUupUqA3p6ekuSykhIcEN3du6datNmzbNnn76aTvmmGPcOgogKWgVCmVcPf744+7/Fi1auGXKmlLdKi2//fbbrVu3bnbrrbfa8OHD7ayzzrLly5e7elYeZUhpG48yud566y17/vnn7aCDDrLs7Gy799577f7777ehQ4e6ddq1a2eHHXaY+7tx48bu/0aNGrnnBAAAAAAAYgtBqQrampdboe2z83IrHJQKpqF0O3fudEPuPJqVb5999glp+++//95lP+29995FhvQpSOS56qqr7NVXX3WBpTfffLPQbdpewSsFoVauXOnao+01lE9++uknd90LmgEAAAAAgJqFoFQF1U1KrtD29Sq4fWVQppWyrhYsWOD+D6QaVZ5169bZL7/84tbRsD4N7/PcfffdLhNKQwtVT0p1qa688koXnJLatWtH8BkBAAAAAIDqhppSFdQoOdXa1WtkcWXcTutru4bJf2YOhZOGwSUlJdnnn39esGzz5s0ugBQKFRdXppOCTipCHngJHEqn+lEKOGlo4HXXXeeynzyffPKJDRgwwNWa6tq1q7Vt27bQ43fo0MEFpubOnVtsG2rVquX+VzsAAAAAAEDsIVOqgjQz3KUdD7NRX/ynzNte1ukwt324KZvpggsucMXONaSuSZMmdsMNN1h8fGgxSA3bGzx4sJu9T7P2KUi1fv16F0Daf//97cQTT7QHHnjAFUJXEfSWLVu6guba5rPPPnMBJQWdXnzxRfv000+tQYMGNmnSJFu7dq116tTJPYaKpSuQde2117r1Dz30UPcYixYtcm1XmxW0Uh0r1cLS+qqdBQAAAAAAYgOZUmEwpH1PS02sZfEh5ktpPa1/brueldYmDZ87/PDD7eSTT7a+ffu6AuKaDS9UKmiuoJTqRqkW1cCBA92sfq1atbLFixe7gNeDDz7oAlKivzds2OBm0pMbb7zRDjjgAOvfv78deeSRLsNK9xFI6+r+x44dax07drRBgwa57CxJTEy0++67zx555BFXbF1ZVwAAAAAAIHbE+f1+TQRXY2RlZbmMm8zMTEtLSyt0W05Oji1btszatGnjMnPK4q2VP9vJ7zxmejV9bm69kgNSSo56vd9wOzYjtMLjJdFbt2vXLhfAqYyMq5quIvsDgOjh8/lcQFwZmqFmlAIACqMvBYCK88VQX7q72Eug6H6W1cTkHz60EfOet/q1apt/NwEp0e1ab/i85912AAAAAAAANRE1pcIgKy/HVm7PDGldhaw25m4v2K4qfPzxx3b88cfvdvY9AAAAAACAykRQKgzSklIsIzW9yNA6DePTcD6NrPtz2F5cke2qQs+ePW3hwoVV8tgAAAAAAABCUCoMRu7Xx12ihWa1a9++fVU3AwAAAAAA1GDUlAIAAAAAAEDEEZQCAAAAAABAxBGUAgAAAAAAQMRRUyoMNs6Z5C7Bhc7N77OCSudx8UUKnTc6bpS7AAAAAAAA1DQEpcIgf0eW7dq8slzbAQAAAAAA1EQEpcIgoXaaJTbIMF9ejvm2blKe1G7WjrP4ug0tPinFbQcAAAAAAFATEZQKAw3BS87obCsmnfjnUD0N2StJXJz5tm+xPUfNtrpd+keymQAAAAAAANUGhc7DIH/bFvtj6ml/BqNUR2p3/r/OlNbXdghdXl5eVTcBAAAAAACECUGpMNjyyQzz524vPSDl8fvMv3O7bfnkybA8/uuvv27169e3/Px8d33hwoWuqPro0aML1hk+fLidc8457u9HH33UWrZsaampqXbKKafYpEmT3PahWLp0qQ0YMMCaNm1qdevWtQMPPNDefffdQuu0bt3abrnlFjv77LOtTp06lpGRYQ888EChddS+hx56yI4//nirXbu2tW3b1l588cWC23///Xe3zqxZs6xPnz6WkpJiM2fONJ/PZxMmTLA999zTkpOTrVu3bjZnzpyC4vJ9+/a1/v37/1lo3sw2bdrk1h07dmy5X18AAAAAABB+BKUqSMGPTe9MLde2m965ryB4UhGHH364ZWdn2zfffOOuf/jhh7bHHnvYBx98ULCOlh155JH2ySef2EUXXWRXXHGFC17169fPbrvttpAfa+vWrXbCCSfY3Llz3eMdd9xxdvLJJ9uKFSsKrXf33Xdb165d3ToKjunx3nnnnULr3HTTTXbaaafZt99+a4MHD7azzjrLfvrpp0LreNtquYJN9957r02cONHuuece++6779yyv/3tb/brr7+6INaMGTPsyy+/tPvuu89tr+eqoBhBKQAAAAAAqpc4fziiIlEkKyvL0tPTLTMz09LSChcaz8nJsWXLllmbNm1cZk4odmVvsF8ubVzu9uz9wAZLrNuoXNvqrdu1a5clJiZaz549XWbS1Vdf7bKflME0fvx427hxo3uuyhb65ZdfXCBIgSVlV3mUQaXrW7aUbzjhfvvt54I/l156aUGmVMeOHe3NN98sWEcBJ732b7zxhruuAJK2UbaU5+CDD7YDDjjAHnzwQZcppfdhypQpLijlUYDpkksuseuvv75g2UEHHeSer5eN9cILL9iQIUPsyiuvtKlTp7rAWIcOHcr8vMqzPwCIPsrAXLdunTVp0sTi4zlXAwDlQV8KABXni6G+dHexl0DR/SyrAV/O1optvyM7LO3QEDdlRilQ9fHHH9upp57qAkPz5s1zWVItWrRwgZmff/7ZBXECBV/fHQW0FPjSfWvIn4bwKYspOFOqd+/eRa4HZ0GFso6CbYE79apVq+zQQw8ttI6uB253xhlnuMDcnXfe6TKqyhOQAgAAAAAAlYvZ9yooPqVuxbavXS8s7dDQvOnTp7uhcElJSbbvvvu6ZQpUbd682QWtwkEBKQ3DU7Cnffv2rh7U6aefbjt37rTKoJpUZbV9+3ZbsGCBJSQkuGF9AAAAAACg+iFTqoIS6jaypCbtNB6tbBvGxbntEuo0DEs7vLpSkydPLghAeUEpXfS37LPPPq7mUqDg67ujmlTnnXeey0Tq0qWLNWvWzA21C/bZZ58Vua7sqrKuE0gpf8r4UhuC29SpU6eC61dddZVLddTwQdWWeu+990J+fgAAAAAAIDLIlKog1UZq2O8yWztzZJm3bdjvcrd9ODRo0MD2339/N0Pd/fff75YdccQRduaZZ1peXl5BoOqyyy5zyzXjngqUK2Cj4E2o7dBQuJdfftltq21Uo0rjXoMpUHTXXXfZwIEDXWaV6jzNnj270DpapuF5hx12mGv3F198YdOmTdvt419zzTU2btw4a9eunZt57/HHH3cF27W96DGUMTZ//nxXn0rrDx061BVF12sEAAAAAACqBzKlwqD+oUMtLjnVLC7ElzMu3uJqpVr9Q4eEtR0KPOXn5xdkRTVs2NBlECmbSRlSXv2lhx9+2AWlNDvenDlzbOTIkSEX8tZ2Cu4ccsghLjCl2e8U/AmmbKWvvvrKunfvbrfeeqvbTusGUiH25557zgXTnnzySXv22WcLZTwV5/LLL7dRo0a5+1emltr/2muvuWDZ+vXr7YILLrCbb765oE16jKZNm7qi6gAAAAAAoPpg9r0wzba29fu3bMWkEzUlnpm/aOZQAQWu4uKs1ag3rG6XY60iAmffq0jG1YgRI2zx4sWuQHo4aPY9zXynS0nU3ldeecVlUlVXzL4H1AyxNMsJAFQV+lIAqDhfDPWlzL4XQRvnTLJV0y6w+NT6fwaldsfvd+utmjbMbVcVVKRcBdGXLFliU6dOtRkzZrghbgAAAAAAAJFCTakwyN+RZbs2rwxxbb/5tm403/9vVxVUu0n1nlQYvW3btq4Y+PDhw91tnTt3tuXLlxe73SOPPGKDBw+OcGsBAAAAAEAsIigVBgm10yyxQUahZX5vGJ/+19A61ZEKGmKn7arC888/X+Jtb7zxhiuMXhzVZgpFcbPxBatho0YBAAAAAEAQglJh0Oi4Ue4SC/baa6+qbgIAAAAAAKgBqCkFAAAAAACAiCMoVQyGlkHYDwAAAAAAqDwEpQIkJSW5/7dv317VTUE14O0H3n4BAAAAAABipKbURx99ZHfffbctWLDAVq9eba+88ooNHDhwt9t88MEHNmrUKFu0aJG1bNnSbrzxRjvvvPPC0p6EhASrX7++rVu3zl1PTU0tUpy8umXy7Nq1yxITE6t1O6ONXlcFpLQfaH/QfgEAAAAAAGIoKLVt2zbr2rWrDRs2zE499dRS11+2bJmdeOKJdtFFF9nMmTNt7ty5Nnz4cGvevLn1798/LG1q1qyZ+98LTFX34InP57P4+KIz+6HiFJDy9gcAAAAAABBDQanjjz/eXUL18MMPW5s2bWzixInueseOHW3evHk2efLksAWlFNxRkKtJkyaWl5dn1ZkCUhs3brRGjRq5wBTCR0P2yJACAAAAACBGg1JlNX/+fOvbt2+hZQpGXXnllSVuk5ub6y6erKysgoCOLrsLTtWqVcuqM7VfQ/fUToJS4be7/QNAbH3WvcxTAED50JcCQMX5YqgvDfU5RFVQas2aNda0adNCy3RdgaYdO3ZY7dq1i2xzxx132Pjx44ssX79+veXk5Fi0v8mZmZlupyUoBQDlQ18KABVHXwoAFeeLob40Ozs79oJS5TFmzBhXGN2jAJYKpDdu3NjS0tIs2ndYZXTpuUT7DgsAVYW+FAAqjr4UACrOF0N9aUpKSuwFpVR0eu3atYWW6bqCS8VlSUlycrK7BNMbHO1vsmiHjZXnAgBVhb4UACqOvhQAKi4uRvrSUNsfVc+yd+/ebsa9QO+8845bDgAAAAAAgOhRpUGprVu32sKFC91Fli1b5v5esWJFwdC7IUOGFKx/0UUX2W+//WbXXnutLV682B588EF7/vnnbeTIkVX2HAAAAAAAABBlQamvvvrKunfv7i6i2k/6e+zYse766tWrCwJU0qZNG5s9e7bLjuratatNnDjRHnvsMTcDHwAAAAAAAKJHldaUOvLII11V+ZI88cQTxW7zzTffVHLLAAAAAAAAUJmiqqYUAAAAAAAAYgNBKQAAAAAAAEQcQSkAAAAAAABEHEEpAAAAAAAARBxBKQAAAAAAAEQcQSkAAAAAAABEHEEpAAAAAAAARBxBKQAAAAAAAEQcQSkAAAAAAABEHEEpAAAAAAAARBxBKQAAAAAAAEQcQSkAAAAAAABEHEEpAAAAAAAARBxBKQAAAAAAAEQcQSkAAAAAAABEHEEpAAAAAAAARBxBKQAAAAAAAEQcQSkAAAAAAABEHEEpAAAAAAAARBxBKQAAAAAAAEQcQSkAAAAAAABEHEEpAAAAAAAARBxBKQAAAAAAAEQcQSkAAAAAAABEHEEpAAAAAAAARBxBKQAAAAAAAEQcQSkAAAAAAABEHEEpAAAAAAAARBxBKQAAAAAAAEQcQSkAAAAAAABEHEEpAAAAAAAARBxBKQAAAAAAAEQcQSkAAAAAAABEHEEpAAAAAAAARBxBKQAAAAAAAEQcQSkAAAAAAABEHEEpAAAAAAAARBxBKQAAAAAAAEQcQSkAAAAAAABEHEEpAAAAAAAARBxBKQAAAAAAAEQcQSkAAAAAAABEHEEpAAAAAAAARBxBKQAAAAAAAEQcQSkAAAAAAABEHEEpAAAAAAAARBxBKQAAAAAAAEQcQSkAAAAAAABEHEEpAAAAAAAARBxBKQAAAAAAAEQcQSkAAAAAAABEHEEpAAAAAAAARBxBKQAAAAAAAEQcQSkAAAAAAABEHEEpAAAAAAAA1Myg1AMPPGCtW7e2lJQU69Wrl33xxRclrpuXl2cTJkywdu3aufW7du1qc+bMiWh7AQAAAAAAEOVBqVmzZtmoUaNs3Lhx9vXXX7sgU//+/W3dunXFrn/jjTfaI488YlOnTrUff/zRLrroIjvllFPsm2++iXjbAQAAAAAAEKVBqUmTJtmIESPs/PPPt06dOtnDDz9sqampNn369GLXf+qpp+z666+3E044wdq2bWsXX3yx+3vixIkRbzsAAAAAAADKJ9Gq0M6dO23BggU2ZsyYgmXx8fHWt29fmz9/frHb5ObmumF7gWrXrm3z5s0rcX1dPFlZWe5/n8/nLtFM7ff7/VH/PACgKtGXAkDF0ZcCQMX5YqgvDfU5VGlQasOGDZafn29NmzYttFzXFy9eXOw2Gtqn7KojjjjC1ZWaO3euvfzyy+5+inPHHXfY+PHjiyxfv3695eTkWLS/yZmZmW6nVTAPAFB29KUAUHH0pQBQcb4Y6kuzs7Orf1CqPO6991433G/fffe1uLg4F5jS0L+ShvspC0s1qwIzpVq2bGmNGze2tLQ0i/YdVq+Bnku077AAUFXoSwGg4uhLAaDifDHUlwaPcKuWQak99tjDEhISbO3atYWW63qzZs2K3UZvzquvvuqynDZu3GgtWrSw0aNHu/pSxUlOTnaXYHqDo/1NFu2wsfJcAKCq0JcCQMXRlwJAxcXFSF8aavur9FnWqlXLevTo4YbgBUYGdb13796lRt0yMjJs165d9tJLL9mAAQMi0GIAAAAAAACEQ5UP39PQuqFDh1rPnj3toIMOsilTpti2bdvckDwZMmSICz6pNpR8/vnntnLlSuvWrZv7/+abb3aBrGuvvbaKnwkAAAAAAACiJig1aNAgV3R87NixtmbNGhdsmjNnTkHx8xUrVhRK+9KwvRtvvNF+++03q1u3rp1wwgn21FNPWf369avwWQAAAAAAAKAs4vwq616DqNB5enq6q2gfC4XO161bZ02aNIn68aYAUFXoSwGg4uhLAaDifDHUl4Yae4nuZwkAAAAAAICoRFAKAAAAAAAA0RGUWrp0qavrdPbZZ7vUMnnzzTdt0aJF4W4fAAAAAAAAYlCZg1IffvihdenSxc2C9/LLL9vWrVvd8m+//dbGjRtXGW0EAAAAAABATQ9KjR492m699VZ75513rFatWgXLjz76aPvss8/C3T4AAAAAAADEoDIHpb7//ns75ZRTiixXdfgNGzaEq10AAAAAAACIYWUOStWvX99Wr15dZPk333xjGRkZ4WoXAAAAAAAAYliZg1JnnXWWXXfddbZmzRqLi4szn89nn3zyiV199dU2ZMiQymklAAAAAAAAanZQ6vbbb7d9993XWrZs6Yqcd+rUyY444gg75JBD3Ix8AAAAAAAAQGkSrYxU3PzRRx+1m266yX744QcXmOrevbt16NChrHcFAAAAAACAGqrMQSlPq1at3AUAAAAAAACo9KDUsGHDdnv79OnTy9wIAAAAAAAA1CxlDkpt3ry50PW8vDw3jG/Lli129NFHh7NtAAAAAAAAiFFlDkq98sorRZZpBr6LL77Y2rVrF652AQAAAAAAIIbFh+VO4uNt1KhRNnny5HDcHQAAAAAAAGJcWIJSsnTpUtu1a1e47g4AAAAAAAAxrMzD95QRFcjv99vq1att9uzZNnTo0HC2DQAAAAAAADGqzEGpb775psjQvcaNG9vEiRNLnZkPAAAAAAAAKFdQ6v333+eVAwAAAAAAQPWoKQUAAAAAAACENVOqe/fuFhcXF9Idfv311yE/OAAAAAAAAGqmkIJSAwcOrPyWAAAAAAAAoMYIKSg1bty4ym8JAAAAAAAAagxqSgEAAAAAAKD6z76Xn59vkydPtueff95WrFhhO3fuLHT7pk2bwtk+AAAAAAAAxKAyZ0qNHz/eJk2aZIMGDbLMzEwbNWqUnXrqqRYfH28333xz5bQSAAAAAAAANTsoNXPmTHv00UftqquussTERDv77LPtscces7Fjx9pnn31WOa0EAAAAAABAzQ5KrVmzxrp06eL+rlu3rsuWkpNOOslmz54d/hYCAAAAAAAg5pQ5KLXnnnva6tWr3d/t2rWzt99+2/395ZdfWnJycvhbCAAAAAAAgJhT5qDUKaecYnPnznV/X3bZZXbTTTdZhw4dbMiQITZs2LDKaCMAAAAAAABq6ux7999/v51zzjl25513FixTsfNWrVrZ/PnzXWDq5JNPrqx2AgAAAAAAoCZmSt1www3WokULGzx4sL333nsFy3v37u1m4CMgBQAAAAAAgLAHpVTg/OGHH7ZVq1ZZv379rE2bNnbLLbfYH3/8EfKDAQAAAAAAAGUKStWuXdvVjXr//fft119/tXPPPdemTZvmglPHHXecvfDCC5aXl8erCgAAAAAAgPAXOpe2bdvahAkTbNmyZfbmm29ao0aN7LzzzrOMjIzy3B0AAAAAAABqmHIFpTxxcXGWmJjo/vf7/WRKAQAAAAAAoPKCUqojpUwpZUypvpTqTD366KO2evXq8twdAAAAAAAAapjEUFfcuXOnvfzyyzZ9+nQ3+17z5s1t6NChNmzYMBecAgAAAAAAAMIelGrWrJlt377dTjrpJPvvf/9r/fv3t/j4Co3+AwAAAAAAQA0VclDqxhtvdDPuNW7cuHJbBAAAAAAAgJgXclBq1KhRldsSAAAAAAAA1BiMvwMAAAAAAEDEEZQCAAAAAABAxBGUAgAAAAAAQPUPSk2YMMHNwhdsx44d7jYAAAAAAAAg7EGp8ePH29atW4ssV6BKtwEAAAAAAABhD0r5/X6Li4srsvzbb7+1hg0blvXuAAAAAAAAUAMlhrpigwYNXDBKl7333rtQYCo/P99lT1100UWV1U4AAAAAAADUxKDUlClTXJbUsGHD3DC99PT0gttq1aplrVu3tt69e1dWOwEAAAAAAFATg1JDhw51/7dp08YOOeQQS0pKqsx2AQAAAAAAIIaFHJTy9OnTx3w+n/3yyy+2bt0693egI444IpztAwAAAAAAQAwqc1Dqs88+s7///e+2fPlyN5wvkOpMqb4UAAAAAAAAENaglIqZ9+zZ02bPnm3NmzcvdiY+AAAAAAAAIKxBqV9//dVefPFFa9++fVk3BQAAAAAAAJx4K6NevXrZkiVLyroZAAAAAAAAUP6g1GWXXWZXXXWVPfHEE7ZgwQL77rvvCl3K44EHHrDWrVtbSkqKC3p98cUXu11/ypQpts8++1jt2rWtZcuWNnLkSMvJySnXYwMAAAAAACAKhu+ddtpp7v9hw4YVLFNdKRU9L0+h81mzZtmoUaPs4YcfdgEpBZz69+9vP//8szVp0qTI+s8884yNHj3apk+fbocccoibBfC8885zjz1p0qSyPh0AAAAAAABEQ1Bq2bJlYW2AAkkjRoyw888/311XcEpF1BV0UvAp2KeffmqHHnqomwFQlGF19tln2+effx7WdgEAAAAAAKAaBaX22muvsD34zp073RDAMWPGFCyLj4+3vn372vz584vdRtlRTz/9tBvid9BBB9lvv/1mb7zxhp177rnFrp+bm+sunqysLPe/z+dzl2im9itDLdqfBwBUJfpSAKg4+lIAqDhfDPWloT6HMgel5KmnnnIZTcqaUvBIgSoNu2vTpo0NGDAg5PvZsGGDG+7XtGnTQst1ffHixcVuowwpbXfYYYe5N2vXrl120UUX2fXXX1/s+nfccYeNHz++yPL169dHfR0qvcmZmZnudVAwDwBQdvSlAFBx9KUAUHG+GOpLs7OzKyco9dBDD9nYsWPtyiuvtNtuu62ghlT9+vVdYKosQany+OCDD+z222+3Bx98sGAmwCuuuMJuueUWu+mmm4qsryws1awKzJRScfTGjRtbWlqaRfsOq1paei7RvsMCQFWhLwWAiqMvBYCK88VQX6qJ7ColKDV16lR79NFHbeDAgXbnnXcWLO/Zs6ddffXVZbqvPfbYwxISEmzt2rWFlut6s2bNit1GgScN1Rs+fLi73qVLF9u2bZtdeOGFdsMNNxR545KTk90lmNaL9jdZtMPGynMBgKpCXwoAFUdfCgAVFxcjfWmo7S/zs9SQve7duxdZrsCPgkNlUatWLevRo4fNnTu3UGRQ13v37l3sNtu3by/y5BTYEqW4AQAAAAAAoPorc6aU6kYtXLiwSMHzOXPmWMeOHcvcAA2tGzp0qMu0UuFyDQFUcMubjW/IkCGWkZHhakPJySef7GbsU2DMG76n7Ckt94JTAAAAAAAAiLGglIJIl1xyiSsSrswkzYL37LPPuqDRY489VuYGDBo0yBUdV52qNWvWWLdu3VyAyyt+vmLFikKZUTfeeKNLZ9P/K1eudGMtFZBSfSsAAAAAAABEhzh/Oca8zZw5026++WZbunSpu96iRQs3w90FF1xg1Z0Knaenp7uK9rFQ6HzdunXWpEmTqB9vCgBVhb4UACqOvhQAKs4XQ31pqLGXMmdKyeDBg91F9Z22bt3qXjAAAAAAAAAgVOUKSnlSU1PdBQAAAAAAAAh7UOqAAw5wM+I1aNDAFRhXTaeSfP3112VqAAAAAAAAAGqekIJSAwYMsOTkZPf3wIEDK7tNAAAAAAAAiHEhBaXGjRtX7N8AAAAAAABAeZS5nPuXX35pn3/+eZHlWvbVV1+VqxEAAAAAAACoWcoclLrkkkvsjz/+KLJ85cqV7jYAAAAAAAAg7EGpH3/80RU+D6YC6LoNAAAAAAAACHtQSgXP165dW2T56tWrLTExpBJVAAAAAAAAqOHKHJQ69thjbcyYMZaZmVmwbMuWLXb99ddbv379wt0+AAAAAAAAxKAypzbdc889dsQRR9hee+3lhuzJwoULrWnTpvbUU09VRhsBAAAAAABQ04NSGRkZ9t1339nMmTPt22+/tdq1a9v5559vZ599tiUlJVVOKwEAAAAAABBTylUEqk6dOnbhhReGvzUAAAAAAACoEUIKSr322mt2/PHHu0wo/b07f/vb38LVNgAAAAAAANTkoNTAgQNtzZo11qRJE/d3SeLi4iw/Pz+c7QMAAAAAAEBNDUr5fL5i/wYAAAAAAADKIz6UlRo2bGgbNmxwfw8bNsyys7PL9WAAAAAAAABAyEGpnTt3WlZWlvt7xowZlpOTw6sHAAAAAACAyh2+17t3b1dLqkePHub3++3yyy+32rVrF7vu9OnTy98aAAAAAAAA1AghBaWefvppmzx5si1dutRdz8zMJFsKAAAAAAAAlRuUatq0qd15553u7zZt2thTTz1ljRo1Kv+jAgAAAAAAoEYrc6Hzo446ymrVqlXZ7QIAAAAAAEAMo9A5AAAAAAAAIo5C5wAAAAAAAKj+hc7j4uIodA4AAAAAAIAKodA5AAAAAAAAqmdQKtCyZcsK/la2VEpKSrjbBAAAAAAAgBgXUqHzQD6fz2655RbLyMiwunXr2m+//eaW33TTTTZt2rTKaCMAAAAAAABqelDq1ltvtSeeeMLuuusuq1WrVsHy/fbbzx577LFwtw8AAAAAAAAxqMxBqSeffNL+/e9/2+DBgy0hIaFgedeuXW3x4sXhbh8AAAAAAABiUJmDUitXrrT27dsXO6wvLy8vXO0CAAAAAABADCtzUKpTp0728ccfF1n+4osvWvfu3cPVLgAAAAAAAMSwMs++N3bsWBs6dKjLmFJ21Msvv2w///yzG9b3+uuvV04rAQAAAAAAULMzpQYMGGD//e9/7d1337U6deq4INVPP/3klvXr169yWgkAAAAAAICanSklhx9+uL3zzjvhbw0AAAAAAABqhHIFpWTBggUuQ0o6d+5MPSkAAAAAAABUXlBq3bp1dtZZZ9kHH3xg9evXd8u2bNliRx11lD333HPWuHHjst4lAAAAAAAAapgy15S67LLLLDs72xYtWmSbNm1ylx9++MGysrLs8ssvr5xWAgAAAAAAoGZnSs2ZM8cVOe/YsWPBsk6dOtkDDzxgxx57bLjbBwAAAAAAgBhU5kwpn89nSUlJRZZrmW4DAAAAAAAAwh6UOvroo+2KK66wVatWFSxbuXKljRw50o455piy3h0AAAAAAABqoDIHpe6//35XP6p169bWrl07d2nTpo1bNnXq1MppJQAAAAAAAGp2TamWLVva119/7epKLV682C1Tfam+fftWRvsAAAAAAAAQg8oclJK4uDjr16+fuwAAAAAAAACVNnzvvffec7PsaZhesMzMTOvcubN9/PHHZW4AAAAAAAAAap6Qg1JTpkyxESNGWFpaWpHb0tPT7R//+IdNmjQp3O0DAAAAAABATQ5Kffvtt3bccceVePuxxx5rCxYsCFe7AAAAAAAAEMNCDkqtXbvWkpKSSrw9MTHR1q9fH652AQAAAAAAIIaFHJTKyMiwH374ocTbv/vuO2vevHm42gUAAAAAAIAYFnJQ6oQTTrCbbrrJcnJyity2Y8cOGzdunJ100knhbh8AAAAAAABiUGKoK95444328ssv2957722XXnqp7bPPPm754sWL7YEHHrD8/Hy74YYbKrOtAAAAAAAAqGlBqaZNm9qnn35qF198sY0ZM8b8fr9bHhcXZ/3793eBKa0DAAAAAAAAhC0oJXvttZe98cYbtnnzZluyZIkLTHXo0MEaNGhQlrsBAAAAAABADVemoJRHQagDDzww/K0BAAAAAABAjRByofPKpKF/rVu3tpSUFOvVq5d98cUXJa575JFHuiGDwZcTTzwxom0GAAAAAABAFAelZs2aZaNGjXKz93399dfWtWtXV6Nq3bp1xa6vYuurV68uuPzwww+WkJBgZ5xxRsTbDgAAAAAAgCgNSk2aNMlGjBhh559/vnXq1MkefvhhS01NtenTpxe7fsOGDa1Zs2YFl3feecetT1AKAAAAAAAgelRpUGrnzp22YMEC69u3718Nio931+fPnx/SfUybNs3OOussq1OnTiW2FAAAAAAAAFVe6DxcNmzYYPn5+da0adNCy3V98eLFpW6v2lMavqfAVElyc3PdxZOVleX+9/l87hLN1H7NgBjtzwMAqhJ9KQBUHH0pAFScL4b60lCfQ5UGpSpKwaguXbrYQQcdVOI6d9xxh40fP77I8vXr11tOTo5F+5ucmZnpdlplmAEAyo6+FAAqjr4UACrOF0N9aXZ2dvUPSu2xxx6uSPnatWsLLdd11YvanW3bttlzzz1nEyZM2O16Y8aMcYXUAzOlWrZsaY0bN7a0tDSL9h1WMw/quUT7DgsAVYW+FAAqjr4UACrOF0N9aUpKSvUPStWqVct69Ohhc+fOtYEDBxa8Cbp+6aWX7nbbF154wQ3LO+ecc3a7XnJysrsE0xsc7W+yaIeNlecCAFWFvhQAKo6+FAAqLi5G+tJQ21/lw/eUxTR06FDr2bOnG4Y3ZcoUlwWl2fhkyJAhlpGR4YbhBQ/dUyCrUaNGVdRyAAAAAAAAlFeVB6UGDRrk6juNHTvW1qxZY926dbM5c+YUFD9fsWJFkQjbzz//bPPmzbO33367iloNAAAAAACAiojzq4JWDaKaUunp6a54WCzUlFq3bp01adIk6lP7AKCq0JcCQMXRlwJAxfliqC8NNfYS3c8SAAAAAAAAUYmgFAAAAAAAACKOoBQAAAAAAAAijqAUAAAAAAAAIo6gFAAAAAAAACKOoBQAAAAAAAAijqAUAAAAAAAAIo6gFAAAAAAAACKOoBQAAAAAAAAijqAUAAAAAAAAIo6gFAAAAAAAACKOoBQAAAAAAAAijqAUAAAAAAAAIo6gFAAAAAAAACKOoBQAAAAAAAAijqAUAAAAAAAAIo6gFAAAAAAAACKOoBQAAAAAAAAijqAUAAAAAAAAIo6gFAAAAAAAACKOoBQAAAAAAAAijqAUAAAAAAAAIo6gFAAAAAAAACKOoBQAAAAAAAAijqAUAAAAAAAAIo6gFAAAAAAAACKOoBQAAAAAAAAijqAUAAAAAAAAIo6gFAAAAAAAACKOoBQAAAAAAAAijqAUAAAAAAAAIo6gFAAAAAAAACKOoBQAAAAAAAAijqAUAAAAAAAAIo6gFAAAAAAAACKOoBQAAAAAAAAijqAUAAAAAAAAIo6gFAAAAAAAACKOoBQAAAAAAAAijqAUAAAAAAAAIo6gFAAAAAAAACKOoBQAAAAAAAAijqAUAAAAAAAAIo6gFAAAAAAAACKOoBQAAAAAAAAijqAUAAAAAAAAIo6gFAAAAAAAACKOoBQAAAAAAAAijqAUAAAAAAAAIo6gFAAAAAAAACKOoBQAAAAAAAAijqAUAAAAAAAAIo6gFAAAAAAAACKOoBQAAAAAAAAijqAUAAAAAAAAamZQ6oEHHrDWrVtbSkqK9erVy7744ovdrr9lyxa75JJLrHnz5pacnGx77723vfHGGxFrLwAAAAAAACom0arYrFmzbNSoUfbwww+7gNSUKVOsf//+9vPPP1uTJk2KrL9z507r16+fu+3FF1+0jIwMW758udWvX79K2g8AAAAAAIAoDEpNmjTJRowYYeeff767ruDU7Nmzbfr06TZ69Ogi62v5pk2b7NNPP7WkpCS3TFlWAAAAAAAAiB5VOnxPWU8LFiywvn37/tWg+Hh3ff78+cVu89prr1nv3r3d8L2mTZvafvvtZ7fffrvl5+dHsOUAAAAAAACI2kypDRs2uGCSgkuBdH3x4sXFbvPbb7/Ze++9Z4MHD3Z1pJYsWWL//Oc/LS8vz8aNG1dk/dzcXHfxZGVluf99Pp+7RDO13+/3R/3zAICqRF8KABVHXwoAFeeLob401OdQ5cP3yvPEVE/q3//+tyUkJFiPHj1s5cqVdvfddxcblLrjjjts/PjxRZavX7/ecnJyLJrptcjMzHQ7rTLMAABlR18KABVHXwoAFeeLob40Ozu7+gel9thjDxdYWrt2baHlut6sWbNit9GMe6olpe08HTt2tDVr1rjhgLVq1Sq0/pgxY1wh9cBMqZYtW1rjxo0tLS3Non2HjYuLc88l2ndYAKgq9KUAUHH0pQBQcb4Y6ktTUlKqf1BKASRlOs2dO9cGDhxY8Cbo+qWXXlrsNoceeqg988wzbj3vTfrll19csCo4ICXJycnuEkzbRvubLNphY+W5AEBVoS8FgIqjLwWAiouLkb401PZX+bNUFtOjjz5qM2bMsJ9++skuvvhi27ZtW8FsfEOGDHHZTh7drtn3rrjiCheM0kx9KnSuwucAAAAAAACIDlVeU2rQoEGuvtPYsWPdELxu3brZnDlzCoqfr1ixolCETUPv3nrrLRs5cqTtv//+lpGR4QJU1113XRU+CwAAAAAAAJRFnF8VtGoQ1ZRKT093xcNioabUunXrXOH3aE/tA4CqQl8KABVHXwoAFeeLob401NhLdD9LAAAAAAAARCWCUgAAAAAAAIg4glIAAAAAAACIOIJSAAAAAAAAiDiCUgAAAAAAAIg4glIAAAAAAACIOIJSAAAAAAAAiDiCUgAAAAAAAIg4glIAAAAAAACIOIJSAAAAAAAAiDiCUgAAAAAAAIg4glIAAAAAAACIOIJSAAAAAAAAiDiCUgAAAAAAAIi4xMg/JAAAAIBoNPmHD23yoo+KvS0/P98SEhKKvW1k5yNs5H59Krl1AIBoQ1AKAAAAQEiy8nJs5fbMcm0HAEAwglIAAAAAQpKWlGIZqemFlvnNb6u2Z7m/W9ROs7i4uGK3AwAgGEEpAAAAACHRELzgYXjb8nIt7ekb3N8/nnKt1UsmAAUACA2FzgEAAACUm9/vL/h7Y862QtcBANgdMqUAAAAAlNmW3B325JKv7L4fPy5Y1u7lO6xdvUZ2acfDbEj7nlY/uXaVthEAUL0RlAIAAABQJm+t/NnOeG+Gbd+1s8htv2VvtFFf/Mdu/PpNe+HoodY/Y58qaSMAoPpj+B4AAACAMgWkTn7nMduxK880UC94sJ63TLdrPa0PAEBxCEoBAAAACHnInjKkVDbKVyQcVZhu13paX9sBABCMoBQAAACAkKiGlIbslRaQ8mg9rf/U0q8qvW0AgOhDUAoAAABAqTSr3v0/zSvXtlN/nMesfACAIghKAQAAACjVxtzttjR7Y4g5Un/R+tpuU+72SmoZACBaEZQCAAAAUKqtebkV2j67gtsDAGIPQSkAAAAApaqblFyh7etVcHsAQOwhKAUAAACgVI2SU61dvUYWV8bttL62a5icWkktAwBEK4JSAAAAAEoVFxdnl3Y8rFzbXtbpMLc9AACBCEoBAAAACMmQ9j0tNbGWxYeYL6X1tP657XpWetsAANGHoBQAAACAkNRPrm0vHD3UlPRUWmBKt2u9F48e6rYDACAYQSkAAAAAIeufsY/9t99wq52Y5MJSwaEpb5luf73fcDs2Y58qaikAoLpLrOoGAAAAAIi+wNSKM2+yp5Z+Zfcu+tiWbd1UcFvbeo1cDSkN9UuvRYYUAKBkBKUAAAAAlJmG5F3W6XA7v/2Blj7zRrfst9Out1b1GlDUHAAQEobvAQAAACi3wABUw+RUAlIAgJCRKQUAAAAgJJN/+NAmL/qo0DK/+Qv+7vTKXcUGpUZ2PsJG7tcnIm0EAEQPglIAAAAAQpKVl2Mrt2eWePuqHVklbgcAQDCCUgAAAABCkpaUYhmp6cXelp+fbwkJCSVuBwBAMIJSAAAAAEKiIXjFDcPz+Xy2bt06a9KkicXHU7YWABAavjEAAAAAAAAQcQSlAAAAAAAAEHEEpQAAAAAAABBx1JSKApM+XGqTP/qt2Nt8+fkWX0JByZFHtLVRfdpVcusAAAAAAADKjqBUFMjK2WUrM3c3jW5eidsBAAAAAABURwSlokBaSqJlpBeeRtfv99uqrFz3d4u0ZIuLiyt2OwAAAAAAgOqIqEUU0BC84GF423J3Wb0b3nR//3TtkVYvpVYVtQ4AAAAAAKDsKHQepZQp5dm4bWeh6wAAAAAAANUdmVJRZsuOPJvx1R9238fLCpa1veN9a9co1S49rI0N7dnS6tdOqtI2AgAAAAAAlIagVBR56+d1dvqMr2z7zvwit/22cbuN+s8iu/HNxfbi0J7Wf58mVdJGAAAAAACAUDB8L4oCUic99oXtyMs3DdQLHqznLdPtWk/rAwAAAAAAVFcEpaJkyJ4ypPzmN18ppaN0u9bT+toOAAAAAACgOiIoFQVUQ0pD9koLSHm0ntZ/8qs/KrtpAAAAAAAA5UJQqprTrHr3z/urqHlZTJ23jFn5AAAAAABAtURQqprbuH2nLd24vUgNqdJofW23aTtD+AAAAAAAQPVTLYJSDzzwgLVu3dpSUlKsV69e9sUXX5S47hNPPGFxcXGFLtouVm3NLTrTXllk5+4KW1sAAAAAAABiJig1a9YsGzVqlI0bN86+/vpr69q1q/Xv39/WrSt59ri0tDRbvXp1wWX58uUWq+omJ1Ro+3rJiWFrCwAAAAAAQMwEpSZNmmQjRoyw888/3zp16mQPP/ywpaam2vTp00vcRtlRzZo1K7g0bdrUYlWj1FrWrlGqxZVxO62v7RqmJlVSywAAAAAAAKI0KLVz505bsGCB9e3b968Gxce76/Pnzy9xu61bt9pee+1lLVu2tAEDBtiiRYssVikAd+lhbcq17WWHtXHbAwAAAAAAVDdVOrZrw4YNlp+fXyTTSdcXL15c7Db77LOPy6Laf//9LTMz0+655x475JBDXGBqzz33LLJ+bm6uu3iysrLc/99P/NTqptTZbftSM+pZu3P2L7Rs6dPf2faV2aU+t6aHtrImh7UsuJ6fu8t+nPK5haLtuV2sTou0gusD6ta2680s5/8LmJcmPs6sdlKCnXNAhv0x+xfb9N3aUrdJ37eRtRqwb6Flix/80vKyd5a67Z7HtbcGXf96D3M2bLNfpy0MoaVm+17c05LSkguub/hyla1+r/TZBlP2SLUOF3QvtOz35xdZ9rItpW67R88W1vyYwoG+7//1SUjtbX1mZ6vXpn7BdT2eHjcUXa47tND11XOX2YavVpW6nR5Pjxvo12nfWM6G7aVu2/zoNrbHgS0Krudl5drih74Kqb0dLuhmKXv89RnZ/O1a+9+cJaVul1Svlu37zwMLLVvxn8WWuXhjqds23L+pZRzfvtCyRVM+M18ItdW0/2o/9mxblWW/PfW9haLTlb0sIWCo67p5f9jaT1aUul116SP02uo1Lk18coJ1vvLgQstWvrmkxvcRfvPbjrp5tmbrLxZncfQR9BFF0EfU7D4iGH1E8X1EcF9KH0EfEYg+gj4iGH3EilL70joZaVHdR/h8vpAeL+oKDvXu3dtdPApIdezY0R555BG75ZZbiqx/xx132Pjx44ss3xq/3fyK3uxG3i4rUtsqc1e25cSX/gFN2r7ZbN1fHaBvZ75lh7CdbNi00bYlKgT1p125mXZLeqJdm/ln0XJ/KalvelaP/q2d7czebJtyM0N6XF9ukqUEPdcs/zbbFV/67H0bt26yvHV/vZa5W3aE/FzXbVxvSTm1Cq5v2bo5pG1zfb4i782WvCzbHsK28TlbLCFo25Dfmy0bbMe6v748t23JCv25Bj3mppwtIW2bnxdfdD/0bbWd8X/tIyVJ3rrZfOv++pjnbdsZcnvXb9pgyb5tBdeztm4KadtEf16R9m7OzbJtoTxubqYlBe+Hts388aV3aBuzNlruur++UHZs2hb6c12/3uJr/VW/bdP20PbD6tJHbM0KbV+Ks6L7En3En1/+efE+91x0IEUfQR8RjD6iZvcRwegjin/M4L6UPoI+IhB9BH1EMPqI0vvSXbviorqPyM4uPXhW5UGpPfbYwxISEmzt2sLRdV1XrahQJCUlWffu3W3JkuKjqmPGjHGF1AMzpTTsr64v1V12JzWxnjVp0qTQsuzENZYUQsCvYWqDQtsqMrm+lMfz7NGwkdVp8ldkMnlTgh2TuN6m1suzq7O3uYyp4OCU103XrpVgLwzpYcfu3dhdz0vOMvOV3qGkJ6cVea6b4pZbnq/0sxeN6ja0BgHb5sRvsy2+/1komjRqXOjsRfzyXZbrK/0MREp8apH2bk9abwkhRGMbptQvsu3aUN+b+ntYvSYBZy+21bKtvtLPDknwY+anbDOfr/QPd72kou9NZvxKy/GVPvq2Yd0GtkfAtjp7sckX2sQAjRvuUejsRdJqv+3wbSp1u6S4WkXam5O8yeJ9pf/oaJicXmTb9VbHfL7Sz140Smtk6U0Czl7syrIsX+lnh6Rx48aFzl5Yaq7l+UrvRKtTH7HNt77U7eItoUh76SP+/4yUL89q+5LcgRR9BH1EMPqImt1HBKOPKL6PCO5L6SPoIwLRR9BHBKOPyC61L60T5X1ESkpKSI8X5/f7QxkRVml69eplBx10kE2dOrUgxatVq1Z26aWX2ujRo0vdXsP/OnfubCeccIIrml4aBaXS09Pd0D/N4hdttuzIsye/+sPu/XiZLdv0VyejouaqITW0Z0tLr01xcwAIlb53dFZHX6KqawgAKDv6UgCoOF8M9aWhxl6qfPiespiGDh1qPXv2dMGpKVOm2LZt29xsfDJkyBDLyMhww/BkwoQJdvDBB1v79u1ty5Ytdvfdd9vy5ctt+PDhVhPUr51klx/e1oYd2NLSbpzjli27/ihr1aAORc0BYDcyv55imV/fW/QGv1m+L9/+F5/wV9ppgPQDrrD0A66MSBsBAACAmqTKg1KDBg1yYyzHjh1ra9assW7dutmcOXMKip+vWLGiUIRw8+bNNmLECLdugwYNrEePHvbpp59ap06dLFZN+nCpTf7ot0LLAhPcDr3/02IDUiOPaGuj+rSLSBsBoLrz5WZZ/taVJd6ev5vtAAAAAMRgUEo0VE+X4nzwwQeFrk+ePNldapKsnF22MrPksdqrsnJL3A4A8Kf45DRLqJtReKHfb/nb/qwHkFCnuVlcfLHbAQAAAIjRoBR2Ly0l0TLSiy8S5svPt/iEhBK3AwD8SUPwgofh+fK22fIHGri/Wwz5wRKT61VR6wAAAICah6hFFNAQvOKG4cVSETQAAAAAAFCzEMkAAAAAAABAxBGUAgAAAAAAQMQRlAIA1FiBM5nm79hY6DoAAACAykVQCgBQ4+TnbLHMb6baqqd7FCxb9cTe9r8nOrrluh0AAABA5SIoBQCoUbb//rb9Ma2NbfrwatuV9Xuh23ZlLnPLdbvWAwAAAFB5CEoBAGoMBZrW/meA+fN2aPDe/18C/blMt2s9AlMAAABA5SEoBQCoETQkb93sQSokZWa+Utb2ufW0PkP5AAAAgMpBUAoAUCNs/ekp8+dtDyEg5fG59bf+9HQltwwAAAComQhKAQBinmbVy1r4QLm2zVp4P7PyAQAAAJWAoBQAIOb5cjbarszfiqkhVRq/286Xs6mSWgYAAADUXASlAAAxz7dzawW3zw5bWwAAAAD8iaAUACDmxdeqW8Ht64WtLQAAAAD+RFAKABDz4lMaWWJ6WzOLK+OWcW67+JSGldQyAAAAoOYiKAUAiHlxcXGW1u2Scm2b1u1Stz0AAACA8CIoBQCoEep2PNfiklLL8NUX79av2/GcSm4ZAAAAUDMRlAIA1AgJKfWtyYmzlDYVwtdfvFuvyUnPu+0AAAAAhB9BKQBAjZHa+lhrOuA/FpdU+//rSwUPy/tzmW5vOvA1S92rXxW1FAAAAIh9iVXdAAAAIh2YannBMtv609OW9c1U25W1rOC2xPQ2roZUvU7nWnxyepW2EwAAAIh1BKUAADWOhuSld7/U6nY+z1Y8+OfMei3O/9VqpbWiqDkAAAAQIQzfAwDUWIEBqISUhgSkAAAAgAgiKAUAAAAAAICIIygFAAAAAACAiKOmFACgRsj8eoplfn1v4YV+f8Gfq57czyyu6Lma9AOusPQDroxEEwEAAIAahaAUAKBG8OVmWf7WlSXenr9tdYnbAQAAAAg/glIAgBohPjnNEupmFL3Bb5bvy7eE+ASzuOK3AwAAABB+BKUAADWChuAVNwzP5/PZunXrrEmTJhYfT6lFAAAAIFL49Q0AAAAAAICIIygFAAAAAACAiCMoBQAAAAAAgIgjKAUAAAAAAICIIygFAAAAAACAiCMoBQAAAAAAgIgjKAUAAAAAAICIIygFAAAAAACAiCMoBQAAAAAAgIgjKAUAAAAAAICIIygFAAAAAACAiCMoBQAAAAAAgIgjKAUAAAAAAICIIygFAAAAAACAiCMoBQAAAAAAgIgjKAUAAAAAAICIIygFAAAAAACAiCMoBQAAAAAAgIgjKAUAAAAAAICIS7Qaxu/3u/+zsrIs2vl8PsvOzraUlBSLjye+CADlQV8KABVHXwoAFeeLob7Ui7l4MZiS1LiglN5gadmyZVU3BQAAAAAAIKZjMOnp6SXeHucvLWwVg5HHVatWWb169SwuLs6iPfKo4Noff/xhaWlpVd0cAIhK9KUAUHH0pQBQcVkx1Jcq1KSAVIsWLXab9VXjMqX0Yuy5554WS7SzRvsOCwBVjb4UACqOvhQAKi4tRvrS3WVIeaJ7kCIAAAAAAACiEkEpAAAAAAAARBxBqSiWnJxs48aNc/8DAMqHvhQAKo6+FAAqLrkG9qU1rtA5AAAAAAAAqh6ZUgAAAAAAAIg4glIAAAAAAACIOIJSAAAAAHbL5/NVdRMAADGIoFQUuOeee+zkk0+u6mYAQFSbPn16QV9KOUUAKJv4+L8OG+hDAQDhQlAqCuyxxx72/vvv2/r166u6KQAQtRISEmz27NmWm5trcXFxVd0cAIgqr732ml100UUuIEUfCgAIF4JS1TA1Ojg9ul+/frZz5077/vvvq6xdABBNdNAU3Jf27t3bateubfPmzauydgFAtMjPz3cXT506dezf//63bdiwwfWjjzzySJW2DwCi3YoVK+zee++1n3/+uUZnoRKUqia8gyelRgemR0tGRoZ16NDB3n333SpqHQBEV1+qs/jBfWnz5s2tc+fO7mx/Tf7iB4BQs0t18Xz44Yeub91zzz3dUOiFCxfa1q1bq7SNABANvN+cv/32m/3vf/8rVFrivvvuszZt2rjfsDU1C5WgVBUJPPMk3sGTvvBVQ+qjjz6yXbt2Fdx+7LHH2jvvvFNoGQDUZMUFlby+dNGiRfb444/bp59+WnBbamqqHXXUUfb2229HtJ0AEC19qLcsMzPTnnjiCTv77LPtyiuvdMt0wJSWlmZnnXWWbd682R566CGrW7duxNsNANFi27Zt7n8FmzTq6cQTT3RBfQWmdFz/0ksv2RVXXGG1atUqcjK1Jqm5z7yKeWee1qxZ4/6fNm2atW/f3s455xx79dVX7cILL7Thw4cXrH/88ce7HXnVqlVV1mYAqA68g6bgs0l5eXn2/PPP23777WdHHHGEPfjgg3bBBRfY5ZdfXtDvKij1yy+/2MaNG2vs2SgANZcCS96J0eL6QC376aef7G9/+5vdfffdrq6psky/+uoru/XWW93BkzKkVJsPAFDUkiVLXP29Vq1a2dFHH+36TvW7Xbp0sQULFliTJk3s8MMPt4kTJ7rh0CczoRlBqco6YFLks7ipc7OystwXu87UJyUluTNQXhT1lltusT/++MON03/55ZftySeftM8++8zd3qtXLxc9/fLLLyP+fACgKvvTwMxSr8BuTk6Ovffee+7LPdCyZcts1KhR7gyU+kvVPNHlm2++cbd37NjRnemfO3duxJ8LAESafosGZkTpt6R3YlT9on5zbt++veB2/X377be7Pvbjjz+2qVOn2lVXXWU9e/Z0tx955JEuaLVy5coqeDYAUP189913du2117rJdNTfjh071pYuXepGPw0ZMsQmTJhgY8aMcZOWKWtf2VGnnHKK3XbbbXbwwQdbgwYNrKaXlSAoVQl0wJSYmOi++AMPpvRFrx1WWU+PPfaYy4h65pln3G2nn366C1DpQErFzrTjyptvvum2S09Pt+7du9tbb71VZc8LACpbcDBf/WlgZqmuayizgktDhw610047zV588UV3IkCBfvWlw4YNc1lTs2bNclmo+luBfmnatKk7uFL/W9N/AACIPcF9mn6LBmZE6eTnZZdd5vpCTaSjTNJTTz3Vna2XLVu2uN+eV199tTVs2NAt029aT48ePSwlJaXgpCl9KICaSIF7ZZTqWH78+PH27bffWv369V3/+cEHH9ill15qZ555pl1yySVuggjVhvbKR2jYs0ZHqSafRkLddtttbnlNzuAnKBWmGUkCLV++3G644QYXRFItKAWglOasyGjXrl3dAZKG6mlMqVKi9YXeokULmzFjhltfw0+U0qcI6n//+1/Lzs5299u/f39Xa2rHjh0RfrYAUHkC+9LA8fSakUTD7DQET1/gffr0cf2p+krVMlFfe8ABB7hhejpLJe3atXMBJ5150pe8Avo64NJZKc1iqoOpvn37uvp9Nf0HAIDYE9yn6cz8P//5T/vkk0/c9U2bNrkA1LPPPmtr1651AXst+9e//uX6SPWZyurXTHvFnSioV6+e619ff/31Yh8PAGKNfou+8MIL9sUXX7jjeNHvSSWTKLh04IEHusSRQw891BYvXuwC+occckjB9iop0bZtW5d96lGmv4qbK5v/scces/POO6/YUVY1BUGpch44Bc9IIgou6XLTTTe54rr6EaDAkq57EVAdQCUnJ7sglOjsvr7Qf/jhBzfe9IwzznBZADpDNXDgQJf6p7NacsIJJ7haKBqnCgDRyOsnA3l9qQLuM2fOtOuvv97+85//uIwmfUlrpicNMdGXus7q64fAcccd5wJYGlaiHwg6KyXqH6+55hrX9yqIrxlNlE2l2U40tE90P6tXr3b9KQBEc12oYAo+6cDJo2HOCkDts88+7nrr1q3dkBLVOVH26ddff+2CUDqLrz5RwSjV5fPO6Kuf9fpsb6a9v//97+6kqbL/NTRFQ6YBIJao3I6G4z3wwAOuFtTIkSNd36fRTF7/q5OeCtQH1oTaf//97ffff3e/ab2+UwEpHf8rE9XbVkOj//GPf9gxxxxjc+bMcX23yvX8+OOPVhMRlCqBdqLgaKV34KSsJ0VLlZJ33XXXuYMdUXBJZ+91UKSdeMSIES51T0XLH330UfeF36FDB3egFfiDwfvS1wGThp4oo0pnrTTribKkFElVW7ST77vvvgUp1gAQLQIL6wafWX/qqafcl7yC9/qS1kFRo0aN3Bkn1YVSUKlbt24uwN+7d2935t+jPrFx48YFNaNEwXwFs5RGreHPun9lAHgz8enMlH4g/PzzzxF7/gAQLoF1oXTwo2Ek4tUyOemkk+y1115zyxSY14GPCpbrdmVCqY/Vb1P9HlX9KPWj6qO9PlJn/nWCQCcHvDp+n3/+uT399NPudh2YaTv1zzpJMGjQoCp7LQAgHHSyMnCW+1deecUFmxSA19/6bakToYoB3HXXXW4djWwKrMknhx12mLsfnRAI/L2r43mdHFDf/f7777vjef3OlV69ernsU8ULOnXqVCOHRf81SByOgj/e+PvgAyftLNrBNIxEB0AaeqfIps7gK4CkgJMCVBp7r/VUG0pDSnQwpKF6Gr+vAruKtipAFThOXzugfjAoiKWaU/qiV40pBcA0La/apakia2r0FEB08w6g9KWss/Ma/qHAk6iP1NARHSApgKQvbZ1h0lA89YUKyIsyTLWdUqT1ha/+U/fhrafZSfW31lOmqoL8ejx92etHgxeEatasGRmnAKoNZSspKKQhIF4QaHfUfym7XoEnBd8VtFedKM0uqkkclG2vjFH9NtXvRw1ZFu++VWtPw0g0dE+Bfp0AVXaUNwxaQSz93tTQaW2rx1O/rdoo6q/1e1SPpwsARBOvH1RQaPPmzTZ58mR7/PHH3e9Hlcq5+eabXfaT6pbecccd7nhfvyPl4osvdn3jG2+84fo//SZVYF71pDp37uz6W10/66yzXEaqJjJTNqmyVcULQun3qE6eqkafZ5//z2atqcOia3SmlH4AFFcQ0kvZU7Qy8Oy7AkT6gle0VDPlqWiZ1tGXuZaJAktKv9MOrR1N9aFU90RnlxSM0kGUAlA64NKPENEXvKgOyt57723PPfectWzZ0gYMGOACXkqPDiwyWVLKNgBUp+zSQOrfMjIyXGFd1dzTmXzvrLwC8Qou6Qvd+1KuXbu2+xGgHwyqeyLqB1XgXENIAmfd0wGZ+lEV3tUXuWpOabpd/ZhQX6sAv04U3HnnnYXaRF8KoKrpoEWZSfo9WdrBiNfHajiJ+kWdLNXQZQWm1N9pyLIoi0nXFaDSevr96f3GVYBev1MPOuggF5AS3YeG8KmUhGqnKJNKv3HVl6rf1kGV+mtltCogFcps0wBQHamPXbRokUsmUbkI9WEarqxgvo7zvWU6JldwSic6AzOo1Heq31YgSkPydELVm4jM67/VByshRTVPNWRafaeyU70TBBdddJFbRwGsQP4amCFVIzOlvKnF9aUcmPocaOHChS5yqalutTPef//9rv6IhpTo/7322st9QesgyNsxdZCkIJYOihRM0lCSJ5980gWfAgv2KjilND8ddKkGitIBVWPKa4eirToIC/5BojYHtrW4dgNAVfKyS5XZqf5OASYvyK8MptGjR9vw4cPd2XsNN9H/OuOkwLu+8NW3avY8HRgpo1T0g0GBfmWk6oyVqP9UwEkHUd6ZKw090fAVr0aUxuerrw08eArMhPXQlwKIBPVrOjmpi35P6qSlN2GOAkAqiKtgkIrmqoZeSdlS6r+U4aR+UzM8qX9U36dsUp0E1QlS/VZV36dZodTH6XaVk1B2lH6fqkSE+lAFrNQeHVTp96hqTOkkgH6r6iSBHksHVbqUNts0AFRHOiGpflEnKBX8P/LII91y/UZVX6uMJ41eUoBevxvV9yoLVYkhCiDpd6aCT/pt62X3K9tex/FeUomG+Klf1wlTjaYS/a7VCACdFFXgSb9Tg3kxicC+Pq4GZkjVyEwp78tTO4CynvTlr6ioN7udop4q1qidUONKdSZJwSVVxdcBkL7MFajSF/i6desK7ldnoHTQpQMi7Zg6uFIxXn0Q9CFQOrRqpShTQAdFqmWiYXpKsw4+MFIbtY4isl60lAMnANWZauSpz9SXuNKc1Q9qmIeXiaQ+UHXydLZdX9gaJqKsJQ0H8bJMNWxFQ0QC60XpS10zlioo5VH/qqCVN/OT6CSB6vgp8OXRQVnwWfzAgBQARIqy65XdpFoj6sO833Ze4FyzNavv++mnn0o9W64DIWWK6iy7ZnLWb1NlWel3pRe8lxdffNEdPGlItIapqH/2Zn7SWXtNFqE+U797deCl9ilQpZOlxZ3Qrcln8AFEFx2nawidhiLrt6b6MA3NUyaUYgCikwPqj1Vex6M+Wgkm3u9O1TRVvxg4a55GOil5RckkokCXgvn6HRxIfaZ+n3oBqeB+VN8BNTkIFaxG/ULX2SLvS1w7prKgJk2aVFBnRDuwvrj1Ja1o58MPP+zGg+qgxps+XDuw1lOxM48OprSj6cBMB0tK1VPw68orr3Q7onZafSB0tkk7n35AKGils1nF0YGTtmdHBVDdaWiI+jidifcCS+PHj3df/Jri1vsC9zJEPQrOK+XZCy4de+yxLoPKmyFPGjRo4PpcBfQ9OsOvLCv134Ff7AqG7e5EBABUBZ1hV2aTgkIqjqvsJtFwD50E1XLVI9UJUi8otbs+S0Ob1TfqRKiC+99//70biqK/dQDknQyYNWuWCzApc1R/qw/WAZb6XB2I3X333W7qcs3wrJMICmCpzwwOPmkZB08Aool+byr7UwElFSZXzSgF4/Xb1CufowCU+kslqXjUD+qioJOof1YQS/WjVHtKQ/sUH1AZCvXFon5WcQNl9wfy+szAJBP60ZLVqF/qKgg5ffp0dzZp4sSJ7gBIvLommllEY+11Fl9f2MqQ0gGQoqWaBlKU3qeMqsCC4zrDpR1NmVWigyhto4Mm1YfSfWoGE304AndGxuADqE705RzYL4VyZlxnmFQnSv2nAu/KbtLZ+nPPPdfVI/HSpDV8RWeSPBqyotRpb4px9Y96vMC+VcP5lE6tgJf6XU+fPn1cBhUAVHea7U5BeGXPq1/U2XtljuogR7VGVExcB0UKLKn/Cwzel9TnKiilkwCazEH3IboPzZin37aqh6pAmIaOiM786zetMqxUz080pESZq8HZ+Rw0AajOCSb6PVnab1QlgKhkhGpCaeIx1S7VcDoloqhUjyhgr5MG6ps9uk/FBVSQXH2jAlStWrVyySwqTaFtr7vuOld+IpCGRZfUHvrU0CRG64GT3mDvTFJpM5V4dUQUJFLASBlMoiKOOjOlsfTKXNKXtg6CNA5f0VBFQL3H0DA8LxqqHVzFID0KXClYpR8G+jGh+1CbAjOhvDNXgUPxOHsPoKoLkxdXr06BdGWN6os4FMp4UsBJw5s9p5xyiqtXorPwCiypX9RMT/piF2WPKgPqxhtvdP2jDtx0oKVMKwWgFLTyglW6FNd+vugBVHf6famZl9S/qo/UcBIVIddvzAkTJrhAvn53atizglLqS5XRX1IfpxmiFNBSWQhl7uvAS/2mhutpGx1M6bE0A6kOujzqg/U7N5hXZxUAqiOV0NFoJGUiaSKxE044wfWh+u1YUk07DY1W8GrYsGGuXp+G2Cm7SX2n+lkF4tu0aeOO3ZWoot+fGtKs2nwK2CuLyusX1aeqX9ZvWMUASsJv0oqp9t9C3lj2QPqy9WYQUdSytJ1A6+qLO/ALWQdjGmp35plnuqnEFSlVhpTOZukskw6IvJ1RadX60tdYUS1X9FX3p53cozRopT/rMUpK2aM2FICq5vWn3pAMj/qqZ555xmU1aTKHwYMH29ixYwv6ud2dkdLQOw27C6y1p6C8vvQ1I17Dhg3dmSXdv+5XZ+x1lkmTQSh45bVDw1nuu+++goCUR/11cGYpX/4AoiUopSEj6lcVKFI9UZ1t10lSBaS8PvmMM85wB1Gh1JXS702Vo9DvU/WhyubXsED1szrIUuFzDaMOnNmJPhNAdeb1eRo655VyUCaphtApy/Oll15yw5W9jPrSgulKPlFNU5Xq0QkABZr0e9ML/ouWKWNUfbL6Um2jIXqqF+397hw0aJAb8ucN8ystmxUxlinlzThX3IGTzjjpDJGKiyuQpC/4K664wlXFL+nMkiKm+vL3ipB5O7K2V7aUZs/TGSUdhGmnVNFHRWI1Zl9p0IrOesNHVFBSEdWS2hyIHwEAqooXzAkcx+71UTr4UWFb1ccbN26cG06iL24F4AcOHOj6PWWXajYmzSqyO8ogVR+rAJTOPKkfVmBJy3WG69RTT3UHXMoM0Nl8pVLrh4BmONEQPo8C/sXhLD6AaKWsJ/0+1G9QTeygGnyqN6oz/+L10cooVXBJdZ6UNbW7fk/rq6/2gvr169cvso7u15sVFQCqK/3+VB+p34C//vqrmxRHQ5PVJ2qiBs2OpxOdyrzXyKX333/fnfQsKUvKGyGlxBUdvysTX1RSR3EAJaIoIUVZp+qPVXJHv1EVkPKSSwKpRIV+C2uIdceOHYtdBxVXLX7pB49nDz5wUoqevny1M3oRT11X1FJf4Bo3r3S+3dEXtn4EKBIaeDZ//vz57nG1c4qCUW+//bYbh68aURojqgMoDTPx0qm9gFTwWSwyoQBUJ4GTJmzZssV98evLV7VIVHtEX/SqzaS+TF/WBx10kOtTFVjSl7VmGlXgStvu7sBG6czqO3V/gTSsWbX8vLpRqmWi4uc6S6Uvf52JAoBYp7okyl7yao9qGLPO/OtsvXeAo4MtFdTVgY83K3Twb2UduHklLPS/Tsbq923wTKMSPNU4AFQHgSOg3nnnHZflqYkaRGUjFDhS0ohqPenkpjfLnUY76fhf/ZpOeJZUn9k7PldyiWIHCnIp0K/fnUo+UcBLEz+IJptQ0okCWF75HfWlHt2/Tqh+9NFH7oQtYjwoVdkHTt7OqZ1JO5wCTjqjr+woffkrhdqbAUo7nz4Iqjul4JSyohQh9WYkCbx/vuwBVGcaiqyhI8pG0pe6Zh/REBIdDKlu3kMPPeTG1+tHgM7g60yQvsR1oKN+T9lPypYq6WxUII3XVxarAlBe36jx+zpp4J2lEt1XcQdQABCr1AdqZjwFoVSvTwF6/X696KKLXFFd/X7V71hlCqhmqbL0vWEi3gGcfit7pSC0XP97fSgzjQKIBm+++abr57zRR8pa0pBjHe+r39MxvcrzqE/T/wq6B86OpwQRBad0P1Lc70gvSUTBf808qhMCynBSqR2vpp8mKdNvUd2/+mcFwBR/8PpSj/pUrafs/lB+C6P84mvCgZN3gKRaKfrS1w6vYmYa9qcdWwEojVGVwC/04AMnglAAooUOZHRGSDONKLtUxRsV3NfMovoBoP4vsDC5vrTVvypIr8kfVDhXs4aqBorG25dGM0qpaHlgP6yzX3q8YBxAAahpdGZew5o19EQ0rbhOlKq2qX7PKtA0bdo0e/DBB90Bk+h27wBLmf2qGaXfst6wZ/pQANFEmUn6TXnPPfe462vXrnU1m5RookwmZUkpi16ZSwoEqVSPfo96NFpJx+Ze1mlJo5S8YL4STVTH9K677nK/bwNLW3jrKAZx2mmnlVjEnOP/yEisLgdO2il14KQhdhpeV9qB0zXXXOMOgLSzqWB5qFRBXwdcqqPi3e/555/vCpwrWht4Rj84WgoA0UJnnZRxOmPGDDcVbmCmpw54lNKs/tcb4qETAjpjP2LEiEKz7ulASGevhg4dWuQxtL3uV/2kzl6pOCQAoCj9ptWkDy+88IKrX6LftzpIUvkIBawC6+t5w/l0wlW/WXXiVENINFJAB1CqtQIA0UaBdiWDaCY7ZYwqa0ojkxSkUiaTTqBqVntlSWnElAJYSlZRgEoldRSMUmapElqU2aQ+Vb9DA2eUDq5HLQpkeTONesF873/N0Ieql1jTDpxE6dOakUQ7tDIBtK1qUikgxTTjAGKBCpUr2K5+VdSvecUftUxD61Szz6uRp+C8+mP9SNCsTSrKqx8KGkevmU90Fl99aEmTUGjYnn487K74JADU5LpS+o2qAyuPTqpqkp3AUhP635s1+ttvv3VDozVluU7CKjAFANFM2aH6japhdLfccosbsqykkY8//tj97tSsovqdqt+nmmVUk+Xod6gCWgpIKTFFWVUqD/G3v/2tYEIHL8i0Zs0al8CimUwvvPBC13fyu7T6i6+uB06iZRrCpwMnL1CkA6emTZu6AyftcJ9//rnboZVtpcKQOnDyht15X/A6cArcGTXFucbxKz1QKX0aGqgCaN7jA0AsnJX3xuEH1iSRo48+2tU18aYeF335P/PMMy7wrzNWGovvzUhyzjnnuB8DXvao/leKtYqWn3jiiW649X333efuhy9+AChKASXVMzn33HMLLfd+qwYfWGnEwNNPP+2mMlfGAAEpALFAWaEKFikLVDMyd+rUyWU9qYi5juVVdsKjoPzcuXNdlqh+106fPt1lmOo4XxNDiLKqNKmOYgQaBqhRVxoereF4ihkgOlT50YN2MKUyi3cGvrgDp0MOOaTQgZPOGunASYXKNb5eZ58UefVmMfEOjHTgNHv2bBfAUgBMBdIVYVXkVUUmSxo/CgDRTEND1N/pzFPgUA+dgVL9En1pq1Ck17fqwEhDSBSo1xh+BZqCaXIIDbPWyQD1zRqyp3561KhRbqY9AMDueb91PZwMBVCT6Pemfpcqy0nH40o0ycnJcRn6WqYMKAWpPBo5pYkivL5StfXUj2pInwJSqpcquv6vf/3L3Xfz5s2r7PmhfOL83imaKvLdd9+5He3DDz8s9sBJB0xK21M2lHjD6/R/WQ+cVNxcB06BqdMAEKuOPfZY98V+8cUXu9RnDQXROH6djVLqs/pH76RAcQKHPiuDVdlQ6rPVl/KlDwAAgLLwjuU/+OADV/hcNZ5VN0/BepWCUIBKE+V462kU1JgxY1wWleIFypJS8EnFyWXlypXu9ygTP0S3Kg9KCQdOABB+mrFEM5iqLpSyRvVFrqEjo0ePdoUkvcxSAAAAIFI0/E6/TzVqqkOHDoVuC0xC0f+aPU+1o3Vcr1hB/fr1q6zdiOGgFAdOAFB5NHRZkzt07Nix0HImdgAAAABgNT0o5eHACQDCK7j/DJzNFAAAAKgqHOejWgWlOHACgMrDlz4AAACA6qbaBKU8HDgBAAAAAADEvmpXpp6AFAAAAAAAQOyrdkEpAAAAAAAAxD6CUgAAAAAAAIg4glIAAAAAAACIOIJSAAAAAAAAiDiCUgAAAAAAAIg4glIAAAAAAACIOIJSAAAAAAAAiDiCUgAAAAAAAIg4glIAAAAAAACIOIJSAAAAAAAAsEj7P79gLPPjZegMAAAAAElFTkSuQmCC", + "text/plain": [ + "
" + ] + }, + "metadata": {}, + "output_type": "display_data" + } + ], + "source": [ + "palette = sns.color_palette(\"colorblind\")\n", + "\n", + "ci_cre_general = plpr_tune_cre_general.confint()\n", + "ci_cre_normal = plpr_tune_cre_normal.confint()\n", + "ci_fd = plpr_tune_fd.confint()\n", + "ci_wg = plpr_tune_wg.confint()\n", + "\n", + "comparison_data = {\n", + " \"Model\": [\"cre_general\", \"cre_normal\", \"fd_exact\", \"wg_approx\"],\n", + " \"theta\": [plpr_tune_cre_general.coef[0], plpr_tune_cre_normal.coef[0], plpr_tune_fd.coef[0], plpr_tune_wg.coef[0]],\n", + " \"se\": [plpr_tune_cre_general.se[0], plpr_tune_cre_normal.se[0], plpr_tune_fd.se[0], plpr_tune_wg.se[0]],\n", + " \"ci_lower\": [ci_cre_general.iloc[0, 0], ci_cre_normal.iloc[0, 0], ci_fd.iloc[0, 0], ci_wg.iloc[0, 0]],\n", + " \"ci_upper\": [ci_cre_general.iloc[0, 1], ci_cre_normal.iloc[0, 1], ci_fd.iloc[0, 1], ci_wg.iloc[0, 1]]\n", + "}\n", + "df_comparison = pd.DataFrame(comparison_data)\n", + "\n", + "print(f\"True treatment effect: {theta}\\n\")\n", + "print(df_comparison.to_string(index=False))\n", + "\n", + "# Create comparison plot \n", + "plt.figure(figsize=(12, 6))\n", + "\n", + "for i in range(len(df_comparison)):\n", + " plt.errorbar(i, df_comparison.loc[i, \"theta\"],\n", + " yerr=[[df_comparison.loc[i, \"theta\"] - df_comparison.loc[i, \"ci_lower\"]],\n", + " [df_comparison.loc[i, \"ci_upper\"] - df_comparison.loc[i, \"theta\"]]],\n", + " fmt='o', capsize=5, capthick=2, ecolor=palette[i], color=palette[i],\n", + " label=df_comparison.loc[i, \"Model\"], markersize=10, zorder=2)\n", + "plt.axhline(y=theta, color=palette[4], linestyle='--',\n", + " linewidth=2, label=\"True effect\", zorder=1)\n", + "\n", + "plt.title(\"Comparison across DoubleMLPLPR approaches\")\n", + "plt.ylabel(\"Coefficient Value\")\n", + "plt.xticks(range(4), df_comparison[\"Model\"], rotation=15, ha=\"right\")\n", + "plt.legend()\n", + "plt.grid(True, alpha=0.3)\n", + "plt.tight_layout()\n", + "plt.show()" + ] + }, + { + "cell_type": "markdown", + "metadata": {}, + "source": [ + "We again see that the `wg_approx` leads to a biased estimate in the non-linear and discontinuous `dgp3` setting. The approaches `cre_general`, `cre_normal`, `fd_exact`, in combination with [LightGBM](https://lightgbm.readthedocs.io/en/stable/) regressors, tuned using the [Optuna](https://optuna.org/) package, lead to estimate close to the true treatment effect.\n", + "\n", + "This is line with the simulation results in [Clarke and Polselli (2025)](https://doi.org/10.1093/ectj/utaf011), albeit only for only one dataset in this example." + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": ".venv", + "language": "python", + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.13.9" + }, + "orig_nbformat": 4 + }, + "nbformat": 4, + "nbformat_minor": 2 +} diff --git a/doc/guide/data/panel_data.rst b/doc/guide/data/panel_data.rst index 6c69a0f4..0b6f7e26 100644 --- a/doc/guide/data/panel_data.rst +++ b/doc/guide/data/panel_data.rst @@ -1,4 +1,4 @@ -The ``DoubleMLPanelData`` class serves as data-backend for :ref:`DiD models ` and can be initialized from a dataframe. +The ``DoubleMLPanelData`` class serves as data-backend for :ref:`DiD models `, as well as the :ref:`DoubleMLPLPR model `, and can be initialized from a dataframe. The class is a subclass of :ref:`DoubleMLData ` and inherits all methods and attributes. Furthermore, it provides additional methods and attributes to handle panel data. @@ -7,6 +7,7 @@ Key arguments * ``id_col``: column to with unique identifiers for each unit * ``t_col``: column to specify the time periods of the observation +* ``static_panel``: Indicates whether the data model corresponds to a static panel data approach (``True``, used for the ``DoubleMLPLPR`` model) or to staggered adoption panel data (``False``, for :ref:`DiD models `) which is the default option. * ``datetime_unit``: unit of the time periods (e.g. 'Y', 'M', 'D', 'h', 'm', 's') .. note:: @@ -39,3 +40,29 @@ Example usage ) print(dml_data) + + +.. tab-set:: + + .. tab-item:: Python + :sync: py + + .. ipython:: python + + import numpy as np + import doubleml as dml + from doubleml.plm.datasets import make_plpr_CP2025 + + np.random.seed(42) + df = make_plpr_CP2025(num_id=100, num_t=5, x_dim=5) + dml_data = dml.data.DoubleMLPanelData( + df, + y_col="y", + d_cols="d", + id_col="id", + t_col="t", + x_cols=["x1", "x2", "x3", "x4", "x5"], + static_panel=True + ) + + print(dml_data) diff --git a/doc/guide/models/plm/plm_models.inc b/doc/guide/models/plm/plm_models.inc index 32fceeec..824ab840 100644 --- a/doc/guide/models/plm/plm_models.inc +++ b/doc/guide/models/plm/plm_models.inc @@ -99,6 +99,38 @@ Logistic partially linear regression model (LPLR) dml_lplr_obj.fit().summary +.. _plpr-model: + +Partially linear panel regression model (PLPR) +********************************************** + +.. include:: /guide/models/plm/plpr.rst + +``DoubleMLPLPR`` implements PLPR models. Estimation is conducted via its ``fit()`` method. + +.. tab-set:: + + .. tab-item:: Python + :sync: py + + .. ipython:: python + + import numpy as np + import doubleml as dml + from doubleml.plm.datasets import make_plpr_CP2025 + from sklearn.linear_model import LassoCV + from sklearn.base import clone + np.random.seed(3142) + learner = LassoCV() + ml_l = clone(learner) + ml_m = clone(learner) + data = make_plpr_CP2025(num_id=250, num_t=10, dim_x=30, theta=0.5, dgp_type='dgp1') + obj_dml_data = DoubleMLPanelData(data, 'y', 'd', 'time', 'id', static_panel=True) + dml_plpr_obj = DoubleMLPLPR(obj_dml_data, ml_l, ml_m) + dml_plpr_obj.fit() + print(dml_plpr_obj) + + .. _pliv-model: Partially linear IV regression model (PLIV) diff --git a/doc/guide/models/plm/plpr.rst b/doc/guide/models/plm/plpr.rst new file mode 100644 index 00000000..660b72ee --- /dev/null +++ b/doc/guide/models/plm/plpr.rst @@ -0,0 +1,183 @@ +Suppose a panel study observes each of :math:`N` individuals over :math:`T` time periods (or waves). +For each individual :math:`i=1,\dots,N` and each period :math:`t=1,\dots,T`, the data consists of the +triple :math:`(Y_it, D_it, X_it)`. Let :math:`\{(Y_it, D_it, X_it) : t = 1, \dots , T\}_{i=1}^N` +denote :math:`N` independent and identically distributed (iid) random vectors, where each vector +corresponds to individual :math:`i` observed across all T waves. + +.. note:: + The notation and identifying assumptions are based on `Clarke and Polselli (2025) `_, with some small adjustments to better fit into the general package documentation conventions, sometimes slightly abusing notation. + See also the R package `xtdml `_ implementation and corresponding reference `Polselli (2025) `_ for further details. + +**Partially linear panel regression (PLPR)** models `(Clarke and Polselli, 2025) `_ take the form + +.. math:: + + Y_{it} = \theta_0 D_{it} + g_1(X_{it}) + \alpha_i^* + U_{it}, & &\mathbb{E}(U_{it} | D_{it}, X_{it}, \alpha_i^*) = 0, + + D_{it} = m_1(X_{it}) + \gamma_i + V_{it}, & &\mathbb{E}(V_{it} | X_{it}, \gamma_i) = 0, + +where :math:`Y_{it}` is the outcome variable and :math:`D_{it}` is the policy variable of interest. +Further note that :math:`\mathbb{E}[\alpha_i^* | D_{it}, X_{it}] \neq 0`. The high-dimensional +vector :math:`X_{it} = (X_{it,1}, \ldots, X_{it,p})` consists of other confounding covariates. +:math:`\alpha_i^*` and :math:`\gamma_i` represent unobserved individual heterogeneity terms, +correlated with the covariates. :math:`U_{it}` and :math:`V_{it}` are stochastic errors. + +Alternatively one can write the *partialling-out* PLPR model as + +.. math:: + + Y_{it} = \theta_0 V_{it} + \ell_1(X_{it}) + \alpha_i + U_{it}, + + V_{it} = D_{it} - m_1(X_{it}) - \gamma_i, + +where :math:`\alpha_i` is a fixed effect. + +To account for the presence of unobserved heterogeneity, `Clarke and Polselli (2025) `_ +use different panel data approaches, under the following assumptions. + +Define potential outcomes :math:`Y_{it}(d)` for individual :math:`i` at wave :math:`t`, where realizations are +linked to the observed outcome by the consistency assumption :math:`Y_{it}(d_{it}) = Y_{it}`. + +:math:`\xi_i` represent time-invariant heterogeneity terms influencing outcome and treatment. +Further, define :math:`L_{t-1}(W_i) = \{W_{i1}, \dots, W_{it-1}\}` as lags of a random variable :math:`W_{it}` +at wave :math:`t`. + +Assumptions `(Clarke and Polselli, 2025) `_: + +- **No feedback to predictors** + :math:`X_{it} \perp L_{t-1} (Y_i, D_i) | L_{t-1} (X_i), \xi_i` + +- **Static panel** + :math:`Y_{it}, D_{it} \perp L_{t-1} (Y_i, X_i, D_i) | X_{it}, \xi_i` + +- **Selection on observables and omitted time-invariant variables** + :math:`Y_{it} (.) \perp D_{it} | X_{it}, \xi_i` + +- **Homogeneity and linearity of the treatment effect** + :math:`\mathbb{E} [Y_{it}(d) - Y_{it}(0) | X_{it}, \xi_i] = d \theta_0` + +- **Additive Separability** + :math:`\mathbb{E} [Y_{it}(0) | X_{it}, \xi_i] &= g_1(X_{it}) + \alpha^*_i, & &\alpha^*_i = \alpha^*(\xi_i)`, + + :math:`\mathbb{E} [D_{it} | X_{it}, \xi_i] &= m_1(X_{it}) + \gamma_i, & &\gamma_i = \gamma(\xi_i)` + + +**Correlated Random Effect (CRE) Approaches** + +These approaches convert the fixed-effects model into a random-effects specification using the +Mundlak device `(Mundlak, 1978) `_. + +Given the set of assumptions, the PLPR model under the CRE approaches take the form + +.. math:: + + Y_{it} = \theta_0 D_{it} + \tilde{g}_1 (X_{it}, \bar{X}_i) + a_i + U_{it}, + + D_{it} = \tilde{m}_1(X_{it}, \bar{X}_i) + c_i + V_{it}. + +For the *partialling-out* PLPR + +.. math:: + + Y_{it} = \theta_0 V_{it} + \tilde{\ell}_1(X_{it}, \bar{X}_i) + a_i + U_{it}, + + V_{it} = D_{it} - \tilde{m}_1(X_{it}, \bar{X}_i) - c_i, + +where :math:`a_i`, :math:`c_i` are random effects and covariate unit means +:math:`\bar{X}_i = T^{-1} \sum_{t=1}^{T} X_{it}`. + +**Transformation Approaches** + +These approaches remove individual heterogeneity from the model by transforming the data. +For some random variable :math:`W_{it}`, define the First-Difference (FD) transformation +:math:`Q(W_{it}) = W_{it} - W_{it-1}` (for :math:`t=2, \dots, T`), and the Within-Group (WG) +transformation :math:`Q(W_{it}) = X_{it} - \bar{X}_{i}`, where :math:`\bar{W}_{i} = T^{-1} \sum_{t=1}^T W_{it}`. + +The PLPR model under the transformation approaches takes the form + +.. math:: + + Q(Y_{it}) = \theta_0 Q(D_{it}) + Q(g_1(X_{it})) + Q(U_{it}), + + Q(D_{it}) = Q(m_1(X_{it})) + Q(V_{it}). + +For the *partialling-out* PLPR + +.. math:: + + Q(Y_{it}) = \theta_0 Q(V_{it}) + Q(\ell_1(X_{it})) + Q(U_{it}), + + Q(V_{it}) = Q(D_{it}) - Q(m_1(X_{it})). + +These transformations remove the fixed effect terms, as :math:`Q(\alpha_i^*)=Q(\alpha_i)=Q(\gamma_i)=0`. + +**Implementation** + +``DoubleMLPLPR`` implements the estimation and requires :ref:`DoubleMLPanelData ` with parameter ``static_panel=True`` as input. +Unit identifier and time period columns are set with ``id_col`` and ``t_col`` in :ref:`DoubleMLPanelData `. + +The model described in `Clarke and Polselli (2025) `_ uses block-k-fold sample splitting, where the entire time series +of the sampled unit is allocated to one fold to allow for possible serial correlation within each unit, which is often the case for panel data. Furthermore, +cluster robust standard error are employed. ``DoubleMLPLPR`` implements both of these aspects by using ``id_col`` as the cluster variable internally, see the example notebook +`Python: Cluster Robust Double Machine Learning `_. + +The ``DoubleMLPLPR`` model inlcudes four different estimation approaches. The first two are correlated random effects (CRE) variants, the latter +two are transformation approaches. This can be selected with the ``approach`` parameter. + +``approach='cre_general'``: + +- Learn :math:`\tilde{\ell}_1 (X_{it}, \bar{X}_i)` from :math:`\{ Y_{it}, X_{it}, \bar{X}_i : t=1,\dots, T \}_{i=1}^N`, + +- First learn :math:`\tilde{m}_1(X_{it}, \bar{X}_i)` from :math:`\{ D_{it}, X_{it}, \bar{X}_i : t=1,\dots, T \}_{i=1}^N`, with predictions :math:`\hat{m}_{1,it} = \tilde{m}_1 (X_{it}, \bar{X}_i)` + + - Calculate :math:`\hat{\bar{m}}_i = T^{-1} \sum_{t=1}^T \hat{m}_{1,it}`, + + - Calculate final nuisance part as :math:`\hat{m}^*_1 (X_{it}, \bar{X}_i, \bar{D}_i) = \hat{m}_{1,it} + \bar{D}_i - \hat{\bar{m}}_i`, + + where :math:`\hat{m}^*_1 (X_{it}, \bar{X}_i, \bar{D}_i) = \mathbb{E}[D_{it} | X_{it}, \bar{X}_i] + c_i`. + +- :math:`g_1` can be learnt iteratively from :math:`\{ Y_{it}, X_{it}, \bar{X}_i : t=1,\dots, T \}_{i=1}^N` using estimates for :math:`\tilde{\ell}_1, \tilde{m}_1`. + +``approach='cre_normal'`` + +Under the assumption that the conditional distribution :math:`D_{i1}, \dots, D_{iT} | X_{i1}, \dots X_{iT}` is multivariate normal (see `Clarke and Polselli (2025) `_ for further details): + +- Learn :math:`\tilde{\ell}_1 (X_{it}, \bar{X}_i)` from :math:`\{ Y_{it}, X_{it}, \bar{X}_i : t=1,\dots, T \}_{i=1}^N`, + +- Learn :math:`m^*_{1}` from :math:`\{ D_{it}, X_{it}, \bar{X}_i, \bar{D}_i: t=1,\dots, T \}_{i=1}^N`, + +- :math:`g_1` can be learnt iteratively from :math:`\{ Y_{it}, X_{it}, \bar{X}_i : t=1,\dots, T \}_{i=1}^N` using estimates for :math:`\tilde{\ell}_1, \tilde{m}_1`. + +``approach='fd_exact'`` + +Consider First-Difference (FD) transformation :math:`Q(W_{it})= W_{it} - W_{it-1}`, under the assumptions from above, +`Clarke and Polselli (2025) `_ show that :math:`\mathbb{E}[Y_{it}-Y_{it-1} | X_{it-1},X_{it}] =\Delta \ell_1 (X_{it-1}, X_{it})` and +:math:`\mathbb{E}[D_{it}-D_{it-1} | X_{it-1},X_{it}] =\Delta m_1 (X_{it-1}, X_{it})`. Therefore, the transformed nuisance function can be learnt as + +- Learn :math:`\Delta \ell_1 (X_{it-1}, X_{it})` from :math:`\{ Y_{it}-Y_{it-1}, X_{it-1}, X_{it} : t=2, \dots , T \}_{i=1}^N`, + +- Learn :math:`\Delta m_1 (X_{it-1}, X_{it})` from :math:`\{ D_{it}-D_{it-1}, X_{it-1}, X_{it} : t=2, \dots , T \}_{i=1}^N`, + +- :math:`\Delta g_1 (X_{it-1}, X_{it})` can be learnt iteratively from :math:`\{ Y_{it}-Y_{it-1}, X_{it-1}, X_{it} : t=2, \dots , T \}_{i=1}^N` using estimates for :math:`\Delta \ell_1, \Delta m_1`. + +``approach='wg_approx'`` + +For the Within-Group (WG) transformation :math:`Q(W_{it})= W_{it} - \bar{W}_{i}`, where :math:`\bar{W}_{i} = T^{-1} \sum_{t=1}^T W_{it}`. +Approximating the model gives + +.. math:: + \begin{align*} + Q(Y_{it}) &\approx \theta_0 Q(D_{it}) + g_1 (Q(X_{it})) + Q(U_{it}), \\ + Q(D_{it}) &\approx m_1 (Q(X_{it})) + Q(V_{it}). + \end{align*} + +Similarly, + +.. math:: + Q(Y_{it}) &\approx \theta_0 Q(V_{it}) + \ell_1 (Q(X_{it})) + Q(U_{it}). + +- Learn :math:`\ell_1` from transformed data :math:`\{ Q(Y_{it}), Q(X_{it}) : t=1,\dots,T \}_{i=1}^N`, + +- Learn :math:`m_1` from transformed data :math:`\{ Q(D_{it}), Q(X_{it}) : t=1,\dots,T \}_{i=1}^N`, + +- :math:`g_1` can be learnt iteratively from :math:`\{ Q(Y_{it}), Q(X_{it}) : t=1,\dots,T \}_{i=1}^N`, using estimates for :math:`\ell_1, m_1`. diff --git a/doc/guide/scores/plm/plm_scores.inc b/doc/guide/scores/plm/plm_scores.inc index 7b6ad0c8..a1122ef1 100644 --- a/doc/guide/scores/plm/plm_scores.inc +++ b/doc/guide/scores/plm/plm_scores.inc @@ -14,6 +14,13 @@ Logistic partial linear regression (LPLR) .. include:: /guide/scores/plm/lplr_score.rst +.. _plpr-score: + +Partially linear panel regression (PLPR) +======================================== + +.. include:: /guide/scores/plm/plpr_score.rst + .. _pliv-score: Partially linear IV regression model (PLIV) diff --git a/doc/guide/scores/plm/plpr_score.rst b/doc/guide/scores/plm/plpr_score.rst new file mode 100644 index 00000000..b9b0c2c5 --- /dev/null +++ b/doc/guide/scores/plm/plpr_score.rst @@ -0,0 +1,111 @@ +For the PLPR model implemented in ``DoubleMLPLPR`` one can choose between +``score='partialling out'`` and ``score='IV-type'``. + +``score='partialling out'`` implements the score function: + +For correlated random effect (cre) approaches ``approach='cre_general'`` and ``approach='cre_normal'`` + +.. math:: + + \psi(W_{it}; \theta, \eta) &:= [Y_{it} - \tilde{\ell}(X_{it},\bar{X}_i) - \theta (D_{it} - \tilde{m}(X_{it},\bar{X}_i) - c_i)] [D_{it} - \tilde{m}(X_{it},\bar{X}_i) - c_i] + + &= - (D_{it} - \tilde{m}(X_{it},\bar{X}_i) - c_i) (D_{it} - \tilde{m}(X_{it},\bar{X}_i) - c_i) \theta + (Y_{it} - \tilde{\ell}(X_{it},\bar{X}_i)) (D_{it} - \tilde{m}(X_{it},\bar{X}_i) - c_i) + + &= \psi_a(W_{it}; \eta) \theta + \psi_b(W_{it}; \eta) + +with :math:`\eta=(\tilde{\ell},\tilde{m})`, where + +.. math:: + + \tilde{\ell}_0(X_{it},\bar{X}_i) &:= \mathbb{E}[Y_{it} \mid X_{it}, \bar{X}_i] = \theta_0\mathbb{E}[D_{it} \mid X_{it}, \bar{X}_i] + g(X_{it}, \bar{X}_i), + + \tilde{m}_0(X_{it},\bar{X}_i) + c_i &:= \mathbb{E}[D_{it} \mid X_{it}, \bar{X}_i]. + +The components of the linear score are + +.. math:: + + \psi_a(W_{it}; \eta) &= - (D_{it} - \tilde{m}(X_{it},\bar{X}_i) - c_i) (D_{it} - \tilde{m}(X_{it},\bar{X}_i) - c_i), + + \psi_b(_{it}W; \eta) &= (Y_{it} - \tilde{\ell}(X_{it},\bar{X}_i)) (D_{it} - \tilde{m}(X_{it},\bar{X}_i) - c_i). + + +For transformation approaches ``approach='fd_exact'`` and ``approach='wg_approx'``, where :math:`Q(W_{it})` indicates a transformated variable :math:`W_{it}` + +.. math:: + + \psi(Q(W_{it}); \theta, \eta) &:= [Q(Y_{it}) - Q(\ell(X_{it})) - \theta (Q(D_{it}) - Q(m(X_{it})))] [Q(D_{it}) - Q(m(X_{it}))] + + &= - (Q(D_{it}) - Q(m(X_{it}))) (Q(D_{it}) - Q(m(X_{it}))) \theta + (Q(Y_{it}) - Q(\ell(X_{it}))) (Q(D_{it}) - Q(m(X_{it}))) + + &= \psi_a(Q(W_{it}); \eta) \theta + \psi_b(Q(W_{it}); \eta) + +with :math:`\eta=(\ell,m)`, where + +.. math:: + + Q(\ell_0(X)) &:= \mathbb{E}[Q(Y_{it}) \mid Q(X_{it})] = \theta_0\mathbb{E}[Q(D_{it}) \mid Q(X_{it})] + Q(g(X_{it})), + + Q(m_0(X)) &:= \mathbb{E}[Q(D_{it}) \mid Q(X_{it})]. + +The components of the linear score are + +.. math:: + + \psi_a(Q(W_{it}); \eta) &= - (Q(D_{it}) - Q(m(X_{it}))) (Q(D_{it}) - Q(m(X_{it}))), + + \psi_b(Q(W_{it}); \eta) &= Q(Y_{it}) - Q(\ell(X_{it})) (Q(D_{it}) - Q(m(X_{it}))). + +``score='IV-type'`` implements the score function: + +For correlated random effect (cre) approaches ``approach='cre_general'`` and ``approach='cre_normal'`` + +.. math:: + + \psi(W_{it}; \theta, \eta) &:= [Y_{it} - D_{it} \theta - \tilde{g}(X_{it},\bar{X}_i)] [D_{it} - \tilde{m}(X_{it},\bar{X}_i) - c_i] + + &= - D_{it} (D_{it} - \tilde{m}(X_{it},\bar{X}_i) - c_i) \theta + (Y_{it} - \tilde{g}(X_{it},\bar{X}_i)) (D_{it} - \tilde{m}(X_{it},\bar{X}_i) - c_i) + + &= \psi_a(W_{it}; \eta) \theta + \psi_b(W_{it}; \eta) + +with :math:`\eta=(\tilde{g},\tilde{m})`, where + +.. math:: + + \tilde{g}_0(X_{it},\bar{X}_i) &:= \mathbb{E}[Y_{it} - D_{it} \theta_0 \mid X_{it},\bar{X}_i], + + \tilde{m}_0(X_{it},\bar{X}_i) + c_i &:= \mathbb{E}[D_{it} \mid X_{it}, \bar{X}_i]. + +The components of the linear score are + +.. math:: + + \psi_a(W_{it}; \eta) &= - D_{it} (D_{it} - \tilde{m}(X_{it},\bar{X}_i)), + + \psi_b(W_{it}; \eta) &= (Y_{it} - \tilde{g}(X_{it},\bar{X}_i)) (D_{it} - \tilde{m}(X_{it},\bar{X}_i)). + +For transformation scores ``approach='fd_exact'`` and ``approach='wg_approx'``, where :math:`Q(W_{it})` indicates a transformated variable :math:`W_{it}` + +.. math:: + + \psi(Q(W_{it}); \theta, \eta) &:= [Q(Y_{it}) - Q(D_{it}) \theta - Q(g(X_{it}))] [Q(D_{it}) - Q(m(X_{it}))] + + &= - Q(D_{it}) (Q(D_{it}) - Q(m(X_{it}))) \theta + (Q(Y_{it}) - Q(g(X_{it}))) (Q(D_{it}) - Q(m(X_{it}))) + + &= \psi_a(Q(W_{it}); \eta) \theta + \psi_b(Q(W_{it}); \eta) + +with :math:`\eta=(g,m)`, where + +.. math:: + + Q(g_0(X_{it})) &:= \mathbb{E}[Q(Y_{it}) - Q(D_{it}) \theta_0 \mid Q(X_{it})], + + Q(m_0(X_{it})) &:= \mathbb{E}[Q(D_{it}) \mid Q(X_{it})]. + +The components of the linear score are + +.. math:: + + \psi_a(Q(W_{it}); \eta) &= - Q(D_{it}) (Q(D_{it}) - Q(m(X_{it}))), + + \psi_b(Q(W_{it}); \eta) &= (Q(Y_{it}) - Q(g(X_{it}))) (Q(D_{it}) - Q(m(X_{it}))). \ No newline at end of file diff --git a/doc/index.rst b/doc/index.rst index 742a9cf7..bf226f2f 100644 --- a/doc/index.rst +++ b/doc/index.rst @@ -263,7 +263,7 @@ Acknowledgements ---------------- Funding by the Deutsche Forschungsgemeinschaft (DFG, German Research -Foundation) is acknowledged – Project Number 431701914 and Grant GRK 2805/1. +Foundation) is acknowledged – Project Number 431701914, Grant GRK 2805/1 and Project Number 530859036. References ---------- diff --git a/doc/literature/literature.rst b/doc/literature/literature.rst index 4869223e..ecde52e7 100644 --- a/doc/literature/literature.rst +++ b/doc/literature/literature.rst @@ -142,6 +142,12 @@ Double Machine Learning Literature :octicon:`link` :bdg-link-dark:`URL ` |hr| + - Paul S Clarke, Annalivia Polselli |br| + **Double machine learning for static panel models with fixed effects** |br| + *The Econometrics Journal, utaf011, 2025* |br| + :octicon:`link` :bdg-link-dark:`URL ` + |hr| + - Yusuke Narita, Shota Yasui, Kohei Yata |br| **Debiased Off-Policy Evaluation for Recommendation Systems** |br| *RecSys '21: Fifteenth ACM Conference on Recommender Systems, 372–379, 2021* |br|