1 Learning Goals
A typical secondary aim analysis from a SMART involves using the data to learn a set of decision rules for a more deeply-tailored adaptive intervention. A more deeply-tailored AI is an adaptive intervention that includes additional tailoring variables beyond those embedded by design. The proposed AI leads to better outcomes through increased personalization. In this workbook, we will implement Q-learning to address this type of secondary aim.
Q-learning uses a subset of moderators to test the utility of candidate tailoring variables for tailoring first- and second-stage intervention options in a 2-stage SMART. Q-learning leads to a proposal for an AI that uses baseline and time-varying covariates to tailor treatment.
In this workbook we will:
Learn how to implement Q-learning to estimate the effect of a more-deeply tailored AI.
- Fit and interpret moderated regression models
- Implement the R Package
qlaci
2 Setup
Load required packages
3 Load simulated data
These are data that were simulated to mimic data arising from the ADHD SMART study (PI: William Pelham). An accompanying handout (“ADHD_SMART_handout.pdf”) describes the variables in the data set.
Examine data
ID | odd | severity | priormed | race | Y0 | A1 | R | NRtime | adherence | Y1 | A2 | Y2 | cell |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | 1 | 2.880 | 0 | 1 | 2.321 | -1 | 0 | 4 | 0 | 2.791 | 1 | 0.598 | C |
2 | 0 | 4.133 | 0 | 0 | 2.068 | 1 | 1 | NA | 1 | 2.200 | NA | 4.267 | D |
3 | 1 | 5.569 | 0 | 1 | 1.004 | -1 | 1 | NA | 0 | 2.292 | NA | 1.454 | A |
4 | 0 | 4.931 | 0 | 1 | 3.232 | 1 | 0 | 4 | 0 | 3.050 | -1 | 6.762 | E |
5 | 1 | 5.502 | 0 | 1 | 1.477 | 1 | 0 | 6 | 0 | 1.732 | -1 | 3.580 | E |
6 | 0 | 5.497 | 0 | 1 | 1.720 | 1 | 0 | 3 | 0 | 2.400 | 1 | 2.075 | F |
7 | 0 | 6.786 | 0 | 1 | 2.265 | 1 | 0 | 7 | 0 | 2.837 | 1 | 2.594 | F |
8 | 0 | 4.317 | 0 | 1 | 2.814 | 1 | 1 | NA | 0 | 2.751 | NA | 4.051 | D |
9 | 1 | 9.088 | 1 | 1 | 1.896 | 1 | 0 | 6 | 0 | 3.594 | -1 | 2.970 | E |
10 | 0 | 6.094 | 0 | 1 | 2.503 | 1 | 0 | 5 | 1 | 2.546 | 1 | 6.022 | F |
11 | 0 | 2.016 | 0 | 1 | 2.810 | 1 | 0 | 4 | 0 | 1.916 | -1 | 5.561 | E |
12 | 0 | 4.308 | 0 | 1 | 2.650 | -1 | 1 | NA | 0 | 3.088 | NA | 2.476 | A |
13 | 0 | 6.290 | 0 | 1 | 2.188 | 1 | 0 | 2 | 0 | 1.733 | 1 | 3.188 | F |
14 | 0 | 3.972 | 0 | 1 | 2.281 | -1 | 0 | 8 | 1 | 2.845 | 1 | 3.116 | C |
15 | 0 | 5.862 | 0 | 1 | 1.588 | 1 | 0 | 8 | 1 | 1.681 | 1 | 4.182 | F |
16 | 0 | 6.086 | 1 | 1 | 2.105 | 1 | 0 | 8 | 1 | 2.296 | -1 | 1.920 | E |
17 | 0 | 8.259 | 1 | 1 | 1.695 | 1 | 0 | 4 | 0 | 2.279 | -1 | 3.230 | E |
18 | 1 | 3.371 | 1 | 1 | 2.222 | 1 | 0 | 3 | 0 | 1.669 | 1 | 0.920 | F |
19 | 1 | 4.958 | 0 | 1 | 2.084 | 1 | 0 | 5 | 1 | 2.605 | -1 | 3.940 | E |
20 | 1 | 3.581 | 1 | 0 | 0.614 | 1 | 0 | 4 | 0 | 0.681 | 1 | -2.254 | F |
21 | 0 | 2.182 | 0 | 0 | 2.204 | 1 | 0 | 5 | 0 | 2.580 | -1 | 5.381 | E |
22 | 0 | 0.378 | 0 | 1 | 2.608 | 1 | 1 | NA | 1 | 1.469 | NA | 5.617 | D |
23 | 0 | 5.489 | 0 | 1 | 2.278 | -1 | 1 | NA | 0 | 2.958 | NA | 2.872 | A |
24 | 0 | 6.504 | 0 | 1 | 2.098 | 1 | 0 | 5 | 0 | 2.445 | -1 | 4.461 | E |
25 | 0 | 6.598 | 0 | 0 | 1.858 | 1 | 0 | 4 | 0 | 1.701 | 1 | 2.978 | F |
26 | 0 | 1.525 | 1 | 1 | 3.259 | -1 | 0 | 5 | 1 | 4.138 | -1 | 5.591 | B |
27 | 1 | 2.176 | 1 | 1 | 2.010 | -1 | 0 | 7 | 1 | 2.007 | -1 | 2.408 | B |
28 | 0 | 2.147 | 1 | 1 | 2.181 | -1 | 0 | 8 | 1 | 3.052 | -1 | 5.045 | B |
29 | 0 | 3.278 | 1 | 1 | 2.387 | 1 | 0 | 8 | 0 | 2.151 | 1 | 0.497 | F |
30 | 0 | 5.425 | 0 | 1 | 2.381 | 1 | 0 | 5 | 1 | 2.467 | -1 | 4.281 | E |
31 | 0 | 6.486 | 1 | 1 | 1.998 | 1 | 0 | 8 | 1 | 1.226 | 1 | 1.439 | F |
32 | 1 | 2.962 | 0 | 1 | 2.249 | 1 | 1 | NA | 1 | 2.477 | NA | 5.463 | D |
33 | 0 | 7.385 | 0 | 0 | 1.923 | -1 | 1 | NA | 0 | 2.871 | NA | 2.200 | A |
34 | 1 | 6.538 | 0 | 1 | 1.801 | -1 | 1 | NA | 0 | 3.177 | NA | 1.842 | A |
35 | 0 | 3.248 | 0 | 0 | 2.705 | -1 | 0 | 4 | 0 | 3.179 | -1 | 4.034 | B |
36 | 0 | 2.316 | 0 | 1 | 2.339 | -1 | 0 | 5 | 0 | 2.904 | -1 | 4.301 | B |
37 | 0 | 4.250 | 1 | 1 | 2.787 | 1 | 0 | 8 | 0 | 3.016 | 1 | 2.971 | F |
38 | 0 | 5.270 | 0 | 1 | 2.867 | 1 | 0 | 5 | 0 | 2.840 | 1 | 3.508 | F |
39 | 0 | 6.808 | 0 | 1 | 1.975 | 1 | 1 | NA | 0 | 2.459 | NA | 3.786 | D |
40 | 0 | 8.475 | 0 | 1 | 1.953 | 1 | 0 | 7 | 1 | 2.447 | 1 | 5.279 | F |
41 | 1 | 2.878 | 1 | 0 | 1.349 | 1 | 0 | 5 | 0 | 0.680 | 1 | -1.502 | F |
42 | 0 | 5.727 | 0 | 1 | 3.119 | 1 | 0 | 2 | 1 | 3.293 | -1 | 4.111 | E |
43 | 1 | 8.335 | 1 | 1 | 1.923 | -1 | 1 | NA | 1 | 2.990 | NA | 4.761 | A |
44 | 1 | 2.747 | 1 | 1 | 1.990 | 1 | 1 | NA | 0 | 1.736 | NA | 1.723 | D |
45 | 1 | 4.443 | 0 | 1 | 1.788 | -1 | 0 | 8 | 0 | 3.066 | 1 | 1.169 | C |
46 | 0 | 0.571 | 0 | 1 | 2.965 | -1 | 0 | 3 | 0 | 3.159 | 1 | 1.781 | C |
47 | 0 | 5.719 | 1 | 1 | 2.176 | 1 | 0 | 2 | 1 | 1.956 | 1 | 4.637 | F |
48 | 0 | 3.397 | 1 | 1 | 2.850 | 1 | 0 | 4 | 1 | 3.320 | 1 | 4.529 | F |
49 | 0 | 2.688 | 0 | 1 | 2.544 | -1 | 0 | 7 | 0 | 3.180 | -1 | 3.628 | B |
50 | 0 | 3.050 | 1 | 1 | 2.972 | 1 | 0 | 3 | 0 | 3.053 | 1 | 0.634 | F |
51 | 1 | 6.111 | 0 | 1 | 2.159 | 1 | 0 | 6 | 0 | 3.662 | 1 | 3.853 | F |
52 | 1 | 3.936 | 0 | 1 | 1.938 | -1 | 0 | 6 | 0 | 2.685 | 1 | 0.217 | C |
53 | 0 | 2.466 | 0 | 1 | 2.677 | 1 | 1 | NA | 1 | 2.473 | NA | 6.786 | D |
54 | 1 | 2.595 | 0 | 1 | 2.129 | -1 | 1 | NA | 0 | 2.717 | NA | 2.857 | A |
55 | 1 | 6.187 | 0 | 0 | 1.507 | -1 | 0 | 2 | 1 | 2.873 | -1 | 0.820 | B |
56 | 0 | 5.879 | 0 | 1 | 1.872 | 1 | 1 | NA | 1 | 2.391 | NA | 4.780 | D |
57 | 1 | 4.732 | 0 | 1 | 1.189 | 1 | 0 | 3 | 0 | 2.239 | -1 | 2.980 | E |
58 | 0 | 6.344 | 0 | 1 | 2.148 | -1 | 0 | 3 | 1 | 3.665 | -1 | 2.556 | B |
59 | 0 | 3.495 | 0 | 1 | 1.542 | -1 | 0 | 5 | 0 | 1.286 | -1 | 1.233 | B |
60 | 1 | 7.886 | 1 | 1 | 2.282 | -1 | 1 | NA | 0 | 3.416 | NA | 4.560 | A |
61 | 0 | 4.423 | 1 | 1 | 2.077 | -1 | 1 | NA | 1 | 2.955 | NA | 5.083 | A |
62 | 0 | 4.087 | 0 | 1 | 1.597 | 1 | 0 | 8 | 1 | 1.305 | 1 | 3.234 | F |
63 | 1 | 5.394 | 0 | 1 | 1.394 | 1 | 0 | 5 | 1 | 1.786 | 1 | 5.340 | F |
64 | 1 | 6.915 | 0 | 1 | 2.176 | -1 | 1 | NA | 1 | 3.606 | NA | 3.198 | A |
65 | 0 | 6.065 | 1 | 1 | 2.167 | 1 | 0 | 8 | 1 | 2.629 | 1 | 3.299 | F |
66 | 0 | 7.318 | 1 | 1 | 1.382 | 1 | 1 | NA | 0 | 1.680 | NA | 0.335 | D |
67 | 1 | 5.115 | 1 | 1 | 2.059 | 1 | 1 | NA | 1 | 2.599 | NA | 3.242 | D |
68 | 0 | 3.739 | 0 | 1 | 2.006 | -1 | 0 | 6 | 0 | 2.664 | -1 | 1.805 | B |
69 | 0 | 4.249 | 0 | 1 | 1.821 | -1 | 0 | 4 | 0 | 2.355 | -1 | 1.650 | B |
70 | 0 | 7.057 | 1 | 1 | 1.867 | 1 | 0 | 6 | 1 | 1.765 | 1 | 2.779 | F |
71 | 1 | 2.877 | 0 | 1 | 1.927 | -1 | 0 | 5 | 0 | 3.615 | 1 | 0.719 | C |
72 | 1 | 2.616 | 0 | 1 | 2.027 | -1 | 0 | 6 | 1 | 3.535 | 1 | 3.452 | C |
73 | 0 | 5.784 | 0 | 1 | 1.554 | 1 | 0 | 6 | 0 | 0.875 | -1 | 4.558 | E |
74 | 1 | 6.531 | 0 | 1 | 1.131 | -1 | 0 | 5 | 1 | 2.344 | -1 | 0.366 | B |
75 | 0 | 0.772 | 0 | 1 | 1.593 | 1 | 0 | 7 | 0 | 0.769 | 1 | 0.737 | F |
76 | 1 | 6.962 | 1 | 0 | 1.798 | 1 | 1 | NA | 0 | 2.537 | NA | 1.504 | D |
77 | 1 | 5.030 | 0 | 1 | 1.695 | -1 | 0 | 2 | 1 | 3.142 | -1 | 3.474 | B |
78 | 0 | 4.547 | 0 | 1 | 2.394 | -1 | 1 | NA | 0 | 3.609 | NA | 2.942 | A |
79 | 1 | 7.767 | 0 | 1 | 1.587 | -1 | 1 | NA | 0 | 2.875 | NA | 0.960 | A |
80 | 0 | 8.698 | 0 | 0 | 1.852 | -1 | 1 | NA | 1 | 2.972 | NA | 2.292 | A |
81 | 1 | 2.132 | 0 | 1 | 2.038 | -1 | 1 | NA | 0 | 3.048 | NA | 2.602 | A |
82 | 1 | 7.529 | 1 | 1 | 1.406 | -1 | 1 | NA | 0 | 3.005 | NA | 3.043 | A |
83 | 0 | 3.420 | 1 | 1 | 2.948 | -1 | 1 | NA | 1 | 3.493 | NA | 6.463 | A |
84 | 1 | 5.331 | 0 | 1 | 1.752 | 1 | 0 | 2 | 1 | 1.536 | 1 | 5.077 | F |
85 | 1 | 6.202 | 0 | 1 | 1.404 | 1 | 0 | 3 | 1 | 2.111 | 1 | 4.198 | F |
86 | 0 | 5.333 | 0 | 1 | 2.100 | 1 | 0 | 8 | 0 | 1.590 | 1 | 1.029 | F |
87 | 0 | 1.343 | 0 | 0 | 2.868 | -1 | 1 | NA | 1 | 2.099 | NA | 1.309 | A |
88 | 1 | 3.734 | 1 | 1 | 2.061 | 1 | 0 | 6 | 0 | 2.152 | -1 | 2.221 | E |
89 | 0 | 8.849 | 1 | 1 | 1.928 | -1 | 1 | NA | 0 | 3.993 | NA | 4.577 | A |
90 | 0 | 5.432 | 0 | 1 | 2.279 | 1 | 0 | 8 | 0 | 1.751 | 1 | 3.198 | F |
91 | 0 | 2.910 | 0 | 1 | 2.446 | -1 | 0 | 7 | 0 | 2.452 | 1 | -0.182 | C |
92 | 1 | 3.360 | 0 | 1 | 1.553 | -1 | 0 | 6 | 1 | 3.167 | -1 | 1.481 | B |
93 | 1 | 7.644 | 0 | 1 | 0.970 | -1 | 1 | NA | 0 | 1.976 | NA | -0.457 | A |
94 | 0 | 5.865 | 0 | 0 | 1.626 | 1 | 1 | NA | 0 | 1.479 | NA | 2.888 | D |
95 | 1 | 2.757 | 0 | 1 | 2.111 | -1 | 1 | NA | 1 | 3.116 | NA | 2.896 | A |
96 | 0 | 5.016 | 0 | 1 | 2.062 | 1 | 0 | 7 | 0 | 2.080 | 1 | 2.314 | F |
97 | 1 | 6.492 | 0 | 1 | 1.747 | 1 | 1 | NA | 0 | 2.240 | NA | 4.268 | D |
98 | 0 | 9.306 | 0 | 1 | 1.439 | -1 | 0 | 3 | 0 | 2.684 | -1 | 1.358 | B |
99 | 1 | 2.796 | 0 | 1 | 1.608 | 1 | 0 | 6 | 1 | 1.917 | -1 | 3.311 | E |
100 | 0 | 4.050 | 0 | 1 | 1.463 | 1 | 0 | 2 | 1 | 1.846 | -1 | 2.443 | E |
101 | 1 | 7.490 | 0 | 0 | 1.077 | -1 | 0 | 2 | 0 | 2.127 | -1 | 0.846 | B |
102 | 0 | 3.852 | 0 | 1 | 2.190 | 1 | 0 | 3 | 0 | 1.993 | 1 | 2.577 | F |
103 | 0 | 8.651 | 0 | 1 | 1.648 | -1 | 1 | NA | 0 | 2.730 | NA | 1.069 | A |
104 | 0 | 7.608 | 0 | 1 | 2.003 | 1 | 1 | NA | 0 | 2.980 | NA | 4.693 | D |
105 | 1 | 0.852 | 1 | 1 | 2.636 | 1 | 0 | 5 | 1 | 2.200 | 1 | 3.382 | F |
106 | 0 | 6.060 | 1 | 1 | 1.964 | 1 | 0 | 6 | 0 | 1.601 | 1 | -0.108 | F |
107 | 0 | 1.452 | 1 | 1 | 2.756 | -1 | 1 | NA | 1 | 3.350 | NA | 5.912 | A |
108 | 0 | 1.338 | 1 | 1 | 2.648 | -1 | 1 | NA | 1 | 3.104 | NA | 4.817 | A |
109 | 1 | 5.176 | 1 | 1 | 1.416 | -1 | 0 | 2 | 0 | 2.115 | 1 | 1.030 | C |
110 | 1 | 5.142 | 0 | 1 | 1.895 | 1 | 1 | NA | 0 | 3.252 | NA | 3.941 | D |
111 | 0 | 4.918 | 0 | 1 | 2.131 | -1 | 0 | 5 | 1 | 3.795 | -1 | 2.873 | B |
112 | 1 | 8.161 | 0 | 1 | 1.577 | -1 | 0 | 6 | 1 | 2.387 | -1 | 1.688 | B |
113 | 0 | 5.737 | 0 | 1 | 2.624 | -1 | 0 | 2 | 1 | 4.386 | 1 | 4.667 | C |
114 | 1 | 3.317 | 0 | 1 | 1.613 | -1 | 0 | 8 | 0 | 1.872 | 1 | -1.160 | C |
115 | 1 | 5.299 | 0 | 1 | 1.656 | -1 | 1 | NA | 0 | 2.877 | NA | 1.997 | A |
116 | 0 | 3.714 | 0 | 1 | 1.957 | -1 | 1 | NA | 0 | 1.977 | NA | 1.564 | A |
117 | 1 | 4.830 | 1 | 1 | 2.297 | 1 | 0 | 3 | 0 | 2.349 | -1 | 3.213 | E |
118 | 1 | 2.130 | 1 | 0 | 2.114 | -1 | 0 | 8 | 0 | 2.634 | -1 | 3.994 | B |
119 | 0 | 4.712 | 0 | 1 | 1.952 | -1 | 0 | 5 | 1 | 2.637 | -1 | 0.924 | B |
120 | 1 | 5.257 | 0 | 1 | 1.527 | 1 | 1 | NA | 1 | 1.631 | NA | 4.876 | D |
121 | 0 | 3.246 | 0 | 1 | 2.099 | -1 | 0 | 7 | 0 | 1.644 | 1 | 0.379 | C |
122 | 0 | 5.103 | 0 | 1 | 2.133 | 1 | 0 | 2 | 0 | 2.405 | -1 | 5.741 | E |
123 | 0 | 3.378 | 0 | 0 | 1.393 | 1 | 1 | NA | 0 | 1.055 | NA | 2.530 | D |
124 | 1 | 3.390 | 0 | 1 | 1.524 | -1 | 1 | NA | 0 | 2.599 | NA | 1.039 | A |
125 | 0 | 4.489 | 1 | 1 | 1.171 | 1 | 0 | 4 | 1 | 1.001 | -1 | 0.838 | E |
126 | 0 | 6.227 | 1 | 0 | 2.206 | -1 | 0 | 2 | 1 | 3.152 | 1 | 4.635 | C |
127 | 1 | 4.487 | 0 | 1 | 2.005 | -1 | 0 | 4 | 0 | 2.863 | -1 | 2.960 | B |
128 | 1 | 5.477 | 0 | 1 | 2.152 | -1 | 0 | 8 | 0 | 3.665 | -1 | 3.994 | B |
129 | 0 | 7.425 | 1 | 1 | 1.964 | -1 | 0 | 8 | 0 | 2.697 | -1 | 5.092 | B |
130 | 0 | 4.287 | 0 | 0 | 2.123 | -1 | 0 | 2 | 0 | 2.518 | -1 | 2.315 | B |
131 | 0 | 4.741 | 0 | 1 | 1.952 | 1 | 1 | NA | 0 | 2.134 | NA | 3.617 | D |
132 | 1 | 6.278 | 0 | 1 | 1.329 | -1 | 0 | 6 | 1 | 2.891 | -1 | 1.090 | B |
133 | 0 | 1.855 | 0 | 1 | 3.124 | 1 | 1 | NA | 1 | 3.572 | NA | 6.706 | D |
134 | 1 | 6.360 | 1 | 0 | 1.003 | -1 | 0 | 4 | 1 | 1.654 | 1 | 3.676 | C |
135 | 1 | 3.663 | 0 | 1 | 2.656 | 1 | 0 | 7 | 0 | 3.281 | -1 | 6.073 | E |
136 | 0 | 2.155 | 0 | 1 | 2.626 | 1 | 1 | NA | 1 | 2.152 | NA | 5.881 | D |
137 | 1 | 0.580 | 1 | 0 | 2.659 | 1 | 1 | NA | 1 | 2.595 | NA | 3.440 | D |
138 | 0 | 3.213 | 0 | 1 | 3.112 | -1 | 0 | 6 | 0 | 3.405 | 1 | 2.047 | C |
139 | 1 | 2.074 | 0 | 1 | 1.632 | -1 | 0 | 2 | 0 | 1.234 | -1 | 1.646 | B |
140 | 0 | 2.054 | 0 | 0 | 2.698 | -1 | 0 | 5 | 1 | 3.245 | 1 | 3.529 | C |
141 | 1 | 9.068 | 0 | 1 | 1.315 | -1 | 0 | 5 | 0 | 2.759 | 1 | 0.192 | C |
142 | 0 | 1.970 | 0 | 0 | 1.827 | 1 | 1 | NA | 0 | 1.190 | NA | 3.028 | D |
143 | 0 | 7.494 | 1 | 0 | 1.929 | -1 | 0 | 6 | 0 | 2.208 | 1 | 0.591 | C |
144 | 1 | 6.363 | 1 | 1 | 1.712 | -1 | 0 | 8 | 0 | 2.984 | 1 | 1.860 | C |
145 | 1 | 2.788 | 0 | 0 | 1.679 | -1 | 0 | 6 | 1 | 3.986 | -1 | 2.303 | B |
146 | 0 | 3.472 | 0 | 1 | 1.973 | 1 | 0 | 2 | 0 | 1.971 | 1 | 2.107 | F |
147 | 0 | 3.975 | 0 | 1 | 1.789 | 1 | 0 | 2 | 1 | 1.044 | 1 | 3.452 | F |
148 | 0 | 5.476 | 1 | 1 | 2.240 | 1 | 0 | 3 | 1 | 3.096 | -1 | 2.395 | E |
149 | 0 | 4.129 | 1 | 1 | 1.984 | 1 | 1 | NA | 0 | 1.748 | NA | 1.786 | D |
150 | 1 | 7.067 | 1 | 1 | 1.426 | 1 | 0 | 3 | 1 | 1.403 | 1 | 2.204 | F |
4 Q-learning
Q-learning is an extension of moderator analysis that uses backwards induction to learn a set of optimal decision rules for a more deeply-tailored adaptive intervention. The proposed adaptive intervention incorporates synergy between the first and second-stage decision rules. Crucially, this approach bypasses the causal problems associated with a single regression model.
Q-learning has 3 steps:
- Fit a stage 2 moderated regression model: \[Y_2 \sim A_2 \mid S_0, A_1, S_1\]
- Obtain optimal stage 2 decision rule: \(A^{opt}_2(S_0, A_1, S_1)\)
- Predict the stage 2 outcome under the optimal decision rule: \(\hat{Y}_{2i}(A_2^{opt})\)
- Fit a stage 1 moderated regression model on \[\hat{Y}_2(A_2^{opt}) \sim A_1 \mid S_0\]
- Obtain optimal stage 1 decision rule (accounting for optimal second stage): \(A_1^{opt}(S_0)\)
4.1 Step 1: second-stage tailoring
Recall that in Q-learning, we conduct the analysis backwards starting with 2nd stage treatment. We would like learn whether we can use baseline or intermediate covariates to select an optimal 2nd stage tactic for non-responders. For example, do covariates such as the initial treatment \(A_1\) and adherence to initial treatment adherence
moderate the effect of \(A_2\)?
We hypothesize that for children who are non-adherent to first-stage treatment it will be better to AUGment (\(A_2=-1\)) with the alternative treatment as opposed to INTensifying (\(A_2=1\)) the current treatment.
Regression model
We test this secondary aim by fitting a moderated regression model using data from non-responders. This model examines whether the binary intermediate outcome variable, adherence
\((S_1)\), and first-stage treatment \(A_1\), moderate the effect of second-stage treatment \(A_2\) on end-of-year outcome \(Y_2\), controlling for other baseline covariates \(\mathbf{X}\). The model is as follows:
\[ \begin{align*} E[Y \mid \mathbf{X}, A_1, S_1, R = 0, A_2] &= \beta_0 + \eta_{1:4}^T\mathbf{X}_c + \eta_5 A_{1} + \eta_6S_1 \\ &+ \beta_1 A_2 + \beta_2 A_1 A_2 + \beta_3S_1A_2 + \beta_4A_1S_1A_2 \end{align*} \tag{1}\]
We interact \(A_2\) with the un-centered variable adherence
since we are interested in the effect of \(A_2\) at the each of the two levels: adherent (0) and non-adherent (1).
First, subset data.frame to non-responders \((R=0)\) and center baseline covariates.
dat_adhd_nr <- filter(dat_adhd, R == 0) # subset data.frame to non-responders
dat_adhd_nr <- dat_adhd_nr %>% mutate(across(c(odd, severity, priormed, race), ~ .x - mean(.x), .names = "{.col}_c"))
head(dat_adhd_nr) %>% kable() # view top 6 rows
ID | odd | severity | priormed | race | Y0 | A1 | R | NRtime | adherence | Y1 | A2 | Y2 | cell | odd_c | severity_c | priormed_c | race_c |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | 1 | 2.88 | 0 | 1 | 2.32 | -1 | 0 | 4 | 0 | 2.79 | 1 | 0.598 | C | 0.604 | -1.866 | -0.307 | 0.139 |
4 | 0 | 4.93 | 0 | 1 | 3.23 | 1 | 0 | 4 | 0 | 3.05 | -1 | 6.762 | E | -0.396 | 0.186 | -0.307 | 0.139 |
5 | 1 | 5.50 | 0 | 1 | 1.48 | 1 | 0 | 6 | 0 | 1.73 | -1 | 3.580 | E | 0.604 | 0.756 | -0.307 | 0.139 |
6 | 0 | 5.50 | 0 | 1 | 1.72 | 1 | 0 | 3 | 0 | 2.40 | 1 | 2.075 | F | -0.396 | 0.752 | -0.307 | 0.139 |
7 | 0 | 6.79 | 0 | 1 | 2.27 | 1 | 0 | 7 | 0 | 2.84 | 1 | 2.594 | F | -0.396 | 2.041 | -0.307 | 0.139 |
9 | 1 | 9.09 | 1 | 1 | 1.90 | 1 | 0 | 6 | 0 | 3.59 | -1 | 2.970 | E | 0.604 | 4.343 | 0.693 | 0.139 |
Next, fit a moderated regression to the subset of non-responders. Interact \(A_2\) with \(A_1\) and adherence
\((S_1)\) and the interaction.
mod_QL_A2 <- lm(Y2 ~ odd_c + severity_c + priormed_c + race_c + A1 + adherence + A2+ A1:A2 + adherence:A2 + adherence:A1:A2,
data = dat_adhd_nr)
summary(mod_QL_A2)
Call:
lm(formula = Y2 ~ odd_c + severity_c + priormed_c + race_c +
A1 + adherence + A2 + A1:A2 + adherence:A2 + adherence:A1:A2,
data = dat_adhd_nr)
Residuals:
Min 1Q Median 3Q Max
-2.741 -0.942 0.052 0.825 3.299
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 2.3667 0.1768 13.38 < 2e-16 ***
odd_c -0.6658 0.2873 -2.32 0.0228 *
severity_c -0.0523 0.0719 -0.73 0.4692
priormed_c -0.5818 0.2995 -1.94 0.0552 .
race_c 0.2008 0.4133 0.49 0.6282
A1 0.4332 0.1469 2.95 0.0041 **
adherence 0.8856 0.2826 3.13 0.0023 **
A2 -1.1668 0.1769 -6.60 2.8e-09 ***
A1:A2 -0.2280 0.1830 -1.25 0.2162
adherence:A2 1.7105 0.2769 6.18 1.9e-08 ***
A1:adherence:A2 0.1872 0.2979 0.63 0.5314
---
Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
Residual standard error: 1.34 on 90 degrees of freedom
Multiple R-squared: 0.494, Adjusted R-squared: 0.438
F-statistic: 8.78 on 10 and 90 DF, p-value: 6.75e-10
Remember we are going backwards to get the effect of \(A_2\) among non-responders. This regression does not estimate the main effect of \(A_1\). We will do that later.
The regression model from Equation 1 gives us the effect of \(A_2\) among non-responders. Looking at the output we find that adherence
is a significant moderator of stage 2 treatment, but \(A_1\) is not.
Knowledge check #1
- Why do we subset to non-responders when fitting a second-stage moderated regression?
Marginal means of stage 2
We will use a very nice package called emmeans
to estimate the marginal mean of end-of-year school performance under different treatment/covariate options. We could do this by writing custom contrasts of the model coefficients (like in the estimate()
func), but emmeans
does this for us.
Marginal means refers to estimating the expected value while holding some covariates constant and averaging over the rest.
We use the fitted moderated regression model given in Equation 1 to estimate the average effect of second-stage treatment, among non-responders, conditional on levels of adherence
and A1
.
# The formula notation denotes the mean outcome under all combination of the factors A1, A2, given levels of adherence.
# We specify weights=proportional since we want to average over the observed distribution of the other baseline covariates.
em2 <- emmeans::emmeans(mod_QL_A2, ~ A2 | A1*adherence, weights = "proportional")
print(em2)
A1 = -1, adherence = 0:
A2 emmean SE df lower.CL upper.CL
-1 2.87 0.354 90 2.17 3.58
1 0.99 0.359 90 0.28 1.71
A1 = 1, adherence = 0:
A2 emmean SE df lower.CL upper.CL
-1 4.19 0.355 90 3.49 4.90
1 1.41 0.301 90 0.81 2.00
A1 = -1, adherence = 1:
A2 emmean SE df lower.CL upper.CL
-1 2.23 0.357 90 1.53 2.94
1 3.40 0.488 90 2.43 4.37
A1 = 1, adherence = 1:
A2 emmean SE df lower.CL upper.CL
-1 3.18 0.432 90 2.33 4.04
1 4.19 0.339 90 3.52 4.86
Results are averaged over the levels of: odd_c, priormed_c, race_c
Confidence level used: 0.95
Interaction plot of stage 2
We can also use the emmeans
package to visualize the estimated mean outcome for non-responders under each tactic.
Show the code
# Prettify the plot
ep2$data %>% mutate(across(1:3, as.factor)) %>%
ggplot(aes(xvar, yvar, color = A2, group = tvar)) +
geom_line() +
geom_point() +
#facet_wrap(vars(A1), labeller = "label_both") +
facet_wrap(vars(A1), labeller = labeller(A1=c("-1"="MED (A1 = -1)","1"="BMOD (A1 = 1)"))) +
scale_color_manual("A2", values = c("-1" = "darkgreen", "1" = "purple"),
labels = c("-1" = "AUGMENT (-1)", "1" = "INTENSIFY (1)")) +
scale_linetype_manual("A1",
labels = c("-1" = "-1 MED", "1" = "1 BMOD")) +
labs(title = "Moderator analysis of stage 2 intervention options",
x = "Adherence to stage 1",
y = "EOS School Performance \n (higher is better)") +
scale_x_discrete(labels = c("Non-adherent", "Adherent")) +
theme_classic()
Knowledge check #2
- What is the optimal tactic for non-adhering, non-responders to \(A_1\) = BMOD(1)? To \(A_1\) = MED(-1)?
What’s the best decision rule at Stage 2?
The results from the moderated regression and the marginal means output suggest that for those children who are non-adherent (adherence
= 0) it is better to Augment (\(A_2 = -1\)) rather than intensify. Likewise, for those children that are adherent (adherence
= 1) it is better to Intensify (\(A_2=1\)). Notice, our optimal tactic does not depend on first-stage treatment \(A1\).
Therefore, among non-responders, \[ A_2^{opt} = \begin{cases} 1 \text{ INT} & \text{ if } \text{ adherence} = 1 \\ -1 \text{ AUG} & \text{ if } \text{ adherence} = 0 \end{cases} \]
Note that the decision rule is the same regardless of each person’s stage 1 intervention assignment.
4.2 Step 2: predict optimal outcome
Create new data.frame with optimal 2nd stage intervention option learned from the moderated regression for every non-responding child
Predict outcome for non-responders under the optimal treatment assignment:
\[ \hat{Y}_i(A^{opt}_2) \]
dat_adhd_nr_optA2$Y2_optA2 <- predict(mod_QL_A2,
newdata = dat_adhd_nr_optA2) # using the stage 2 moderated regression model
head(dat_adhd_nr_optA2) %>% kable()
ID | odd | severity | priormed | race | Y0 | A1 | R | NRtime | adherence | Y1 | A2 | Y2 | cell | odd_c | severity_c | priormed_c | race_c | Y2_optA2 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | 1 | 2.88 | 0 | 1 | 2.32 | -1 | 0 | 4 | 0 | 2.79 | -1 | 0.598 | C | 0.604 | -1.866 | -0.307 | 0.139 | 2.77 |
4 | 0 | 4.93 | 0 | 1 | 3.23 | 1 | 0 | 4 | 0 | 3.05 | -1 | 6.762 | E | -0.396 | 0.186 | -0.307 | 0.139 | 4.66 |
5 | 1 | 5.50 | 0 | 1 | 1.48 | 1 | 0 | 6 | 0 | 1.73 | -1 | 3.580 | E | 0.604 | 0.756 | -0.307 | 0.139 | 3.96 |
6 | 0 | 5.50 | 0 | 1 | 1.72 | 1 | 0 | 3 | 0 | 2.40 | -1 | 2.075 | F | -0.396 | 0.752 | -0.307 | 0.139 | 4.63 |
7 | 0 | 6.79 | 0 | 1 | 2.27 | 1 | 0 | 7 | 0 | 2.84 | -1 | 2.594 | F | -0.396 | 2.041 | -0.307 | 0.139 | 4.56 |
9 | 1 | 9.09 | 1 | 1 | 1.90 | 1 | 0 | 6 | 0 | 3.59 | -1 | 2.970 | E | 0.604 | 4.343 | 0.693 | 0.139 | 3.19 |
4.3 Step 3: first-stage tailoring
We would now like to use baseline information to learn an optimal first-stage tactic accounting for our future optimal second-stage decision
We hypothesize that children already on medication (
priormed
= 1) will be better-off, on average, starting with MED (\(A1 = -1\)) instead of BMOD (\(A1 = 1\)) due to parent/child habituation with taking medication.
Data.frame with adjusted outcomes
Merge predicted outcome from non-responders with the observed outcome from responders
# Responders get assigned their observed outcome (no stage 2 tactic)
dat_adhd_r <- dat_adhd %>% filter(R == 1) %>%
mutate(Y2_optA2 = Y2)
# combine non-responders w/ responders
dat_adhd_optA2 <- bind_rows(dat_adhd_nr_optA2, dat_adhd_r)
# center baseline control variables across responders and non-responders
dat_adhd_optA2 <- dat_adhd_optA2 %>% mutate(across(c(odd, severity, race), ~ .x - mean(.x), .names = "{.col}_c"))
dat_adhd_optA2 %>% kable() %>%
kable_styling() %>%
scroll_box(height = "300px")
ID | odd | severity | priormed | race | Y0 | A1 | R | NRtime | adherence | Y1 | A2 | Y2 | cell | odd_c | severity_c | priormed_c | race_c | Y2_optA2 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | 1 | 2.880 | 0 | 1 | 2.321 | -1 | 0 | 4 | 0 | 2.791 | -1 | 0.598 | C | 0.593 | -1.891 | -0.307 | 0.153 | 2.774 |
4 | 0 | 4.931 | 0 | 1 | 3.232 | 1 | 0 | 4 | 0 | 3.050 | -1 | 6.762 | E | -0.407 | 0.161 | -0.307 | 0.153 | 4.655 |
5 | 1 | 5.502 | 0 | 1 | 1.477 | 1 | 0 | 6 | 0 | 1.732 | -1 | 3.580 | E | 0.593 | 0.732 | -0.307 | 0.153 | 3.959 |
6 | 0 | 5.497 | 0 | 1 | 1.720 | 1 | 0 | 3 | 0 | 2.400 | -1 | 2.075 | F | -0.407 | 0.727 | -0.307 | 0.153 | 4.626 |
7 | 0 | 6.786 | 0 | 1 | 2.265 | 1 | 0 | 7 | 0 | 2.837 | -1 | 2.594 | F | -0.407 | 2.016 | -0.307 | 0.153 | 4.558 |
9 | 1 | 9.088 | 1 | 1 | 1.896 | 1 | 0 | 6 | 0 | 3.594 | -1 | 2.970 | E | 0.593 | 4.318 | 0.693 | 0.153 | 3.190 |
10 | 0 | 6.094 | 0 | 1 | 2.503 | 1 | 0 | 5 | 1 | 2.546 | 1 | 6.022 | F | -0.407 | 1.324 | -0.307 | 0.153 | 4.588 |
11 | 0 | 2.016 | 0 | 1 | 2.810 | 1 | 0 | 4 | 0 | 1.916 | -1 | 5.561 | E | -0.407 | -2.754 | -0.307 | 0.153 | 4.808 |
13 | 0 | 6.290 | 0 | 1 | 2.188 | 1 | 0 | 2 | 0 | 1.733 | -1 | 3.188 | F | -0.407 | 1.519 | -0.307 | 0.153 | 4.584 |
14 | 0 | 3.972 | 0 | 1 | 2.281 | -1 | 0 | 8 | 1 | 2.845 | 1 | 3.116 | C | -0.407 | -0.798 | -0.307 | 0.153 | 3.914 |
15 | 0 | 5.862 | 0 | 1 | 1.588 | 1 | 0 | 8 | 1 | 1.681 | 1 | 4.182 | F | -0.407 | 1.092 | -0.307 | 0.153 | 4.600 |
16 | 0 | 6.086 | 1 | 1 | 2.105 | 1 | 0 | 8 | 1 | 2.296 | 1 | 1.920 | E | -0.407 | 1.316 | 0.693 | 0.153 | 4.007 |
17 | 0 | 8.259 | 1 | 1 | 1.695 | 1 | 0 | 4 | 0 | 2.279 | -1 | 3.230 | E | -0.407 | 3.488 | 0.693 | 0.153 | 3.899 |
18 | 1 | 3.371 | 1 | 1 | 2.222 | 1 | 0 | 3 | 0 | 1.669 | -1 | 0.920 | F | 0.593 | -1.400 | 0.693 | 0.153 | 3.489 |
19 | 1 | 4.958 | 0 | 1 | 2.084 | 1 | 0 | 5 | 1 | 2.605 | 1 | 3.940 | E | 0.593 | 0.187 | -0.307 | 0.153 | 3.982 |
20 | 1 | 3.581 | 1 | 0 | 0.614 | 1 | 0 | 4 | 0 | 0.681 | -1 | -2.254 | F | 0.593 | -1.190 | 0.693 | -0.847 | 3.277 |
21 | 0 | 2.182 | 0 | 0 | 2.204 | 1 | 0 | 5 | 0 | 2.580 | -1 | 5.381 | E | -0.407 | -2.588 | -0.307 | -0.847 | 4.598 |
24 | 0 | 6.504 | 0 | 1 | 2.098 | 1 | 0 | 5 | 0 | 2.445 | -1 | 4.461 | E | -0.407 | 1.734 | -0.307 | 0.153 | 4.573 |
25 | 0 | 6.598 | 0 | 0 | 1.858 | 1 | 0 | 4 | 0 | 1.701 | -1 | 2.978 | F | -0.407 | 1.828 | -0.307 | -0.847 | 4.367 |
26 | 0 | 1.525 | 1 | 1 | 3.259 | -1 | 0 | 5 | 1 | 4.138 | 1 | 5.591 | B | -0.407 | -3.245 | 0.693 | 0.153 | 3.460 |
27 | 1 | 2.176 | 1 | 1 | 2.010 | -1 | 0 | 7 | 1 | 2.007 | 1 | 2.408 | B | 0.593 | -2.595 | 0.693 | 0.153 | 2.760 |
28 | 0 | 2.147 | 1 | 1 | 2.181 | -1 | 0 | 8 | 1 | 3.052 | 1 | 5.045 | B | -0.407 | -2.623 | 0.693 | 0.153 | 3.428 |
29 | 0 | 3.278 | 1 | 1 | 2.387 | 1 | 0 | 8 | 0 | 2.151 | -1 | 0.497 | F | -0.407 | -1.492 | 0.693 | 0.153 | 4.160 |
30 | 0 | 5.425 | 0 | 1 | 2.381 | 1 | 0 | 5 | 1 | 2.467 | 1 | 4.281 | E | -0.407 | 0.655 | -0.307 | 0.153 | 4.623 |
31 | 0 | 6.486 | 1 | 1 | 1.998 | 1 | 0 | 8 | 1 | 1.226 | 1 | 1.439 | F | -0.407 | 1.716 | 0.693 | 0.153 | 3.986 |
35 | 0 | 3.248 | 0 | 0 | 2.705 | -1 | 0 | 4 | 0 | 3.179 | -1 | 4.034 | B | -0.407 | -1.523 | -0.307 | -0.847 | 3.220 |
36 | 0 | 2.316 | 0 | 1 | 2.339 | -1 | 0 | 5 | 0 | 2.904 | -1 | 4.301 | B | -0.407 | -2.455 | -0.307 | 0.153 | 3.469 |
37 | 0 | 4.250 | 1 | 1 | 2.787 | 1 | 0 | 8 | 0 | 3.016 | -1 | 2.971 | F | -0.407 | -0.520 | 0.693 | 0.153 | 4.109 |
38 | 0 | 5.270 | 0 | 1 | 2.867 | 1 | 0 | 5 | 0 | 2.840 | -1 | 3.508 | F | -0.407 | 0.500 | -0.307 | 0.153 | 4.637 |
40 | 0 | 8.475 | 0 | 1 | 1.953 | 1 | 0 | 7 | 1 | 2.447 | 1 | 5.279 | F | -0.407 | 3.705 | -0.307 | 0.153 | 4.463 |
41 | 1 | 2.878 | 1 | 0 | 1.349 | 1 | 0 | 5 | 0 | 0.680 | -1 | -1.502 | F | 0.593 | -1.893 | 0.693 | -0.847 | 3.314 |
42 | 0 | 5.727 | 0 | 1 | 3.119 | 1 | 0 | 2 | 1 | 3.293 | 1 | 4.111 | E | -0.407 | 0.956 | -0.307 | 0.153 | 4.607 |
45 | 1 | 4.443 | 0 | 1 | 1.788 | -1 | 0 | 8 | 0 | 3.066 | -1 | 1.169 | C | 0.593 | -0.327 | -0.307 | 0.153 | 2.692 |
46 | 0 | 0.571 | 0 | 1 | 2.965 | -1 | 0 | 3 | 0 | 3.159 | -1 | 1.781 | C | -0.407 | -4.199 | -0.307 | 0.153 | 3.561 |
47 | 0 | 5.719 | 1 | 1 | 2.176 | 1 | 0 | 2 | 1 | 1.956 | 1 | 4.637 | F | -0.407 | 0.948 | 0.693 | 0.153 | 4.026 |
48 | 0 | 3.397 | 1 | 1 | 2.850 | 1 | 0 | 4 | 1 | 3.320 | 1 | 4.529 | F | -0.407 | -1.373 | 0.693 | 0.153 | 4.147 |
49 | 0 | 2.688 | 0 | 1 | 2.544 | -1 | 0 | 7 | 0 | 3.180 | -1 | 3.628 | B | -0.407 | -2.083 | -0.307 | 0.153 | 3.450 |
50 | 0 | 3.050 | 1 | 1 | 2.972 | 1 | 0 | 3 | 0 | 3.053 | -1 | 0.634 | F | -0.407 | -1.720 | 0.693 | 0.153 | 4.172 |
51 | 1 | 6.111 | 0 | 1 | 2.159 | 1 | 0 | 6 | 0 | 3.662 | -1 | 3.853 | F | 0.593 | 1.341 | -0.307 | 0.153 | 3.928 |
52 | 1 | 3.936 | 0 | 1 | 1.938 | -1 | 0 | 6 | 0 | 2.685 | -1 | 0.217 | C | 0.593 | -0.834 | -0.307 | 0.153 | 2.719 |
55 | 1 | 6.187 | 0 | 0 | 1.507 | -1 | 0 | 2 | 1 | 2.873 | 1 | 0.820 | B | 0.593 | 1.416 | -0.307 | -0.847 | 2.932 |
57 | 1 | 4.732 | 0 | 1 | 1.189 | 1 | 0 | 3 | 0 | 2.239 | -1 | 2.980 | E | 0.593 | -0.038 | -0.307 | 0.153 | 4.000 |
58 | 0 | 6.344 | 0 | 1 | 2.148 | -1 | 0 | 3 | 1 | 3.665 | 1 | 2.556 | B | -0.407 | 1.573 | -0.307 | 0.153 | 3.790 |
59 | 0 | 3.495 | 0 | 1 | 1.542 | -1 | 0 | 5 | 0 | 1.286 | -1 | 1.233 | B | -0.407 | -1.276 | -0.307 | 0.153 | 3.408 |
62 | 0 | 4.087 | 0 | 1 | 1.597 | 1 | 0 | 8 | 1 | 1.305 | 1 | 3.234 | F | -0.407 | -0.683 | -0.307 | 0.153 | 4.693 |
63 | 1 | 5.394 | 0 | 1 | 1.394 | 1 | 0 | 5 | 1 | 1.786 | 1 | 5.340 | F | 0.593 | 0.623 | -0.307 | 0.153 | 3.959 |
65 | 0 | 6.065 | 1 | 1 | 2.167 | 1 | 0 | 8 | 1 | 2.629 | 1 | 3.299 | F | -0.407 | 1.295 | 0.693 | 0.153 | 4.008 |
68 | 0 | 3.739 | 0 | 1 | 2.006 | -1 | 0 | 6 | 0 | 2.664 | -1 | 1.805 | B | -0.407 | -1.031 | -0.307 | 0.153 | 3.395 |
69 | 0 | 4.249 | 0 | 1 | 1.821 | -1 | 0 | 4 | 0 | 2.355 | -1 | 1.650 | B | -0.407 | -0.521 | -0.307 | 0.153 | 3.368 |
70 | 0 | 7.057 | 1 | 1 | 1.867 | 1 | 0 | 6 | 1 | 1.765 | 1 | 2.779 | F | -0.407 | 2.287 | 0.693 | 0.153 | 3.956 |
71 | 1 | 2.877 | 0 | 1 | 1.927 | -1 | 0 | 5 | 0 | 3.615 | -1 | 0.719 | C | 0.593 | -1.894 | -0.307 | 0.153 | 2.774 |
72 | 1 | 2.616 | 0 | 1 | 2.027 | -1 | 0 | 6 | 1 | 3.535 | 1 | 3.452 | C | 0.593 | -2.154 | -0.307 | 0.153 | 3.319 |
73 | 0 | 5.784 | 0 | 1 | 1.554 | 1 | 0 | 6 | 0 | 0.875 | -1 | 4.558 | E | -0.407 | 1.013 | -0.307 | 0.153 | 4.611 |
74 | 1 | 6.531 | 0 | 1 | 1.131 | -1 | 0 | 5 | 1 | 2.344 | 1 | 0.366 | B | 0.593 | 1.761 | -0.307 | 0.153 | 3.114 |
75 | 0 | 0.772 | 0 | 1 | 1.593 | 1 | 0 | 7 | 0 | 0.769 | -1 | 0.737 | F | -0.407 | -3.998 | -0.307 | 0.153 | 4.873 |
77 | 1 | 5.030 | 0 | 1 | 1.695 | -1 | 0 | 2 | 1 | 3.142 | 1 | 3.474 | B | 0.593 | 0.259 | -0.307 | 0.153 | 3.193 |
84 | 1 | 5.331 | 0 | 1 | 1.752 | 1 | 0 | 2 | 1 | 1.536 | 1 | 5.077 | F | 0.593 | 0.561 | -0.307 | 0.153 | 3.962 |
85 | 1 | 6.202 | 0 | 1 | 1.404 | 1 | 0 | 3 | 1 | 2.111 | 1 | 4.198 | F | 0.593 | 1.432 | -0.307 | 0.153 | 3.916 |
86 | 0 | 5.333 | 0 | 1 | 2.100 | 1 | 0 | 8 | 0 | 1.590 | -1 | 1.029 | F | -0.407 | 0.563 | -0.307 | 0.153 | 4.634 |
88 | 1 | 3.734 | 1 | 1 | 2.061 | 1 | 0 | 6 | 0 | 2.152 | -1 | 2.221 | E | 0.593 | -1.036 | 0.693 | 0.153 | 3.470 |
90 | 0 | 5.432 | 0 | 1 | 2.279 | 1 | 0 | 8 | 0 | 1.751 | -1 | 3.198 | F | -0.407 | 0.662 | -0.307 | 0.153 | 4.629 |
91 | 0 | 2.910 | 0 | 1 | 2.446 | -1 | 0 | 7 | 0 | 2.452 | -1 | -0.182 | C | -0.407 | -1.860 | -0.307 | 0.153 | 3.438 |
92 | 1 | 3.360 | 0 | 1 | 1.553 | -1 | 0 | 6 | 1 | 3.167 | 1 | 1.481 | B | 0.593 | -1.410 | -0.307 | 0.153 | 3.280 |
96 | 0 | 5.016 | 0 | 1 | 2.062 | 1 | 0 | 7 | 0 | 2.080 | -1 | 2.314 | F | -0.407 | 0.245 | -0.307 | 0.153 | 4.651 |
98 | 0 | 9.306 | 0 | 1 | 1.439 | -1 | 0 | 3 | 0 | 2.684 | -1 | 1.358 | B | -0.407 | 4.536 | -0.307 | 0.153 | 3.104 |
99 | 1 | 2.796 | 0 | 1 | 1.608 | 1 | 0 | 6 | 1 | 1.917 | 1 | 3.311 | E | 0.593 | -1.975 | -0.307 | 0.153 | 4.095 |
100 | 0 | 4.050 | 0 | 1 | 1.463 | 1 | 0 | 2 | 1 | 1.846 | 1 | 2.443 | E | -0.407 | -0.720 | -0.307 | 0.153 | 4.695 |
101 | 1 | 7.490 | 0 | 0 | 1.077 | -1 | 0 | 2 | 0 | 2.127 | -1 | 0.846 | B | 0.593 | 2.720 | -0.307 | -0.847 | 2.332 |
102 | 0 | 3.852 | 0 | 1 | 2.190 | 1 | 0 | 3 | 0 | 1.993 | -1 | 2.577 | F | -0.407 | -0.918 | -0.307 | 0.153 | 4.712 |
105 | 1 | 0.852 | 1 | 1 | 2.636 | 1 | 0 | 5 | 1 | 2.200 | 1 | 3.382 | F | 0.593 | -3.918 | 0.693 | 0.153 | 3.614 |
106 | 0 | 6.060 | 1 | 1 | 1.964 | 1 | 0 | 6 | 0 | 1.601 | -1 | -0.108 | F | -0.407 | 1.290 | 0.693 | 0.153 | 4.014 |
109 | 1 | 5.176 | 1 | 1 | 1.416 | -1 | 0 | 2 | 0 | 2.115 | -1 | 1.030 | C | 0.593 | 0.406 | 0.693 | 0.153 | 2.072 |
111 | 0 | 4.918 | 0 | 1 | 2.131 | -1 | 0 | 5 | 1 | 3.795 | 1 | 2.873 | B | -0.407 | 0.148 | -0.307 | 0.153 | 3.865 |
112 | 1 | 8.161 | 0 | 1 | 1.577 | -1 | 0 | 6 | 1 | 2.387 | 1 | 1.688 | B | 0.593 | 3.390 | -0.307 | 0.153 | 3.029 |
113 | 0 | 5.737 | 0 | 1 | 2.624 | -1 | 0 | 2 | 1 | 4.386 | 1 | 4.667 | C | -0.407 | 0.967 | -0.307 | 0.153 | 3.822 |
114 | 1 | 3.317 | 0 | 1 | 1.613 | -1 | 0 | 8 | 0 | 1.872 | -1 | -1.160 | C | 0.593 | -1.453 | -0.307 | 0.153 | 2.751 |
117 | 1 | 4.830 | 1 | 1 | 2.297 | 1 | 0 | 3 | 0 | 2.349 | -1 | 3.213 | E | 0.593 | 0.059 | 0.693 | 0.153 | 3.413 |
118 | 1 | 2.130 | 1 | 0 | 2.114 | -1 | 0 | 8 | 0 | 2.634 | -1 | 3.994 | B | 0.593 | -2.640 | 0.693 | -0.847 | 2.031 |
119 | 0 | 4.712 | 0 | 1 | 1.952 | -1 | 0 | 5 | 1 | 2.637 | 1 | 0.924 | B | -0.407 | -0.058 | -0.307 | 0.153 | 3.875 |
121 | 0 | 3.246 | 0 | 1 | 2.099 | -1 | 0 | 7 | 0 | 1.644 | -1 | 0.379 | C | -0.407 | -1.525 | -0.307 | 0.153 | 3.421 |
122 | 0 | 5.103 | 0 | 1 | 2.133 | 1 | 0 | 2 | 0 | 2.405 | -1 | 5.741 | E | -0.407 | 0.333 | -0.307 | 0.153 | 4.646 |
125 | 0 | 4.489 | 1 | 1 | 1.171 | 1 | 0 | 4 | 1 | 1.001 | 1 | 0.838 | E | -0.407 | -0.281 | 0.693 | 0.153 | 4.090 |
126 | 0 | 6.227 | 1 | 0 | 2.206 | -1 | 0 | 2 | 1 | 3.152 | 1 | 4.635 | C | -0.407 | 1.457 | 0.693 | -0.847 | 3.014 |
127 | 1 | 4.487 | 0 | 1 | 2.005 | -1 | 0 | 4 | 0 | 2.863 | -1 | 2.960 | B | 0.593 | -0.284 | -0.307 | 0.153 | 2.690 |
128 | 1 | 5.477 | 0 | 1 | 2.152 | -1 | 0 | 8 | 0 | 3.665 | -1 | 3.994 | B | 0.593 | 0.706 | -0.307 | 0.153 | 2.638 |
129 | 0 | 7.425 | 1 | 1 | 1.964 | -1 | 0 | 8 | 0 | 2.697 | -1 | 5.092 | B | -0.407 | 2.655 | 0.693 | 0.153 | 2.621 |
130 | 0 | 4.287 | 0 | 0 | 2.123 | -1 | 0 | 2 | 0 | 2.518 | -1 | 2.315 | B | -0.407 | -0.484 | -0.307 | -0.847 | 3.166 |
132 | 1 | 6.278 | 0 | 1 | 1.329 | -1 | 0 | 6 | 1 | 2.891 | 1 | 1.090 | B | 0.593 | 1.508 | -0.307 | 0.153 | 3.128 |
134 | 1 | 6.360 | 1 | 0 | 1.003 | -1 | 0 | 4 | 1 | 1.654 | 1 | 3.676 | C | 0.593 | 1.589 | 0.693 | -0.847 | 2.341 |
135 | 1 | 3.663 | 0 | 1 | 2.656 | 1 | 0 | 7 | 0 | 3.281 | -1 | 6.073 | E | 0.593 | -1.108 | -0.307 | 0.153 | 4.056 |
138 | 0 | 3.213 | 0 | 1 | 3.112 | -1 | 0 | 6 | 0 | 3.405 | -1 | 2.047 | C | -0.407 | -1.558 | -0.307 | 0.153 | 3.423 |
139 | 1 | 2.074 | 0 | 1 | 1.632 | -1 | 0 | 2 | 0 | 1.234 | -1 | 1.646 | B | 0.593 | -2.696 | -0.307 | 0.153 | 2.816 |
140 | 0 | 2.054 | 0 | 0 | 2.698 | -1 | 0 | 5 | 1 | 3.245 | 1 | 3.529 | C | -0.407 | -2.716 | -0.307 | -0.847 | 3.814 |
141 | 1 | 9.068 | 0 | 1 | 1.315 | -1 | 0 | 5 | 0 | 2.759 | -1 | 0.192 | C | 0.593 | 4.298 | -0.307 | 0.153 | 2.451 |
143 | 0 | 7.494 | 1 | 0 | 1.929 | -1 | 0 | 6 | 0 | 2.208 | -1 | 0.591 | C | -0.407 | 2.723 | 0.693 | -0.847 | 2.416 |
144 | 1 | 6.363 | 1 | 1 | 1.712 | -1 | 0 | 8 | 0 | 2.984 | -1 | 1.860 | C | 0.593 | 1.593 | 0.693 | 0.153 | 2.010 |
145 | 1 | 2.788 | 0 | 0 | 1.679 | -1 | 0 | 6 | 1 | 3.986 | 1 | 2.303 | B | 0.593 | -1.982 | -0.307 | -0.847 | 3.109 |
146 | 0 | 3.472 | 0 | 1 | 1.973 | 1 | 0 | 2 | 0 | 1.971 | -1 | 2.107 | F | -0.407 | -1.298 | -0.307 | 0.153 | 4.731 |
147 | 0 | 3.975 | 0 | 1 | 1.789 | 1 | 0 | 2 | 1 | 1.044 | 1 | 3.452 | F | -0.407 | -0.796 | -0.307 | 0.153 | 4.699 |
148 | 0 | 5.476 | 1 | 1 | 2.240 | 1 | 0 | 3 | 1 | 3.096 | 1 | 2.395 | E | -0.407 | 0.706 | 0.693 | 0.153 | 4.038 |
150 | 1 | 7.067 | 1 | 1 | 1.426 | 1 | 0 | 3 | 1 | 1.403 | 1 | 2.204 | F | 0.593 | 2.297 | 0.693 | 0.153 | 3.289 |
2 | 0 | 4.133 | 0 | 0 | 2.068 | 1 | 1 | NA | 1 | 2.200 | NA | 4.267 | D | -0.407 | -0.638 | NA | -0.847 | 4.267 |
3 | 1 | 5.569 | 0 | 1 | 1.004 | -1 | 1 | NA | 0 | 2.292 | NA | 1.454 | A | 0.593 | 0.798 | NA | 0.153 | 1.454 |
8 | 0 | 4.317 | 0 | 1 | 2.814 | 1 | 1 | NA | 0 | 2.751 | NA | 4.051 | D | -0.407 | -0.453 | NA | 0.153 | 4.051 |
12 | 0 | 4.308 | 0 | 1 | 2.650 | -1 | 1 | NA | 0 | 3.088 | NA | 2.476 | A | -0.407 | -0.462 | NA | 0.153 | 2.476 |
22 | 0 | 0.378 | 0 | 1 | 2.608 | 1 | 1 | NA | 1 | 1.469 | NA | 5.617 | D | -0.407 | -4.393 | NA | 0.153 | 5.617 |
23 | 0 | 5.489 | 0 | 1 | 2.278 | -1 | 1 | NA | 0 | 2.958 | NA | 2.872 | A | -0.407 | 0.718 | NA | 0.153 | 2.872 |
32 | 1 | 2.962 | 0 | 1 | 2.249 | 1 | 1 | NA | 1 | 2.477 | NA | 5.463 | D | 0.593 | -1.808 | NA | 0.153 | 5.463 |
33 | 0 | 7.385 | 0 | 0 | 1.923 | -1 | 1 | NA | 0 | 2.871 | NA | 2.200 | A | -0.407 | 2.614 | NA | -0.847 | 2.200 |
34 | 1 | 6.538 | 0 | 1 | 1.801 | -1 | 1 | NA | 0 | 3.177 | NA | 1.842 | A | 0.593 | 1.768 | NA | 0.153 | 1.842 |
39 | 0 | 6.808 | 0 | 1 | 1.975 | 1 | 1 | NA | 0 | 2.459 | NA | 3.786 | D | -0.407 | 2.038 | NA | 0.153 | 3.786 |
43 | 1 | 8.335 | 1 | 1 | 1.923 | -1 | 1 | NA | 1 | 2.990 | NA | 4.761 | A | 0.593 | 3.565 | NA | 0.153 | 4.761 |
44 | 1 | 2.747 | 1 | 1 | 1.990 | 1 | 1 | NA | 0 | 1.736 | NA | 1.723 | D | 0.593 | -2.024 | NA | 0.153 | 1.723 |
53 | 0 | 2.466 | 0 | 1 | 2.677 | 1 | 1 | NA | 1 | 2.473 | NA | 6.786 | D | -0.407 | -2.305 | NA | 0.153 | 6.786 |
54 | 1 | 2.595 | 0 | 1 | 2.129 | -1 | 1 | NA | 0 | 2.717 | NA | 2.857 | A | 0.593 | -2.176 | NA | 0.153 | 2.857 |
56 | 0 | 5.879 | 0 | 1 | 1.872 | 1 | 1 | NA | 1 | 2.391 | NA | 4.780 | D | -0.407 | 1.109 | NA | 0.153 | 4.780 |
60 | 1 | 7.886 | 1 | 1 | 2.282 | -1 | 1 | NA | 0 | 3.416 | NA | 4.560 | A | 0.593 | 3.116 | NA | 0.153 | 4.560 |
61 | 0 | 4.423 | 1 | 1 | 2.077 | -1 | 1 | NA | 1 | 2.955 | NA | 5.083 | A | -0.407 | -0.347 | NA | 0.153 | 5.083 |
64 | 1 | 6.915 | 0 | 1 | 2.176 | -1 | 1 | NA | 1 | 3.606 | NA | 3.198 | A | 0.593 | 2.144 | NA | 0.153 | 3.198 |
66 | 0 | 7.318 | 1 | 1 | 1.382 | 1 | 1 | NA | 0 | 1.680 | NA | 0.335 | D | -0.407 | 2.548 | NA | 0.153 | 0.335 |
67 | 1 | 5.115 | 1 | 1 | 2.059 | 1 | 1 | NA | 1 | 2.599 | NA | 3.242 | D | 0.593 | 0.345 | NA | 0.153 | 3.242 |
76 | 1 | 6.962 | 1 | 0 | 1.798 | 1 | 1 | NA | 0 | 2.537 | NA | 1.504 | D | 0.593 | 2.191 | NA | -0.847 | 1.504 |
78 | 0 | 4.547 | 0 | 1 | 2.394 | -1 | 1 | NA | 0 | 3.609 | NA | 2.942 | A | -0.407 | -0.223 | NA | 0.153 | 2.942 |
79 | 1 | 7.767 | 0 | 1 | 1.587 | -1 | 1 | NA | 0 | 2.875 | NA | 0.960 | A | 0.593 | 2.997 | NA | 0.153 | 0.960 |
80 | 0 | 8.698 | 0 | 0 | 1.852 | -1 | 1 | NA | 1 | 2.972 | NA | 2.292 | A | -0.407 | 3.928 | NA | -0.847 | 2.292 |
81 | 1 | 2.132 | 0 | 1 | 2.038 | -1 | 1 | NA | 0 | 3.048 | NA | 2.602 | A | 0.593 | -2.639 | NA | 0.153 | 2.602 |
82 | 1 | 7.529 | 1 | 1 | 1.406 | -1 | 1 | NA | 0 | 3.005 | NA | 3.043 | A | 0.593 | 2.759 | NA | 0.153 | 3.043 |
83 | 0 | 3.420 | 1 | 1 | 2.948 | -1 | 1 | NA | 1 | 3.493 | NA | 6.463 | A | -0.407 | -1.351 | NA | 0.153 | 6.463 |
87 | 0 | 1.343 | 0 | 0 | 2.868 | -1 | 1 | NA | 1 | 2.099 | NA | 1.309 | A | -0.407 | -3.428 | NA | -0.847 | 1.309 |
89 | 0 | 8.849 | 1 | 1 | 1.928 | -1 | 1 | NA | 0 | 3.993 | NA | 4.577 | A | -0.407 | 4.079 | NA | 0.153 | 4.577 |
93 | 1 | 7.644 | 0 | 1 | 0.970 | -1 | 1 | NA | 0 | 1.976 | NA | -0.457 | A | 0.593 | 2.874 | NA | 0.153 | -0.457 |
94 | 0 | 5.865 | 0 | 0 | 1.626 | 1 | 1 | NA | 0 | 1.479 | NA | 2.888 | D | -0.407 | 1.095 | NA | -0.847 | 2.888 |
95 | 1 | 2.757 | 0 | 1 | 2.111 | -1 | 1 | NA | 1 | 3.116 | NA | 2.896 | A | 0.593 | -2.013 | NA | 0.153 | 2.896 |
97 | 1 | 6.492 | 0 | 1 | 1.747 | 1 | 1 | NA | 0 | 2.240 | NA | 4.268 | D | 0.593 | 1.721 | NA | 0.153 | 4.268 |
103 | 0 | 8.651 | 0 | 1 | 1.648 | -1 | 1 | NA | 0 | 2.730 | NA | 1.069 | A | -0.407 | 3.880 | NA | 0.153 | 1.069 |
104 | 0 | 7.608 | 0 | 1 | 2.003 | 1 | 1 | NA | 0 | 2.980 | NA | 4.693 | D | -0.407 | 2.838 | NA | 0.153 | 4.693 |
107 | 0 | 1.452 | 1 | 1 | 2.756 | -1 | 1 | NA | 1 | 3.350 | NA | 5.912 | A | -0.407 | -3.318 | NA | 0.153 | 5.912 |
108 | 0 | 1.338 | 1 | 1 | 2.648 | -1 | 1 | NA | 1 | 3.104 | NA | 4.817 | A | -0.407 | -3.432 | NA | 0.153 | 4.817 |
110 | 1 | 5.142 | 0 | 1 | 1.895 | 1 | 1 | NA | 0 | 3.252 | NA | 3.941 | D | 0.593 | 0.371 | NA | 0.153 | 3.941 |
115 | 1 | 5.299 | 0 | 1 | 1.656 | -1 | 1 | NA | 0 | 2.877 | NA | 1.997 | A | 0.593 | 0.529 | NA | 0.153 | 1.997 |
116 | 0 | 3.714 | 0 | 1 | 1.957 | -1 | 1 | NA | 0 | 1.977 | NA | 1.564 | A | -0.407 | -1.056 | NA | 0.153 | 1.564 |
120 | 1 | 5.257 | 0 | 1 | 1.527 | 1 | 1 | NA | 1 | 1.631 | NA | 4.876 | D | 0.593 | 0.487 | NA | 0.153 | 4.876 |
123 | 0 | 3.378 | 0 | 0 | 1.393 | 1 | 1 | NA | 0 | 1.055 | NA | 2.530 | D | -0.407 | -1.392 | NA | -0.847 | 2.530 |
124 | 1 | 3.390 | 0 | 1 | 1.524 | -1 | 1 | NA | 0 | 2.599 | NA | 1.039 | A | 0.593 | -1.380 | NA | 0.153 | 1.039 |
131 | 0 | 4.741 | 0 | 1 | 1.952 | 1 | 1 | NA | 0 | 2.134 | NA | 3.617 | D | -0.407 | -0.029 | NA | 0.153 | 3.617 |
133 | 0 | 1.855 | 0 | 1 | 3.124 | 1 | 1 | NA | 1 | 3.572 | NA | 6.706 | D | -0.407 | -2.916 | NA | 0.153 | 6.706 |
136 | 0 | 2.155 | 0 | 1 | 2.626 | 1 | 1 | NA | 1 | 2.152 | NA | 5.881 | D | -0.407 | -2.615 | NA | 0.153 | 5.881 |
137 | 1 | 0.580 | 1 | 0 | 2.659 | 1 | 1 | NA | 1 | 2.595 | NA | 3.440 | D | 0.593 | -4.190 | NA | -0.847 | 3.440 |
142 | 0 | 1.970 | 0 | 0 | 1.827 | 1 | 1 | NA | 0 | 1.190 | NA | 3.028 | D | -0.407 | -2.801 | NA | -0.847 | 3.028 |
149 | 0 | 4.129 | 1 | 1 | 1.984 | 1 | 1 | NA | 0 | 1.748 | NA | 1.786 | D | -0.407 | -0.641 | NA | 0.153 | 1.786 |
We now have a data frame of responders and non-responders with the estimated optimal outcome for non-responders tailored by adherence
and \(A_1\).
Regression model
We fit a moderated regression model for first-stage treatment using the dat_adhd_optA2
data frame, which accounts for the optimal future second-stage tactic. The tailoring variable of interest is priormed
\((S_0)\).
\[ \begin{align*} E[\hat{Y}(A_2^{opt}) \mid \mathbf{X}, S_0, A_1] &= \beta_0 + \eta_{1:3}^TX + \eta_4S_0 \\ &+ \beta_1A_1 + \beta_2S_0A_1 \end{align*} \tag{2}\]
# Moderator regression for first stage tailoring on priormed, controlling for optimal future tactic. We use uncentered priormed because we want to examine the interaction effect at different levels.
mod_QL_A1 <- lm(Y2_optA2 ~ odd_c + severity_c + race_c + priormed
+ A1 + A1:priormed,
data = dat_adhd_optA2)
summary(mod_QL_A1)
Call:
lm(formula = Y2_optA2 ~ odd_c + severity_c + race_c + priormed +
A1 + A1:priormed, data = dat_adhd_optA2)
Residuals:
Min 1Q Median 3Q Max
-3.1589 -0.1362 0.0287 0.3966 2.2222
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 3.6081 0.0799 45.16 < 2e-16 ***
odd_c -0.6036 0.1398 -4.32 2.9e-05 ***
severity_c -0.1140 0.0327 -3.49 0.00065 ***
race_c 0.7184 0.1863 3.86 0.00017 ***
priormed -0.0280 0.1484 -0.19 0.85077
A1 0.7568 0.0821 9.22 3.5e-16 ***
priormed:A1 -0.9084 0.1489 -6.10 9.5e-09 ***
---
Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
Residual standard error: 0.817 on 143 degrees of freedom
Multiple R-squared: 0.529, Adjusted R-squared: 0.509
F-statistic: 26.7 on 6 and 143 DF, p-value: <2e-16
The standard errors (and p-values) in the step 3 regression are potentially incorrect because they don’t take into account sampling error in estimation of \(\hat{Y}(A_2^{opt})\). Luckily, we provide software (see qlaci
) to do proper inference!
We use emmeans
package to estimate the expected end-of-year school performance for an adaptive intervention that offers BMOD(1) or MED(-1) at first-stage for levels of priormed
, adjusting for the fact we are optimally tailoring second-stage treatment for non-responders by adherence
.
Interaction plot of optimal tactic
Show the code
# Prettify plot
qep1$data %>% mutate(across(1:2, as.factor)) %>%
ggplot(aes(xvar, yvar, color = A1, group = tvar)) +
geom_line(linewidth = 1) +
geom_point() +
scale_color_manual("A1", values = c("-1" = "red", "1" = "blue"),
labels = c("-1" = "-1 MED", "1" = "1 BMOD")) +
theme_classic() +
labs(title = "Moderator of stage 1 controlling for optimal stage 2",
x = "Medication use in Prior year",
y = "EOS School Performance \n (higher is better)") +
scale_x_discrete(labels = c("No prior med", "Prior med")) +
scale_y_continuous(n.breaks = 8)
5 Q-learning Follow-up: Estimate the mean outcome under the more deeply tailored AI
The results of Q-learning generated a proposal for a more deeply-tailored AI that tailors first-stage on priormed
and second-stage on response status and adherence
:
To estimate the mean outcome under the proposed decision rules we use what we learned in the Primary Aims module and create an indicator for those individuals observed to be consistent with the more deeply tailored AI.
The optimal decision rules at each stage:
\[ A_1^{opt} = \begin{cases} 1 \text{ BMOD} & \text{ if } \text{ priormed} = 0 \\ -1 \text{ MED} & \text{ if } \text{ priormed} = 1 \end{cases} \]
\[ A_2^{opt} = \begin{cases} 1 \text{ INT} & \text{ if } \text{ adherence} = 1 \\ -1 \text{ AUG} & \text{ if } \text{ adherence} = 0 \end{cases} \]
We can write a simple function to that returns TRUE if an individual is consistent with the more deeply tailored AI and FALSE if otherwise
# Functions that returns TRUE if individual is consistent with QL adaptive intervention decision rule
QL_rule <- function(priormed, A1, R, adherence, A2) {
# First-stage rule
if ((priormed == 1 & A1 == -1) | (priormed==0 & A1 == 1)) {
if (R == 1) {
return(TRUE)
}
# Second-stage rule, among non-responders
else if ((adherence == 1 & A2 == 1) | (adherence == 0 & A2 == -1)) {
return(TRUE)
}
else {
return(FALSE)
}
}
else {
return(FALSE)
}
}
dat_adhd_QL <- dat_adhd %>% rowwise %>%
mutate(QL = QL_rule(priormed, A1, R, adherence, A2))
mean(dat_adhd_QL$QL)
[1] 0.307
ID | odd | severity | priormed | race | Y0 | A1 | R | NRtime | adherence | Y1 | A2 | Y2 | cell | QL |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1 | 1 | 2.880 | 0 | 1 | 2.321 | -1 | 0 | 4 | 0 | 2.791 | 1 | 0.598 | C | FALSE |
2 | 0 | 4.133 | 0 | 0 | 2.068 | 1 | 1 | NA | 1 | 2.200 | NA | 4.267 | D | TRUE |
3 | 1 | 5.569 | 0 | 1 | 1.004 | -1 | 1 | NA | 0 | 2.292 | NA | 1.454 | A | FALSE |
4 | 0 | 4.931 | 0 | 1 | 3.232 | 1 | 0 | 4 | 0 | 3.050 | -1 | 6.762 | E | TRUE |
5 | 1 | 5.502 | 0 | 1 | 1.477 | 1 | 0 | 6 | 0 | 1.732 | -1 | 3.580 | E | TRUE |
6 | 0 | 5.497 | 0 | 1 | 1.720 | 1 | 0 | 3 | 0 | 2.400 | 1 | 2.075 | F | FALSE |
7 | 0 | 6.786 | 0 | 1 | 2.265 | 1 | 0 | 7 | 0 | 2.837 | 1 | 2.594 | F | FALSE |
8 | 0 | 4.317 | 0 | 1 | 2.814 | 1 | 1 | NA | 0 | 2.751 | NA | 4.051 | D | TRUE |
9 | 1 | 9.088 | 1 | 1 | 1.896 | 1 | 0 | 6 | 0 | 3.594 | -1 | 2.970 | E | FALSE |
10 | 0 | 6.094 | 0 | 1 | 2.503 | 1 | 0 | 5 | 1 | 2.546 | 1 | 6.022 | F | TRUE |
11 | 0 | 2.016 | 0 | 1 | 2.810 | 1 | 0 | 4 | 0 | 1.916 | -1 | 5.561 | E | TRUE |
12 | 0 | 4.308 | 0 | 1 | 2.650 | -1 | 1 | NA | 0 | 3.088 | NA | 2.476 | A | FALSE |
13 | 0 | 6.290 | 0 | 1 | 2.188 | 1 | 0 | 2 | 0 | 1.733 | 1 | 3.188 | F | FALSE |
14 | 0 | 3.972 | 0 | 1 | 2.281 | -1 | 0 | 8 | 1 | 2.845 | 1 | 3.116 | C | FALSE |
15 | 0 | 5.862 | 0 | 1 | 1.588 | 1 | 0 | 8 | 1 | 1.681 | 1 | 4.182 | F | TRUE |
16 | 0 | 6.086 | 1 | 1 | 2.105 | 1 | 0 | 8 | 1 | 2.296 | -1 | 1.920 | E | FALSE |
17 | 0 | 8.259 | 1 | 1 | 1.695 | 1 | 0 | 4 | 0 | 2.279 | -1 | 3.230 | E | FALSE |
18 | 1 | 3.371 | 1 | 1 | 2.222 | 1 | 0 | 3 | 0 | 1.669 | 1 | 0.920 | F | FALSE |
19 | 1 | 4.958 | 0 | 1 | 2.084 | 1 | 0 | 5 | 1 | 2.605 | -1 | 3.940 | E | FALSE |
20 | 1 | 3.581 | 1 | 0 | 0.614 | 1 | 0 | 4 | 0 | 0.681 | 1 | -2.254 | F | FALSE |
21 | 0 | 2.182 | 0 | 0 | 2.204 | 1 | 0 | 5 | 0 | 2.580 | -1 | 5.381 | E | TRUE |
22 | 0 | 0.378 | 0 | 1 | 2.608 | 1 | 1 | NA | 1 | 1.469 | NA | 5.617 | D | TRUE |
23 | 0 | 5.489 | 0 | 1 | 2.278 | -1 | 1 | NA | 0 | 2.958 | NA | 2.872 | A | FALSE |
24 | 0 | 6.504 | 0 | 1 | 2.098 | 1 | 0 | 5 | 0 | 2.445 | -1 | 4.461 | E | TRUE |
25 | 0 | 6.598 | 0 | 0 | 1.858 | 1 | 0 | 4 | 0 | 1.701 | 1 | 2.978 | F | FALSE |
26 | 0 | 1.525 | 1 | 1 | 3.259 | -1 | 0 | 5 | 1 | 4.138 | -1 | 5.591 | B | FALSE |
27 | 1 | 2.176 | 1 | 1 | 2.010 | -1 | 0 | 7 | 1 | 2.007 | -1 | 2.408 | B | FALSE |
28 | 0 | 2.147 | 1 | 1 | 2.181 | -1 | 0 | 8 | 1 | 3.052 | -1 | 5.045 | B | FALSE |
29 | 0 | 3.278 | 1 | 1 | 2.387 | 1 | 0 | 8 | 0 | 2.151 | 1 | 0.497 | F | FALSE |
30 | 0 | 5.425 | 0 | 1 | 2.381 | 1 | 0 | 5 | 1 | 2.467 | -1 | 4.281 | E | FALSE |
31 | 0 | 6.486 | 1 | 1 | 1.998 | 1 | 0 | 8 | 1 | 1.226 | 1 | 1.439 | F | FALSE |
32 | 1 | 2.962 | 0 | 1 | 2.249 | 1 | 1 | NA | 1 | 2.477 | NA | 5.463 | D | TRUE |
33 | 0 | 7.385 | 0 | 0 | 1.923 | -1 | 1 | NA | 0 | 2.871 | NA | 2.200 | A | FALSE |
34 | 1 | 6.538 | 0 | 1 | 1.801 | -1 | 1 | NA | 0 | 3.177 | NA | 1.842 | A | FALSE |
35 | 0 | 3.248 | 0 | 0 | 2.705 | -1 | 0 | 4 | 0 | 3.179 | -1 | 4.034 | B | FALSE |
36 | 0 | 2.316 | 0 | 1 | 2.339 | -1 | 0 | 5 | 0 | 2.904 | -1 | 4.301 | B | FALSE |
37 | 0 | 4.250 | 1 | 1 | 2.787 | 1 | 0 | 8 | 0 | 3.016 | 1 | 2.971 | F | FALSE |
38 | 0 | 5.270 | 0 | 1 | 2.867 | 1 | 0 | 5 | 0 | 2.840 | 1 | 3.508 | F | FALSE |
39 | 0 | 6.808 | 0 | 1 | 1.975 | 1 | 1 | NA | 0 | 2.459 | NA | 3.786 | D | TRUE |
40 | 0 | 8.475 | 0 | 1 | 1.953 | 1 | 0 | 7 | 1 | 2.447 | 1 | 5.279 | F | TRUE |
41 | 1 | 2.878 | 1 | 0 | 1.349 | 1 | 0 | 5 | 0 | 0.680 | 1 | -1.502 | F | FALSE |
42 | 0 | 5.727 | 0 | 1 | 3.119 | 1 | 0 | 2 | 1 | 3.293 | -1 | 4.111 | E | FALSE |
43 | 1 | 8.335 | 1 | 1 | 1.923 | -1 | 1 | NA | 1 | 2.990 | NA | 4.761 | A | TRUE |
44 | 1 | 2.747 | 1 | 1 | 1.990 | 1 | 1 | NA | 0 | 1.736 | NA | 1.723 | D | FALSE |
45 | 1 | 4.443 | 0 | 1 | 1.788 | -1 | 0 | 8 | 0 | 3.066 | 1 | 1.169 | C | FALSE |
46 | 0 | 0.571 | 0 | 1 | 2.965 | -1 | 0 | 3 | 0 | 3.159 | 1 | 1.781 | C | FALSE |
47 | 0 | 5.719 | 1 | 1 | 2.176 | 1 | 0 | 2 | 1 | 1.956 | 1 | 4.637 | F | FALSE |
48 | 0 | 3.397 | 1 | 1 | 2.850 | 1 | 0 | 4 | 1 | 3.320 | 1 | 4.529 | F | FALSE |
49 | 0 | 2.688 | 0 | 1 | 2.544 | -1 | 0 | 7 | 0 | 3.180 | -1 | 3.628 | B | FALSE |
50 | 0 | 3.050 | 1 | 1 | 2.972 | 1 | 0 | 3 | 0 | 3.053 | 1 | 0.634 | F | FALSE |
51 | 1 | 6.111 | 0 | 1 | 2.159 | 1 | 0 | 6 | 0 | 3.662 | 1 | 3.853 | F | FALSE |
52 | 1 | 3.936 | 0 | 1 | 1.938 | -1 | 0 | 6 | 0 | 2.685 | 1 | 0.217 | C | FALSE |
53 | 0 | 2.466 | 0 | 1 | 2.677 | 1 | 1 | NA | 1 | 2.473 | NA | 6.786 | D | TRUE |
54 | 1 | 2.595 | 0 | 1 | 2.129 | -1 | 1 | NA | 0 | 2.717 | NA | 2.857 | A | FALSE |
55 | 1 | 6.187 | 0 | 0 | 1.507 | -1 | 0 | 2 | 1 | 2.873 | -1 | 0.820 | B | FALSE |
56 | 0 | 5.879 | 0 | 1 | 1.872 | 1 | 1 | NA | 1 | 2.391 | NA | 4.780 | D | TRUE |
57 | 1 | 4.732 | 0 | 1 | 1.189 | 1 | 0 | 3 | 0 | 2.239 | -1 | 2.980 | E | TRUE |
58 | 0 | 6.344 | 0 | 1 | 2.148 | -1 | 0 | 3 | 1 | 3.665 | -1 | 2.556 | B | FALSE |
59 | 0 | 3.495 | 0 | 1 | 1.542 | -1 | 0 | 5 | 0 | 1.286 | -1 | 1.233 | B | FALSE |
60 | 1 | 7.886 | 1 | 1 | 2.282 | -1 | 1 | NA | 0 | 3.416 | NA | 4.560 | A | TRUE |
61 | 0 | 4.423 | 1 | 1 | 2.077 | -1 | 1 | NA | 1 | 2.955 | NA | 5.083 | A | TRUE |
62 | 0 | 4.087 | 0 | 1 | 1.597 | 1 | 0 | 8 | 1 | 1.305 | 1 | 3.234 | F | TRUE |
63 | 1 | 5.394 | 0 | 1 | 1.394 | 1 | 0 | 5 | 1 | 1.786 | 1 | 5.340 | F | TRUE |
64 | 1 | 6.915 | 0 | 1 | 2.176 | -1 | 1 | NA | 1 | 3.606 | NA | 3.198 | A | FALSE |
65 | 0 | 6.065 | 1 | 1 | 2.167 | 1 | 0 | 8 | 1 | 2.629 | 1 | 3.299 | F | FALSE |
66 | 0 | 7.318 | 1 | 1 | 1.382 | 1 | 1 | NA | 0 | 1.680 | NA | 0.335 | D | FALSE |
67 | 1 | 5.115 | 1 | 1 | 2.059 | 1 | 1 | NA | 1 | 2.599 | NA | 3.242 | D | FALSE |
68 | 0 | 3.739 | 0 | 1 | 2.006 | -1 | 0 | 6 | 0 | 2.664 | -1 | 1.805 | B | FALSE |
69 | 0 | 4.249 | 0 | 1 | 1.821 | -1 | 0 | 4 | 0 | 2.355 | -1 | 1.650 | B | FALSE |
70 | 0 | 7.057 | 1 | 1 | 1.867 | 1 | 0 | 6 | 1 | 1.765 | 1 | 2.779 | F | FALSE |
71 | 1 | 2.877 | 0 | 1 | 1.927 | -1 | 0 | 5 | 0 | 3.615 | 1 | 0.719 | C | FALSE |
72 | 1 | 2.616 | 0 | 1 | 2.027 | -1 | 0 | 6 | 1 | 3.535 | 1 | 3.452 | C | FALSE |
73 | 0 | 5.784 | 0 | 1 | 1.554 | 1 | 0 | 6 | 0 | 0.875 | -1 | 4.558 | E | TRUE |
74 | 1 | 6.531 | 0 | 1 | 1.131 | -1 | 0 | 5 | 1 | 2.344 | -1 | 0.366 | B | FALSE |
75 | 0 | 0.772 | 0 | 1 | 1.593 | 1 | 0 | 7 | 0 | 0.769 | 1 | 0.737 | F | FALSE |
76 | 1 | 6.962 | 1 | 0 | 1.798 | 1 | 1 | NA | 0 | 2.537 | NA | 1.504 | D | FALSE |
77 | 1 | 5.030 | 0 | 1 | 1.695 | -1 | 0 | 2 | 1 | 3.142 | -1 | 3.474 | B | FALSE |
78 | 0 | 4.547 | 0 | 1 | 2.394 | -1 | 1 | NA | 0 | 3.609 | NA | 2.942 | A | FALSE |
79 | 1 | 7.767 | 0 | 1 | 1.587 | -1 | 1 | NA | 0 | 2.875 | NA | 0.960 | A | FALSE |
80 | 0 | 8.698 | 0 | 0 | 1.852 | -1 | 1 | NA | 1 | 2.972 | NA | 2.292 | A | FALSE |
81 | 1 | 2.132 | 0 | 1 | 2.038 | -1 | 1 | NA | 0 | 3.048 | NA | 2.602 | A | FALSE |
82 | 1 | 7.529 | 1 | 1 | 1.406 | -1 | 1 | NA | 0 | 3.005 | NA | 3.043 | A | TRUE |
83 | 0 | 3.420 | 1 | 1 | 2.948 | -1 | 1 | NA | 1 | 3.493 | NA | 6.463 | A | TRUE |
84 | 1 | 5.331 | 0 | 1 | 1.752 | 1 | 0 | 2 | 1 | 1.536 | 1 | 5.077 | F | TRUE |
85 | 1 | 6.202 | 0 | 1 | 1.404 | 1 | 0 | 3 | 1 | 2.111 | 1 | 4.198 | F | TRUE |
86 | 0 | 5.333 | 0 | 1 | 2.100 | 1 | 0 | 8 | 0 | 1.590 | 1 | 1.029 | F | FALSE |
87 | 0 | 1.343 | 0 | 0 | 2.868 | -1 | 1 | NA | 1 | 2.099 | NA | 1.309 | A | FALSE |
88 | 1 | 3.734 | 1 | 1 | 2.061 | 1 | 0 | 6 | 0 | 2.152 | -1 | 2.221 | E | FALSE |
89 | 0 | 8.849 | 1 | 1 | 1.928 | -1 | 1 | NA | 0 | 3.993 | NA | 4.577 | A | TRUE |
90 | 0 | 5.432 | 0 | 1 | 2.279 | 1 | 0 | 8 | 0 | 1.751 | 1 | 3.198 | F | FALSE |
91 | 0 | 2.910 | 0 | 1 | 2.446 | -1 | 0 | 7 | 0 | 2.452 | 1 | -0.182 | C | FALSE |
92 | 1 | 3.360 | 0 | 1 | 1.553 | -1 | 0 | 6 | 1 | 3.167 | -1 | 1.481 | B | FALSE |
93 | 1 | 7.644 | 0 | 1 | 0.970 | -1 | 1 | NA | 0 | 1.976 | NA | -0.457 | A | FALSE |
94 | 0 | 5.865 | 0 | 0 | 1.626 | 1 | 1 | NA | 0 | 1.479 | NA | 2.888 | D | TRUE |
95 | 1 | 2.757 | 0 | 1 | 2.111 | -1 | 1 | NA | 1 | 3.116 | NA | 2.896 | A | FALSE |
96 | 0 | 5.016 | 0 | 1 | 2.062 | 1 | 0 | 7 | 0 | 2.080 | 1 | 2.314 | F | FALSE |
97 | 1 | 6.492 | 0 | 1 | 1.747 | 1 | 1 | NA | 0 | 2.240 | NA | 4.268 | D | TRUE |
98 | 0 | 9.306 | 0 | 1 | 1.439 | -1 | 0 | 3 | 0 | 2.684 | -1 | 1.358 | B | FALSE |
99 | 1 | 2.796 | 0 | 1 | 1.608 | 1 | 0 | 6 | 1 | 1.917 | -1 | 3.311 | E | FALSE |
100 | 0 | 4.050 | 0 | 1 | 1.463 | 1 | 0 | 2 | 1 | 1.846 | -1 | 2.443 | E | FALSE |
101 | 1 | 7.490 | 0 | 0 | 1.077 | -1 | 0 | 2 | 0 | 2.127 | -1 | 0.846 | B | FALSE |
102 | 0 | 3.852 | 0 | 1 | 2.190 | 1 | 0 | 3 | 0 | 1.993 | 1 | 2.577 | F | FALSE |
103 | 0 | 8.651 | 0 | 1 | 1.648 | -1 | 1 | NA | 0 | 2.730 | NA | 1.069 | A | FALSE |
104 | 0 | 7.608 | 0 | 1 | 2.003 | 1 | 1 | NA | 0 | 2.980 | NA | 4.693 | D | TRUE |
105 | 1 | 0.852 | 1 | 1 | 2.636 | 1 | 0 | 5 | 1 | 2.200 | 1 | 3.382 | F | FALSE |
106 | 0 | 6.060 | 1 | 1 | 1.964 | 1 | 0 | 6 | 0 | 1.601 | 1 | -0.108 | F | FALSE |
107 | 0 | 1.452 | 1 | 1 | 2.756 | -1 | 1 | NA | 1 | 3.350 | NA | 5.912 | A | TRUE |
108 | 0 | 1.338 | 1 | 1 | 2.648 | -1 | 1 | NA | 1 | 3.104 | NA | 4.817 | A | TRUE |
109 | 1 | 5.176 | 1 | 1 | 1.416 | -1 | 0 | 2 | 0 | 2.115 | 1 | 1.030 | C | FALSE |
110 | 1 | 5.142 | 0 | 1 | 1.895 | 1 | 1 | NA | 0 | 3.252 | NA | 3.941 | D | TRUE |
111 | 0 | 4.918 | 0 | 1 | 2.131 | -1 | 0 | 5 | 1 | 3.795 | -1 | 2.873 | B | FALSE |
112 | 1 | 8.161 | 0 | 1 | 1.577 | -1 | 0 | 6 | 1 | 2.387 | -1 | 1.688 | B | FALSE |
113 | 0 | 5.737 | 0 | 1 | 2.624 | -1 | 0 | 2 | 1 | 4.386 | 1 | 4.667 | C | FALSE |
114 | 1 | 3.317 | 0 | 1 | 1.613 | -1 | 0 | 8 | 0 | 1.872 | 1 | -1.160 | C | FALSE |
115 | 1 | 5.299 | 0 | 1 | 1.656 | -1 | 1 | NA | 0 | 2.877 | NA | 1.997 | A | FALSE |
116 | 0 | 3.714 | 0 | 1 | 1.957 | -1 | 1 | NA | 0 | 1.977 | NA | 1.564 | A | FALSE |
117 | 1 | 4.830 | 1 | 1 | 2.297 | 1 | 0 | 3 | 0 | 2.349 | -1 | 3.213 | E | FALSE |
118 | 1 | 2.130 | 1 | 0 | 2.114 | -1 | 0 | 8 | 0 | 2.634 | -1 | 3.994 | B | TRUE |
119 | 0 | 4.712 | 0 | 1 | 1.952 | -1 | 0 | 5 | 1 | 2.637 | -1 | 0.924 | B | FALSE |
120 | 1 | 5.257 | 0 | 1 | 1.527 | 1 | 1 | NA | 1 | 1.631 | NA | 4.876 | D | TRUE |
121 | 0 | 3.246 | 0 | 1 | 2.099 | -1 | 0 | 7 | 0 | 1.644 | 1 | 0.379 | C | FALSE |
122 | 0 | 5.103 | 0 | 1 | 2.133 | 1 | 0 | 2 | 0 | 2.405 | -1 | 5.741 | E | TRUE |
123 | 0 | 3.378 | 0 | 0 | 1.393 | 1 | 1 | NA | 0 | 1.055 | NA | 2.530 | D | TRUE |
124 | 1 | 3.390 | 0 | 1 | 1.524 | -1 | 1 | NA | 0 | 2.599 | NA | 1.039 | A | FALSE |
125 | 0 | 4.489 | 1 | 1 | 1.171 | 1 | 0 | 4 | 1 | 1.001 | -1 | 0.838 | E | FALSE |
126 | 0 | 6.227 | 1 | 0 | 2.206 | -1 | 0 | 2 | 1 | 3.152 | 1 | 4.635 | C | TRUE |
127 | 1 | 4.487 | 0 | 1 | 2.005 | -1 | 0 | 4 | 0 | 2.863 | -1 | 2.960 | B | FALSE |
128 | 1 | 5.477 | 0 | 1 | 2.152 | -1 | 0 | 8 | 0 | 3.665 | -1 | 3.994 | B | FALSE |
129 | 0 | 7.425 | 1 | 1 | 1.964 | -1 | 0 | 8 | 0 | 2.697 | -1 | 5.092 | B | TRUE |
130 | 0 | 4.287 | 0 | 0 | 2.123 | -1 | 0 | 2 | 0 | 2.518 | -1 | 2.315 | B | FALSE |
131 | 0 | 4.741 | 0 | 1 | 1.952 | 1 | 1 | NA | 0 | 2.134 | NA | 3.617 | D | TRUE |
132 | 1 | 6.278 | 0 | 1 | 1.329 | -1 | 0 | 6 | 1 | 2.891 | -1 | 1.090 | B | FALSE |
133 | 0 | 1.855 | 0 | 1 | 3.124 | 1 | 1 | NA | 1 | 3.572 | NA | 6.706 | D | TRUE |
134 | 1 | 6.360 | 1 | 0 | 1.003 | -1 | 0 | 4 | 1 | 1.654 | 1 | 3.676 | C | TRUE |
135 | 1 | 3.663 | 0 | 1 | 2.656 | 1 | 0 | 7 | 0 | 3.281 | -1 | 6.073 | E | TRUE |
136 | 0 | 2.155 | 0 | 1 | 2.626 | 1 | 1 | NA | 1 | 2.152 | NA | 5.881 | D | TRUE |
137 | 1 | 0.580 | 1 | 0 | 2.659 | 1 | 1 | NA | 1 | 2.595 | NA | 3.440 | D | FALSE |
138 | 0 | 3.213 | 0 | 1 | 3.112 | -1 | 0 | 6 | 0 | 3.405 | 1 | 2.047 | C | FALSE |
139 | 1 | 2.074 | 0 | 1 | 1.632 | -1 | 0 | 2 | 0 | 1.234 | -1 | 1.646 | B | FALSE |
140 | 0 | 2.054 | 0 | 0 | 2.698 | -1 | 0 | 5 | 1 | 3.245 | 1 | 3.529 | C | FALSE |
141 | 1 | 9.068 | 0 | 1 | 1.315 | -1 | 0 | 5 | 0 | 2.759 | 1 | 0.192 | C | FALSE |
142 | 0 | 1.970 | 0 | 0 | 1.827 | 1 | 1 | NA | 0 | 1.190 | NA | 3.028 | D | TRUE |
143 | 0 | 7.494 | 1 | 0 | 1.929 | -1 | 0 | 6 | 0 | 2.208 | 1 | 0.591 | C | FALSE |
144 | 1 | 6.363 | 1 | 1 | 1.712 | -1 | 0 | 8 | 0 | 2.984 | 1 | 1.860 | C | FALSE |
145 | 1 | 2.788 | 0 | 0 | 1.679 | -1 | 0 | 6 | 1 | 3.986 | -1 | 2.303 | B | FALSE |
146 | 0 | 3.472 | 0 | 1 | 1.973 | 1 | 0 | 2 | 0 | 1.971 | 1 | 2.107 | F | FALSE |
147 | 0 | 3.975 | 0 | 1 | 1.789 | 1 | 0 | 2 | 1 | 1.044 | 1 | 3.452 | F | TRUE |
148 | 0 | 5.476 | 1 | 1 | 2.240 | 1 | 0 | 3 | 1 | 3.096 | -1 | 2.395 | E | FALSE |
149 | 0 | 4.129 | 1 | 1 | 1.984 | 1 | 1 | NA | 0 | 1.748 | NA | 1.786 | D | FALSE |
150 | 1 | 7.067 | 1 | 1 | 1.426 | 1 | 0 | 3 | 1 | 1.403 | 1 | 2.204 | F | FALSE |
Estimate mean of more deeply tailored AI
Finally, we fit a simple regression model to the dat_adhd_QL
data.frame which contains an indicator for those consistent with the learned decision rules
# remember to add weights! Non-Responders are underrepresented
dat_adhd_QL <- dat_adhd_QL %>% mutate(weights = if_else(R==1, 2, 4))
mod_QL_rule <- lm(Y2 ~ QL, data = dat_adhd_QL, weights = weights)
summary(mod_QL_rule)
Call:
lm(formula = Y2 ~ QL, data = dat_adhd_QL, weights = weights)
Weighted Residuals:
Min 1Q Median 3Q Max
-8.777 -1.553 0.037 1.670 6.911
Coefficients:
Estimate Std. Error t value Pr(>|t|)
(Intercept) 2.135 0.131 16.2 <2e-16 ***
QLTRUE 2.566 0.254 10.1 <2e-16 ***
---
Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
Residual standard error: 2.52 on 148 degrees of freedom
Multiple R-squared: 0.407, Adjusted R-squared: 0.403
F-statistic: 102 on 1 and 148 DF, p-value: <2e-16
# Contrast
C = rbind("Mean Y under more deeply tailored AI" = c(1,1))
# Calcualte the mean outcome for those consistent with the more deeply tailored AI
QL_estimate <- estimate(mod_QL_rule, C)
QL_estimate
Estimate 95% LCL 95% UCL SE p-value
Mean Y under more deeply tailored AI 4.701 4.274 5.128 0.218 <1e-04 ***
---
Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
Comparing QL adaptive intervention to the four embedded AIs
Knowledge check #3
- Why do we need to use Q-learning to estimate a more deeply-tailored AI that uses intermediate variables? Could we use a single regression model?
The steps we just covered illustrate the basics of Q-learning using moderated regression models. However, if we want to do inference on the first-stage decision rule we need some extra software… enter qlaci()
!
6 The qlaci
package
The qlaci
package performs Q-learning on data arising from a two-stage SMART. It is useful when we need standard errors for the estimates of our more deeply-tailored adaptive intervention.
The qlaci package can be downloaded from the d3c github using remotes
or devtools
: d3center-isr/qlaci
Start by grand mean centeringe the baseline covariates. This does not change the point estimates, but is useful when interpreting model coefficients or for hand coding contrasts.
Next, we specify the contrast matrix that will be used for the stage 1 regression (step 3 of Q-learning). qlaci()
uses this matrix to estimate the mean outcomes under each of the first-stage treatments (accounting for the future optimal decision) at both levels of priormed
. We also specify a contrast that estimates the mean outcome if everyone in the study had received the optimal more deeply-tailored AI.
## contrast matrix - we must transpose this for qlaci
c1 <-
rbind(
"Mean Y under bmod, prior med" = c(1, rep(0, 3), 1, 1, 1),
"Mean Y under med, prior med" = c(1, rep(0, 3), 1, -1, -1),
"Mean diff (bmod-med) for prior med" = c(0, rep(0, 3), 0, 2, 2),
"Mean Y under bmod, no prior med" = c(1, rep(0, 3), 0, 1, 0),
"Mean Y under med, no prior med" = c(1, rep(0, 3), 0, -1, 0),
"Mean diff (bmod-med) for no prior med" = c(0, rep(0, 3), 0, 2, 0)
)
The function qlaci()
maximizes the expected outcome for each stage of treatment given a set of moderators.
attach(dat_adhd_c) # with attach we can be lazy and refer to variables in the data.frame by name directly
q1 <- qlaci::qlaci(H10 = cbind(1, odd_c, severity_c, race_c, priormed),
H11 = cbind(A1 = 1, "A1:priormed" = priormed),
A1 = A1,
Y1 = rep(0, nrow(dat_adhd_c)), # set to zero for everyone; we care only about EOS outcome
H20 = cbind(1, odd_c, severity_c, race_c, priormed_c, A1, adherence),
H21 = cbind(A2 = 1, "A2:A1" = A1, "A2:adherence" = adherence, "A2:A1:adeherence" = A1*adherence),
A2 = A2,
Y2 = Y2,
S = 1 - R,
c1 = t(c1))
detach(dat_adhd_c)
6.1 qlaci results
The the coefficients estimated by qlaci()
combined with the user specified contrast matrix give us the estimated means for first-stage treatment options by levels of priormed,
accounting for the optimal second-stage tactic for non-responders. With valid confidence intervals we can test contrasts and even compare to the 4 embedded adaptive interventions found in Virtual Module 2!
est low upp
Mean Y under bmod, prior med 3.429 2.772 4.003
Mean Y under med, prior med 3.732 3.117 4.469
Mean diff (bmod-med) for prior med -0.303 -1.190 0.559
Mean Y under bmod, no prior med 4.365 3.951 4.737
Mean Y under med, no prior med 2.851 2.383 3.315
Mean diff (bmod-med) for no prior med 1.514 0.875 2.104
Knowledge check #4
- Looking at the above table, what do the two contrasts (bmod - med) for levels of prior med tell us about tailoring?