Skip to content

Commit 3957928

Browse files
committed
Added slide on residual errors
1 parent 17cfc94 commit 3957928

File tree

1 file changed

+26
-2
lines changed

1 file changed

+26
-2
lines changed

slides/02-machine_learning_fundamentals.md

Lines changed: 26 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -107,8 +107,6 @@
107107
</div>
108108
</div>
109109

110-
<div class="fragment" data-fragment-index="1"></div>
111-
112110
---
113111

114112
## Machine Learning
@@ -225,6 +223,32 @@ The loss function $$\mathcal{L}$$ quantifies the difference between the predicte
225223

226224
---
227225

226+
## Residual Errors
227+
228+
- Data points are generated by a true function $f^*$ plus noise:
229+
230+
<div class="formula">
231+
$$
232+
\mathbf{y}_i = f^*(\mathbf{x}_i) + \epsilon_i
233+
$$
234+
</div>
235+
236+
where $\epsilon_i$ represents inherent noise or randomness in the data generation process.
237+
238+
- Residual errors measure the difference between predictions and observations:
239+
240+
<div class="formula">
241+
$$
242+
r_i = \mathbf{y}_i - f_{\boldsymbol{\theta}}(\mathbf{x}_i)
243+
$$
244+
</div>
245+
246+
<div class="highlight" style="padding: 40px 40px">
247+
Errors persist due to: (1) inherent noise $\epsilon_i$, and (2) approximation error when $f^* \notin \mathcal{F}_{\Theta}$
248+
</div>
249+
250+
---
251+
228252
## Example: Linear Regression
229253

230254
<div style="text-align: center;">

0 commit comments

Comments
 (0)