Skip to content

new issues in lecture notebooks #261

@bertdv

Description

@bertdv

The following issues have low priority but should be addressed for next year.

Prob theory

BML

FFG

  • In FFG class, add why large models are necessarily sparse!

Gaussians

Discrete data

  • Move exercise on evidence for die toss to the main text.

Regression

  • build model $p(y,x,w) = p(w) \prod_n p(y_n|x_n,w) \underbrace{p(x_n)}_{\delta(x_n - \hat{x}_n)}$

Classification

  • The decision boundaries are generally (hyper-)parabolic surfaces. Dat moet denk ik ellipsoids zijn, want parabolen zijn “defecte” ellipsen (missende kwadratische term).

    • Beste Bert,
      Dank je wel voor je leuke en snelle reactie! Ik zat nog verder te denken: het is nog iets subtieler: de niveaukrommen x’ Sigma^{-1} x = constant zijn ellipsoids in de R^n, dat is wat in de tekst wordt beschreven denk ik. En dat is waarschijnlijk het eenvoudigst om te noemen in de tekst: elke klasse heeft ellipsoids om zich heen.
  • Both generative and discriminative classification can be worked out from a model specification to lead to $$p(y_{\dot,k}=1 | x_\dot,D) \sim \mathrm{Cat}(\sigma(\beta_k^T x_\dot))$$

  • query "Can you work out Bayesian logistic regression with Laplace approximation for the posterior on the weights" in chatGPT gives a nice calculation with shorter formulas (in matrix form) than currently in lecture notes.

Dynamic systems

  • the first animation of prediction of position is unclear. I would expect $$p(z_{10})$$ based on priors alone to have a lot of noise.

  • [ ]

Metadata

Metadata

Assignees

Labels

ContentLecture materials

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions