Logo

Lessons

  • Lesson 0: Setting up computing resources
  • Lesson 1: Introduction to Bayesian modeling
    • Probability as the logic of science
    • Notation of parts of Bayes’s Theorem
    • Marginalization
    • Bayes’s theorem as a model for learning
    • Probability distributions
    • Tasks of Bayesian modeling
    • Bayesian modeling example: parameter estimation from repeated measurements
    • Choosing likelihoods
    • Choosing priors
  • Lesson 2: Parameter estimation by optimization
  • Lesson 3: Markov chain Monte Carlo and Stan
  • Lesson 4: Display of MCMC results and diagnostics
  • Lesson 5: Prior and posterior predictive checks
  • Lesson 6: Hierarchical models
  • Lesson 7: Principled pipelines

Exercises

  • Exercise 1. Practice with Bayesian modeling
  • Exercise 2. Paremeter estimation by optimization
  • Exercise 3. First foray into MCMC
  • Exercise 4: More Bayesian inference with MCMC
  • Exercise 5: Bayesian modeling with prior and posterior predictive checks
  • Exercise 6: Hierarchical models
  • Exercise 7: Principled pipelines

Schedule

  • Schedule overview
  • Daily schedule
PoL workshop on statistical inference, part 2
  • Lesson 1: Introduction to Bayesian modeling
  • View page source

Lesson 1: Introduction to Bayesian modeling

  • Probability as the logic of science
  • Notation of parts of Bayes’s Theorem
  • Marginalization
  • Bayes’s theorem as a model for learning
  • Probability distributions
  • Tasks of Bayesian modeling
  • Bayesian modeling example: parameter estimation from repeated measurements
  • Choosing likelihoods
  • Choosing priors
Previous Next

Last updated on Aug 12, 2024.

© 2021–2024 Justin Bois. With the exception of pasted graphics, where the source is noted, this work is licensed under a Creative Commons Attribution License CC-BY 4.0. All code contained herein is licensed under an MIT license.


Built with Sphinx using a theme provided by Read the Docs.