Logo

Lessons

  • Lesson 0: Setting up computing resources
  • Lesson 1: Introduction to Bayesian modeling
  • Lesson 2: Parameter estimation by optimization
  • Lesson 3: Markov chain Monte Carlo and Stan
  • Lesson 4: Display of MCMC results and diagnostics
  • Lesson 5: Prior and posterior predictive checks
  • Lesson 6: Hierarchical models
    • Modeling repeated experiments
    • Choosing a hierarchical prior
    • Implementation of a worm reversal hierarchical model
    • Generalization of hierarchical models
    • General implementation of hierarchical models
  • Lesson 7: Principled pipelines

Exercises

  • Exercise 1. Practice with Bayesian modeling
  • Exercise 2. Paremeter estimation by optimization
  • Exercise 3. First foray into MCMC
  • Exercise 4: More Bayesian inference with MCMC
  • Exercise 5: Bayesian modeling with prior and posterior predictive checks
  • Exercise 6: Hierarchical models
  • Exercise 7: Principled pipelines

Schedule

  • Schedule overview
  • Daily schedule
PoL workshop on statistical inference, part 2
  • Lesson 6: Hierarchical models
  • View page source

Lesson 6: Hierarchical models

  • Modeling repeated experiments
  • Choosing a hierarchical prior
  • Implementation of a worm reversal hierarchical model
  • Generalization of hierarchical models
  • General implementation of hierarchical models
Previous Next

Last updated on Aug 12, 2024.

© 2021–2024 Justin Bois. With the exception of pasted graphics, where the source is noted, this work is licensed under a Creative Commons Attribution License CC-BY 4.0. All code contained herein is licensed under an MIT license.


Built with Sphinx using a theme provided by Read the Docs.