Here are the notes: https://raw.githubusercontent.com/Ceyron/machine-learning-and-simulation/main/english/essential_pmf_pdf/categorical_intro.pdf
The Bernoulli Distribution allowed us to model discrete random variables with only two states (think of the weather which can either be good or bad). Often, we, however, want to model discrete variables with more than two states, maybe even multiple hundred. That is where the generalization of the Bernoulli, the Categorical, comes in. In this video we look at how to define such a kind of distribution and how to use it in TensorFlow Probability.
Timestamps:
00:00 Intro
00:38 Encoding discrete states
01:09 Parameters of the Categorical
01:43 Important property of the parameters
02:06 Saving the last state’s probability
02:50 Example Theta Array
03:12 The Probability Mass Function
03:49 The Indicator Function
04:09 Example
05:43 TFP: Setup
06:02 TFP: Using the Categorical
07:31 Outro
tfpmultinullimultinomial
![Categorical Distribution & Indicator Function | Intro | with TensorFlow Probability | [english]](https://ytimg.googleusercontent.com/vi/421uW9aZHio/maxresdefault.jpg)














