This is pretty old school, but my introduction to Bayes's theorem in graduate school was not gentle (as I recall there was a lot of abstruse mathematics involved) so I'm inclined to ease the road for anyone following in the footsteps of probability theory. At its core, the concept is simple: the probability of one thing changes as the probability of a prior thing changes. For example the probability that George will return home with groceries is higher if you know that George probably went to the grocery store, and lower if you know he probably went to the bar. But does it mean to say that? And what are the implications for machine learning? This article looks at this concept, as the title says, gently.
Today: 0 Total: 105 [Share]
] [