Loading...
「ツール」は右上に移動しました。
3345いいね 114629回再生

Information Theory, Lecture 1: Defining Entropy and Information - Oxford Mathematics 3rd Yr Lecture

In this lecture from Sam Cohen’s 3rd year ‘Information Theory’ course, one of eight we are showing, Sam asks: how do we measure the amount of information we learn by seeing the outcome of a random variable? Answer: this can be measured by the variable’s entropy (and related quantities), which we introduce.

You can watch the eight lectures from the course as they appear via the playlist:
   • Student Lectures - Information Theory  

You can also watch many other student lectures via our main Student Lectures playlist (also check out specific student lectures playlists):    • Student Lectures - All lectures  

All first and second year lectures are followed by tutorials where students meet their tutor in pairs to go through the lecture and associated problem sheet and to talk and think more about the maths. Third and fourth year lectures are followed by classes.

コメント