19:07 To celebrate work of Hartley. Hartley himself, anecdotally, hated probabilities, still he made great contributions to information theory.
This YouTube Channel is lovely. I like the Oxford Mathematics Youtube Page very much
That's how an information theoretician ought to look like!
Go grab your notebook this will be a good one for sureeee ty oxford
This is magnificent on the part of Oxford University.
Claude Shannon, GOAT of information theory, unknown son of Petoskey, Michigan.
Simple and straight forward. Thank you @Oxford Mathematics.
So if he gives that lecture 52 times the first example with the new random variable is gonna fail right? "This is a new random variable. Ooh look, it's the same card." 04:15
I did this course (Information Theory) under Signal Processing and Telecommunication System for B Eng in Electrival and Electonic Engineering degree. Does Oxford University do this course under Mathematics degree? I am just wondering whether this course is taught under Engineering department like me in the UK.
Is there a website that shows which full courses are available? Or do they only upload these as examples so you apply to the school?
Is there an agreed upon definition of "guessing entropy" used in side channel analysis via template attacks? I know it is off topic, but I believe it would somehow be extension of information theory principles. I could of course be wrong.
Is the formula at 34:00 an error?
Please explore semantic communication.
Great stuff
Pls upload whole cource
Speaking of communication, this is a REALLY good case in which a well-prepared (NB) slide deck would improve delivery and help lecturer and students pay more attention to message than medium! Chalk boards can then be used for digression, Q&A etc.
Would it be possible to add in the description info regarding the two textbooks that the lecturer mentions at the beginning? The lecture notes would also be interesting to have (but I understand if they are only for the enrolled students).
What does p(X) mean? p(x) = P(X=x) which makes sense (the probability of the random outcomes being x). It's used in the definition of entropy in terms of the expectation H(X) = E[-log p(X)]?
Beautiful lectures! Any chance we can get our hands on Mr. Cohenâs lecture notes?
@s3nhxx