Lab 10
Flowchart:
Average = 59 Above Average = 10/21
Below Aveage = 11/21 Above Average = >59
Below Average = <59
If the student had an "A", then the grade is >89.
If the student had a "B", then the grade is >74.
If the student had a "C", then the grade is >59.
If the student had a "D", then the grade is >44.
If the student had a "F", then the grade is <44.
Shannon and Hartley
Shannon entropy on average is how many yes/no questions one needs to ask to establish what the symbol is.
It can break off into segments containing one alternative or uniform distribution. The theorem establishes Shannon's channel capacity, a bound on the maximum amount of error-free digital data (that is, information) that can be transmitted over such a communication link with a specified bandwidth in the presence of the noise interference, under the assumption that the signal power is bounded. The law is named after Claude Shannon and Ralph Hartley.
Comparisons of Entropy:
Professor Mussolini had the least amount of "error", or "entropy", or "chaos" in his region with the entropy being as low as about -1. Professor Matic had the most entropy with it reaching to -.5 which is pretty high doubt. Professor Churchill was in the middle with entropy. In most cases of Mussolini, the probability was 0, Churchill with the second most cases of probability 0, and Matic had the least amount of probablility 0.
No comments:
Post a Comment