Pragmatic Semiotic Information • 9
•
https://inquiryintoinquiry.com/2024/03/16/pragmatic-semiotic-information-9/
Information Recapped —
Reflection on the inverse relation between uncertainty and information
led us to define the “information capacity” of a communication channel
as the “average uncertainty reduction on receiving a sign”, taking the
acronym “AURORAS” as a reminder of the definition.
To see how channel capacity is computed in a concrete case
let's return to the scene of uncertainty shown in Figure 5.
Pragmatic Semiotic Information • Figure 5
•
https://inquiryintoinquiry.files.wordpress.com/2024/03/pragmatic-semiotic-i…
For the sake of the illustration let's assume we are dealing with the
observational type of uncertainty and operating under the descriptive
reading of signs, where the reception of a sign says something about
what's true of our situation. Then we have the following cases.
• On receiving the message “A” the additive measure of uncertainty
is reduced from log 5 to log 3, so the net reduction is (log 5 - log 3).
• On receiving the message “B” the additive measure of uncertainty
is reduced from log 5 to log 2, so the net reduction is (log 5 - log 2).
The average uncertainty reduction per sign of the language is computed
by taking a “weighted average” of the reductions occurring in the channel,
where the weight of each reduction is the number of options or outcomes
falling under the associated sign.
• The uncertainty reduction (log 5 - log 3) is assigned a weight of 3.
• The uncertainty reduction (log 5 - log 2) is assigned a weight of 2.
Finally, the weighted average of the two reductions is computed as follows.
• (1/5) ∙ [ 3 ∙ (log 5 - log 3) + 2 ∙ (log 5 - log 2) ]
Extracting the pattern of calculation yields the following worksheet
for computing the capacity of a two‑symbol channel with frequencies
partitioned as n = k₁ + k₂.
Capacity of a channel {“A”, “B”} bearing the odds of 60 “A” to 40 “B”
•
https://inquiryintoinquiry.files.wordpress.com/2024/03/channel-capacity-60-…
In other words, the capacity of the channel is slightly under 1 bit.
That makes intuitive sense in as much as 3 against 2 is a near‑even
split of 5 and the measure of the channel capacity, otherwise known
as the “entropy”, is especially designed to attain its maximum of
1 bit when a two‑way partition is split 50‑50, that is, when the
distribution is “uniform”.
Regards,
Jon
cc:
https://www.academia.edu/community/L6Pd9M
cc:
https://mathstodon.xyz/@Inquiry/112032763420333668