Information Reduces Uncertainty


  • Information reduces uncertainty
  • For a given information function, information is:
    • sufficient (complete and available when needed)
    • clear (without noise or redundancy)
    • accurate (without error – changed values of falsehoods).

Example #1

  • 4 = x + y
  • x could be many values including 0,1,2,3, and 4
  • If we now discover that y = 1
  • 4 = x + 1
  • We can now say with certainty that x = 3

Example #2

  • A red, green, and blue ball are placed in a black bag.
  • You put your hand in the bag and pull out a blue ball.
  • The probably of your next pick being a red ball has just increased.

Example #3

  • Router1 receives a packet.
  • It has four ports which it can forward the packet on.
  • Router1 receives a message from Router2 indicating it is at the end of the cable attached to Router1’s port1, and Router2 is the next hop closest to the destination for the packet that Router1 just received.
  • The probability of Router1 selecting the best port to send the packet on has increased.

Information reduces uncertainty when…

  • i’ = x + y
  • 4 = 3 + 1
  • x & y are required (the information must be complete)
  • x & y must be unaltered and factual (information must be accurate)
  • x & y are both required at the time the function / equation is executed (information must be timely)

However, information is subject to communication delays, corruption, and drop.

Information is certain or uncertain

Dimensions of certainty and uncertainty:

  • (in)sufficient
  • (un)clear
  • (in)accurate

Uncertainty is reduced through information that is sufficient, clear, and accurate.


Information is sufficient when it enables an information function to complete with certainty and accuracy. There are multiple reasons information may be insufficient:

  • untimely – the information did not arrive at the information function, by the time the information function completed.
  • incomplete – information was incomplete at the source and/or was filtered from the information function, for example, by administrative policy.


In some ways, the opposite of too little information, is too much information.

  • noise – information that is irrelevant to the function, lengthening the function time as it searches for the necessary information
  • redundant – information that contains redundancy, therefore leading to the same information being processed when not necessary.


There are any number of ways in which information could be inaccurate.

  • The information originally contained false statements: 1+1=3.
  • The information could have initially been accurate, but errors were introduced during transfer, storage, or computation.

If information is sufficient, clear, and accurate, then information increases certainty.

Information does not reduce uncertainty when…


Not complete, or not available for processing. Information can be not available for processing because it arrives late due to latency, or because it was blocked/filtered by policy.


More information than is necessary, due to redundancy, noise, or other, can complicate information processing and lead to suboptimal function processing times, and information output.


Information that is errored during an information flow will produce suboptimal or incorrect information output – garbage in, garbage out. Information that was incorrect to begin with, a falsehood, for example 1=2, will also result in suboptimal or incorrect information output.

Basic Information goals / principles

  • Information is (in)sufficient, (un)clear, and (in)accurate
  • Latency should be low enough and bandwidth high enough to complete a function in time.
  • All of the information required should be delivered to a function.
  • Information should be necessary, containing neither noise nor redundancy.
  • Information should not contain false statements or have errors introduced to it.