If everyone knows a tenth of the population dishonestly claims to observe alien spaceships, this can make it very hard for the honest alien-spaceship-observer to communicate fact that she has actually seen an alien spaceship.
Old saying: "Remember that one explanation for an unlikely natural occurrence is 'All of the witnesses are lying'."
I bet it was paper mache, not a paper machete.
I don't usually nitpick typos, but considering the subject of this discussion.....
I don't know that your logic is airtight,as each added assumption increases the chances of an untrue or even illogical assertion. There is a greater likelihood of a a mistaken belief than a lie. Though there are liars there are also more people who may hold mis- beliefs. As for the assertion that Robin makes in his paper that the testimony a high status or reliable individual is extraordinary evidence,I submit the following counter example.
There was a rash of UFO sightings which later proved to be partly due to photographs,of a paper machete model found a man's attic. A very well respected man and his wife whom I personally know stated that they had witnessed a UFO in the swamp which had then took off. He did not go about talking much about this but did go on national TV. His reputation was not ruined nor did his account make anyone but UFO enthusiasts believe. He did not press the matter and later told me it must have been some sort of secret military craft.There is a nearby airbase doing research. The fact is that if the conventional paradigm is violated you will thy to find a way around it,not believe it. That depends how closely wed to the opinion you hold.
This is not "noise" but "clutter" or "interference" -- real signals that you are not interested in. One way to see the difference is that averaging will not help you. People don't flip back and forth between false alien claims and lack of claims. The same people are consistently making the false claims.
This makes the title "can a tiny bit of interference destroy communication?" for which the answer is definitely yes (depending on what you mean by tiny bit).
Human relationships form a "small world" network (the Facebook friendship graph, for instance, has an average of about four degrees of separation). This means that salient information can propagate to pretty much everybody in the world passing only through a very small number of intermediaries. Thus, unless noise per intermediary is high, it is usually not an issue.
Moreover, human social communication is generarly very redundant: there are many short paths between any two nodes in the network. Hence, liars who introduce false claims pretending to be relying claims made by others are usually easy to detect.
Regarding the value of making a claim, if you make a notable claim that has a low prior probability (e.g. some unexpected scientifically observable phenomenon) AND you can back it up with solid evidence, you will gain a large reward.On the other hand, if you can't back it up with solid evidence (e.g. the typical UFO claim), usually you will not be believed by most people and you will incur in a social cost (unless your claim triggers widespread cognitive biases in a way that is salient to most people).
I think the key point here is the claim prior probability, not the communication noise: consider the claim "I saw a giant marshmallow in the sky".If you made that claim most people would be less inclined to believe you than if you claimed "I saw an alien spaceship in the sky", because the prior probability of the former is even lower than the prior probability of the latter, even if nobody is known to falsely claim the former while lots of people are known to falsely claim the latter.
This problem is worse when it comes to real-life conspiracies, where a few people who know what's actually going on have an explicit interest in polluting public discourse. With alien spaceships, noise generally comes from epistemologically incompetent sources, so just by being rational and articulate, you are likely to avoid being drowned out; it's the (other) practical jokers who reduce your credibility there.
It seems to me that if the distribution of costs are known, then the actors can mostly figure out the true value from the distorted value. Suppose 60% of the population sees A, and 20% says A. Everyone hears 20% A, and should be able to work back to that 60% A was seen (in expected value), because they know about this effect.
I wonder if there are cost distributions where the original level is not identifiable- where both 40% seeing A and 60% seeing A both result in 20% saying A. It seems like that will only be true for unlikely distributions, and then only true for part of those distributions.Another issue here is the decision problem doesn't just depend on the cost- it also depends on the benefit of others knowing the truth *and* the probability that your message will change their belief! If I have a high cost for saying "A," such that I'd like to switch to "B," I need to model how switching my statement will affect the beliefs of others. If the distributions are such that I could reasonable expect this feedback, then I should take that into consideration when making my decision.
Supposing that people just care about their personal message, and that they have some costs associated with that message, means that each person will have a different probability threshold, and not care about how their message effects others. Then, each distribution over thresholds and probability of seeing A should have an equilibrium point where no one changes their decision. (Hm, only one equilbrium? Maybe multiple, which complicates things even further. It looks like we can reduce it to one if we assume a particular updating strategy.)