We don’t know everything. This might come as a shock to some. To others it’s understood. We may never know “everything.” It may not be possible, just as it is ‘not possible to prove all true statements.’ Kurt Godel was a 20th century mathematician who became famous for the proof of this statement. He was also a friend of Albert Einstein. That particular worthy gentleman said something to the effect that not everything that can be counted is worth counting, and that not everything that counts can be counted.
Steven Hawking, author of the seminal “A Brief History of Time,” puts it thus: We can’t even solve exactly the motion of three bodies in a theory as simple as Newton’s theory of gravity, and the difficulty increases with the number of bodies and the complexity of the theory.”
So, the idea that we have an exact handle on something as “simple” as audio is ummmmm, silly?
Let’s try on a quote from the lead analog designer of a major manufacturer of professional recording equipment (Apogee Electronics), Lucas van der Mee: “I have learned there are a lot of things we don’t know. I have found myself, many times, saying that certain experiences cannot happen because I could not explain them from a technical point of view. The first responses during the development of the Big Ben master clock (used in recording studios/mve) come to mind. The reports I received of an improved soundstage (when the Big Ben was substituted for a studio’s previous master clock/mve) sounded like nonsense to me until I heard them myself. And it was not subtle. I had to go back and find out why this happened instead of trying to deny it. That is a bit of a problem that comes with the job. Technical engineers have to trust in logic and reason, which can make them a bit stubborn, hard to convince – in some cases they start to believe they are godlike in a sense, they think they know everything. Thanks to my hands-on experience, I know that there is more than just logic. Our ears do not lie – our perception is actually really good. The things an experienced audio engineer can hear can never be discounted, even if it is not explainable. The task for a technical engineer is to find out why it is perceived and what are the main contributors.”
Not only do we not know everything, but some times we don’t even know what we think we know, i.e. common knowledge may be common, but it is often incorrect. For instance, I was reading a book the other day on “The Structure of Scientific Revolutions,” by T. S. Kuhn. I had read it before, but when I got to this particular page (p.63 in this edition) and I read this particular something again, it made me stop and re-read it…then it hit me like a bombshell. When I read it for the first time some years ago, I had thought I was understanding its meaning, but I wasn’t. I read it under the spell of some ‘common knowledge’ which in fact was a perceptual bias (“prior” assumptions) that had led me to an incorrect understanding of perceptual bias’ own scientific basis.
What this particular section of T. S. Kuhn’s book said was that psychology states that if you are used to a given result from a known framework, that’s what you will tend to perceive in future experiences with that framework. (J.S. Bruner and Leo Postman, “On the Perception of Incongruity: A Paradigm,” Journal of Personality, XVIII (1949), 206-23.)
This pioneering work on perception has as an example of a ‘framework,’ a common deck of playing cards. However, it is a deck with a few extra “jokers,” which are a few cards with reversed colors–red for black or vice versa. A test was set up to see how much visual exposure time was necessary for correct identification of the cards. The cards were mounted so they could be flashed one at a time for a viewer in timed exposures. As the viewing times became just long enough for most of the cards to be correctly identified, there was no problem. As the viewing times became longer, the “incongruities” in the deck started troubling the viewers.
After a slightly longer viewing time was reached, most viewers caught on that, say for instance, the 4 of spades was red and not black as it was supposed to be. They quickly picked up on the other anomalous cards and had no further problems. However, there were a few viewers that couldn’t accept that some of the cards were colored wrong, even when the viewing time was 40 times the normal minimum exposure time for card identification. A few viewers were actually physically affected by the incongruities and experienced distress, some even nausea. They literally could not accept that the cards were colored wrong. It wasn’t normal, and it certainly wasn’t something they’d had previous experience with.
Doctors. Everyone has experienced them. When you’re sick, you often go to see one. After all, everyone knows they are the authority on getting well. You go to the doctor so that they can make you feel better. Everyone knows this, and has pretty much known it for their entire lives. This is what is known in Bayesian psychology as a “prior.” A prior is an individual’s assumption about how the world works, a hypothesis about reality. In the case of “doctors,” we know that if we go to see a doctor, the result is we feel better. We ‘perceive the world in a better light.’ This is what we expect. This is normal. This is the in-the-box kind of thinking that we are programmed with because it is part of the every day world around us. This is an almost universal prior here in our country.
Now say once again we feel poorly and go to the doctor’s to get prescribed some medicine. Medicine is supposed to make us feel better isn’t it? That’s normal–an everyday kind of thing. So is thinking that we are getting better, after all, we should be, because we went to the doctor, got a prescription, and we took the medicine. This is normal; this is thinking “in-the-box.” You’ve been-there and done-that many times and this is the expected result. Is this “wishful thinking?” Kind of, but not really. You’ve experienced this many times before, and so have all of your friends and acquaintances. It is in perfect alignment with your past experiences. It “should” be, and of course, it “is.” It’s part of your normal world–you are there. It is also a ‘perceptual bias’ as the result of a well established ‘prior.’
Say that the next time we go to a doctor we get to become part of a clinical trial. After all, that means free medicine! The doctor explains that we may get a placebo, but we don’t feel well and doctors are supposed to make you feel better, so what the doctor gives you must be the real thing. And guess what, we do feel better! This is our prior. This is normal. This is the in-the-box kind of thinking that we are programmed with because it’s part of the every day world around us…even if what we’ve been taking is a sugar pill–a placebo. The placebo effect, when and if it actually occurs, is a case of thinking in-the-box, getting the expected results from a known framework, i.e. [doctor + medicine = feeling better].
Now, of course, the “normal” interpretation of this clinical trial scenario would be that the placebo effect is to blame because a known object (the sugar pill) had a non-normal effect. However, the correct interpretation of the patient’s response from the clinical trial should be from the patient’s frame of reference, NOT from the point of view of the efficacy of the medicine. The stimulus is the world view of which the actual medicine is just a part.
A known authority figure has interacted with the patient in a known framework. The prior in this case is then: ‘I took the doctor’s medicine and now I will feel better.’ This is normal and fits the patient’s long established world view. The world view is the operative prior. It is the result of thinking in-the-box. A normal object should, and therefore does have a normal effect. You can tell a person that “x” does not equal “y.” But if they’ve always experienced that “x” does equal “y,” as we’ve seen, they sometimes cannot believe differently, even when confronted with information to the contrary.
Several centuries ago, when the Europeans first came upon the Polynesian islands, the islanders saw the sailing ship’s sails slowly rising over the horizon. It didn’t take long before the size of the ship’s sails and masts were much larger than anything the islanders had ever seen before. Therefore “it” –whatever it was– wasn’t real, and they “couldn’t” see the ship. It was not normal. It was too big for it to be anything from their world, so if it wasn’t part of their world, and it clearly wasn’t, obviously it was not there. As the ship came more in view, it got even bigger, and so was even more unbelievable, and even more not-there. It wasn’t until the European sailors landed and interacted with the islanders that their ship became believable.
“It’s not in my text book so therefore it isn’t real.” …sound familiar?
Let’s illustrate ‘prior’ in an audio context. If a buddy comes over to your house and you are for the first time confronted with the concept that a power cord is now an object with a sound of its own– this is not normal. Normal is: power cords “do” power, not “tone.” It is not a case of been there and done that–you aren’t anywhere near “that.” [use one power cord per component + put extras away = normal]
So, you sit down and actually hear a difference between power cords… Besides, “Aaaargh, another damn variable!,” you might be feeling that maybe the placebo effect might be playing games with you. OK, lets see. The placebo effect has to do with getting an expected result from a known framework (see five paragraphs up). So, was this a known framework? Yes, you know things have to have power, and so things need power cords. Was this an expected result? No. You had no prior personal knowledge of the fact that individual power cords have a sound of their own.
Was this then the result of the placebo effect? No, it couldn’t have been. A non-normal effect from a normal event would not be considered normal. Therefore the placebo effect cannot be responsible, because we now know that the placebo effect is the result of expecting what you’ve gotten before – in-the-box thinking — nothing new. And for this person in this instance with the power cords, it was new, and so it wasn’t normal, and so hearing a difference wasn’t the result of the placebo effect.
When the “normal” thing isn’t normal, and you still get the “normal” result, this is what’s known as a case of perceptual bias. Perceptual-bias is another term for ‘seeing/hearing/feeling (perceiving) what you expect.’ However, “what you expect” is a term intimately connected with a person’s previous experiences.
Unfortunately, it has been a ‘prior’ that the placebo effect had to do with “wishful thinking” –making something up — a distortion of the truth. It is, however, this viewpoint itself that is actually the distortion of the truth.
If you are exposed to something “new,” you will have a tendency to NOT hear it if you are of the mind that this new thing cannot exist. It is out of your norm and you will tend to ignore it even when it does exist. This behavior is normal and to be expected. Skeptics will obviously be more prone to this than others.
In the same way, if it is out of your norm and you do hear it, it is not because of the placebo effect or from a listening bias. This is what the science behind the understanding of perceptual bias says–it is not the other way around.
Prior assumptions can be rooted in truth or fiction, or both at the same time. Whatever your particular priors may happen to be, the truth is that we seek to understand our lives in terms of the things, persons, places, and experiences we’ve already had in our lives. This is normal and to be expected. However, it is how you deal with new information that is the key to your personal growth.
‘If you’re not learning, you’re dying.’ I read this somewhere–I believe it’s true.
1. deviating from or inconsistent with the common order, form, or rule; irregular; abnormal: Advanced forms of life may be anomalous in the universe.
2. not fitting into a common or familiar type, classification, or pattern; unusual: He held an anomalous position in the art world.
3. incongruous or inconsistent.
4. Gram. irregular.
Random House Unabridged Dictionary, Copyright ©© 1997, by Random House, Inc.
something taken for granted; a supposition: a correct assumption.
Random House Unabridged Dictionary, Copyright ©© 1997, by Random House, Inc.,
a person whose real or apparent authority over others inspires or demands obedience and emulation: Parents, teachers, and police officers are traditional authority figures for children.
Random House Unabridged Dictionary, Copyright ©© 1997, by Random House, Inc.
Which is: 1. Lacking in harmony, compatibility, or appropriateness.
2. Inconsistent with reason, logic, or common sense.
A preference or an inclination, especially one that inhibits impartial judgment.
From Perceptual Bias and Dualistic Illusions, William G. Merriman.
An inert substance given in a clinical trial instead of actual medicine.
In Bayesian psychology, a prior is an individual’s assumption about how the world works, a hypothesis about reality.
From The Economist, Jan 5th 2006, Psychology; Bayes Rules
His quote is from A Briefer History of Time, p.137.
Lucas van der Mee
His quote is from Tape Op #51, p.62.
Copyright 2006 Mike Vans Evers
Don’t forget to bookmark us! (CTRL-SHFT-D)
Stereo Times Masthead
Frank Alles, Mike Girardi, Key Kim, Russell Lichter, Terry London, Moreno Mitchell, Paul Szabady, Bill Wells, Mike Wright, Stephen Yan, and Rob Dockery
David Abramson, Tim Barrall, Dave Allison, Ron Cook, Lewis Dardick, Dan Secula, Don Shaulis, Greg Simmons, Eric Teh, Greg Voth, Richard Willie, Ed Van Winkle, and Rob Dockery
Carlos Sanchez, John Jonczyk, John Sprung and Russell Lichter
Site Management Clement Perry
Ad Designer: Martin Perry