

| Quantum mechanics |
|---|
| {\displaystyle {\hat {H}}|\psi (t)\rangle =i\hbar {\frac {\partial }{\partial t}}|\psi (t)\rangle } |
In physics, the term observer effect refers to changes that the act of observation will make on a phenomenon being observed. This is often the result of instruments that, by necessity, alter the state of what they measure in some manner. A commonplace example is checking the pressure in an automobile tire; this is difficult to do without letting out some of the air, thus changing the pressure. This effect can be observed in many domains of physics and can often be reduced to insignificance by using better instruments or observation techniques.
In quantum mechanics, there is a common misconception (which has acquired a life of its own, giving rise to endless speculations) that it is the mind of a conscious observer that causes the observer effect in quantum processes. It is rooted in a basic misunderstanding of the meaning of the quantum wave function ψ and the quantum measurement process.[1][2]
According to standard quantum mechanics, however, it is a matter of complete indifference whether the experimenters stay around to watch their experiment, or leave the room and delegate observing to an inanimate apparatus, instead, which amplifies the microscopic events to macroscopic[3] measurements and records them by a time-irreversible process.[4] The measured state is not interfering with the states excluded by the measurement. As Richard Feynman put it: “Nature does not know what you are looking at, and she behaves the way she is going to behave whether you bother to take down the data or not.”[5]
Historically, the observer effect has also been confused with the uncertainty principle.[6][7]

Ooh – dabbling in physics :} I’m glad your quotes have pointed out some of the nonsense that is often attached to quantum mechanics. Yes, things do get a bit fuzzy when we probe very closely but I, personally, do not have a problem with that. We actually have rather hazy ideas of how things move. Newton played a clever trick, when he came up with his idea of fluxions (we know it as calculus), where he represented motion as a series of tiny steps (like frames of a cine film). He assumed that the steps could be made smaller and smaller ad infinitum but evidence today indicates this is not the case. Should we really be so surprised?
To be blunt,No.:)
Achilles and the slug