1 Nicolaas Vroom | Is the 'uncertainty principle' a law of physics? | Wednesday 1 march 2017 |
2 Douglas Eagleson | Re :Is the 'uncertainty principle' a law of physics? | Thursday 2 march 2017 |
3 richard miller | Re :Is the 'uncertainty principle' a law of physics? | Friday 3 march 2017 |
4 John Heath | Re :Is the 'uncertainty principle' a law of physics? | Saturday 4 march 2017 |
5 Lawrence Crowell | Re :Is the 'uncertainty principle' a law of physics? | Sunday 5 march 2017 |
6 JohnF | Re :Is the 'uncertainty principle' a law of physics? | Sunday 5 march 2017 |
7 richard miller | Re :Is the 'uncertainty principle' a law of physics? | Sunday 5 march 2017 |
8 Nicolaas Vroom | Re :Is the 'uncertainty principle' a law of physics? | Saturday 11 march 2017 |
9 Nicolaas Vroom | Re :Is the 'uncertainty principle' a law of physics? | Saturday 11 march 2017 |
10 Roland Franzius | Re :Is the 'uncertainty principle' a law of physics? | Monday 13 march 2017 |
11 Kevin Aylward | Re :Is the 'uncertainty principle' a law of physics? | Monday 13 march 2017 |
12 Jos Bergervoet | Re :Is the 'uncertainty principle' a law of physics? | Tuesday 14 march 2017 |
13 Nicolaas Vroom | Re :Is the 'uncertainty principle' a law of physics? | Tuesday 14 march 2017 |
14 jc_lavau | Re :Is the 'uncertainty principle' a law of physics? | Tuesday 21 march 2017 |
The uncertainty principle is based on Heisenberg's microscope. See: https://en.wikipedia.org/wiki/Heisenberg's_microscope "Then, according to the laws of classical optics, the microscope can only resolve the position of the electron up to an accuracy of: delta(x) = lambda / sin (epsilon)" delta (x) is small for small lambda or high frequencies. Only this equation you can call a "law of physics" because it describes observations of an experiment? using a photon and one electron. This is not the case for the uncertainty principle in total.
The wave-particle duality is discussed in: https://en.wikipedia.org/wiki/Wave%E2%80%93particle_duality In par 2.5 de Broglie's wavelength is discussed. In this par 2 experiments are mentioned to confirm de Broglie's formula. This declares Broglie's formula as a physical law. In par 2.6 Heisenberg's uncertainty principle is discussed. No direct experiment is mentioned to confirm the particle-wave duality. This indicates that the uncertainty principle has "nothing" to with the particle-wave duality.
The photon is discussed in: https://en.wikipedia.org/wiki/Photon In par 6 "Wavea=80_particle duality and uncertainty principles" is discussed. "In that case, since the wavelength and intensity of light can be varied independently, one could simultaneously determine the position and momentum to arbitrarily high accuracy, violating uncertainty principle." Such a sentence requires confirmation by experiment, which is not mentioned. How can you influence the intensity when one photon is involved?
Hawking radiation is discussed in: https://en.wikipedia.org/wiki/Hawking_radiation "Hawking's work followed etc where etc showed him that, according to the quantum mechanical uncertainty principle, rotating black holes should create and emit particles." IMO you can not use the "uncertainty principle" to explain this phenomena.
The EPR paradox is discussed in: https://en.wikipedia.org/wiki/EPR_paradox "The essence of the paradox is that particles can interact in such a way that it is possible to measure both their position and their momentum more accurately than Heisenberg's uncertainty principle allows, unless measuring" etc. IMO the uncertainty principle can not be used to solve the EPR paradox specific because entangled particles are involved.
Free will is discussed in: https://en.wikipedia.org/wiki/Free_will In par 1.1 incompatibilism we can read: "Physical determinism is currently disputed by prominent interpretations of quantum mechanics, and while not necessarily representative of intrinsic indeterminism in nature, fundamental limits of precision in measurement are inherent in the uncertainty principle." The uncertainty principle has nothing to do with free will. IMO the same for quantum physics in broader sense.
The Schrödinger equation is discussed in: https://en.wikipedia.org/wiki/Schrödinger_equation In par 3.3 "Measurement and uncertainty" Heisenberg uncertainty principle is mentioned. "In classical mechanics, a particle has, at every moment, an exact position and an exact momentum. These values change deterministically as the particle moves according to Newton's laws. Under the Copenhagen interpretation of quantum mechanics, particles do not have exactly determined properties, and when they are measured, the result is randomly drawn from a probability distribution." The issue is (and this has nothing to do with classical mechanics nor with the Copenhage interpretation nor with Newton's Law nor with uncertainty principle) that a particle at any instant has an exact position but simply cannot be measured (with that accuracy) at that specific instant. The consequence is that it is not possible to demonstrate if the trajectories of such particles are accordingly to newton's law
See also?: https://en.wikipedia.org/wiki/Uncertainty_principle#Heisenberg.27s_microscope https://www.sciencedaily.com/releases/2012/09/120907125154.htm
Nicolaas Vroom
Click here to Reply
> | The uncertainty principle is based on Heisenberg's microscope. See: https://en.wikipedia.org/wiki/Heisenberg's_microscope "Then, according to the laws of classical optics, the microscope can only resolve the position of the electron up to an accuracy of: delta(x) = lambda / sin (epsilon)" delta (x) is small for small lambda or high frequencies. Only this equation you can call a "law of physics" because it describes observations of an experiment? using a photon and one electron. This is not the case for the uncertainty principle in total. |
This is just a perspective comment. I remember reading Bohr's version of the uncertainty principle. He had his own microscope study. I am not in a position to track the works of Bohr right now, but will comment.
The microscope was used as cause and effect in abstraction. It was the equivalent to a cause of probability as with the diffraction grating. Inverting the paths of the grating effect on light, appears the focus of image.
It solved the dilemma of sight. If diffraction destroys focus, how does uncertainty allow focus abstractly?
So Bohr examined the dilemma and showed the meaning of the pin hole camera. Why does it both focus and diffract?
> |
IMO the uncertainty principle is not a law of physics or science.
As such it can not be used as a concept to describe the physical reality,
nor as an intrinsique part of any other physical law.
For an explanation see: https://en.wikipedia.org/wiki/Physical_law
"Physical laws are typically conclusions based on repeated scientific
experiments and observations over many years" etc. The "Uncertainty Principle" IMO does not fall in this cathegory. It captures the human limatations of making observations. It captures the fact that the laws of nature are apt with uncertainty. The uncertainty principle is based on Heisenberg's microscope. See: https://en.wikipedia.org/wiki/Heisenberg's_microscope "Then, according to the laws of classical optics, the microscope can only resolve the position of the electron up to an accuracy of: delta(x) = lambda / sin (epsilon)" delta (x) is small for small lambda or high frequencies. Only this equation you can call a "law of physics" because it describes observations of an experiment? using a photon and one electron. This is not the case for the uncertainty principle in total. |
[Moderator's note: Quoted text trimmed. -P.H.]
From a purely mathematical perspective, this is surely a time-frequency domain duality thing? A simple consequence of fourier transforms (when just considering time and frequency). The concept becomes especially strong with a wave interpretation of matter. As an adherent to a mathematical-only description of physics (nature is mathematics, there is no real physics), to me that is enough. Do we really need to have an intuitive feel/law and dig deeper for an answer? Whilst a lot of Physicists are uncomfrotable with a cold, mathematical-only intepretation, there are also probably a lot of chemists uncomfortable with a Physics-only (quantum mechanical) theory of their subject. And a lot of biologists who are uncomfortable with both.
Richard Miller
> | On Wednesday, 1 March 2017 21:40:37 UTC, Nicolaas Vroom wrote: |
>> | IMO the uncertainty principle is not a law of physics or science. As such it can not be used as a concept to describe the physical reality, nor as an intrinsique part of any other physical law. For an explanation see: https://en.wikipedia.org/wiki/Physical_law "Physical laws are typically conclusions based on repeated scientific |
https://www.scientificamerican.com/article/common-interpretation-of-heisenbergs-uncertainty-principle-is-proven-false/
Maybe a tiny ULR.
On Wednesday, March 1, 2017 at 3:40:37 PM UTC-6, Nicolaas Vroom wrote:
> | IMO the uncertainty principle is not a law of physics or science. As such it can not be used as a concept to describe the physical reality, nor as an intrinsique part of any other physical law. |
The uncertainty principle can be derived from the commutator [x, p] = i=C4=A7. One evaluates Tr([x, p]^2), and with completeness sums finds the Heisenberg uncertainty principle. One can also see it as a manifestation of wave mechanics, where classical EM waves obey the relationship =CE=94=CF=89=CE=94t = 1 and for quantum mechanics E = =C4=A7=CF=89 extends this wave result to quantum mechanics as the Heisenberg uncertainty.
LC
> | On Wednesday, 1 March 2017 21:40:37 UTC, Nicolaas Vroom wrote: |
>> |
IMO the uncertainty principle is not a law of physics or science.
As such it can not be used as a concept to describe the physical reality,
nor as an intrinsique part of any other physical law.
For an explanation see: https://en.wikipedia.org/wiki/Physical_law
"Physical laws are typically conclusions based on repeated scientific
experiments and observations over many years" etc.
The "Uncertainty Principle" IMO does not fall in this cathegory.
It captures the human limatations of making observations.
It captures the fact that the laws of nature are apt with uncertainty.
The uncertainty principle is based on Heisenberg's microscope. See: https://en.wikipedia.org/wiki/Heisenberg's_microscope "Then, according to the laws of classical optics, the microscope can only resolve the position of the electron up to an accuracy of: delta(x) = lambda / sin (epsilon)" delta (x) is small for small lambda or high frequencies. Only this equation you can call a "law of physics" because it describes observations of an experiment? using a photon and one electron. This is not the case for the uncertainty principle in total. |
> |
[Moderator's note: Quoted text trimmed. -P.H.] From a purely mathematical perspective, this is surely a time-frequency domain duality thing? A simple consequence of fourier transforms (when just considering time and frequency). The concept becomes especially strong with a wave interpretation of matter. As an adherent to a mathematical-only description of physics (nature is mathematics, there is no real physics), to me that is enough. Do we really need to have an intuitive feel/law and dig deeper for an answer? Whilst a lot of Physicists are uncomfrotable with a cold, mathematical-only intepretation, there are also probably a lot of chemists uncomfortable with a Physics-only (quantum mechanical) theory of their subject. And a lot of biologists who are uncomfortable with both. Richard Miller |
"Time-frequency duality...consequence of Fourier transforms" is >>exactly right<<. Uncertainty has >>nothing whatsoever<< to do with "human limitations of making observations" as suggested by the OP. The best intuitive analogy I've come across to clarify this in the time-frequency domain is the classical example of musical notes. Suppose you want to know a note's frequency and the exact time it played. But to know its frequency, you have to accumulate enough amplitude,time samples to determine/reconstruct its waveform. So if you've done that, then you can't talk about "exact time" because your samples necessarily span a finite time interval. Alternatively, your ear can't determine, and you can't talk about, the frequency of an instantaneous finger-snap sound, which necessarily sounds like a white-noise superposition of many different sinusoidal frequencies (in particular, the frequency,amplitude Fourier decomposition of a delta function in amplitude,time space). So this >>is not<< an issue about "human limitations". It's about the >>very meaning<< (in this case conjugate variables) of the observables you're trying to measure. -- John Forkosh ( mailto: j...@f.com where j=john and f=forkosh )
On Friday, 3 March 2017 09:04:23 UTC, richard miller wrote:
> | On Wednesday, 1 March 2017 21:40:37 UTC, Nicolaas Vroom wrote: |
> > |
IMO the uncertainty principle is not a law of physics or science.
As such it can not be used as a concept to describe the physical reality,
nor as an intrinsique part of any other physical law.
For an explanation see: https://en.wikipedia.org/wiki/Physical_law
"Physical laws are typically conclusions based on repeated scientific
experiments and observations over many years" etc.
The "Uncertainty Principle" IMO does not fall in this cathegory.
It captures the human limatations of making observations.
It captures the fact that the laws of nature are apt with uncertainty.
The uncertainty principle is based on Heisenberg's microscope. See: https://en.wikipedia.org/wiki/Heisenberg's_microscope "Then, according to the laws of classical optics, the microscope can only resolve the position of the electron up to an accuracy of: delta(x) = lambda / sin (epsilon)" delta (x) is small for small lambda or high frequencies. Only this equation you can call a "law of physics" because it describes observations of an experiment? using a photon and one electron. This is not the case for the uncertainty principle in total. |
> |
[Moderator's note: Quoted text trimmed. -P.H.] From a purely mathematical perspective, this is surely a time-frequency domain duality thing? A simple consequence of fourier transforms (when just considering time and frequency). The concept becomes especially strong with a wave interpretation of matter. As an adherent to a mathematical-only description of physics (nature is mathematics, there is no real physics), to me that is enough. Do we really need to have an intuitive feel/law and dig deeper for an answer? Whilst a lot of Physicists are uncomfrotable with a cold, mathematical-only intepretation, there are also probably a lot of chemists uncomfortable with a Physics-only (quantum mechanical) theory of their subject. And a lot of biologists who are uncomfortable with both. Richard Miller |
I thought I'd put a bit more teeth on my reply and then look at some of the assumptions.
Firstly, just the mathematical basics of the Fourier derivation, which I appreciate may be a bit simplistic for the readers, but hopefully harmless.
Using the following definitions
v0 = wave frequency (Hz) w0 = radial frequency (rad/s) = 2.pi.v0 A = arbitrary amplitude
then a wave, represented simply as a complex function of time f(t) (not spatial length) is given by
f(t) = A.exp(i.w0.t)
Note that I have kept the usual, additional spatial 'k.x' term out for simplicity i.e. f(t,x)=A.exp(i.[k.x-w0.t]) where k=wave number = 2pi/wave length).
The Fourier transform 'F(w)' for f(t), for a finite measuring time period T, is a sync function centred about w0, i.e.
F(w) = sqrt(2/pi).sin[(w-w0).(T/2)]/(w-w0)
If you measure this wave for an infinite period of time, it has a Fourier transform F(w) which is a single delta function at the exact frequency w=w0, i.e. measuring for an infinite period of time will give you an exact answer, no uncertainty.
However, for any finite measuring time period T (effectively using a rectangular window function in Fourier analysis), then F(w) has a central peak with frequency width (w-w0)/T, which comes from the locations of the zeros, given by
(w-w0).(T/2) = n.pi, for integer n, |n|>0
or in terms of frequencies 'v' (Hz) instead of radial 'w' (rad/s) then
(v - v0).T = n
In other words, if you measure a pure frequency for a finite time T you will get a spread of frequencies (in the central peak and beyond), with the first zero given for n=1, i.e. you will no longer measure an exact frequency but a range of frequencies, and an uncertainty therefore in the result, where the uncertainty is given by the half width of the central peak, akin to a standard deviation as used in other derivations of the uncertainty principle.
Denoting the spread in frequencies by dv, defined as
dv = v - v0
then (v - v0).T = n can be written as
dv.T = n
and, lastly, re-writing, the finite time T as dT (not necessarily small) then we get
dv.dT = n
So, going over to Physics, and assuming the Planck relation
E = h.v
then dv.dT is re-written in terms of energy dE (dv = dE/h) and time dT as
dE.dT = n.h
since |n|>0, the minimum uncertainty is thus 'h' here and, more generally, dE.dT>=h.
Using a spatial wave function, i.e. f(x)=A.exp(i.k.x), will give similiar for the De Broglie form 'dx.dp' where momentum p=(h/2pi).k.
The exact form of the constant on the right (n.h here) seems to vary a lot in the texts, i.e. h, h/2, hbar, hbar/2 etc. Since its exact value is not central to the below points, I'll leave it for now. Readers are, of course, welcome to correct.
I myself, have a reservation here. For instance, can we not measure the spread in frequencies and take the central frequency? Well, perhaps, but there is still an uncertainty, i.e. the frequency spread or standard deviation. A good physicist will invariably quote the measurement as mean +/- std dev (or something similar), and never not include the error in measurement.
Of course, the Fourier transform derivation is not the only method and, as mentioned in another post, there is the operator commutation derivation, and text-books often give the wave group/packet derivation (finite size of wave group related to the superposition of a finite group of wave lengths - the dx.dp uncertainity ).
So, on to my points (at last)
1) The wave interpretation manifest here by Planck's relation E=hv is absolutely central, same for De Broglie and dp.dx >= h. This is pure physics.
2) The above mathematical derivation means that whether the measurement process interferes with the experiment, no matter how small, is of no consequence, the uncertainty is inherent in the limitations (a finite measurement time duration) and not a consequence of interfering with the object being measured. This is pure mathematics.
3) The mathematics is based upon continuous functions and a belief that such continuity holds right down to absolute zero. This too is pure mathematics. Also, differential operators, by definition, assume continuity, so the commutator derivation is not immune either.
With the above points in mind, if experimenters are claiming to beat the uncertainty limitation, then it would seem they are implying either the wave intepretation is wrong or the mathematics is wrong, or both?
Of course, it could also be that the experiment is simply wrong (faster than light neutrinos anyone?) or, more commonly, a rather liberal interpretation whereby the gain is also offset by a loss such that no real rules are broken. In defence, some of the more popular mags/online science news sites seem to exagerate the findings that were really not the words of the authors of the original paper - I'll assume cynicism on my part for this paragraph and stick with the points!
If there is truly a problem, I would tentatively suggest that the assumption of mathematical continuity is a good candidate but, admittedly, there is a degree of self-interest here because I myself believe that nature is discrete at the lowest level of nature. On the other hand, the assumption that we can forever apply differential equations and use our continuous functions, to mimic nature at its lowest level, Planck lengths, also seems to me to be a leap of faith. Nevertheless, as a reality check, the uncertainty relation and its application to decay lifetimes, spectral widths has certainly treated us kindly as, indeed, QM in general. So what's it to be?
Richard Miller http://www.urmt.org
> |
Work at the university of Toronto has put a dent in the strict
mathematical interpretation of the Heisenberg's Uncertainty Principle.
There we go. > https://www.scientificamerican.com/article/common-interpretation-of-heisenbergs-uncertainty-principle-is-proven-false/
Maybe a tiny ULR. |
In the first message in this thread I mentioned: https://www.sciencedaily.com/releases/2012/09/120907125154.htm
In this article they write: "It is often assumed that Heisenberg's uncertainty principle applies to both the intrinsic uncertainty that a quantum system must possess, as well as to measurements. These results show that this is not the case and demonstrate the degree of precision that can be achieved with weak-measurement techniques."
The two concepts the supposed "uncertainty inside physical processes" specific related to elementary particles versus "measurements" of the same processes are completely different issues. The second is a real issue because every measurement disturbes what you want to measure. This is specific an issue if you want to measure the same twice.
In order to get a grip on the first issue please study this: https://en.wikipedia.org/wiki/Bohr_model Specific: https://en.wikipedia.org/wiki/Bohr_model#Shortcomings Here we read: "The model also violates the uncertainty principle in that it considers electrons to have known orbits and locations, two things which can not be measured simultaneously." This is considered a short coming of the Bohr model.
IMO the problem is in the uncertainty principle. The Bohr model is in some sense a mathematical model of an atom. This model allows you to calculate the emission and absorption energies of photons, validated by experiments. This mathematical model is rather rigid implying that the underlying physical processes are also rather rigid implying 'no' uncertainty.
My understanding is to valadite by experiment is rather simple when many atoms are involved but tricky if you only want to try one.
Nicolaas Vroom
> | Suppose you want to know a note's frequency and the exact time it played. |
The subject of this thread is not about music but much more about the behaviour single photons and electrons.
The study of music is a complete different subject.
See for example:
https://en.wikipedia.org/wiki/Digital_recording
The challenge is to make an exact copy and that is technical
speaking a very difficult subject, in continuous evolution.
For single photons you should think about a single photon going either through a single or a double slit.
For a single electron you should think about the article
https://www.sciencedaily.com/releases/2012/09/120907125154.htm
In the sketch in the beginning of that document shows:
"The incoming gamma ray (in green) which is scattered by the electron
(in blue)"
IMO to draw such a sketch is rather simple.
The incomming gamma ray and the electron are aligned.
The problem I have how do you take care that this is also the
case in an actual experiment, i.e. that the two are aligned?
How do you know where the electron is (in case there is only one)?
How do you know where the electron is when many electrons are
involved around the nucleus of an atom?
This are the type of questions that make the uncertainty principle interesting.
Nicolaas Vroom https://www.nicvroom.be
> | IMO the problem is in the uncertainty principle. The Bohr model is in some sense a mathematical model of an atom. This model allows you to calculate the emission and absorption energies of photons, validated by experiments. |
No. The Bohr model is a nice example for a theory that accidentally gives the right numbers from an abstruse model.
The idea of a Bohr orbit with integer wave number on a circle makes the energy dependent on circular orbits confined to a greatest circle. Classically the circilar Kepler orbits are the states of minimal energy at fixed angular momentum, which is universally a conserved quantity and quantized in the integers in wave states by the wave number on circles of latitude.
Only the accidental Kepler degeneration of states with respect to the angular momentum at fixed energy, special to the 1/r and r^2 potentials relates the L-wave number at lowest energy to the degenerate total energy quantum number.
The correct quantum energy level scaling by the Bohr radius r/rB is not more than another mathematical accident reavealing a greater symmetry group than the evident rotation group.
There is no chance to fit a bunch of waves of the same wavelength onto any of the set of Kepler ellipses allowed by constant quantized energy and variing integer angular momentum.
Most puzzling fact is that the classical states with angular meomentum L=3D0 at fixed energy are ellipses with small axis=3D0, rays with the center at one end classically, while they are represented by spherical symmetric constant charge distributions in quantum theory.
The tragedy of the Bohr model continues if one is trying to test the Schr=F6dinger model for the Helium atom with four particles.
In the Helium atom there are no symmetries at all and no modelling idea fits the exactly measurable spectral data in an acceptabe precision until the day before yesterday.
The areas of atomic and nuclear states and molecular binding present a completely unsolved or unrelated bunch of stories of n-particle entanglement mechanisms.
The simulation exceeds any computer capacity conceivable even in the times of the intelligent cloud.
A funny example of interaction of modelling and reality is the financial market world.
It is not simulated on computers to model human action and the statistical time evolution anymore.
The global market today is the bunch of algorithms running on computers to make decisions and the set of statistical analysis tools to produce decision controlling data sets in the microsecond time scale.
--
Roland Franzius
On Friday, March 3, 2017 at 4:04:23 AM UTC-5, richard miller wrote:
> | On Wednesday, 1 March 2017 21:40:37 UTC, Nicolaas Vroom wrote: |
>> |
IMO the uncertainty principle is not a law of physics or science.
As such it can not be used as a concept to describe the physical reality,
nor as an intrinsique part of any other physical law.
For an explanation see: https://en.wikipedia.org/wiki/Physical_law "Physical laws are typically conclusions based on repeated scientific experiments and observations over many years" etc. The "Uncertainty Principle" IMO does not fall in this cathegory. It captures the human limatations of making observations. It captures the fact that the laws of nature are apt with uncertainty. The uncertainty principle is based on Heisenberg's microscope. See: https://en.wikipedia.org/wiki/Heisenberg's_microscope "Then, according to the laws of classical optics, the microscope can only resolve the position of the electron up to an accuracy of: delta(x) = lambda / sin (epsilon)" delta (x) is small for small lambda or high frequencies. Only this equation you can call a "law of physics" because it describes observations of an experiment? using a photon and one electron. This is not the case for the uncertainty principle in total. |
[[Mod. note -- 44 excessively-quoted lines snipped here. -- jt]]
> | https://www.scientificamerican.com/article/common-interpretation-of-heisenbergs-uncertainty-principle-is-proven-fals/ |
This is stuff is actually well known, by those that know, at least since 2001
https://plato.stanford.edu/entries/qt-uncertainty/#2.3
To wit, the Kennard uncertainty relation derivation says, take repeated measurements of an objects position and calculate the standard deviation of position, take repeated measurements of momentum, and calculate the standard deviation of momentum. The product of these two standard deviations is the uncertainty relation. It is a *statistical* relation, it says nothing about the *measurement* errors in an individual measurement. It also does not assume at the same time either.
It's debatable if Heisenberg's version of *simultaneous*, non statistically, individual measurements has ever been formally derived. Heisenberg's approach was only a hand waving argument.
Kevin Aylward
http://www.kevinaylward.co.uk/gr/index.html
http://www.kevinaylward.co.uk/qm/index.html
> |
The tragedy of the Bohr model continues if one is trying to test
the Schrödinger model for the Helium atom with four particles.
In the Helium atom there are no symmetries at all and no modelling idea fits the exactly measurable spectral data in an acceptabe precision until the day before yesterday. The areas of atomic and nuclear states and molecular binding present a completely unsolved or unrelated bunch of stories of n-particle entanglement mechanisms. The simulation exceeds any computer capacity conceivable even in the times of the intelligent cloud. |
For the more complex cases this might be true, but where "nuclear states" refers to the simple baryons like the proton and neutron, they can quite accurately be modeled by today's lattice calculations, I think within 1% for the energy levels.
And of course for the atomic states, it just depends on how far you go beyond Hydrogen and Helium (and what accuracy you then still want to obtain). In fact people can call it a "simulation" regardless of that, which is one of the problems..
-- Jos
> | Am 11.03.2017 um 03:25 schrieb Nicolaas Vroom: |
> > | IMO the problem is in the uncertainty principle. The Bohr model is in some sense a mathematical model of an atom. This model allows you to calculate the emission and absorption energies of photons, validated by experiments. |
> |
No. The Bohr model is a nice example for a theory that accidentally gives the right numbers from an abstruse model. |
I do not want to argue about that (in detail). (I agree most probably the model is not 100% correct, but still IMO a good starting point in human understanding)
My whole point is the issue if the Bohr Model anything has to do with the uncertainty principle. IMO it does not.
At the same time if you want to validate the model based on measurements as part of certain experiments, you have to take into account that these measurements are not exact and apt with human or technical limitations.
Thanks for your comments.
Nicolaas Vroom https://www.nicvroom.be/
The principle of cruel uncertainty of the immortal prophet, is a mere
commercial repackaging of the properties of the Fourier transform,
established a century before.
If a photon is well defined in frequency, it is looooooooong.
If a photon is well concentrated, say from a femtosecond laser, it is
poorly defined in frequency. That's all.
If long, it gives neat interferences. Whose are observed be the
repetition of many photons, on a concurrency of many potential
absorbers.
If too short, brfff !
[Moderator's note: While some aspects of the uncertainty principle have classical analogues, as mentioned above, there are of course aspects which have no counterpart in classical physics. -P.H.]
The mythology of cruel uncertainty was elaborated to save the unexcusable mythology of corpuscles and "corpuscular aspects".
However there are many optical results that are not compatible at all with the corpuscular mythology. In french : http://jacques.lavau.deonto-ethique.eu/Physique/Microphysique_contee_mars2017.pdf
--
http://jacques.lavau.deonto-ethique.eu/Physique/Microphysique_contee.pdf
http://jacques.lavau.deonto-ethique.eu/Physique/4e_couverture.pdf
http://deontologic.org/quantic
Back to my home page Contents of This Document