John Brodix Merryman Jr.
2 min readMay 12, 2024

--

Math is model making. As such, it is abstracted reductionism.

The resulting maps cannot include all the information from the territory, or they would revert back to noise. Whiteout.

Consider that epicycles were brilliant math, as a model of our view of the cosmos, yet the geocentric point of view is subjective, not objective. So using it as principle resulted in flawed premises and the resulting "shut up and calculate" became that much garbage in, garbage out.

The crystalline spheres were lousy physics, as explanation.

Given that initial premises always originate from a more limited point of view than the subsequent accumulated knowledge, but they provide the foundations for the models, the tendency is to patch, rather than falsify.

Consider current cosmology;

When it was realized that cosmic redshift increased proportional to distance in all directions, it either meant, that with classic doppler shift, we are at the center of the universe, or this redshift was an optical effect. Given the only known cause at the time was some sort of medium slowing the light, aka, ether and tired light, yet there didn't seem to be any other distorting effect, so it was dismissed.

The solution was to use the premise of spacetime to argue that space itself must be expanding. Which totally ignores the central premise on which spacetime is based, the speed of light as a Constant.

If space were to expand, the speed of the light crossing it should increase proportionally, in order to remain Constant. Yet that would likely negate explaining redshift.

Two metrics are being derived from the same light. One based on the speed and one based on the spectrum. If the speed were being used as the numerator, it would be a tired light theory, but as an expanding space theory, the speed is still the explicit denominator. The metric used to calibrate the expansion. "Space is what you measure with a ruler." And the Ruler is still the speed.

One way light does redshift over distance is as multi spectrum packets, as the higher frequencies dissipate faster, yet that would mean we are sampling a wave front and the quantification of light is an artifact of its detection and measurement. A "loading," or "threshold" theory.

Imagine how that would go down in Quantum Physics. Better to patch, than falsify.

That's why physics has been spinning its wheels for the last 50 years.

The models are tools, not gods.

There is no universal map/model, as they break down over infinities, disappear at equilibrium/zero and everything in-between is relational.

--

--

John Brodix Merryman Jr.
John Brodix Merryman Jr.

Written by John Brodix Merryman Jr.

Having an affair with life. It's complicated.

Responses (1)