But in a general theory of relativity (which is forced by Mach's Principle to also be a relativistic theory of gravity), the gravitational field is space. The field doesn't sit inside a background metric, it is the background metric.
So with this sort of model, we've got no obvious starting point – no obvious starting geometry, and not even an obvious starting topology, unless we start cheating and putting in some critical parameters by hand, according to what we believe to be the correct values.
We make an exasperated noise and throw in a quick idealisation. We say that we're going to suppose that matter is pretty smoothly and evenly distributed through the universe (which sounds kinda reasonable), and then we use this assumption of a homogeneous distribution to argue that there must therefore be a fairly constant background field. That then gives us a convenient smooth, regular background shape that we can use as a backdrop, before we start adding features like individual stars, and galaxies.
That background level gives us our assumed gravitational floor.
We know that this idea isn't really true, but it's convenient. Wheeler and others tried exploring different approaches that might allow us to do general relativity without these sorts of starting simplifications (e.g. the pregeometry idea), but while a "pregeometrical" approach let us play with deeper arguments that didn't rely on any particular assumed geometrical reduction, getting from first principles to new, rigorous predictions was difficult.
So while general relativity in theory has no prior geometry and is a completely free-standing system, in practice we tend to implicitly assume a default initial dimensionality and a default baseline background reference rate of timeflow, before we start populating our test regions with objects. We allow things to age more slowly than the baseline rate when they're in a more intense gravitational field, but we assume that the things can't be persuaded to age more quickly than the assumed background rate (and that signals can't travel faster than the associated background speed of light) without introducing "naughty" hypothetical negative gravitational fields (ref: Positive Energy Theorem).
This is one of the reasons why we've made almost no progress in warpdrive theory over half a century – our theorems are based on the implicit assumption of a "flat floor", and this makes any meaningful attempt to look at the problem of metric engineering almost impossible.
Now to be fair, GR textbooks are often quite open about the fact that a homogeneous background is a bit of a kludge. It's a pragmatic step – if you're going to calculate, you usually need somewhere to start, and assuming a homogeneous background (without defining exactly what degree of clumpiness counts as "homogeneous") is a handy place to start.
But when we make an arbitrary assumption in mathematical physics, we're supposed to go back at some point and sanity-check how that decision might have affected the outcome. We're meant to check the dependencies between our initial simplifying assumptions and the effects that we predicted from our model, to see if there's any linkage.
So ... what happens if we throw away our "gravitational floor" comfort-blanket and allow the universe to be a wild and crazy place with no floor? What happens if we try to "do" GR without a safety net? It's a vertigo-inducing concept, and a few "crazy" things happen:
Result 1: Different regional expansion rates, and lobing
Without the assumption of a "floor", there's no single globally-fixed expansion rate for the universe. Different regions with different "perimeter" properties can expand at different rates. If one region starts out being fractionally less densely populated than another, its rate of entropic timeflow will be fractionally greater, the expansion rate of the region (which links in to rate of change of entropy) will be fractionally faster, and the tiny initial difference gets exaggerated. It's a positive-feedback inflation effect. The faster-expanding region gets more rarefied, its massenergy-density drops, the background web of light-signals increasingly deflects around the region rather than going through it, massenergy gets expelled from the region's perimeter, and even light loses energy while trying to enter, as it fights "uphill" against the gradient and gets redshifted by the accelerated local expansion. The accelerated expansion pushes thermodynamics further in the direction of exothermic rather than endothermic reactions, and time runs faster. Faster timeflow gives faster expansion, and faster expansion gives faster timeflow.
The process is like watching the weak spot on an over-inflated bicycle inner tube – once the trend has started, the initial near-equilibrium collapses, and the less-dense region balloons out to form a lobe. Once a lobe has matured into something sufficiently larger than its connection region, it starts to look to any remaining inhabitants like its own little hyperspherical universe. Any remaining stars caught in a lobe could appear to us to be significantly older than the nominal age of the universe as seen from "here and now", because more time has elapsed in the more rarefied lobed region. The age of the universe, measured in 4-coordinates as a distance between the 3D "now-surface" and the nominal location of the big bang (the radial cosmological time coordinate, referred to as "a" in MTW's "Gravitation",§17.9), is greater at their position than it is at ours.
With a "no-floor" implementation of general relativity, the universe's shape isn't a nice sphere with surface crinkles, like an orange, it's a multiply-lobed shape rather more like a raspberry, with most of the matter nestling in the deep creases between adjacent lobes (book, §17.11). If there was no floor, we'd expect galaxies to align in three dimensions as a network of sheets that form the boundary walls that lie between the faster-expanding voids.
And if we look at our painstakingly-plotted maps of galaxy distributions, that's pretty much what seems to be happening.
Result 2: Galactic rotation curves
If the average background field intensity drops away when we leave a galaxy, to less than the calculated "floor" level, then the region of space between galaxies is, in a sense, more "fluid". These regions end up with greater signal-transmission speeds and weaker connectivity than we'd expect by assuming a simple "floor". The inertial coupling between galaxies and their outside environments becomes weaker, and the influence of a galaxy's own matter on its other parts becomes proportionally stronger. It's difficult to get outside our own galaxy to do comparative tests, but we can watch what happens around the edges of other rotating galaxies where the transition should be starting to happen, and we can see what appears to be the effect in action.Result 3: Enhanced overall expansion
In standard Newtonian physics (and "flat-floor" GR), this doesn't happen. A rotating galaxy obeys conventional orbital mechanics, and stars at the outer rim have to circle more slowly than those further in if they're not going to be thrown right out of the galaxy. So, if you have a rotating galaxy with persistent "arm" structures, the outer end of the arm needs to be rotating more slowly, which means that the arm's rim trails behind more and more over time. This "lagging behind" effect stretches local clumps into elongated arms, and then twists those arms into a spiral formation.
When we compare our photographs of spiral-arm galaxies with what the theory predicts, we find that ... they have the wrong spiral. The outer edges aren't wound up as much as "flat-floor" theory predicts, and the outer ends of the arms, although they're definitely lagged, seem to be circling faster than ought to be possible.
So something seemed to be wrong (or missing) with "flat-floor" theory. We could try to force the theory to agree with the galaxy photographs by tinkering with the inverse square law for gravity (which is a little difficult, but there have been suggestions based on variable dimensionality and string theory, or MOND), or we could fiddle with the equations of motion, or we could try to find some way to make gravity weaker outside a galaxy, or stronger inside.
The current "popular" approach is to assume that current GR and the "background floor" approach are both correct, and to conclude that there therefore has to be something else helping a galaxy's parts to cling together – by piling on extra local gravitation, we might be able to "splint" the arms to give them enough additional internal cohesiveness to stay together.
Trouble is, this approach would require so much extra gravity that we end up having to invent a whole new substance – dark matter – to go with it.
We have no idea what this invented "dark matter"might be, or why it might be there, or what useful theoretical function it might perform, other than making our current calculations come out right. It has no theoretical basis or purpose other than to force the current GR calculations to make a better fit to the photographs. Its only real properties are that its distribution shadows that of "normal" matter, it has gravity, and ... we can't see it or measure it independently.
So it'd seem that the whole point of the "dark matter" idea is just to recreate the same results that we'd have gotten anyway by "losing the floor".
Because the voids are now expanding faster than the intervening regions, the overall expansion rate of the universe is greater, and ... as seen from within the galactic regions ... the expansion seems faster than we could explain if we extrapolated a galaxy-dweller's sense of local floor out to the vast voids between galaxies. To someone inside a galaxy, applying the "homogeneous universe" idealisation too literally, this overall expansion can't be explained unless there's some additional long-range, negatively-gravitating field pushing everything apart.
So again, the current "popular" approach is to invent another new thing to explain the disagreement between our current "flat-floor" calculations and actual observations. This one, we call "Dark Energy", and again, it seem to be another back-door way to recreating the results we'd get by losing the assumed gravitational background floor.
So here's the funny thing. We know that the assumption of a "homogenous" universe is iffy. Matter is not evenly spread throughout the universe as a smooth mist of individual atoms. It's clumped into stars and planets, which are clumped into star systems, which are clumped into galaxies. Galaxies are ordered into larger void-surrounding structures. There's clumpiness and gappiness everywhere. It all looks a bit fractal.
It might seem obvious that, having done the "smooth universe" calculations, we'd then go back and factor in the missing effect of clumpiness, and arrive at the above three (checkable) modifying effects, (1) lobing (showing up as "void" regions in the distribution of galaxies), (2) increased cohesion for rotating galaxies, and (3) a greater overall expansion rate. It also seems natural that having done that exercise and having made those tentative conditional predictions, that when all three effects were discovered for real, the GR community would be in a happy mood.
But we didn't get around to doing it. All three effects took us by surprise, and then we ended up scrabbling around for "bolt-on" solutions (dark matter and dark energy) to force the existing, potentially flawed approach to agree with the new observational evidence.
The good news is that the "dark matter"/"dark energy" issue is probably fixable by changing our approach to general relativity, without the sort of major bottom-up reengineering work needed to fix some of the other problems. At least with the "floor" issue, the "homegeneity" assumption is already recognised as a potential problem in GR, and not everyone's happy about our recent enthusiasm for inventing new features to fix short-term problems. We might already have the expertise and the willpower to solve this one, comparatively quickly.
Getting it fixed next year would be nice.
1 comment:
Suns ought to be strongly positively charged by virtue of themass difference between protons and electrons. Thus,they should repel each other, making G look variable. That would reproduce soemthing like MOND quite easily, and do away with several sources of evidence for dark matter. Also, large amounts of free charge, charge separation and so on could give a CDM mimic. As this woudl continue unabated, luminous bits of the old universe ought to be repelling - well, attracting less - more strongly than young ones.
Post a Comment