PaulHoule 2 days ago

When I worked at arXiv one of my coworkers was a fresh astrophysics PhD who was cynical about the state of the field. He thought that we didn't know what the hell was going on with accretion disks but that a few powerful people in the field created the impression that we did and that there was no dissent because it was so difficult to get established in the field.

When I first saw the ΛCDM model my first impression was that I'd didn't believe it, it seemed bad enough to have dark matter that we didn't understand (though WIMPs and axions are plausible) but adding equally mysterious and physically unmotivated dark energy made it seem just an exercise in curve fitting.

There have been a longstanding problem that the history of the universe and cosmological distance scale haven't made sense.

https://medium.com/starts-with-a-bang/the-hubble-tension-sti...

When I was getting my PhD in condensed matter physics I was going to the department colloquium all the time and seeing astrophysics talks about how some people thought the hubble constant was 40 km/s/Mpc and others thought it was 80 km/s/Mpc. With timescape cosmology maybe they were both right.

Another longstanding problem in astronomy is that since the 1970s it's been clear we have no idea of how supermassive black holes could have formed in the time we think the universe has existed. With the JWST there are a flood of results that show the first 500 million years of the universe probably lasted a lot more than 500 million years

https://iopscience.iop.org/article/10.3847/2041-8213/ac9b22

  • Keysh a day ago

    > There have been a longstanding problem that the history of the universe and cosmological distance scale haven't made sense. https://medium.com/starts-with-a-bang/the-hubble-tension-sti...

    > When I was getting my PhD in condensed matter physics I was going to the department colloquium all the time and seeing astrophysics talks about how some people thought the hubble constant was 40 km/s/Mpc and others thought it was 80 km/s/Mpc. With timescape cosmology maybe they were both right.

    You're (mis)remembering a different (old) problem and confusing it with a new one. The problem in the 1970s and 1980s was: what is the local expansion rate of the universe? Where "local" mean "within a few hundred megaparsecs". There were two main groups working on the problem: one group tended to find values of around 50 km/s/Mpc and other values of around 100. Gradually they began to converge (in the early 1990s, the low-H0 group getting values of around 60, the high-H0 group values of around 80), until a consensus emerged that it was in the low 70s, which is where we are now.

    The "Hubble tension" is a disagreement between what we measure locally (i.e., a value in the low 70s) and what theory (e.g., LCDM) says we should measure locally, if you extrapolate the best-fitting cosmological models -- based on cosmological observations of the CMB, etc. -- down to now (a value in the upper 60s). This has only become a problem very recently, because the error bars on the local measurement and the cosmological predictions are now small enough to suggest (maybe/probably) meaningful disagreement.

    > Another longstanding problem in astronomy is that since the 1970s it's been clear we have no idea of how supermassive black holes could have formed in the time we think the universe has existed. With the JWST there are a flood of results that show the first 500 million years of the universe probably lasted a lot more than 500 million years https://iopscience.iop.org/article/10.3847/2041-8213/ac9b22

    That's not a "longstanding" problem, it's a problem from the last 25 years or so. In order for there to be a problem, you have to have what you think are reliable estimates for the age of the universe and evidence for large supermassive black holes very early in the universe. This is something that has emerged only relatively recently.

    (Your link, by the way, is to a paper that has nothing to do with black holes.)

  • Maro a day ago

    I was doing an astrophsyics Phd 15 years ago, and one of the many reasons I abandoned it is exactly this. To get published, I would have had to start all my papers introducing and assuming the ΛCDM model, even though it just didn't seem right to me (too many "dark" components, too many assumptions, inflation).

    To be fair, people a lot smarter than me think it's good, or good enough.

  • somenameforme a day ago

    Slightly tangential but how does the ad hoc nature of things like cosmic inflation seemingly not bother more people? Quite the opposite, it's a rather lauded discovery. This is more cosmology than astronomy, but at least reasonably related. That topic alone destroyed my interest in an academic pursuit of astronomy.

    'Here's an idea that makes no logical sense, has no physical argument whatsoever (let alone evidence) in support of its existence, and just generally seems completely absurd - but if we ignore all of that, it solves a lot of other pesky problems with reality, as observed, practically falsifying other lauded theories.'

    Just add more epicycles?

    • lutorm a day ago

      Well, it's not like people pulled it out of thin air. Both inflation and the lambda-CDM models are solutions to the GR equations, so in that sense it's perfectly justifiable to see if general relativity can explain the data. I don't think it's fair to say that it "makes no logical sense".

      • misja111 a day ago

        I assume that OP was talking about the cosmic inflation theory that claims there was a rapid expansion immediately after the big bang. I don't see how that's a solution to the GR equation, could you maybe explain/give a link?

        • russdill 19 hours ago

          There is an addition of a inflation field. Observations show that such an inflation happened, and there's several ideas of inflation fields that are compatible with the standard model and match observations. The equations that make the observed expansion happen in the presence of such a field the friedmann equations, which are GR solutions.

          The field isn't just something that magically expands things, it just effects energy density. The rapid expansion is then a consequence of GR.

    • stouset a day ago

      Nobody’s happy with dark energy, it’s just the only framework we have that fits the data. All the other ideas might be brilliant and inspired but are measurably worse at describing the real world.

      Even the name “dark energy” is a tacit acknowledgment that—along with dark matter—we have no clue what the underlying physics actually is.

      • cryptonector a day ago

        How is timescapes measurably worse? It's not a new theory, so perhaps it's been tested, and if it's failed why is it back in the news? (Sometimes failures get back in the news. It's a fair question.)

        • russdill a day ago

          It starts with the assumption that our observations of cosmic expansion are false and due to time dilation from a clumpy universe, then works out how clumpy the universe must be to account for our observations.

          For timescapes to "work" time must run 38% faster in cosmic voids. This works out to a density difference of about 100,000 times what we observe with many existing methods.

          • cryptonector a day ago

            > It starts with the assumption that our observations of cosmic expansion are false

            It starts with the assumption that our measurements (observations!) of attenuation and red-shift of standard candles are correct. Your characterization is wildly wrong, unless you meant to use a different word than 'observations'.

            > and due to time dilation from a clumpy universe, then works out how clumpy the universe must be to account for our observations.

            Yes, it does work backwards. That's insufficient by itself to say that timescapes is incorrect. It's concerning though because they work out what densities must be in order to explain away all of the attenuation/red-shift disparities. However the reasoning time dilation is not wrong -- in fact, it's blindingly obviously correct if GR is correct, it's just that it might also be wildly insufficient -- it might just be in the noise.

            > For timescapes to "work" time must run 38% faster in cosmic voids. This works out to a density difference of about 100,000 times what we observe with many existing methods.

            Thanks. This is responsive to my question, but it's not yet dispositive. We could find that:

              - some of our observations are incorrect
              - some of our interpretations of those
                observations are incorrect
              - timescapes explains some but not all
                of the apparent acceleration of the
                expansion of the universe
            
            Also, density differences need to take into account density differences at the time that a ray of starlight we observe today entered each void along the way to us, not current apparent density disparities. It has to be that 8 billion years ago the difference in density between clusters and voids was much starker, though the voids would have been smaller, and perhaps the difference was not enough orders of magnitude.
            • russdill a day ago

              There's a ton of theoretical work out there that is about building a foundation for future work. This is one of those works. But people tend to overlook the part where the work is just a piece, and many other things most first fall into place and over hype it. This is course isn't the fault of the authors.

    • gosub100 a day ago

      What do you mean by "bother"? The universe appears to be expanding, so I assume you don't deny that evidence, correct?

  • colechristensen 2 days ago

    On the topic of early black hole growth I saw this released a couple of months ago, an early black hole apparently growing at 40x the Eddington limit 1.5 billion years after the big bang.

    https://chandra.si.edu/press/24_releases/press_110424.html

    > A super-Eddington-accreting black hole ~1.5 Gyr after the Big Bang observed with JWST

    https://www.nature.com/articles/s41550-024-02402-9

    • dotancohen 2 days ago

      Correct me if I'm wrong, but the term Eddington limit is a bit misleading as it does not describe some physical rate that cannot be exceeded. Lots of super Eddington objects are known.

      • PaulHoule 2 days ago

        It's the point where light pressure can blow off the outer layers of a star

        https://en.wikipedia.org/wiki/Eddington_luminosity

        Objects that pulse like

        https://en.wikipedia.org/wiki/Eta_Carinae

        can evade it and there are other ways too.

        When it comes to super-massive black holes there is the question of how quickly stuff can even get close enough to the black hole to get into the accretion disk.

        500M years is a long time for the kind of large star that becomes a black hole (blows up in 10M years or so), but if one black hole is going to merge with another black hole and that is going to merge with another black hole and so on there is no Eddington limit (no EM radiation!) but rather the even slower process of shedding angular momentum via gravitational radiation. (One highlight of grad school was the colloquium talk where we got to hear the signal from two black holes colliding almost 20 years before it was detected for real)

        I hope JWST sees

        https://en.wikipedia.org/wiki/Stellar_population#Population_...

        Note those Pop 3 stars have a higher Eddington limit because they've got hardly any "metal" in them which means light interacts with them differently, although astronomers have the strange (to me) conventional that anything heavier than Helium is a metal which includes, say, oxygen. (As a cond-mat PhD I think a metal is something that has free electrons, which could be one of those elements towards the left side of the periodic table or could be a doped semiconductor or polymer like polyaniline)

        • dotancohen a day ago

          Thank you. I did not know that the first cosmic generation of stars had a higher Eddington limit. Why does light interact differently with hydrogen and helium than it does with the other elements? Does it have anything to do with having only a single layer of electrons in the atom?

gammarator 2 days ago

Here’s an extended comment by another astrophysicst: https://telescoper.blog/2025/01/02/timescape-versus-dark-ene...

The most important bit:

> The new papers under discussion focus entirely on supernovae measurements. It must be recognized that these provide just one of the pillars supporting the standard cosmology. Over the years, many alternative models have been suggested that claim to “fix” some alleged problem with cosmology only to find that it makes other issues worse. That’s not a reason to ignore departures from the standard framework, but it is an indication that we have a huge amount of data and we’re not allowed to cherry-pick what we want.

  • throwawaymaths 2 days ago

    the thing is, this is not really an alternative model. it's rather actually bothering to do the hard math based on existing principles (GR) and existing observations, dropping the fairly convincingly invalidated assumption of large scale uniformity in the mass distribution of the universe.

    if anything the standard model of cosmology should at this point be considered alternative as it introduces extra parameters that might be unnecessary.

    so yeah it's one calculation. but give it time. the math is harder.

    • sandgiant 2 days ago

      This has the same number of free parameters as LambdaCDM. Also this result only looks supernovae, i.e. low redshift sources. LambdaCDM is tested on cosmological scales.

      Very interesting, but “more work is needed”.

      • throwawaymaths 2 days ago

        thats not the case, if, as is increasingly speculated, the lambda is not constant over time. you figure two parameters for linear and three for a quadratic experience

    • bsder 2 days ago

      > dropping the fairly convincingly invalidated assumption of large scale uniformity in the mass distribution of the universe.

      The problem with that is then you need a mechanism that creates non-uniformly distributed mass.

      Otherwise, you are simply invoking the anthropic principle: "The universe is the way it is because we are here."

      • marcyb5st a day ago

        I think that can be mitigated in three ways: our understanding of inflation is flawed, there were more "nucleation" sites where our universe came to be, and there are the already theorized baryonic acoustic oscillations that could introduce heterogeneity in the universe.

        Maybe is a combination of these, maybe something else. If nothing else, the uniformity is less probable than a mass distribution with variance (unless there is a phenomenon like inflation that smoothen things out, but also that was introduced to explain the assumption of a homogeneous universe). I concede that explaining the little variance in the CMB with our current understanding is hard when dropping homogeneity assumption however.

      • throwawaymaths a day ago

        > The problem with that is then you need a mechanism that creates non-uniformly distributed mass.

        you need no such thing. thats like saying "i refuse to acknowledge the pacific ocean to be so damn large without a mechanism". you dont need that. it just is. this doesnt preclude the existence of such a mechanism. but for any (legit) science, mechanistic consideration should be strictly downstream of observation.

      • jcarreiro a day ago

        > The problem with that is then you need a mechanism that creates non-uniformly distributed mass.

        The mechanism is gravity; and we have good observational evidence that the mass distribution of the universe is not uniform, at least at the scales we can observe (we can see galaxy clusters and voids).

      • zmgsabst 2 days ago

        You don’t need a mechanism to point out a fact contradicts an assumption, eg, our measurements show non-uniform mass at virtually all scales (including billions of light years). There simply is no observable scale with uniform mass.

        Obviously there’s some mechanism which causes that, but the mere existence of multi-billion light year structures invalidates the modeling assumption — that assumption doesn’t correspond to reality.

        • throwawaymaths a day ago

          yeah the ~1b ly nonuniformity is pretty much there. the ~10b ly uniformity is still early days but looking more and more likely as more data roll in (unless there is a systematic problem)

    • User23 2 days ago

      Calculation is harder in a world of functionally limitless compute is sort of interesting. Where do we go from here?

  • austin-cheney 2 days ago

    That sounds like regression.

    If this problem of regression occurs as regularly as your quote implies then the fault is not in these proposed alternatives, or even in the likely faulty existing model, but in the gaping wide holes for testing these things quickly and objectively. That is why us dumb software guys have test automation.

    • abdullahkhalids a day ago

      You are oversimplifying science, especially theoretic physics. At the point where we are, there are neither any quick/cheap tests, and there is no objectivity. The space of possible correct theories is infinite, and humans are simply not smart enough to come up frameworks to objectively truncate the space. If we were, we would have made progress already.

      There is a lot of subjectivity and art to designing good experiments, not to mention a lot of philosophical insight. I know a lot of scientists deny the role of philosophy in science, but I see all the top physicists in my fields liberally use philosophy - not philosopher type philosophy but physicist type philosophy - to guide their scientific exploration.

      • naasking a day ago

        > and humans are simply not smart enough to come up frameworks to objectively truncate the space.

        We are, but some people stubbornly resist such things. For instance, MOND reproducing the Tully-Fisher relation and being unexpectedly successful at making many other predictions suggests that any theories purporting to explain dark matter/gravitational anomalies should probably have MOND-like qualities in some limit. That would effectively prune the space of possible theories.

        Instead, they've gone in the complete opposite direction, basically ignoring MOND and positing different matter distributions just to fit observations, while MOND, against all odds since it's not ultimately correct, continues to make successful predictions we're now seeing in JWST data.

        • abdullahkhalids a day ago

          Note that when I say humans, I mean how the whole worldwide social institution/network of scientists, with all its inherent hierarchies and politics. Because that is the social network that ultimately comes to some consensus on what the best theories at any given moment are.

          Its indeed possible, in fact "necessary" that some individual scientists within this network by luck or brains come up with much better theories, but sometimes those theories are not accepted by others or they are. But ultimately all that matters is the consensus.

          • austin-cheney 21 hours ago

            Holy fuck guy. Take a step back and do some self-reflection. Any time people post about physics on here its all emotions about how hard life is. With so much longing for sympathy its amazing anything in the field ever gets published.

            Unless you are looking for research grants stop crying about consensus and instead return to evidence and proofs. There will always be a million sad tears in your big sad community. If that is your greatest concern its going to take you a million years to prove what you already know, because all the sad people you are showing it to are just as sad and self-loathing about social concerns as you are.

      • austin-cheney a day ago

        I am not. You are using bias as an excuse to qualify poor objectivity. I am fully aware that astrophysics contains a scale and diversity of data beyond my imagination, but that volume of data does not excuse an absence of common business practices.

        > The space of possible correct theories is infinite

        That is not unique to any form of science, engineering, or even software products.

        > and humans are simply not smart enough to...

        That is why test automation is a thing.

    • bubblyworld a day ago

      I think automated hypothesis testing against new data in science is itself an incredibly difficult problem. Every experiment has its own methodology and particular interpretation, often you need to custom build models for your experimental setup to test a given hypothesis, there are lots of data cleanup and aggregation steps that don't generalise, etc. My partner is in neuroscience, for instance, and merging another lab's data into their own workflows is a whole project unto itself.

      Test automation in the software context is comparatively trivial. Formal systems make much better guarantees than the universe.

      (not to say I think it's a bad idea - it would be incredible! - but perhaps the juice isn't worth the squeeze?)

      • austin-cheney a day ago

        > Every experiment has its own methodology

        That is bias. Bias is always an implicit default in any initiative and requires a deliberate concerted effort to identify.

        None of what you said is unique to any form of science or engineering. Perhaps the only thing about this unique to this field of science, as well as microbiology, is the shear size and diversity of the data.

        From an objective perspective test automation is not more or less trivial to any given subject. The triviality of testing is directly determined by the tests written and their quality (speed and reproducibility).

        The juice is always worth the squeeze. Its a business problem that can be answered with math in consideration of risk, velocity, and confidence.

        • bubblyworld 3 hours ago

          Respectfully I disagree - the situation is far more complex in science than software engineering disciplines.

          I agree that different tests require different amounts of effort (obviously), but even the simplest "unit tests" you could conceive of for scientific domains are very complex, as there's no standard (or even unique) way to translate a scientific problem into a formally checkable system. Theories are frameworks within which experiments can be judged, but this is rarely unambiguous, and often requires a great deal of domain-specific knowledge - in analogy to programming it would be like the semantics of your language changing with every program you write. On the other hand, any programmer in a modern language can add useful tests to a codebase with (relatively) little effort.

          We are talking hours versus months or even years here!

          The experiment informs the ontology which informs the experiment. I don't think this is reducible to bias, although that certainly exists. Rather to me it's inherent uncertainty in the domain that experiments seek to address.

          Business practice, as you use the term, evolved to serve very different needs. Automated testing is useful for building software, but that effort may be better spent in science developing new experiments and hypotheses. It's very much an open problem whether the juice is worth the squeeze - in fact the lack of such efforts is (weak) evidence that it might not be. Scientists are not stupid.

          • austin-cheney 2 hours ago

            > We are talking hours versus months or even years here!

            That is why there are engineers that specialize in how to perform testing so that it doesn't take so long. For example long tests don't need to run at all if more critical short tests fail. The problems you describe for astrophysics are not unique to astrophysics even if the scale, size, and diversity of the data is so unique. Likewise, all the excuses I hear to avoid testing are the very same excuses software developers make.

            The reality is that these are only 25% valid. On their face these excuses are complete garbage, but a validation cannot occur faster than the underlying system allows. If the system being tested is remarkably slow then any testing upon it will be, at best, just as remarkably slow. That is not a fault of the test or the testing, but is entirely the fault of that underlying system.

JumpCrisscross 2 days ago

Reading this as a layman, it looks like releasing ΛCDM's cosmological principle [1] reveals the nontrivial temporal effects mass clusters have via general relativity. As a result, there could be empty regions of space in which billions of years more have elapsed than in e.g. a galaxy. This not only changes how we interpret supernova data (the acceleration isn't generally happening, but an artefact of looking through space which is older than our own), but may also negate the need for dark matter (EDIT: dark energy) and the meaning of a single age of our univese.

(I'm also vaguely remembering a multi-universe model in which empty space inflates quicker than massed space.)

[1] https://en.wikipedia.org/wiki/Cosmological_principle

  • throwawaymaths 2 days ago

    > here could be empty regions of space in which billions of years more have elapsed than in e.g. a galaxy.

    important to note that the motivation for releasing the cosmological principle, is that we know that there are "small" voids and that there is strong evidence of much larger voids and structure on the scale of tens of billions of light years that is incompatible with the cosmological principle, so it's not just a thing to do on a whim, it's supported by observation.

    • JumpCrisscross 2 days ago

      > we know that there are "small" voids and that there is strong evidence of much larger voids and structure on the scale of tens of billions of light years that is incompatible with the cosmological principle

      Two cosmologists debate which of their town’s bars is better, the small one or the large one. The town has one bar.

      • throwawaymaths 2 days ago

        fair, I should have also put voids in quotes. is the black sea part of the med and is the med part of the atlantic?

  • aeve890 2 days ago

    >Reading this as a layman, it looks like releasing ΛCDM's cosmological principle

    You mean relaxing. Also... "as a layman"? Lol what kind of layman are you. Respect.

    • JumpCrisscross 2 days ago

      > You mean relaxing

      Fair enough, at high redshift the cosmological principle could still hold under timescape. (It doesn't require, it however.)

      All that said, I'm generally sceptical about findings based on supernova data. They require so much statistical work to interpret correctly that the error rate on first publications is remarkably high.

  • Keysh a day ago

    > there could be empty regions of space in which billions of years more have elapsed than in e.g. a galaxy.

    A problem with that idea would be that the ages of galaxies in low-density regions (including voids) tend to be younger than galaxies in denser regions, suggesting that galaxy evolution proceeds more slowly in voids.

    https://www.iaa.csic.es/en/news/galaxies-great-cosmic-voids-...

  • sesm 2 days ago

    Overturning Lambda-CDM model removes only one observation that is explainable by Dark Matter (peaks in spectrum of CMB). It's not the only observation.

    • throwawaymaths 2 days ago

      well the edges of a galaxy are in less of a deep gravity well than the center, so time and thus rotation should go faster. is that enough to account for flat rotation curves? i dont know enough to do a back of the envelope calculation

  • escape_goat 2 days ago

    Back here on the lay benches, I think the best starting point in the Wikipedia is probably the the article on inhomogenous cosmology, of which the Timescape Comsology proposed by David Wiltshire (listed as an author on this paper) in 2007 is a notable example; it is discussed in the article.

    <https://en.wikipedia.org/wiki/Inhomogeneous_cosmology>

    • Gooblebrai 2 days ago

      This is a mind-blowing theory!

  • astrobe_ 2 days ago

    > As a result, there could be empty regions of space in which billions of years more have elapsed

    If they are empty, those billion years didn't happen. But nothing is really empty, right?

    • pdonis 2 days ago

      > If they are empty, those billion years didn't happen.

      No, that's not correct. Here's a better way to look at it:

      In our cosmological models, we "slice up" the spacetime of the universe into slices of "space at a constant time"--each slice is like a "snapshot" of the space of the entire universe at a single instant of "cosmological time". The models, which assume homogeneity and isotropy, assume that the actual elapsed proper time at every point in space in each "snapshot" is the same--in other words, that "cosmologcal time" is also proper time for comoving observers everywhere in space at that instant of cosmological time--the time actually elapsed since the Big Bang on a clock moving with each observer.

      What these supernova papers are examining is the possibility that "cosmological time" and proper time (clock time) for comoving observers do not always match: roughly speaking, in areas with higher mass concentration (galaxy clusters), proper time lags behind cosmological time (the time we use in the math to label each "snapshot" slice of the space of the universe), and in areas with lower mass concentration (voids), proper time runs ahead of cosmological time. The idea is that this mismatch between proper time and cosmological time can be significant enough to affect the inferences we should be drawing from the supernova observations about the expansion history of the universe.

      As far as I know the jury is still out on all this; claims by proponents that what is presented in these papers is already sufficient to require "foundational change" are, I think, premature. But it is certainly a line of research that is worth pursuing.

      • lukasb 2 days ago

        Does the idea of a single cosmological time even make sense? I thought one of the key parts of relativity is that which events happen simultaneously depends on your perspective.

        • pdonis 2 days ago

          > Does the idea of a single cosmological time even make sense?

          Sure, it's just a convenient choice of coordinates. Even in a model that is not exactly homogeneous and isotropic, it can still be a convenient choice of coordinates to have cosmological time track some kind of average of the low mass and high mass regions. As I understand it, that's basically what the alternate models described in the article are doing.

          > I thought one of the key parts of relativity is that which events happen simultaneously depends on your perspective.

          That's true, but it doesn't actually mean very much. In a particular spacetime geometry, you can still have particular things that are picked out physically by the properties of that specific spacetime, and one of them can be "cosmological time". In an exactly homogeneous and isotropic model, that time is picked out by the symmetries of homogeneity and isotropy. But even in a model where homogeneity and isotropy are only average properties, their average is still picked out physically--for example, by the CMB, which is much, much closer to being exactly homogeneous and isotropic than galaxies and galaxy clusters (it's homogeneous and isotropic to within about 1 part in 100,000). So picking "cosmological time" based on observers who see the CMB that way is a physical method of picking them out; it's not an arbitrary choice.

          Bear in mind that in special relativity, when you talk about relativity of simultaneity, you are talking about spacetime that is empty--there are no gravitating bodies anywhere. So all inertial frames are indeed equivalent in that spacetime, not just mathematically but physically. But as soon as you put gravitating bodies in, that symmetry is broken: the gravitating bodies have a definite state of motion, and that picks out certain choices of coordinates as being aligned with the gravitating bodies. So while it's true that you don't have to use those coordinates, they are convenient and they do reflect an actual physical property of the spacetime, and so does the definition of time they give.

        • layer8 2 days ago

          That’s only if you are moving relative to another observer. There is, however, an approximate “universal” rest frame that corresponds to the CMB (cosmic microwave background), and you can define simultaneity relative to that rest frame.

        • hnuser123456 a day ago

          You can define the moment when the universe stopped being a singularity as t=0 and its "end state" at t=1, without knowing what that end state is, and still make general observations about the beginning, middle, and end of the universe. Of course, you can't derive any specifics about specific regions of spacetime when thinking this way.

      • le-mark 2 days ago

        As a layman, what I don’t get is; the speed of light is constant, so wouldn’t that nullify any time/space fluctuations due to lack of mass/gravity?

        • pdonis 2 days ago

          > the speed of light is constant

          That's not a good way of describing the actual law in a curved spacetime, i.e., a spacetime that contains gravitating masses. In such a spacetime there is no single global definition of "speed"; you can't compare speeds at spatially separated points.

          A better way to state the law is that the light cone structure of the spacetime constrains the motion of all bodies: timelike bodies move within the light cones, lightlike bodies (like light itself) move exactly on the light cones. But once you state it that way, it becomes obvious that this law does not impose any constraints on "time/space fluctuations".

          • User23 2 days ago

            Einstein's wonderful book Relatvity clarified (sort of) this for me. It's not too hard to develop an intuition about special relativity, but GR is a whole other ball of wax.

        • mgsouth 2 days ago

          As another layman, no, I don't think so.

          The "twin paradox" [1] is a prime example. The two twins depart from a common point in time and space, go about their separate travels, and meet again at a common point in space-time. Despite both twins always having the same constant speed of light, one of the twins takes a shorter path through time to get to the meeting point--one twin aged less than the other. In the paradox case, the shorter/longer paths are due to differences in acceleration. But the same thing happens due to differences in gravitation along two paths. (In fact, IIUC, acceleration and gravitational differences are the same thing.)

          Just thinking about the math makes my head hurt, but it's apparent that two different photons can have taken very different journeys to reach us. For example, the universe was much denser in the dim past. Old, highly red-shifted photons have spent a lot of time slogging through higher gravitational fields. As a layman, that would suggest to me that, on average, time would have.. moved slower for them?... they would be even older than naive appearances suggest. I don't think the actual experts are naive, so that's been accounted for, or there's confounding factors. But I could also imagine that more chaotic differences, such as supernovas in denser galatic centers vs. the suburbs, or from galaxies embedded in huge filaments, could be hard to calculate.

          [1] https://en.wikipedia.org/wiki/Twin_paradox

        • a_cardboard_box 2 days ago

          From our perspective, the light in the void moves faster than it does here. But so do clocks, so someone in the void would measure their light moving at the same speed as we measure our light moving here.

        • s1artibartfast a day ago

          speed of light is constant, but distance and time are not.

    • ben_w 2 days ago

      Assuming I correctly understood the argument in the link:

      Even if the space was truly empty, the expansion of that space would have gone on for longer, and thus things on opposite sides would eventually notice they were more distant.

      But also yes the space isn't really totally empty.

    • nine_k 2 days ago

      The higher the density, the more curved is the spacetime at that area, and the slower is the passage of time. You don't have to go to extremes like black holes vs absolute vacuum. A sufficient difference should be visible between regions closer to centers of galaxies, or just clusters of nearby galaxies, and really large "voids" between them, which contain some matter, and even some stars, but are vastly more empty. This is what the article explores.

      (This connects in a funny way to Vernor Vinge's SF idea of slower and faster areas of space. The "high" / "fast" space is mostly empty, so the time passes there faster than in the "unthinking depths" around galactic cores, and hugely more progress is done by civilizations in the "fast" space, as observed from the "slow" space.)

      • skirmish 2 days ago

        > Vernor Vinge's SF idea of slower and faster areas of space

        His ideas were more ambitious: there are advanced technologies (e.g. FTL travel) that work in "fast" space but completely stop working in the "slow zone" (where the Solar system is located). On the other hand, even human-level intelligence would stop functioning close to the galactic center, the crew would not be able to operate the ship and would be stranded.

        • lazide a day ago

          Also, if I remember correctly wasn’t it implied that this was constructed/controllable a bit, and was due to some kind of ‘nature preserve’ from a super alien race? I might be misremembering.

asplake 2 days ago

It’s early days on this, so let me ask again what I have asked previously: What does timescape do to estimates of the age of the universe?

  • sigmoid10 2 days ago

    It would mean that we literally can't calculate it anymore, because expansion and everything else we see might just be artefacts of inhomogeneities beyond the scale of the observable universe. But that would crash hard with our observation of the CMB and since this study only looks at supernovae, I would not bet on it holding up for long.

    • geysersam 2 days ago

      How would

      > that we literally can't calculate it (the age of the universe) anymore

      crash with our observation of the CMB?

      I don't see how us being unable to calculate a quantity from one set of observations could possibly clash with another set of observations (the cmb).

      What am I missing?

      • hnuser123456 a day ago

        The CMB suggests we get a picture of the entire early universe.

        However, other things are suggesting we might not be seeing the whole universe just by looking as far away as possible. It could be we can see some regions on the CMB that have already expanded outside of our observable universe. These regions aren't just "even fainter and we need a better telescope", they're "the last photon from that region that will ever reach us came and left billions of years ago."

        Therefore, there might not be one singular hubble constant, there might be two. One that applies to our local observable universe, and one that applies to the entire universe.

        It could be that the universe is 26 billion years old: https://academic.oup.com/mnras/article/524/3/3385/7221343?lo...

        And because at great enough distances(/times), expansion is faster than light, and we simply can't see a significantly different epoch of the universe just by looking deeper.

    • User23 2 days ago

      Even more fun, once you abandon isotropy you don't even need to posit matter inhomogenities. It could just be that spacetime itself has irregular topology.

      Which, incidentally, is probably a better theory than dark matter. For example it can produce the same results without the problem of undetectable matter.

      • cryptonector a day ago

        Presumably we should be able to build a theory of spacetime that yields no need for dark matter, but we're not there yet.

        On the other hand, determining the local time dilation factor based on all mass beyond the local area is essentially not possible. We can talk about how the great voids have less time dilation than galaxy clusters, sure. But what about the universe as a whole? Our universe could be embedded in a larger one that contributes to time dilation in ours and we could never sense that. Time dilation at cosmological scales is relative for this reason.

block_dagger 2 days ago

Maybe Vernor Vinge was right.

  • Vecr 2 days ago

    He wasn't. He stated from the start that all of his stories were gimicked to remove the singularity.

    This theory does not do that.

SaintSeiya 2 days ago

The ΛCDM model always felt "wrong" in my gut: dark matter? dark energy? is just the modern equivalent of the aether theory. The fact that is more complex to calculate is not an excuse to prefer the ΛCDM model. The God's theory is even simpler as it spares of any math and physics, yet we do not use it.

jmward01 2 days ago

The expansion of the universe has always come down to one question for me. In an expanding universe when you throw a ball up what speed does it come down?

  • pezezin a day ago

    My very limited understanding of the topic is that for a gravitationally bound system like the Earth, the usual rules apply, but on cosmological scales the expansion of the universe means that time-translation is not invariant and thus conservation of energy is not well-defined.

    https://en.wikipedia.org/wiki/Conservation_of_energy#General...

    • jmward01 a day ago

      Getting rid of the time side then, think of it as an orbit. If the universe was expanding then something could be orbiting slower than gravity would allow. Basically this question keeps bringing me back to the ties between mass and the expansion of the universe. No matter how you look at it mass must be special when it comes to expansion because it is either giving off 'free' energy in the form of slow orbits and acceleration between two objects or it -isn't- giving off that energy and something is canceling it out. Following this rabbit-hole is pretty interesting at a minimum.

  • thomquaid a day ago

    if you throw less than escape velocity, about the energy you threw with less system losses. if you throw greater than escape velocity, about any energy is possible less system losses, if you allow enough time for it to come back after its trip around the solar system. if you throw it into the milky way, same thing, easier potentials. if you throw it at relativistic velocities, expansion of the universe could play a significant role.

nimish 2 days ago

Finally. We should use numerical relativity and simulate using full-fat GR. Not the half-assed approximations.

  • XorNot a day ago

    Yes I'm sure the problem was no physicist working in the field their entire career thought to just do this.

    • nimish 21 hours ago

      Well no one bothered to actually implement it so who cares whether they thought it first or not?

      • russdill 19 hours ago

        Spoiler alert, this paper does not reach these conclusions by doing the thing you are asking.

        • nimish 19 hours ago

          Hence the subjunctive :)

chuckwfinley 2 days ago

It is sure seeming like LCDM needs some work. It's not really clear if a timescape approach solves the outstanding issues though

uoaei 2 days ago

The "shut up and calculate" attitude has done a lot of harm to physics research over the past decades. It is quite remarkable and more than a bit surprising that the primary tenet of general relativity -- that spacetime behaves differently where there is curvature vs where there is not -- was not sufficiently accounted for seemingly by any researchers this entire time.

I am interested to see some retrospective metaanalysis on how many cosmological models have not suffered from this glaring omission. I suspect it's very few but I also think that it would be difficult to do this kind of modeling before we were able to do analysis in silico so there would be an obvious bias in the set of theories.

  • JumpCrisscross 2 days ago

    > The "shut up and calculate" attitude has done a lot of harm to physics research over the past decades

    This is a shut up and calculate paper. There is zero theoretical ground being broken. The meat is in the statistical analysis (which I concede is beyond me).

    • uoaei 2 days ago

      This paper is distinct in that it's cogent about the underlying cosmological principles. "Shut up and calculate" poisoned academic physics by eliding the necessity of thinking of systems as systems, and not merely as sets of equations to be manipulated.

  • sampo 2 days ago

    Also Pioneer Anomaly was solved, after dropping simple models treating the space probe as a simple or spherical particle, and accounting for the 3-dimensional shape of the probe.

    Because of the shape of the space probe, part of the thermal radiation emitted from its surfaces were hitting some other surfaces, and thus the probe did not emit radiation evenly into every direction.

    https://en.wikipedia.org/wiki/Pioneer_anomaly

  • caconym_ 2 days ago

    > It is quite remarkable and more than a bit surprising that the primary tenet of general relativity -- that spacetime behaves differently where there is curvature vs where there is not -- was not sufficiently accounted for seemingly by any researchers this entire time.

    This was also my first thought when I heard about this paper. It seems almost impossible that nobody in the entire contentious field of physical cosmology had considered whether our current consensus models account for the relativistic effects of the (known!) large scale structure of space.

    Following from that, my second thought was that maybe there is something more subtle about this analysis---maybe the question the researchers asked is less obvious than the headline makes it seem ("we forgot about relativity"). Obviously the subject matter is beyond me to answer that question myself, and I haven't found any good answers elsewhere.

    • sampo 2 days ago

      > It seems almost impossible that nobody in the entire contentious field of physical cosmology had considered whether our current consensus models account for the relativistic effects of the (known!) large scale structure of space.

      One of the authors of the present study (prof. Wiltshire) has published this idea first time already 18 years ago: https://en.wikipedia.org/wiki/Inhomogeneous_cosmology#Timesc...

    • exe34 2 days ago

      GR calculations are very hard, and it's easy to think the effects aren't relevant outside of extreme conditions. this reminds me of the (sadly not very well supported) paper about gravitomagnetism explaining the rotation curves without dark matter.

    • justlikereddit 2 days ago

      The way I see it people got stuck in a mindset of universal background time that is pretty much what earth clocks run at. With any serious relativity effects only being locally compartmentalized next to extreme cosmic phenomena.

      Adding dark matter to this mindset makes it even worse because it homogenizes everything even further towards a Universal Standard Timeframe when 80% of all mass is finely dispersed as a background fog.

      Put the Real back in Relativity.

      It's by far a more satisfying solution than magic mystery matter.

      My pet theory is that black holes are also vastly misunderstood because they're always seen from the Universal Standard Timeframe, if we probe a black hole and their local space from strongly relativistic timeframe they'll start to make more sense, but I'll leave that to the daydreams of the reader.

      • User23 2 days ago

        GR says that you can pick any frame that you want. The Earth-centered, Earth-fixed coordinate system happens to be really convenient practically, for things like navigation satellites.

        • uoaei a day ago

          In the context of GR, at cosmological scales, every frame is an inertial frame. That means fictitious forces will arise if you're not careful to account for that.

        • justlikereddit a day ago

          Sure you can do that. But the context of what is going on in different frames will be mostly lost.

          Explain the history of earth seen from a probe hovering one inch above the event horizon of Sagittarius A*?

          From the perspective of the probe: "The solar system, home of mankind was a blink in the sky that lasted 1 second according to probe time. As was most other stars, as seen from here. Surely nothing important could ever happen in such a short timeframe. Fin."

          There's important nuances to what perspective one have. Something that seems to not just be missed but actively fought against in modernity.

  • ANewFormation 2 days ago

    This is also the biggest surprise for me, but I'd frame it as people largely just handwaving in the assumption of a (at scale) isotropic universe, even though that's highly questionable.

    I think the practical issue is that that assumption let a lot more work get done than would have been possible otherwise. Of course if it turns out the universe is not isotropic then most all of that work is worth less than nothing. So publish or perish strikes again?

    • programjames 2 days ago

      It is somewhat surprising, because one of the most famous papers in chaos theory, "The Applicability of the Third Integral of Motion" (Henon & Heiles), basically starts by saying a similar assumption isn't true, that stars aren't ergodically distributed in the axial/radial directions.

      If you have five equations of motion in a six-dimensional universe (3 space + 3 velocity coordinates), you can compute the future trajectory of each point. Two equations come from constant energy & angular momentum, and these constrain where in phase-space the trajectories can go. Another two equations are do not make any such constraints, which implies stars are at least ergodically distributed in a 2D phase-space. Since none of these equations constrain the axial/radial velocity, you would expect the dispersions to be equal for both directions. However, this turns out to not be the case. This means there must be a third isolating equation of motion out there, and the surprising thing Henon & Heiles find is it's chaotic! Sometimes it constrains points to 2D regions of phase-space (i.e. concentric circles of orbits), and other times it lets them move in a 3D region (i.e. chaotic trajectories filling the space).

revskill 2 days ago

Not much related but could we somehow calculate the Gravitational constant with only math ?