How different might the laws of nature have been (in some other logically possible but nomologically impossible world)? Are there any limits?
Answer by Massimo Pigliucci
The question of laws of nature, their ontological status, and their significance for our understanding of the world, is very much an open one. Moreover, it is a question at the fascinating interface among metaphysics, epistemology, philosophy of science and, of course, natural science itself (particularly fundamental physics, but also chemistry and biology).
To begin with, let us get straight the difference between nomological and logical possibilities. Nomological means ‘relating to or denoting certain principles, such as laws of nature, that are neither logically necessary nor theoretically explicable, but are simply taken as true,’ the word originating from the Greek nomos (law). That standard dictionary definition gets at the heart of the matter by affirming that nomological principles are neither logically necessary nor ‘theoretically explicable.’ Take, for instance, the law of conservation of energy, which states that the total energy of a closed system cannot change over time (i.e., energy can only be transformed, but not created or destroyed). This is not logically necessary in the sense that its violation would not entail any logical contradiction (unlike, say, the idea that I can simultaneously be and not be me — which violates the principle of non contradiction in classical logic). At the same time, we don’t really know why the law of conservation actually applies to the universe as we understand it, i.e., we do not have a theoretical account of it. It is more like a brute empirical fact about nature, or as a theoretical axiom from which other things can be derived.
In a sense, then, logical possibility is about what is possible in principle, while nomological possibility concerns what is actually physically possible, given the laws of the universe as they stand. The first set is, at first glance, much larger than the second one.
But, of course, things are not quite that simple, as physicists have long played with a few ideas that entail the tantalizing prospect of bringing the range of nomological possibilities closer to that of logical ones.
There are at the least three such proposals on the table. One is the so-called many-worlds (or Everett, named after American physicist Hugh Everett III, who first proposed it) ‘interpretation’ of quantum mechanics. This is the idea that the universal wave function is objectively real (and doesn’t ‘collapse,’ contra to other interpretations of quantum mechanics), so that every time that events could take more than one course (say, I may or may not decide to finish writing this essay) the universe literally splits into two, in one version of which course of action A (I finish writing) takes place, while in the other course of action B (I don’t finish) occurs. There’s even a fun app (aptly called ‘Universe Splitter’) which allows you to make random decisions and track the ensuing branching of universes you keep creating while using the app. My understanding, however (and I stand to be corrected by a physicist, if the case may be) is that Everettian quantum mechanics doesn’t fool around with the laws of the universe itself, so that the number of nomological possibilities — while vastly augmented by the splitting procedure — would still be far less than the number of logical possibilities, because each course of action (each split universe) would still be subject to the laws of nature as we understand them.
This limitation, however, does not hold for another bold idea currently entertained by cosmologists: the multiverse. This is the conjecture — as yet with no empirical basis, but compatible with what fundamental physics tells us about the world — that our universe is just one of infinitely many, constantly being created and destroyed in a much larger milieu, i.e., the multiverse. According to some physicists, such as Lee Smolin of the Perimeter Institute in Canada, different universes may be characterized by different sets of laws of physics, which means that a much larger number of logical possibilities would become nomologically implemented as well, just not all in the same ‘universe.’
Thirdly, we have theoretical physicist Max Tegmark’s idea of the ‘mathematical universe,’ the contention that the fundamental structure of the universe is (as opposed to simply being described by) mathematics. The building blocks of reality are, according to Tegmark, mathematical structures, not objects or particles. A corollary of this idea is that all possible (i.e., all logically consistent) mathematical structure do exist, in one way (or one universe) or the other, so that logical and nomological sets would coincide.
It is hard to know what to make of this notion (in what sense am I, or my laptop ‘made of’ mathematical structures?), but it is certainly a venerably old one, going back to Pythagoras (in the philosophy of mathematics, Tegmark’s proposal is aptly referred to as Pythagoreanism about mathematical objects).
One more thing to consider within the context of this discussion. So far, I have assumed that the nomological set is defined by fundamental laws, i.e., by laws that apply always and everywhere (at the least in a given universe). But some philosophers of science — chiefly Nancy Cartwright and Ian Hacking — maintain that there are no such things are universal laws, that all so-called laws of nature are phenomenological, i.e., they are arrived at by approximately applying to real phenomena a set of highly abstract, and quite literally false, idealizations. Consider, for instance, Galileo’s law of inertia, which eventually led to Newton’s first law of motion. Galileo arrived at the formulation of the law by carrying out thought experiments involving idealized frictionless planes, objects that simply do not exist in the real world. His results, however, hold approximately true for real planes and surfaces, and have therefore been generalized from phenomenological (describing a range of phenomena) to fundamental (applying always and everywhere). Cartwright and Hacking, however, invoke basic empiricist principles to deny that leap, suggesting that as far as we know all laws of physics are only approximately and locally valid. If that is the case, then there are no such things as nomological possibilities, strictly speaking.
Answer by Craig Skinner
I’d say no limits at all.
First, I think all logically possible combinations of variants of laws could have occurred provided we accept that possible universes thereby include those that last less than a second, those where nothing ever happens, those with different dimensions, say 2 or 32, and so forth, most of these universes being lifeless of course.
Secondly, why should logic always apply in a universe? We must include the myriad illogical ones too.
I don’t want to get bogged down in what exactly a law of nature is and how it can be differentiated from an accidental exceptionless regularity. Let us just assume, as does your question, that our world is ordered by laws: some causal (such as what happens when two chemical substances combine); some conservational (energy, charge, momentum, parity); some deterministic, others probabilistic (radioactivity for instance). And let’s include the constants of nature such as the strength of each of the four forces, the quantum of charge, fine structure constant, and speed of light.
Could the suite of laws and constants in our world have been different?
Leibniz certainly thought so. He felt God could have decreed whatever laws he wanted, but in fact chose the suite that gave the best possible world, one that included creatures with free will who could relate (or not, as they chose) to a loving God.
And of course scientists of this era mostly felt they were indeed discovering God’s laws.
Later scientists and philosophers, taking a naturalistic viewpoint, asked whether the laws might have been different. A hundred years ago, many, perhaps most, would have said no, there is only one logically consistent set of laws – we cant yet see this, but when we fully understand all the laws, we will see that no variation is logically possible.
These days, fewer think this. However it has become clear that slight variation from the actual constants and laws would produce worlds where intelligent life is impossible. If gravity were a tiny bit stronger, the universe would have collapsed moments after it started; with gravity a bit weaker no stars or planets would have formed. if the strong nuclear force were weaker, no stable elements could form; if stronger, no carbon or oxygen. And there are many other examples. This is known as the fine-tuning problem – how come the laws are just right for we humans to exist?
No problem for religious people, God made the laws that way. No problem for those accepting of a brute fact, that’s just the way the world is, and if it weren’t, we wouldn’t be here (simple anthropic principle).
An alternative favoured by some is that all possible combinations of laws/constants occur in different universes (a multiverse), so that, by chance, a few universes will have the fine tuned laws, and, of course, we live in such a universe.
Kant considered that the laws of nature were part of the forms of our perception and categories of our understanding without which we could have no coherent experience at all i.e. any experienced world necessarily exhibits space, time, causality and other regularities. Exhibits them, that is, in our experience (the world of appearances), we can know nothing of the world as it is in itself according to Kant.
Kant also felt that mathematical truths, although a priori, were not analytic (true by reason of the meaning of the terms) but synthetic. He was thereby obliged to admit that, say, 5+7 might not have been 12. Yet this seems a necessary truth. He solved it nicely by saying that 5+7=12 was necessarily true in any experienced world, such as ours, and could be false only in worlds where there was no experience, so could not be false in any world we found ourselves in. Such worlds would be so disordered and chaotic that ordered entities capable of experience would be out of the question. I rather agree: worlds where logic doesn’t hold might exist, but these worlds couldn’t contain the likes of us.
In short, I think all possible variations of our laws of nature could have been. And for all we know, all are instantiated in some universe or other within a vast, maybe infinite, multiverse.
Finally, if you feel there must be a reason for everything (Principle of Sufficient Reason), and are in a speculative mood, you can ask what metarule decides the choice of laws.
Leibniz favoured Goodness. I’ve rather assumed Fullness (all possibilities occur) or perhaps No Rule (a random selection of all possibilities). It cant be Simplicity because the simplest option would be absolutely nothing rather than anything at all, and that’s not the case.
Then one can ask what higher-level metametarule decides whether Fullness, Goodness or No Rule occurs at the metalevel. Simplicity at the metametalevel would favour No Rule. No Rule would favour Randomness. Fullness would favour all rules. Goodness would favour itself, or maybe this is impossible (violates the Foundation Principle, like a cause causing itself, or God creating himself).
More can be said on this issue of levels and Selectors but I will leave it at that.
One thought on “How different might the laws of nature have been?”
I am still learning about all this, but my view is that there is a multiverse within the current universe, and that this is really a matter of perception. This brings in Deleuzian notions of difference and multiplicity. It also helps us to get away from the unsettling problem that there are many universes and replicas of ourselves. Does this match the view of Lee Smolin? I believe there is a empirical evidence for difference though. Any suggestions on how I can pursue this line of thought? And how popular is Deleuze in Philosophy of Science? Thanks for the great posts!