Tag Archives: cosmology

Applied Cosmology: The Holographic Principle

The Holographic Principle says that a full description of a volume of space is encoded in the surface that bounds it. This arises from black hole thermodynamics, where the black hole entropy increases with its surface area, not its volume. Everything there is to know about the black hole’s internal content is on its boundary.

Software components have boundaries that are defined by interfaces, which encapsulate everything an outsider needs to know to use it. Everything about its interior is represented by its surface at the boundary. It can be treated like a black box.

Applied Cosmology: Self Similar Paradigm

Robert Oldershaw’s research on The Self Similar Cosmological Paradigm recognizes that nature is organized in stratified hierarchy, where every level is similar. The shape and motions of atoms is similar to stellar systems. Similarities extend from the stellar scale to the galactic scale, and beyond.

Managing complexity greatly influences software design. Stratified hierarchy is familiar to this discipline.

At the atomic level, we organize our code into units. Each unit is a module with a boundary, which exposes its interface that governs how clients interact with this unit. The unit’s implementation is hidden behind this boundary, enabling it to undergo change independently of other units, as much as possible.

We build upon units by reusing modules and integrating them together into larger units, which themselves are modular and reusable in the same way. Assembly of modules into integrated components is the bread and butter of object-oriented programming. This approach is able to scale up to the level of an application, which exhibits uniformity of platform technologies, programming language, design metaphors, conventions, and development resources (tools, processes, organizations).

The next level of stratification exists because of the need to violate the uniformity across applications. However, the similarity is unbroken. We remain true to the principles of modular reuse. We continue to define a boundary with interfaces that encapsulate the implementation. We continue to integrate applications as components into larger scale components that themselves can be assembled further.

Enterprises are attempting to enable even higher levels of stratification. They define how an organization functions and how it interfaces with other organizations. This is with respect to protocols for human interaction as well as information systems. Organizations are integrated into business units that are integrated into businesses at local, national, multi-national, and global scales. Warren Buffett’s Berkshire Hathaway has demonstrated how entire enterprises exhibit such modular assembly.

This same pattern manifests itself across enterprises and across industries. A company exposes its products and services through an interface (branding, pricing, customer experience) which encapsulates its internal implementation. Through these protocols, we integrate across industries to create supply chains that provide ever more complex products and services.

Applied Cosmology: Machian Dynamics

Julian Barbour wrote the book titled “The End of Time: The Next Revolution in Physics” [http://www.platonia.com/ideas.html]. He explains how our failure to unify General Relativity with Quantum Theory is because of our ill-conceived preoccupation with time as a necessary component of such a theory. According to Machian Dynamics, a proper description of reality is composed of the relationships between real things, not a description with respect to an imaginary background (space and time). Therefore, all you have is a configuration of things, which undergoes a change in arrangement. The path through this configuration-space is what we perceive as the flow of time.

We apply this very model of the universe in configuration management.

Software release management is a configuration management problem. The things in configuration-space are source files. A path through configuration space captures the versions of these source files relative to each other as releases of software are built. Our notion of time is with respect to these software releases.

Enterprise resource management in the communications industry involves many configuration management problems in various domains. We normally refer to such applications as Operations Support Systems.

In network resource management, the configuration-space includes devices and other resources in the network, their connectivity, and the metadata (what is normally called a “device configuration” which needs to be avoided in the context of this discussion for obvious reasons) associated with that connectivity arrangement.

In service resource management, the configuration-space includes services, their resource allocations, and the subscription metadata (what is normally called a “service configuration” which needs to be avoided in the context of this discussion for obvious reasons) or “design”.

Such applications have a notion of configuration-space, because such systems cannot operate in a world that is limited to its dependence on a background of space and time. We need to be able to travel backward and forward in time arbitrarily to see how the world looked in the past from the perspective of a particular transaction. These applications enable users to hypothesize many possible futures. Perhaps only one of which is brought into reality through a rigorous process of analysis, design, planning, procurement, construction, and project management. Reality is always from the perspective of the observer, and one’s frame of reference is always somewhere on the path in configuration-space.

Software engineering is applied cosmology

Engineering is applied science. Some people believe that software engineering is applied computer science. In a limited sense, it is. But software is not entirely separated from hardware. Applications are not entirely separated from processes. Systems are not entirely separated from enterprises. Corporations are not entirely separated from markets. For this reason, I believe what we do is not software engineering at all. It is not limited to applied computer science. Our engineering discipline is actually applied cosmology.

dark matter – skepticism

I remain skeptical of both dark matter and dark energy. I don’t believe that either are valid concepts. They are hypothesized as explanations for observations that defy the current theories of how ordinary matter and energy ought to behave. I don’t have an argument to disprove either concept, but I do believe the burden of proof is on those who assert the existence of dark matter and dark energy. In this article, I would like to present an alternative theory to explain the anomalous behavior that proponents of dark matter use to support their hypothesis.

We must begin with an introduction to the topic of dark matter. One of the reasons that dark matter is hypothesized is to explain the unexpected rotational speeds of galaxies. In spiral galaxies like our own, the stars orbiting closer to the central super-massive black hole should appear to orbit very quickly, while the stars orbiting farther away should move much more slowly. The stars should behave as the planets do in our solar system. But they don’t. The galaxy rotates more like a wheel.

A wheel rotates the way it does because it is solid. The molecules that make up the solid are far apart, and the spaces between them are enormous. It is the electrostatic forces forming the bonds between molecules that are so strong that the molecules maintain a rigid structure.

Perhaps the stars within a galaxy travel together like a solid. Could the mutual gravitation between neighboring stars be strong enough to hold them together in a somewhat rigid configuration? That seems more plausible than to view the stars as orbiting the central super-massive black hole.

The Milky Way has a mass of ~1,250 billion solar masses. Its super-massive black hole, Sagittarius A*, has a mass of 4.1 million solar masses. Compare these proportions to the Solar System, where the Sun at 2 * 10^30 kg is more than 1000 times larger than the planets at less than 2 x 10^27 kg combined. The Sun’s dominant mass explains why the planets orbit the Sun. Sgr A* is puny relative to the stars in the galaxy. Perhaps this is why the mutual gravitation of neighboring stars would hold them in a nearly rigid configuration, and these influences would dominate over the gravitational force of the central super-massive black hole.

I don’t have the math skills to test that hypothesis. But it’s fun to wonder about such things in the hopes that someone with skills might think of the same idea and publish a legitimate version of the theory.

the shape of a galaxy and its black hole

I was reading this article: Stirred, Not Shaken. Black Hole Antics Puff Up Whopper of a Galaxy.

Articles like this make me wonder whether the shape of a galaxy reflects the influences (gravity, spin, active/inactive) of the central black hole or lack thereof. Maybe a black hole with a mass and spin (moment of inertia) above a certain threshold causes a galaxy to become more spiral. Maybe if it spins with a wobble, the spiral develops a central bar. Maybe if the black hole has a moment of inertia that is below a certain threshold in proportion to the rest if the galaxy’s mass, the galaxy becomes elliptical.

Thoughts return to the weird relationship between the surface area of the black hole’s event horizon (not an existent, since “it” has no energy) to its entropy. Since entropy is a measure of information, the black hole behaves like a hologram. It again leads one to wonder whether the shape of the galaxy is a reflection of the information that is contained in the black hole.

quantum gravity unsolved

Modern physics is fundamentally broken. Quantum theory explains how the universe works at small scales, such as atoms. Quantum theory is remarkably accurate with an incredibly good agreement with experiments. The theory of general relativity explains how the universe works at large scales, such as stars and galaxies. General relativity is remarkably accurate with an incredibly good agreement with experiments. However, the two theories disagree with each other. Both theories are mostly correct, but also wrong in some ways. After decades of trying to reconcile the two theories of the universe, physicists are not close to formulating a theory of everything. There are several promising avenues of research, such as string theory and loop quantum gravity, but none of these has so far been successful.

It makes me wonder about where the error in thinking could be to have misled physicists throughout the world onto roads that are probably dead ends.

It’s been pretty obvious to many where the flaw in quantum theory lies. It treats the time dimension as being independent from the three space dimensions, and this is in complete contradiction to general relativity, which states that space and time form a single manifold. Quantum theory treats space and time as an absolute background, as if they exist independently from the particles and fields that make up reality. However, only particles and fields are real. Space and time as a background do not actually exist; they are mathematical constructs reflecting the geometry of some physical theory.

However, where is general relativity flawed? This is not as obvious. The most obvious conflict with quantum theory has always been gravity, which general relativity claims is the curvature of space-time. Quantum theory has never had a good explanation of gravity. The source of gravity is mass-energy.

Something else we should be aware of is the Standard Model of particle physics. It too represents our current best understanding of the universe. It too is flawed. It does not include gravity among other phenomena. And interestingly enough it predicts a particle called the Higgs boson, which is responsible for mass. However, the Higgs boson has never been experimentally observed. The Large Hadron Collider (LHC) is supposed to confirm or refute the existence of the Higgs boson, when it becomes operational.

Maybe this is more than just a coincidence that the Standard Model, which does not include gravity also predicts the origin of mass, which has eluded observation. Given that gravity is where the conflict lies between quantum theory and general relativity, perhaps we should take a closer look at mass-energy.

Let’s take a look at the Schrödinger equation from quantum theory:

The H is the Hamiltonian, which is the total energy of the system (potential + kinetic). The kinetic energy is proportional to mass. General relativity tells us that mass is equivalent to a whole lot of potential energy:E = mc2.

Meanwhile, special relativity tells us that mass is relative. The mass of something depends on how fast it is moving relative to the observer. The observer and the relative motion are other things that the Schrödinger equation does not account for.

We keep encountering mass in these equations. Maybe we don’t understand mass as well as we think we do. Which leads us to a misunderstanding of gravity, and consequently the geometry of space-time. I wonder if all of this points to mass as being the culprit, where we have been getting it all wrong. Mass is where I suspect the problem to be. If the LHC does not find the Higgs boson, I think this will tell us just how wrong we are about mass.

universe of events – cosmology in software

On my second reading of Three Roads to Quantum Gravity by Lee Smolin, the concept of a relational universe stands out as something fundamentally important.

Each measurement is supposed to reveal the state of the particle, frozen at some moment of time. A series of measurements is like a series of movie stills — they are all frozen moments.

The idea of a state in Newtonian physics shares with classical sculpture and painting the illusion of the frozen moment. This gives rise to the illusion that the world is composed of objects. (p.53)

In object oriented programming, the objects correspond to the particles. The focus is on capturing the state of the object, frozen at some moment of time. As methods are called on the object, changes to its state (variables) are like a series of movie stills.

Lee Smolin goes on to write:

If this were really the way the world is, then the primary description of something would be how it is, and change in it would be secondary. Change would be nothing but alterations in how something is. But relativity and quantum theory each tell us that this is not how the world is. They tell us — no, better they scream at us — that our world is a history of processes. Motion and change are primary. Nothing is, except in a very approximate and temporary sense. How something is, or what its state is, is an illusion. It may be a useful illusion for some purposes, but if we want to think fundamentally we must not lose sight of the essential fact that ‘is’ is an illusion. So to speak the language of the new physics we must learn a vocabulary in which process is more important than, and prior to, stasis. Actually, there is already available a suitable and very simple language which you will have no trouble understanding.

From this new point of view, the universe consists of a large number of events. An event may be thought of as the smallest part of a process, a smallest unit of change. But do not think of an event happening to an otherwise static object. It is just a change, no more than that.

The universe of events is a relational universe. That is, all its properties are described in terms of relationships between the events. The most important relationship that two events can have is causality. This is the same notion of causality that we found was essential to make sense of stories.

If objects are merely an illusion, and it is really causal events that are fundamental to modeling a universe that is relational and dynamical, then perhaps we should re-examine how effective object oriented programming is at producing software that effectively models real world processes. Classes of objects definitely focus on the static structure of the universe. The methods on these classes can be considered to correspond to events, which carry information in, perform some computation, and carry information out. However, the causal relationships between events is buried in the procedural code within each method; they are not expressed in a first class manner.

Personal productivity applications like spreadsheets and word processors model objects (e.g., documents) and relationships that undergo relatively simple processes involving only a few actors. The causal history of events is not as important, because there is only one set of objects in a document to maintain integrity among and the series of frozen moments model of the universe works rather well. Enterprise applications such as Enterprise Resource Planning (ERP) facilitate a multitude of parallel business processes that involve many actors and sophisticated collaborations. Each actor is performing transactions against some subset of objects, which are each progressing through a distinct life cycle. Maintaining integrity among the objects changed by these many concurrent events is incredibly complicated. It becomes important to keep a causal history of events in addition to the current state of the universe, as well as having a schedule of future events (for planning) that have not come to pass. A series of frozen moments becomes less appealing, whereas a set of processes and events seems like a better description of the universe.

cosmological constant

The energy density of empty space is called the cosmological constant. It accounts for the force that causes the expansion of the universe. Its value is approximately 10^-29 g/cm^3. This is an incredibly tiny positive number. They call this stuff dark energy.

As the universe expands, the density of ordinary matter like stars and rocks decreases because new matter is not magically appearing to fill in the space. The incredible thing about the cosmological constant is that the energy density of vacuum does not decrease as the universe expands with time. If this does not surprise you, then let me explore this a little deeper.

mass = density * volume

If the universe is expanding, then the volume is growing larger with time. If the density remains constant, then this would mean that the mass-energy of the universe is ever increasing.

Reconcile that with the First Law of Thermodynamics.

In any process, the total energy of the universe remains the same.

Are we to believe that the universe itself violates the First Law of Thermodynamics?

expanding universe

(follow-up to 2003/07/27)

Scientists observe the following phenomena:

  1. Based on observations of supernovae, galaxies are known to be moving farther away from each other in the universe.
  2. The farther away the galaxy is from us, the greater the red shift in the light from that galaxy. Similar to the Doppler effect, the faster the galaxy is moving away from us, the greater the wavelengths of light are shifted towards the red side of the electromagnetic spectrum. This means that the farther away the galaxy is from us, the faster it is moving away from us.
  3. The farther away the galaxy is from us, the more time it takes for light to travel, before it arrives for us to observe. Therefore, the greater the distance travelled, the farther back into history we are observing.

Based on these observations, scientists theorize that the universe has been expanding. Extrapolating back in time, the theory projects that in the distant past (13.7 billion years ago +/- 200 million years), the universe must have been very compact and incredibly hot and dense. This is the Big Bang theory.

They also conclude that the expansion of the universe has been accelerating.

I don’t understand how they can arrive at that conclusion.

If at greater distances, we observe greater red shift, this means that farther back in time we observe higher velocity of expansion. In other words, as time moves forward, the velocity of expansion decreases. Wouldn’t elementary physics tell us that the expansion of the universe is actually decelerating – NOT increasing in velocity?