úterý 10. února 2009

Danger of positive approach

Few weeks ago, Lubos Motl has won the 2008 Weblog Award price for Best European (Non UK) Blog, which may become a surprise for someone, proponents of dual theories in particular. Whereas AWT is rather invariant/symmetric with respect to string /LQG duality, we can make an attempt for independent analysis of this result. The positive thing may be, Lubos is compatriot of mine, we even both born in the same city - which is not so difficult, after all, as the Czech Republic is really tiny country. Czech Republic is birthplace of many brilliant and intelligent people and beautiful women as well, which is partly due its location in central Europe on the crossroad of many trade routes, along which the mixing of various races can occur.

On the other hand, I'm not very sure, whether Lubos is a typical representative of Czech science, society the less - which is traditionally rather balanced in its opinions, if not opportunistic due it's sensitive geopolitic role of small boundary country between zones of interest of East an West Europe blocks. Buffer countries are often playing a role of branes, which leads to the fragmentation of state boundaries in this area. Aether foam gets more dense at place of density gradient due the potential energy content, where two dual space-time branes/gradients intersects/interfere mutually.

Whereas Weblog Award is rather representative competition, it's still based on results of surprisingly limited number of votes, because the first place in Best European Blog category was a matter of just some 700 voices. Which isn't really too much in the world of anonymous proxies, whereas Goggle is doing a lot more than 1000 queries per second (about 25 queries per second per server). Anyway, Motl's price is well deserved for his frenetic activity and it's even logical in certain extent, because his postings are often quite entertaining and informative and Reference Frame blog is one of few ones, which I'm visiting regularly. Because modern people are basically consumers, Motl's graphomania plays well with their needs, because average visitor can always find something new in his blog everyday.

If so, where's the problem?

Even if we ignore the excentric and subjectively ugly design (typical for Motl's sites) and sometimes unstable behavior of scripts on his site (the purpose of which is to prohibit Motl's opponents in visiting and posting at times), we shouldn't neglect the fact, popularity of this blog is partially based on strongly biased opinions and ad hominem attacks, followed by personally motivated censorship of discussions, which manifest itself by sectarian character of people, which are allowed to post there (simillia simillibus curantur). From these reasons, Motl is often perceived as a controversial person in blogosphere. Personally, I do not believe, most of people, who voted in Weblog Award Poll didn't realize autistic and asocial character of "humble correspondent's" blog - the problem is, the system of voting didn't enable them to express their opinion. Negative voices simply don't count here.

This is a general property of contemporary voting systems, which enable only positive votes, which leads to high degree of populism in side of politicians and ignorance and lack of interests about negative aspects of politics on the side of publicity. Even morally controversial politicians may become successful in this system, if they're is sufficiently active in another areas, in self promotion of personality cult in particular. I believe, this MAY be one of reasons of society problems with its own political representations: voters simply have no veto privilege - they can be only partly responsible. In natural evolution such unbalanced approach to fitness function would suffer consequences, because it violates the equilibrium of supply and demand.

As I'm not expert in social sciences, I'm not informed, whether such approach was proposed or even tested in history and which reasons has lead people to consider only positive voting approach in anonymous elections. Maybe it could have adverse effects and it would lead to undesirable level of opportunism between politicians, I don't know. But as I've met in many cases, most trivial ideas were often ignored for long time just because of their simplicity or generally low asset, which can manifest only under high civilization density. Maybe it could even save Germany from nacism in the mid of 30's of the last century, which was rather inclined to Hitler's populism. If so, maybe the time of more dualistic/symmetric voting system just come up.

čtvrtek 5. února 2009

AWT and evolution of life

By AWT the life is the highly organized form of matter existence, whose properties and abilities are determined by extremely high degree of nested condensation from space-time perspective. Therefore the life formation occurs always near phase interface, where the highest density of space-time gradients can occur due mutual interference of energy waves constituting both phases. The highest concentration of gradients promotes the evolution of maximal complexity, so we can expect the life formation exactly at the middle of dimensional scale of Universe on high dimensional fractal coast of lakes at islands of ancient oceans, covering surface of planets inside of galaxies forming fractal surface of black holes, where solid, fluid and gaseous phase can met together.

Because life is space-time artifact, not just spatial one, the high density of temporal events, i.e. mutations is required to enable the gradual evolution of complexity. This requires an environment, capable of periodic changes and enabling the dissipation of energy in each step. Periodic and tidal waves of ancient oceans can provide such dissipation, because they're paced slowly enough to enable natural selection. Earth rotation and rotational axis inclination, presence of sufficiently massive Sun and Moon provides another level of periodicity due tidal forces, thus increase randomness of evolutionary process.

By AWT the life evolution follows an ancient Oparin's coacervate theory. Coacervates are tiny oily droplets, which are precipitating spontaneously from saturated solutions of various organic compounds, the racemic mixtures of amino-acids and sugars in particular. Under high concentration and some shaking so called reverse micelles or even double layered liposomes can be formed. Such liposomes can behave like walking droplets, described recently:

We can imagine, such droplets were precipitated from waves of ancient lakes at places, where organic compounds were pre-concentrated by wind and solar radiation and they were thrown at coast surface, covered by various surfactants. The droplets are attracted to them, so they started to climb around coast, collecting these materials in their cells. The most successful droplets become so large by such way, they fragmented into smaller ones under impact of next breaker wave, and whole process has repeated many times. Blastulation can be considered as a rudiment of this process by now.

During this the less successful ("low fitness") droplets disappeared gradually on behalf of those better ones, which have collected the proper surfactants into their liposome bodies. Later the concurrence has lead into preference of droplets, which were not only able to collect surfactants, but even to collect the chemicals, able to synthesize them inside of cells. These droplets has become able to digest food after then, so they become hunters of less successful droplets, not just passive collectors of matter from outside. Of course, such competition has accelerated the evolution a much.

And this saga continues till now...

Note that in this early stage of life evolution the inheritance was provided by physical mechanism completely, simply by dividing of cells together with their interior and surface membranes. By AWT the evolution of life follows exactly the evolution of inorganic matter in more nested dimensional scales, i.e. no ribonucleic acids, chromosomes or other contemporary subtleties were required here. We can consider, this mechanism could be reproduced in vitro under proper conditions without problem. Recently living examples of walking droplets were found: a single-celled giant amoebas of very ancient origin.

From AWT follows, such amoebas were first unicellular organisms by the same way, like sponges of foamy structure can be considered as a first multicellular animals. After all, the tissue of higher organisms is rudiment of foam with flat surface as well. The smaller structures (structures bellow human scale of about 1,7 cm) are having concave structures (organelles), while larger tends to become convex (trees, fungi), because they're kept together by surface tension forces. Therefore first organisms were relativelly large from their very beginning, because electromagnetic interaction itself doesn't provide necessary level of complexity and inheritance at molecular level.

Concerning the creationist approach to life formation, the "intelligent constructer" idea is dual to Aether concept and it can be replaced by it easily. From remote space-time perspective every gradualistic evolution becomes discontinuous stepwise artifact by the same way, like event horizon of black holes, when observed from large distance. Every logical explanation is concentrating non-causal assumptions on background, so it becomes a sort of religion. The belief doesn't differ from adherence to causal logics too much, because both approaches tends to tautology by gradual elimination of postulates.

Deism can be understood as an religious approach to Occam's razor criterion, whereas AWT is driven by causual logic. For deeper understanding of God concept we should understand creator paradigm better. Currently it seems, it's just our civilization, which created the black hole, where we are living now. Maybe the moment of final understanding of God becomes the very end of civilization at the same moment, maye quantum uncertainty will protect us from such destiny. Should we kill people like Zephir a well before they can bring an apple of ultimate understanding of reality? Or can just these people prohibits us from destiny of quantum suicide experiment?

AWT, Genesis and precambrian explosion

By AWT elementary particles are small living creatures, which follows energy density gradients (food) of their life environment. Bosons are males, whereas fermions are females. They've a genetic information encoded in helical structure of density gradients inside their body like other living organisms, they consist of foamy tissue composed of bilayers with different surface tension and superhydrophobic behavior, they're tactile and sensitive to heat and mechanical stimulation like other animals.

In general, the she-fermions are more communicative particles, usually rather attractive having mass (some can become quite corpulent). In general, they're loving company and most of all they prefer to exchange food & energy with bosons.

Instead of this, bosons are a movable, unstable and volatile particles. They usually bouncing from one she-fermion to another by high speed. Whenever boson obtains a sufficient energy (fitness), it succeede in mating and it is allowed to exchange its information with fermion. After such collisions a new small particles can emerge, which have structure and property signatures of both parents at the same time.

From this point of view it seems, atom nuclei or black holes are nested closely packed globular colonies of these creatures, similar to "Globe animalcule" (Volvox globator) chlorophytes. This algae can serve as a brane model of strings, being formed by 2D foam.

By Genesis formation of life occurred in six steps, non-uniformly distributed in space-time scale, but equidistantly separated in entropy density scale ("days"). The first stage was a formation of space and time ("heavens and the earth") inside of graviton condensate ("darkness over the deep and God's breath (Aether) hovering over the waters" (waves?)). Gravitons are ambivalent particles, serving both like boson, both like fermions due the supersymmetry.

During Big Bang event (''let there be light") phase transition of space-time has occurred, followed by separation of first generation of bosons, i.e. photons ("God separated the light from the darkness") in process of so called inflation, which resulted into condensation of black hole dome, forming observable generation of Universe ("let there be a dome in the midst of the waters"), i.e. the vacuum in particular ("God called the dome Sky").

By AWT cambrian explosion was a result of analogous phase transition, a condensation of genes following from fast cooling. Around 530 million years ago Earth passed by so called "Snowball Earth" episode, i.e. by cryogenian period of strong cooling by the same way, like the Universe during inflation. During this a existing oceans were covered by thick layer of ice. This shock change of climate was followed by massive extinction, during which remaining organisms were forced to increase speed of their evolution and to exchange genes even in diaspora. The diaspora has lead into evolution of sexual reproduction, which is effective (and quite pleasant) method, how to increase gene mixing speed.

The speed of evolution and mutation must remain always balances in accordance to life conditions. Prokaryota still rely to horizontal gene transfer, simply because they can divide fast. Sexual reproduction is too mutagenic and energetically expensive for tiny organisms with fast paced live cycle (protozoa), so they using it only in under unfavorable conditions.

Large organisms can reproduce sexually, but sometimes tend to parthenogenesis under good life conditions: for example sharks are living in very stable conditions, so they don't evolve fast, they don't require mutations, so they're cancer resistant and hammerhead shark can reproduce asexually. A endometriosis and/or male associated infertility can be understood as an attempt for evolutionary adaptation of human organism to wealthy life conditions, where the sexual reproduction leads to unnecessary high mutagenity. Good social conditions leads to unisex life style and male population will decline gradually in analogy to mixture of particles, which undergoes the gradual evaporation of smaller particles on behalf of large ones with lower social tension.

neděle 1. února 2009

Experimental constraints are there on the scale of spatial structure

what sort of experimental constraints are there on the scale of spatial structure like that

By AWT this constraint doesn’t exist, but here are less or more significant conceptual limits. For example the rest mass of photon appears zero, but when the wavelength size of photon becomes comparable to observable Universe, the photon cannot move inside it furthemore. After the energy of photon becomes equivalent to its rest mass. Such mass appears low (~10 E-61 kg), but here are even stronger limit. When the wavelength of photon becomes comparable to wavelength of CMB, every photon dissolves in CMB noise, which corresponds the graviton noise of previous Universe generation. By such way, the minimal rest mass of photon is quite large, but it’s limited by observability of photon, not by existence of photon as such.

If some civilization could have look inside of large black hole, he would probably see the same things, which we can observe around us, but it would probably dissolve, if it could visit us. But such civilisation can construct a giant microscope, as large as whole black hole and it can use focused gravitational waves to observe the life inside. After then such civilization could see more, then the single generation of Universe because of tunnelling of information through event horizon. Therefore the observational limit is just a matter of observational scope, by my opinion.

Lorentz symmetry and String theory

This post is a polemic to Motl's somewhat nervous defense of Lorentz symmetry (LS), as quoted by italics. It hope, it may be interesting for someone. By AWT the confrontation of ideas in dialectic discussion is driving tensor of new ideas: full agreement cannot serve as a both subject, both object of further thinking and extrapolations.

Moshe Rozali wrote a very sane text about the importance of LS for the search for the fundamental laws of Nature: The Universe is probably not a quantum computer. I agree with every word he wrote. He says that many people who are following the physics blogosphere want to believe that their area of expertise is actually sufficient to find a theory of everything.

.. by the same way, like string theorists and many others.. By AWT whatever theory of your personal preference can become a TOE, if you make it infinitely implicit, i.e. if you compose it from as from minimal number of postulates, as possible. The complex theories mixed from high number of postulates, like string theory would be strongly handicapped by such way, of course.

So Seth Lloyd of the quantum computing fame wants to believe that the world is a quantum computer. Robert Laughlin wants to imagine that quantum gravity is an example of the fractional quantum Hall effect. Other people have their own areas of expertise, too. Peter Woit wants to believe that a theory of everything can be found by mudslinging and defamations while Lee Smolin wants to believe that the same theory can be found by selling caricatures of octopi to the media (following some subtle and not so subtle defamations, too).

..and string theorists are believing in vibrating strings. And so? Live and let live. The world of coexisting theories illustrates the space-time world, being a low energy density projection of it into causual space.

Moshe Rozali correctly tells them that if they are going to ignore the Lorentz symmetry, a basic rule underlying special relativity, they are almost guaranteed to fail. Lorentz symmetry is experimentally established and even if it didn't hold quite accurately, it holds so precisely that a good theory must surely explain why it seems to work so extremely well in the real world.

Lorentz symmetry is violated by quantum mechanics heavily, it's simply based on dual approach be more specific. By AWT even gravitational lensing is rather quantum mechanics phenomena, then the relativity phenomena. To defend Lorentz symmetry you're simply required to fight against quantum mechanics and vice-versa.

It still doesn't mean, Universe computes something for somebody.

Moreover, the state-of-the-art theories of the world are so constrained - i.e. so predictive - exactly because they are required to satisfy the Lorentz symmetry.

Quantum mechanics is based on zero or infinite many radiative time arrows. It's invariant to LS (and other postulates of relativity, based on radiative time arrow causality), while still remains predictive. Aether theory is invariant to both, while still remains predictive. In fact, just because both LS, both quantum mechanics are mutually inconsistent apparently, here's a question, why not to start once again from complete beginning.

Because of this symmetry, quantum field theories only admit a few marginal or relevant deformations. If you assume that they make sense up to extremely high energy scales, you may accurately predict all of their low-energy physics as long as you know a few important parameters. Such a "complete knowledge" of physics in terms of a few parameters would be impossible in non-relativistic theories.

The same is true for relativistic theories. The emergence concept is still required to seamlessly connect both these branches of physics.

String theory is even more constrained than quantum field theory: it has no adjustable dimensionless non-dynamical parameters whatsoever. In some sense, you may view string theory as a tool to generate privileged quantum field theories with some massless spectrum and infinitely many very special, selected massive fields with completely calculable interactions. So all the Lorentz constraints that apply to quantum field theory can do the analogous job in string theory, too.

String theory is like every other quantum field theory in this point. It's true, most of formalism was developed under cover of string theory, because string theory has a good marketing, best experts and some nice faces in front of it. But these approaches can be used in many other theories and the best string theorists, like Ed Witten are doing so without any frustrations.

However, in string theory, the character of LS is even more direct. The very short distance physics of string theory is pretty much guaranteed to respect the LS. Whenever you look at regions that are much smaller than all the curvature radii of a D+1-dimensional spacetime manifold, the dynamics of a closed string reduces to a collection of D+1 free scalars on the worldsheet which manifestly preserves the Lorentz symmetry. And one can show that the interactions respect it, too.

String theory is based on combination of quantum mechanics and special relativity. From this point of view is apparently less general, then any theory based on combination of quantum mechanics and general relativity, like LQG. It's just one of evolutionary steps of physics, no less, no more. It opened many research perspectives, while quantum gravity has opened others.

Open strings may violate the LS spontaneously, for a nonzero B-field or a magnetic field on the brane, and one can enumerate a couple of related ways to spontaneously break the Lorentz symmetry with the presence of branes and their worldvolume fields. But none of these pictures ever hides the fact that the fundamental theory behind all these possibilities is Lorentz-invariant.

This is just one of many perspectives possible. Some others can see an infinitely fractal Universe based upon quantum mechanics units or even particle units. But fractal geometrodynamics, as expressed by double relativity based on Poincare, Cartan and deSitter groups is still in the game as well.

There's a lot of confusion in the public about the fate of the LS in general relativity. Be sure that the LS is incorporated into the very heart of general relativity. General relativity generalizes special relativity; it doesn't deny it. General relativity can be defined as any collection of physical laws that respect the rules of special relativity (including Lorentz invariance) in small enough regions of spacetime - regions that can, however, be connected into a curved manifold. All breaking of LS in general relativity can always be viewed as a spontaneous breaking by long-distance effects and configurations.

Every generalization is predestined to violate its roots less or more lately. My personal understanding is, general relativity has nothing to do with LS at all, being even much more general, then many relativists (specially those special ones) may be willing to admit. Anyway, general relativity has nothing to do with string theory, which doesn't uses postulates of general relativity at all. This belongs into realm of quantum gravity.

In fact, even in spacetimes with a lot of curved regions - such as spacetimes with many neutron stars or even black holes - one can use the tools of special relativity in many contexts: either in very small regions that are much smaller than all the curvature radii, or in regions that are much larger than stars and black holes. In the latter description, the stars and black holes may be viewed as local point masses or tiny disturbances that follow the laws of relativistic mechanics at much longer distances, anyway.

That's perfectly right. And the large systems of such particles are following a quantum or newton mechanics at another distances, and so on.

So if someone completely neglects Lorentz invariance, the player that became so essential in 1905, he shouldn't be surprised if theoretical physicists simply ignore him or her. It is not necessary for a theory to be Lorentz-invariant from the very beginning. But a theory only starts to be interesting as a realistic theory of our world after one proves that Lorentz invariance holds exactly (or almost exactly).

It was just Einstein in 1917, who completely omitted Lorentz invariance from further thoughts. Just because string theory has chosen Lorentz invariance as one of its postulates doesn't means, this approach is the only universal approach to physics. Even Einstein has recognized it - so why not some string theorists?

I am personally convinced that theories that try to break Lorentz invariance by small effects are not well-motivated. But even if I insist on the things that have been established only, the "at least almost accurate" Lorentz symmetry that has been demonstrated is an extremely powerful constraint on any theory. If you invent a random theory for which no reason why it should be Lorentz-invariant is known, it is extremely likely that the LS doesn't work at all and the theory is therefore ruled out.

The small breaking of Lorentz invariance we can observe as a quantum chaos. It's not a consequence of violating it, rather applying it in many concurrent time arrows. Because every particle itself is Lorentz invariant, the mutual interaction of many particles brings a causal uncertainty into global view. The theory based on small effects is Kostelecky theory, for example.

There are actually approaches to string theory that are not manifestly Lorentz-invariant. For example, the BFSS matrix model, or M(atrix) theory, is a 0+1-dimensional quantum field theory - a U(N) gauge theory with 16 supercharges. You can also say that it is a quantum mechanical model with many degrees of freedom organized into large Hermitean matrices. It resembles non-relativistic quantum mechanics, with some extra indices and a quartic potential.

Every theory should be defined by its postulate tensor, string theory is no exception. No theory, which is based on Lorentz symmetry can derive the violation of this symmetry by rigorous way.

There is no a priori reason to think that such a seemingly non-relativistic theory - whose symmetry actually includes the Galilean symmetry known from non-relativistic physics - should be Lorentz-invariant. Except that one can defend and "effectively prove" this relativistic symmetry by arguments based on string dualities. Although it can't be completely obvious from the very beginning, the original BFSS matrix model describes a relativistic 11-dimensional spacetime of M-theory. But the relevance of the matrix model for M-theory only began to be studied seriously when arguments were found that these two theories were actually equivalent. You simply can't expect your non-relativistic model to be equally interesting for physicists if you don't have any evidence that your model respects Lorentz invariance - or if it even seems very likely that it cannot respect it. Physicists would be foolish to treat your theory on par with QED or the BFSS matrix model because it seems excessively likely that your theory can't agree with some of the basic properties of the spacetime we know.

This is not true. In AWT the LS is provided by fact, no object can serve both like subject, both like mean of observation at the same space and time (a singular case of observation, based on zero degree causal tensor). Therefore Aether concept cannot violate Lorentz symmetry locally by its definition.

Emergence and the role of Lorentz symmetry in the grand scheme of things.

That's right, but the emergence has no relevant explanation in physics without Aether concept, not a string theory. And they're both theorems of AWT. Aether concept doesn't uses neither require any other ad hoced concepts. While emergence is required both for explanation of relativity, both quantum mechanics, I believe, we can avoid LS safely for future by the same way, like prof. Einstein did.   

The comments above should be completely uncontroversial. But let me add a few more speculations. Because space is emergent in string theory, the LS - a symmetry linking space and time - has to be emergent, too. This symmetry of special relativity is telling us that things can't move faster than light in the newly emergent geometry. What is this constraint good for? Is Nature trying to tell us something deeper than that?

The claim "space is emergent in string theory" simply mean, space is composed of many tiny strings. If you cannot realize it, then you simply don't know, what the emergence is based on. The Nature is just trying to tell us, it doesn't matter, which concept you're use in large quantity, it always loses its conceptual subtleties and becomes a pin-point singularity, i.e. "particle" from sufficiently distant space time perspective. This is what the Aether approach is based on: on particle abstract. The symmetry you're disputing just illustrates, the LS has its principal limits in anti deSitter space. From perspective of observer sitting inside of dense fluctuation of Aether the energy will spread outside of black hole by superluminal speed without problem.

Well, I am confident that special relativity is important for life as we know it because motion is very helpful for animals and the equivalence of all inertial frames is the simplest (and maybe the only plausible) method for Nature to guarantee that the very motion won't kill the animals. Imagine that you would feel any motion - you would probably vomit all the time and die almost instantly. ;-)

Stop trolling. Special relativity is important for life of (special) relativists and some fundamentalist string theorists only. Some people can become quite naturalistic, when defeating their pet theories...;-)

The Lorentz symmetry and the Galilean symmetry were the two most obvious realizations of the equivalence of all inertial frames that Nature could choose from, and She chose the LS because it treats space and time more democratically than the Galilean symmetry. (I could probably construct more robust anthropic arguments even though they would probably not be based on the motion of animals only - simply because the low value of "v/c" for animals indicates that the finiteness of "c" is not necessary for life itself.)

Nature doesn't choose the LS, the Prussian academy under Planck leadership has chosen it as its paradigm to avoid influence of Poincare's Sorbonne. This is a difference...;-)

But in the previous two paragraphs, we were talking about the 3+1 large dimensions of spacetime only. String theory has additional dimensions that can emerge in various ways and that are dual to each other - and the LS applies to all these dimensions as long as they become larger than the curvature (and compactification) radii. In some sense, that's quite shocking.

Emergence isn't miracle, it has very simple reason in AWT. Some physicists are becoming a cocooned creationists apparently, because they tend to use the concepts without their firm reasoning. This is a consequence of less or more hidden belief into reality, not the reality understanding by logical implications based on analogies.

The conclusion is, LS violation isn’t supposed to be weak at all. If we consider, particles of matter are all formed by the same vacuum, like the rest of cosmic space, then the LS violation is responsible for refraction index of both black holes, both elementary particles, everything. If LS would be complete universal, we would see anything from Universe - simply because it would be nothing to deflect path of light.

We can call this missunderstanding by proverb “The darkest place is under the candlestick”. Many scientists are spending money and their lives by obstinate search for LS violation - whereas they’re virtually sitting on it all the time. This just illustrates, why is it so important to understand subject at nonformal, conceptual level. It can save the money for all of us.

Every quantum mechanics phenomena is just a manifestation of nearly singular Lorentz symmetry violation from this perspective. Not saying about weaker effects, like CMB, gravitational lensing, photon-photon interactions and pair formation, GZK limit, dark matter… Virtually, if we can observe at least something, then the LS is violated there. We can see just this portion of curved space-time, because the places, where LS remains valid well are transparent for us by definition.

The same, just dual problem exists with quest for hidden dimensions. Because scientists are refuting Aether concept, we are forced to pay them for development of alternative models and for proposal of experiments, which could confirm the presence of hidden dimensions, albeit every quantum chaos or complex long distance interaction is demonstrating them clearly. Such ignorance may appear funny, but it's an innefective and expensive game for the rest of society, because these scientists can get involved in more usefull things.

To be sarcastic regarding string theory, I’d say, it tryies to describe by using of LS just this part of Universe, which violates it most pronouncedly. But this paradox is logical, because we can never use the same aspect of reality both the object of  observation/ description, both the mean of observation/description. We can see, the same logics, which introduces the Aether can be used even for Lorentz symmetry at the another level of reasoning. Theoretical description is dual to experimental observation in this sense. The reality is partly real, partly the consequence of theories and observable reality forms the boundary of both approaches.

Anyway, quantum gravity suffers the same conceptual problem, being dependent on equivalence principle instead of LS. It just means, it becomes wrong/singular in different part of conformal space-time: it can describe the LS violation of free space, assuming a “stringy structure” for it, while it’s missing the complex multidimensional structure of particles.

Whereas string theory depends on LS, it cannot predict the LS violation phenomena by rigorous way, because it doesn’t care about vacuum structure (with exception of string field theory and some other boundary approaches) . But it can describe well the complex structure of particles as such. These nice theories are AdS/CFT dual in fact, being separated by one derivation of Aether gradient in its description (they're mutualy orthogonal each other via Lorentz symmetry group).

AWT, emergence and particle - unparticle duality

This post is a reaction to two recent articles (1, 2) from HEP arXiv section (via KFC's blog), which are illustrating conceptual problems of formal approach of mainstream physics clearly.

These article quoted lacks the definition of emergence, duality, particle and unparticle concepts too much to be able to claim the things like "particles are dual to unparticles" or "quantum mechanics is of emergent nature" reliably. By my opinion it's even impossible to propose relevant formal description of concepts without robust definition of them at the semantic level. Without it no formal derivation can be interpreted and used by another theories. By AWT Universe is formed by infinitely dense environment, the observability of which corresponds the system of nested density fluctuations inside of dense gas (a condensing supercritical fluid in particular).

By AWT, our Universe could appear like fractal cloud similar to Perlin noise and after then every particle or artifact inside of our Universe becomes a sort of unparticle, observed from perspective of another one. This perspective introduces a sort of causality into chaotic view of our Universe, because only causal gradients (a “particles”) is what we can observe from this chaos.

Double relativity (DR) is based on dynamical relationship of two systems of reference: when one system of reference has been immobilized, it temporarily becomes an absolute point of reference. In this moment, at least two cases of DR were proposed so far, based on de Sitter and Poincaré invariant space-time group accordingly. Poincaré spacetime group appears slightly less general, being based on Lorentz symmetry of special relativity, while de Sitter spacetime group relies to equivalence principle of general relativity.

Therefore I still don’t see any evidence for particle - unparticle duality here: particles are always a subset of unparticles by DR, not a dual representation of it. And if it appears so from perspective of DR, then the DR is demonstrating its limits in this point, which is probably given by fact, it's a formal theory and the particle - unparticle duality is relevant for infinitely dense particle field, i.e. singular case of every formal theory. By AWT only infinitely implicit ("fractal") theory can become an equivalent of infinitely dense Aether and/or abstract unparticle model. This leads to requirement of triple, quadruple,... etc. relativity naturally. Only {inside of such/exsintric perspective of} "infinite relativity" the particle and unparticle models can become dual completely.

Second article suffers by similar causal problem, because by its name quantum mechanics can be virtually everything, until we define, what the “Emergent Phenomenon” really is. By AWT every “deeper level dynamics” is nothing else, then the particle dynamics of in many other deeper levels of particle fluctuations, i.e. the unparticle dynamics. Therefore unparticle geometrodynamics appears like best way, how to formalize emergence concept. But because it wasn't formalized yet, we cannot use it for derivation of any testable conclusions, predictions the less. Without predictions every article about "emergent physics" becomes just a metaphysics based on formal math up to level, we can talk about duality of rigor and postmodern philosophy. This is because the predictability of both formal, both nonformal hypothesis vanishes mutually and ceases to zero with increasing scope. Nonformal approach of philosophy becomes a quite powerful tool there, because both philosophy, both formal math is based on predicate logics.

Mainstream science uses positivist approach very often from pragmatic reasons by the same way, like medicinemans of ancient era have used their tools to keep their significance in the eyes of the rest of society. It handles the phenomena by formal way of various regression of reality without worrying, whether they're valid at the robust logics level - i.e. whether they're not an apparent nonsense, to say it by less diplomatic way. Such approach is analogous to epicycle solution of conceptual problems of geocentric model and it corresponds to solving of homework assignation without understanding of problem at the abstract level first. After all, the contemporary learning system purportedly trains new scientists to formal way of reality description, not understanding. This positivist approach my be a consequence of fact, scientists are payed for filling of publications with equations - but not for explanation of subject - so they just adopted to this situation.

From AWT perspective the unparticle concept is still ad hoced, as we can see it in the nested field of nested fluctuations of Boltzmann gas . We can paraphrase here a proverb "The optimist sees the doughnut; the pessimist the hole":

Where physicist sees a particle, mathematican can see a pure geometry only.

But can a "pure geometry" interact with/observe/describe a pure geometry? I really don't think so - or the Universe is one big cheap illussion and we're observing anything. Here's still some fifth element hidden behind particle concept. By my opinion it's a consequence of seemingly trivial fact, we are only part of Universe. Why?

If we could reveal a general explicit rule in sequence of prime numbers or in Fibonnacci spirals inside of growing pile of particles, we could postulate a very general emergence group, which would become nonlocal and very universal by such way. But I don't think, such group exists at all. If we are formed by pure geometry, then we should admit, then the pure geometry can observe/interacts with itself. Such identity would violate Goedel's theorems, Aether concept, virtually everything, what we know about reality so far.

Therefore the question is, why Universe is always larger, then our observable scope? I can feel, the limited speed of information spreading could answer this question, at least partially.

čtvrtek 29. ledna 2009

AWT and definition of intelligence

By AWT correct - i.e. physically relevant - definition of intelligence is rather important, as it can give us a clue about direction of psychological time arrow.

From certain perspective every free particle appears like quite intelligent "creature", because it can find the path of the optimal potential gradient unmistakably even inside of highly dimensional field where interactions of many particles overlaps mutually. Whereas single particle is rather "silly" and it can follow just a narrow density gradient, complex multidimensional fluctuations of Aether can follow a complex gradients and they can even avoid a wrong path or obstacles at certain extent. They're "farseeing" and "intelligent". Note that the traveling of particle along density gradient leads into gradual dissolving of it and "death". The same forces, which are keeping the particle in motion will lead to its gradual disintegration of it.

The ability of people to make correct decisions in such fuzzy environment is usually connected with social intelligence. We can say, motion of particle is fully driven by its "intuition". They can react fast in many time dimensions symmetrically (congruently), whereas their ability to interact with future (i.e. ability of predictions) still remains very low, accordingly to low (but nonzero) memory capacity of single gradient particle. Nested clusters of many particles are the more clever, the more hidden dimensions are formed by. Electrochemical waves of neural system activity should form a highly nested systems of energy density fluctuations.

Neverthelles, if we consider intelligence as "an ability to obtain new abilities", then the learning ability and memory capacity of single level density fluctuations still remains very low. Every particle has a surface gradient from perspective of single level of particle fluctuations, so it has an memory (compacted space-time dimensions) as well. Therefore for single object we can postulate the number of nested dimensions inside of object as a general criterion of intelligence. The highly compactified character of neuron network enables people to handle a deep level of mutual implications, i.e. manifolds of causual space defined by implication tensors of high order. Such definition remains symmetrical, i.e. invariant to both intuitive behaviour driven by parallel logics, both conscious behaviour, driven by sequential logics.

Every highly condensed system becomes chaotic, because intelligent activities of individual particles are temporal and they're compensating mutually here. By such way, the behavior of human civilization doesn't differ very much from behavior of dense gas, as we can see from history of wars and economical crisis, for instance. The ability of people to drive the evolution of their own society is still quite limited in general. We can consider such ability as a criterion of social self-awareness. The process of phase transition corresponds learning phase of multi-particle system.

Interesting point is, individual members of such systems may not be aware of incoming phase transition, because theirs space-time expands (the environment becomes more dense) together with these intelligent artifacts. At certain moment the environment becomes more conscious (i.e. negentropic), then the particle system formed by it and phase transition will occur. The well known superfluidity and superconductivity phenomena followed by formation of boson condensate can serve as a physical analogy of sectarian community formation, separated from the needs/feedback of rest of society. Members of community can be internally characterized by their high level of censorship (total reflection phenomena with respect to information spreading) and by superfluous homogeneity of individual stance distribution, followed by rigidity and fragility of their opinions (i.e. by duality of odd and even derivations in space and time) from outside perspective.

AWT explains, how even subtle forces of interests between individuals crowded around common targets cumulate under emergence of irrational behavior gradually. Because such environment becomes more dense, the space-time dilatation occurs here and everything vents OK from insintric perspective. As the result, nobody from sectarian community will realize, he just lost control over situation.

For example, people preparing LHC experiments cannot be accused from evil motives - they just want to do some interesting measurements on LHC, to finish their dissertations, make some money in attractive job, nurse children, learn French, and so on… Just innocent wishes all the time, am I right? But as a whole their community has omitted serious precautionary principles under hope, successful end justifies the means.

Particle model explains, how even subtle forces of interests between individuals crowded around common targets cumulate under emergence of irrational behavior gradually. For example, nobody of this community has taken care about difference in charged and neutral black holes in their ability to swallow surrounding matter. As a result, nobody of members of such community realizes consequence of his behavior until very end.

And this is quite silly and unscouscios behavior, indeed.

AWT and LHC safety risk

The LHC "black hole" issue disputed (1, 2, 3) and recently reopened (1, 2, 3) is manifestation of previously disputed fact, every close community becomes sectarian undeniably and separated from needs of rest of society like singularity by total reflection mechanism. Ignorance of fundamental ideas (Heim theory) or discoveries (cold fusion, surface superconductivity, "antigravity") on behalf of risky and expensive LHC experiments illustrates increasing gap between priorities of physical community and interests of the rest of society.

The power of human inquisitiveness is the problem here: as we know from history, scientists as a whole never care about morality, just about technical difficulties. If they can do something, then they will do it - less or more lately, undeniably. No matter whether it's nuclear weapon, genetically engineered virus and/or collider. Which makes trouble at the moment, the results of such experiments can threaten the whole civilization. We should know about this danger of human nature and we should be prepared to suffer consequences. Max Tegmark’s “quantum suicide” experiment doesn't say, how large portion of the original system can survive its experiment.

So, what's the problem with LHC experiments planned? Up to this day, no relevant analysis, evaluating all possible risks and their error bars is publicly available. Existing safety analysis and reports (1, 2) are very rough and superficial, as they doesn't consider important risk factors and scenarios, like formation of charged black holes or surface tension phenomena of dense particle clusters. There's an obstinate tendency to start LHC experiments without such analysis and to demonstrate first successful results even without thorough testing phase. Because the load of accelerator was increased over 80% of nominal capacity during first days impatiently, the substantial portion of cooling system crashed due the massive spill (100 tons) of expensive helium and monitoring systems of whole LHC are in extensive upgrade and replacement to avoid avalanche propagation of the same problem over whole accelerator tube in future.

Up to these days, publicity has no relevant and transparent data about probability of supercritical black hole formation during expected period of LHC lifetime and about main factors, which can increase total risk above acceptable level, in particular the risk associated to:

  1. Extreme asymmetry of head-to-head collisions, during which a zero momentum/speed black holes can be formed, so they would have a lot of time to interact with Earth with compare to natural protons from cosmic rays. The collision geometry is has no counterpart in nature, as it's a product of long-term human evolution, not natural processes.

  2. Avalanche-like character of multi-particle collisions. When some piece of matter appears in accelerator line, then whole content of LHC will feed it by new matter incoming from both directions by nearly luminal speed, i.e. in much faster way with compare to collisions of natural cosmic rays appearing in stratosphere

  3. Proximity of dense environment. With compare to stratospheric collisions of gamma rays, the metastable products of LHC collisions can be trapped by gravitational field of Earth and to interact with it in long term fashion. Some models are considering, the black hole can move in Earth core for years without notion, thus changing the Earth into time-bomb for further generations.

  4. Formation of charged and magnetic black hole. As we know from theory, real black holes should always exhibit nonzero charge and magnetic field as the result of their fast surface rotation. While force constant of electromagnetic force is about 10^39 times stronger then those of gravitational interaction (and the force constant of nuclear force is even much higher), the omitting of such possibility from security analysis is just a illustration of deep incompetence of high energy physics and it looks rather like intention, than just omission. It's not so surprising, as every introduction of such risk into safety analysis would lead into increasing of LHC risk estimations in many orders of magnitude, making them unfeasible in the eyes of society.

  5. Formation of dense clusters of quite common neutral particles, which are stable well outside from LHC energy range (presumably the neutrons). This risk is especially relevant for ALICE experiment, consisting of head-to-head collisions of heavy atom nuclei, during which the large number of free neutrons can be released in the form of so called neutron fluid. The signs of tetra-neutron existence supports this hypothesis apparently. The neutron fluid would stabilize neutrons against decay due its strong surface tension by analogous way, like the neutrons inside neutron stars. The risk of neutron fluid formation is connected to possible tendency to expel protons from atom nuclei in contact with neutron fluid, thus changing them into droplets of another neutron fluid by avalanche like mechanism, which was proposed for strangelet risk of LHC originally.

  6. Surface tension effects of large dense particle clusters, like the various gluonium and quarkonium states which CAN stabilize even unstable forms of mater, like neutral mesons and other hadrons up to levels, they can interact with ordinary matter by mechanism above described under formation of another dense particle clusters, so called strangelets (sort of tiny quark stars, originally proposed by Ed Witten). The evidence of these states was confirmed recently for tetra- and pentaquark exotic states. By AWT the surface tension phenomena are related to dark matter and supersymmetry effects observed unexpectedly in Fermilab (formation of di muon states well outside of collider pipe), as we can explain later. If this connection will be confirmed, we aren't expected to worry about strangelet formation anymore - simply because we observed it already!

With compare to black hole formation, the risks of strangelet and neutron fluid aren't connected to collapse of Earth into gravitational singularity, but to release of wast amount of energy (comparable to those of thermonuclear fusion), during which of most of matter would be vaporized and expelled into cosmic space by pressure of giant flash of accretion radiation.

As I explained already, cosmic ray arguments aren’t wery relevant to highly asymmetric LHC collisions geometry, so it has no meaning to repeat them again and again. This geometry - not the energy scale - is what makes the LHC collisions so unique and orthogonal to extrapolations based on highly symmetrical thermodynamics. It’s product of very rare human evolution. Whole AWT is just about probability of various symmetries.

So we are required to reconsider LHC experiments in much deeper, publicly available and peer reviewed security analysis. We should simply apply scientific method even to security analysis of scientific experiments - no less, no more. By my opinion, these objections are trivial and mostly evident - but no safety analysis has considered them so far from apparent reason: not to threat the launch of LHC. So now we can just ask, who is responsible for this situation and for lack of persons responsible for relevant safety analysis of LHC project of 7 billions € in total cost?

Safety is the main concern of LHC experiments. You can be perfectly sure, LHC experiments are safe because of many theories. After all, the main purpose of these experiments is to verify these theories.

Isn't the only purpose of LHC to verify it's own safety at the very end? Is it really enough for everybody?

úterý 27. ledna 2009

AWT and Bohmian mechanics

This post is a reaction to recent L. Motl's comments (1, 2, reactions) concerning the Bohm interpretation of quantum mechanics (QM), the concept of Louis de Broglie pilot wave in particular (implicate/explicate order is disputed here). Bohm's holistic approach (he was proponent of marxistic ideas) enabled him to see general consequences of this concept a way deeper, the aristocratic origin of de Broglie. It's not surprising, Bohm's interpretation has a firm place in AWT interpretations of various concepts, causual topology of implications and famous double slit experiment in particular. After all, we have a mechanical analogy of double slit experiment (DSE) presented already (videos), therefore it’s evident, QM can be interpreted by classical wave mechanics without problem..

Single-particle interference observed for macroscopic objects

AWT considers pilot wave as an analogy of Kelvin waves formed during object motion through particle environment. Original AWT explanation of double slit experiment is, every fast moving particle creates an undulations of vacuum foam around it by the same way, like fish flowing beneath water surface in analogy to de Broglie wave.

These undulations are oriented perpendicular to the particle motion direction and they can interfere with both slits, whenever particle passes through one of them. Aether foam gets more dense under shaking temporarily, thus mimicking mass/energy equivalence of relativity and probability density function of quantum mechanics at the same moment. The constructive interference makes a flabelliform paths of more dense vacuum foam, which the particle wave follows preferably, being focused by more dense environment, thus creating a interference patterns at the target.

By AWT the de Broglie wave or even quantum wave itself are real physical artifacts. The fact, they cannot be observed directly by the using of light wave follows from Bose statistics: the surface waves are penetrating mutually, so they cannot be observed mutually. But by Hardy's theorem weak (gravitational or photon coupling) measurement of object location without violating of uncertainty principle is possible. What we can observe is just a gravitational lensing effect of density gradients (as described by probability function), induced by these waves in vacuum foam by thickening effect during shaking.

Other thing is, whether pilot wave concept supplies a deeper insight or even other testable predictions, then for example time dependent Schrödinger equation does. By my opinion it doesn't, or it's even subset of information contained in classical QM formalism. This doesn't mean, in certain situations pilot wave formalism cannot supply an useful shortcut for formal solution (by the same way, like for example Bohr's atom model) - whereas in others cases it can become more difficult to apply, then other interpretations.

čtvrtek 15. ledna 2009

AWT, theories and Gödel's incompleteness theorems

By AWT the scientific (i.e. causality logics based) theories are simply density fluctuations of Aether scale invariant environment like others. Human understanding is energy density driven, and the theories are accelerating the speed of energy/information propagation through environment (a human society) by the same way, like the asymmetric density fluctuations (a gradients) are accelerating the asymmetric energy spreading in transversal waves through particle environment.

Being a physical artifacts, even the seemingly abstract theories have independent tangible impact to observable reality. For example, the aerial view bellow illustrates the appearance of two neighboring countries (Austria and former Czechoslovakia), which differs just only by their theories of social arrangement, not by natural conditions. The appearance of landscape in country, which is applying socially oriented theory leading to less diversity is apparently less divergent as well. It still doesn't mean, the more divergent theory is necessarily better, though, because it's suited just for more rich and divergent environment - but this is another story.

Because the scope of density fluctuations inside of nested field of density fluctuations is always limited, the scope of theories must remain limited as well. This is because every theory is based on at least single causal/logical connection between two or more axioms/postulates/assumptions, i.e. an implication tensor definning the cardinality and compactness/consistency of formal logic system built upon implication. But the consistency of two different postulates can be never confirmed with certainty - or we could replace them by single one and we could never have some implication between them anymore, but a tautology. By such way, the scope of every logics is limited, because it remains based on insintrically inconsistent axioms - or we couldn't have some logics at all. In particular, at the moment, when TOE defines a time arrow, it becomes tautological, because validity of every implication depends on time arrow vector of antecedent and consequent. Such conclusion leads us to the understanding, every Theory Of Everything (a TOE based on no assumptions) is necessarily tautological by its very nature by the same way, like dual concept of God - and as such not very useful in causual perspective for the rest of society.

Gödel's incompleteness theorems (GITs) show that, for any sufficiently complex set of mathematical systems, one of the following two statements is true. Either
  1. There are true statements, expressible within the mathematical system, that cannot be proven from the axioms of that mathematical system. Or:
  2. There are false statements, expressible within the mathematical system, that can be proven from the axioms of that mathematical system.
Most mathematicians lean towards (1), because (2) would basically imply that formal math is BS (causual bifurcations related to imaginary numbers or division by zero are particularly good for it). But (1) is just a limitation upon what can be proven by mathematics: There are true statements, which you can perfectly describe in mathematical terms, which cannot be proven by mathematics.

Whole GITs are about this dilemma, but the explanation of AWT appears more intuitive and general. GITs were derived for theory of natural number set based on eleven axioms of Peano algebra, which is supposedly best defined human theory (of countable units) so far. The existence of other theories is based on more fuzzy logic, including the definition of theory itself. We can still consider AWT theory more general, then any other number theory, because the (natural) number concept is based on countable units, i.s. singular zero-dimensional particles colliding mutually in infinitely dimensional space, whereas the differential calculus is based upon concept of Aether density gradients driven observable reality.

Without particle concept the number concept is unthinkable - until we accept, we are composed just from pure numbers - which doesn't appear very probable, because number theory is product of human evolution and as such is much younger, then the Universe - not vice-versa. By such way, the AWT is working even at the case of singular geometry and fuzzy algebras.

Donald Rumsfeld: "As we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns - the ones we don't know we don't know."

sobota 3. ledna 2009

AWT and human scale

By AWT the Universe appears like being formed by infinitely nested field of density fluctuations of Aether. The human brain is one of such fluctuation, due its large time scale it can interact/observe a huge portion of space-time both into past of Universe expansion (the cosmic scale), both into future of it (the Planck scale). Because of symmetry of mutual interaction, human scale appears exactly at the middle of observable space-time scale. The human scale is defined by the average size of neurons inside of human brain (lowest entropy observable inside of our Universe generation) or by wavelength of cosmic microwave background radiation (CMB) (about 1.7 cm), which is apparently chaotic (highest entropy density observable). Under furthersome conditions, the violation of Lorentz symmetry can be observed by naked eye as a Brownian motion at Planck scale or like gravity lensing at cosmic scale - due CPT symmetry violation the Planck scale appears more close to human scale, then the cosmic one.

The wavelength 1.7 cm is invariant with respect to AdS-CFT duality, because it corresponds the wavelength, when the character of energy spreading changes from longitudinal waves to transversal one. From AWT perspective the CMB corresponds the capillary waves at water surface, which are spreading along it by the lowest speed at wavelength of 1.73 cm from exsintric perspective, enabling to interact with as large space-time, as possible and allowing the most advanced evolution of matter inside it. Classical quantum mechanics cannot handle gravity (phenomena) at all and quantum noise blurs in CMB noise above human scale by the same way, like relativity is limited by CMB noise (GZK limit, CMB Doppler anisotropy, etc.) in its predictions.

From cosmological perspective, the wavelength of CMB (1,7 cm) corresponds the outer diameter of Universe or the wavelength of Hawking radiation of tiny black hole, whose lifespan corresponds the age of our Universe generation (13.7 GYrs) - so we can say, the CMB is Hawking radiation of the black hole, which we are living in, i.e. red-shifted radiation of most distant quasars. The foamy character of energy spreading enables to see the event horizons of our Universe both from inside, both from outside via CMB radiation (i.e. the event horizons of most distant quasars observable). The larger (gravitational waves) or shorter waves (gamma radiation) are of limited scope with compare to CMB due the dispersion (GZK limit) in analogy to capillary waves spreading at water surface (compare the celerity curve bellow). The energy density of 3D space-time (roughly given by third power of Planck constant, i.e. 1oE+96 J/m3) corresponds the mass density of black hole, which is forming it.

From AWT follows, every Aether fluctuation of diameter bellow 1.7 cm will dissolve into photons and neutrinos, while the larger objects will collapse into heavier objects and evaporate by the same way. The black holes of diameter bellow 1.7 cm can evaporate via Hawking radiation during observable Universe lifespan, while these larger ones will evaporate by accretion radiation - so we can say, such objects are the most stable objects inside of observable Universe generation and accretion radiation is AdS-CFT dual to Hawking's one (massive objects bellow 1.7 scale falling into event horizon would appear like tiny quantum fluctuations from distant perspective outside of black hole due the immense space-time compactification around it). The density of largest black holes existing inside of observable Universe (with 10+9 times the mass of the sun, which has a radius of event horizon about 109 km) should be comparable with human scale (1 kg per dm3).

Curvature instability is scale invariant. During Big Bang event, all particles were formed by supersymmetric gravitons, the average size of which corresponded the wavelength of CMB photons. During universe evolution the larger gravitons condensed into particles and objects of observable matter, while the smaller fluctuations have evaporated into antiparticles of matter, which were dispersed by its repulsive gravity into clouds of dark matter, surrounding the objects of normal matter. The same criterion can be applied for planet and planetoids formation or even for predators-prey relationship of biosphere. Only pieces larger then some 1.7 cm can serve as a nuclei for accretion and subsequent gravitational growth, or they would become dispersed by radiation pressure of CMB photons. The smaller pieces of matter tends to condense as a whole in large clusters, instead (large means > 1.7 cm).

From AWT follows, the size of photons is given by interference of light wave with graviton background of Planck length scale, forming the quantum foam background of universe. From interference condition follows, the size of wave packet is equal to their wavelength size exactly at the 1.7 cm scale, which effectivelly means, microwave photons are serving both like particles, both like waves, i.e. by the same way, like gravitons in previous generation of Universe, expanded during inflation or like graviton waves in future generation of Universe before its gravitational collapse. The photons of larger wavelength cannot exist, because they tend to condense spontaneously with these smaller ones into solitons of negative rest mass (axions, or so called tachyon condensate).

Even tiny droplets and bubbles in mixtures tends to shrink and evaporate bellow 1.7 cm scale, while larger droplets and bubbles expands and fragments. The least stable droplets of 1,7 cm diameter (liposomes) could started the evolution of life at shallow places of ancient oceans (i.e. inside of multiphase environment of the largest possible complexity). The repeated breakdown by surf waves enabled them to compete for collection and/or (later) production of surfactants, which enabled them to remain as stable, as possible. Whole evolutionary process lasted whole Universe age, because AWT makes no conceptual difference between evolution of inorganic matter and organic life. Therefore it's nothing very strange, the quantum nature and size of neural standing waves corresponds the size of Universe scope, perceivable just by these waves (i.e. quantum gravity standing wave, forming the observable Universe generation). The increasing density of Universe resulting from vacuum foam collapse corresponds the expansion of the scope of human consiusness, capable to comprehend an increasing space-time portion of Aether chaos complexity during time.

The anthropocentric question, whether 1.7 cm distance scale is adjusted by evolution or it just enables the best visibility of Universe remains a tautology by Aether theory, because from AWT follows, every object which is product of less or more long term evolution has tendency to remain adapted to its environment and vice-versa. The scope of observable Universe always depends on entropy density of observer (i.e. number of time events/mutations involved) - the primitive organisms can see their Universe smaller, the more intelligent larger accordingly.

Lord Byron: "Truth is always strange — stranger than fiction."

pátek 2. ledna 2009

Motivations of Aether Wave Theory

AWT isn't based on some mysticism at all - on the contrary. AWT is based on Boltzmann gas model - it's a basic system for definition of thermodynamical energy, instead. Furthermore, this model isn't ad-hoce at all. It's based on the understanding, from sufficiently distant perspective every object appears like pin-point particle. And every complex interactions in such system can be modeled by system of colliding particles. For example, people are complex objects, but if we would observe them from sufficient altitude, they would appear and behave like chaotic 2D gas composed of colliding particles. It's natural reduction of virtually every physical system.

Despite its conceptual simplicity, this system becomes irreducibly complex with increasing of particle density, because it forms fractaly nested density fluctuations composed of density fluctuations. Such behavior can be both simulated by computers, both modeled by dense gas condensation (supercritical fluid at the right picture) and the resulting complexity is limited just by computational power. Which means, AWT principle enables to model systems of arbitrary complexity just by recursive application of trivial mechanism. If nothing else, we should consider this model because of its simplicity and the fact, nobody did propose it for modeling of observable reality, yet.

The main reason for reintroduction of Aether theory back into mainstream physics is better and more consistent and universal understanding of fundamental connections of reality. Most of these motivations weren't never presented by mainstream physics and they're forming the theorems, i.e. testable predictions of AWT at the same moment, because they can be derived from ab-initio simulation of nested density fluctuations of Boltzmann particle gas. This list bellow will be extended by new ideas occasionally.
  1. Explanation of energy spreading by light
    The spreading of inertial energy requires inertial environment. We cannot use the energy concept for light waves spreading, while ignoring mass concept, the mass-energy equivalence in particular.
  2. Explanation of wave character of light.
    Only system of mutually colliding particles can spread energy in waves, vacuum shouldn't be any exception.
  3. Explanation of finite frequency of light.
    Only system of nonzero mass density can spread waves of finite frequency, as follows from wave equation.
  4. Explanation of high light energy density/frequency achievable.
    Classical models of luminiferous Aether were based on sparse gas model of Aether, which cannot spread the waves of energy density corresponding to gamma or cosmic radiation frequency.
  5. Explanation of light speed invariance.
    Light speed invariance is consequence of Aether concept and the fact, the light speed is the fastest energy spreading observable (if wee neglect the gravity waves, which are too faint to be observable), so we can use only light for observation of reality, the light speed/spreading in particular.
  6. Explanation of absence of reference frame for light spreading in vacuum.
    If we use the light for observation of light spreading in luminiferous Aether, it's motion/reference frame can be never locally observed just by using of light waves, because no object can serve as a subject and as a mean of observation at the same moment.
  7. Explanation/prediction of transversal character of light waves.
    In particle environment, only transversal waves can remain independent to environment reference frame by the same way, like motion of capillary waves at water surface.
  8. Explanation/prediction of foamy structure of vacuum.
    Only foam structure composed of "strings" and "(mem)branes" can spread energy in transversal waves through bulk particle environments (string and brane theories) and/or provide the properties of elastic fluid, composed of "spin loops" vortices (LQG theory).
  9. Explanation/prediction of two vector character of transversal light waves.
    Only nested foam structure can promote the light spreading in two mutually perpendicular vectors of electrical and magnetic intensity (1, 2). The formation of nested density fluctuations can be observed experimentally during condensation of supercritical fluid (1).
  10. Explanation/prediction of uncertainty principle.
    The transversal character of surface waves is always violated on behalf of underwater waves. Inside of inhomogeneous particle system the energy is always spreading in both transversal, both longitudinal waves, thus violating the predictability/determinism of energy spreading and introducing an indeterminism into phenomena, mediated/observed by using it.
  11. Explanation/prediction of particle/wave duality.
    Every isolated energy wave (a soliton) increases the Aether foam density temporarily by the same way, like the soap foam gets dense during shaking due the spontaneous symmetry breaking. As the result, every soliton spreads like less or more pronounced gradient/blob of Aether density and it bounces from internal walls of surface gradient of such blob like standing wave packet, i.e. particle (1).
  12. Explanation/prediction of virtual particles.
    The concept of virtual particles, which appear and dissapear temporarily in vacuum is typical behavior of density fluctuations inside of every gas or fluid and physics knows no other way, in which such behavior can be realized.
  13. Explanation/definition of time dimension and space-time concept.

"..People have often tried to figure out ways of getting these new concepts. Some people work on the idea of the axiomatic formulation of the present quantum mechanics. I don't think that will help at all. If you imagine people having worked on the axiomatic formulation of the Bohr orbit theory, they would never have been led to Heisenberg's quantum mechanics. They would never have thought of non-commutative multiplication as one of their axioms which could be challenged. In the same way, any future development must involve changing something which people have never challenged up to the present, and which will not be shown up by an axiomatic formulation..."

Paul A.M. Dirac, in Development of the Physicist's conception of Nature, In The Physicists conception of Nature ed.Jaghdish Metra, D. Reidel, 1973., pp 1-14.