at the request of a number of people, we have tried to make the basic explanation as simple as possible with links to the papers where the referencing This summary has been completely re-written to catch up with Barry's research.
A Basic Summary
Helen Setterfield (with constant help from Barry), November 2008
What you are in for here: The results of Barry’s research so far deal with two main categories: the five anomalies and plasma physics. An anomaly is data (or facts) which don’t agree with a theory. Suppose I said all dogs are black. That was my theory. Then you showed me a white or tan dog. I would have a choice. I could declare what you showed me was not a dog or I could change my theory. Science comes up against this constantly. Unfortunately, in some fields, the theory takes precedence over any conflicting data and the conflicting data is ignored, ridiculed, excused away, or all three.
Five anomalies in astronomy and physics which Barry has found standard science refuses to deal with straightforwardly are
In looking at these problems, Barry found they were all the ‘children’ of one ‘parent:’ something called the Zero Point Energy.
So the first part of this basic summary will deal with the Zero Point Energy and its affect on these anomalies, as well as what conclusions are somewhat inescapable as a result.
As far as plasma physics is concerned, it is not as scary as the name sounds. Right now the accepted model for universe formation and maintenance depends on the actions of gravity. However gravity is a very weak force – you defy it every time you pick something up. Because it is so weak, physicists who believe gravity is the controlling force in the universe find themselves having to invent dark matter, dark energy, and dark force to account for what they are seeing. By way of contrast, what we have learned about plasma shows that everything we see ‘out there’ is to be expected from the way plasma behaves, and the evidence we have is that the entire universe is filled with the stuff.
Zero Point Energy -- If you take a container of some sort, and get rid of every atom and particle in it, we have a vacuum, right? Well, yes, but there is still heat energy producing radiation. OK, now turn down the thermostat. To absolute zero. No heat energy left.
Problem: there is still radiation energy which can be measured in your container. A lot of it! Because it is in evidence at zero mark on the Kelvin thermometer - absolute zero, no molecules can move here - it is called the Zero Point Energy, or ZPE for short.
To give some kind of comprehension of the vast strength of the Zero Point Energy, consider this: Our lights in our homes are rated at about 50-150 watts per bulb. That's the energy the bulbs give out. Our sun shines with the brightness of about 3,000,000,000,000,000,000 light bulbs. Our galaxy is comprised of about 150,000,000,000 suns something like ours. If all these suns were shining for 10,000,000,000 years, that would still be less energy expended than the Zero Point Energy packs into one cubic centimeter of space. This is in each of us and everything around us and throughout the universe.
The matter in the universe, then, is almost insignificant compared to the sea of energy it is immersed in. For this reason, the Zero Point Energy is not reduced by its interaction with matter. Thus, its strength remains fairly constant in our time.
Red shift of light -- "Red shift" can actually be explained fairly easily. A photon of light is emitted from an atom when an electron is forced out of position and then snaps back into position. The energy required to force it out of position is released when the electron snaps back and that energy is then emitted as a photon of light. Interestingly, this means every element can emit light. What has been found is that each element, when it emits light, does so in a pattern unique to that element. It is very much like the bar codes on the goods that we buy -- the analyzed light wave has a series of dark 'bars' in it. The position of the bars in the light spectrum identifies the element. This is how we know, for instance, that iron is in galaxies far out there. The light contains the iron 'bar codes.'
Here on earth, each element's 'bar code' or light signature, is at an exact place on the color spectrum.
What is observed is that the further we look out into space, the more these 'bar codes' are shifted to the red end of the spectrum. So, for instance, the bar in sodium, at the top, instead of being in the yellow range, would be shifted more to the orange. The pattern for mercury, the second one down, would still be the same, but every line would be shifted a bit more to the red, and the further out we look, the more shifted to the red end of the light spectrum it is.
When this red shift was first noticed, around 1925, it was thought that it was an indication of something called a Doppler shift. We can hear what this means in sound when a siren passes us. The sound, as it passed, drops suddenly. This is due to the fact that as the siren moves away from us the sound waves are lengthened and thus the sound appears to be lower. Using this idea, it was thought that the red shifting of light from distant galaxies meant the galaxies were moving away from us. Thus, the idea of an expanding universe came into play.
But starting in the 1970's and continuing, something has been challenging this concept of an expanding universe. Imagine, if you will, a car on a freeway going 50 miles an hour. Then, suddenly, with NO intermediate speeds, it is going 55 miles per hour. Then, again, with NO intermediate speeds, it is going 60 miles per hour. This jerking from one speed to another defies physics as we know it in regard to moving objects. And it is this phenomena which has been observed with the red shift measurements the further out we look. If the red shifting of light were an indication of an expanding universe, we would expect to see a smooth variation in increasing measurements as we look out further and further. But what we actually see are clumps of measurements and then a sudden jump to another distinct clump of measurements.
What has also been seen is that in the centers of galaxy clusters, where movement is known to be very fast, there is a 'smearing out' of the measurements exactly as we would expect movement to produce. So why is the presumed movement of the presumed expansion of the universe not doing the same thing? Is it expanding in jumps and starts?
And why, at the most distant reaches of the universe known are the red shift measurements suddenly so severely red shifted that their measurements are off all expected scales? We read in the press and journals that the universe is expanding ever faster at its edges. This is because they are trying to make sense of these enormous red shift measurements at those distances.
Is it possible that red shift is NOT an indicator of an expanding universe, but due to something else?
Planck’s Constant -- Planck's constant itself is the measure of something called 'uncertainty.' The jiggling motion of subatomic particles means it is impossible to tell where they are at any given instant in time. They move too fast. But we can see the general area they take up in their jiggling motion. The size of this area indicates the amount of uncertainty in the actual position of the subatomic particle at any instant. So as they get jiggled more, they take up more space, and the uncertainty about exactly where they are at any instant increases. This uncertainty is called Planck's Constant. Only the actual measurements show it is not a constant at all. It has varied in time. Planck's Constant, called 'h' mathematically, measures two things, then: the strength of the ZPE as well as the resultant uncertainty connected with subatomic particles.
Planck’s constant has been measured as changing. It does not appear that it is a constant at all.
Planck Particle Pairs – extremely tiny positive and negative pairs of matter in the original universe. If an electron were the size of the Golden Gate Bridge a Planck Particle would be the size of a bit of dust on that bridge. How do we know they existed? Because of the general fuzziness we get when we look out to the farthest reaches of space. That fuzziness is constant and uniform, unlike dust clouds.
Atomic mass – Subatomic mass is measured in an electro-magnetic environment. It is not a weight measurement or an actual size measurement as we can do with things we can see. It is measured by the amount of deflection exerted on a stream of subatomic particles, such as electrons, when they are shot through a magnetic field. The place and size of the ‘splatter’ on the end of the tube they are shot down (which has a phosphorescent coating it) tells us the mass of the particles themselves.
Speed of light – Light itself can be emitted from any atom. Incoming energy can force an electron out of its proper position in relation to the nucleus of an atom. When the electron snaps back into position, a photon of light is released. The speed of this light is measured by several means. What is being measured is the time it takes for light to leave its source and then reach its final destination of absorption. This can be measured in a variety of ways.
For about 300 years, despite all error bar accomodations, despite all allowances for instrumental and human error, the measured speed of light was measured as dropping continuously if only slightly. This was being discussed in scientific journals and lecture halls until 1941. What happened that year was nothing short of bizarre.
Dr. Birge, of the University of California at Berkeley, was considered the 'keeper of the constants.' He kept the official records of the measurements of a number of constants which had actually been measured as changing. He did not like the changes, but he was admitting to them. Until August of that year, when, an article he authored appeared in Reports on Progress in Physics. Here is the first paragraph of that paper. Read it carefully:
If you read that paragraph carefully, you will see the last two sentences are in direct contradiction to everything else before them. Nevertheless, the speed of light was, from that time on, declared a constant -- meaning its value was unchanging. This, despite measurements to the contrary. Today the attitude is the same. Any data disagreeing with the declaration that the speed of light is constant must be wrong due to one cause or another, because the speed of light has been declared a constant. Period.
The way the speed of light is measured today does not help the situation. It is normally measured by atomic means, which means the measurements depend on atomic processes. It is assumed the atomic processes have remained the same. However light itself, and therefore its speed, is the result of atomic processes. Measuring the one by the other is something like using two elastic bands. Neither is stretched. One is marked off in inches in the unstretched position. The other marked, too, but just with lines matching the inch marks on the other bit of elastic. If you hold both pieces of elastic together and stretch them, the second piece will still have its marks matching the first piece and it will still appear to be the same length between marks, as shown by the inch numbers on the first piece. But they are both stretching together, so you are really not getting an accurate picture of what is going on. This is what happens when the speed of light is measured by other atomic constants.
Atomic constants – Atomic constants refer to the rate of atomic processes. When we are in school, be it high school physics or college or university, one of the things we NEVER hear is the idea that some of the atomic constants might not be so constant. This absolute 'constancy' is the backbone of a good part of physics today.
It was not always so. Up until 1941, when the above-quoted Birge article came out, the subject of the varying measurements that were being seen regarding some of the constants was one of the major topics in the journals concerned with this sort of thing. Planck’s Constant was increasing. The speed of light was decreasing. What was interesting was that the speed of light MULTIPLIED by Planck's constant was always the same. As one went up the other went down in a precise inverse ratio. The product of Planck’s Constant, h, and the speed of light, c is a true constant. Something like the number 12. You can reach it a number of ways in multiplication: 1 x 12, 2 x 6, or 3 x 4. As one goes up the other multiplicand has to go down proportionately, but the product is always 12.
There are true atomic constants, and then there are those which are not truly constant but have been measured as changing. It is these which are the ‘anomalous’ data which much be accounted for.
When the concept of an expanding universe entered the secular scientific arena, it was ridiculed. It was condescendingly nicknamed the "Big Bang", even though the idea did not include any kind of explosion. It was rejected as being too close to the "silly ideas" of the Bible. Since the Bible was 'clearly' mythology, there was no way the truth of the cosmos could be allowed to come anywhere near what the Bible said happened.
The Bible does say the universe expanded. But it uses another term for it. In the Bible, God says He stretched the heavens. He says this twelve times. There are two major differences between the idea of the "Big Bang" and the Biblical explanation, however. First, the Big Bang says the expansion continues to this day, whereas the Bible says it was a one time, complete event. Second, the Big Bang has no source for the initial energy fuelling the expansion. In the Bible, God says He did it.
When God stretched the universe, or the heavens, He invested what is referred to as the ‘fabric of space’ with a vast amount of energy. Stretch a rubber band. Blow up a balloon and stretch the fabric of it. Both times you have put your energy into the stretching. Since the energy is not doing anything unless you let it go, it is a sort of 'hidden' energy called potential energy. But let that rubber band go, or don't tie the balloon and let go of it, and all that energy explodes into motion. This energy in motion is called kinetic energy.
Well, God didn't let things go, and they didn't all pop back into something tiny, like a collapsed balloon. Instead some of that energy God invested into the universe when He stretched it out was transformed into tiny, tiny (much tinier than electrons) particles called Planck Particle Pairs. Each pair has one positive and one negative member.
To understand what happened next, the easiest way to think about it is to do an experiment yourself. Half-fill a tub or a large sink with water. Put your hands in the water, under the surface, flat and palms together. Keeping your hands stiff, pull them apart as quickly as you can. Whirlpools form. These whirlpools, or vortices, have three stages: formation, maintenance, and dying down.
Just like the rubber band or balloon let most of its energy go at once, so did the stretched out heavens. Kazillions of Planck Particle Pairs were formed very rapidly (excuse the technical language there) and began spinning and whirling about. This was the initial very rapid buildup of the Zero Point Energy. That is because the whirling of electrically charged particles creates electric and magnetic fields. The Zero Point Energy is made up of all manner of electric and magnetic fields and waves.
The ZPE continued to build until the whirling ceased. As long as there was motion, however, new PPPs were forming. Then you might expect things to settle down, but....No. Remember that each Planck Particle Pair has one negative and one positive unit? They started to recombine, or snap back together. Not all at once, but bit by bit. . This combining released a tiny amount of energy. But the billions upon billions of the PPP combining at any one time created an enormous amount of energy, and the ZPE was building.
And then they had all finished recombining. So why doesn’t the Zero Point Energy decrease? Aside from the fact that, as mentioned in the definition section, that it is so enormous an amount of energy that interaction with matter does not affect it at all, there is a feedback mechanism. The only drain on the Zero Point Energy is the formation of virtual particles, which also come in pairs of positive and negative members.
To understand virtual particles, you need to understand a bit more about the Zero Point Energy. ZPE waves are of all different lengths and strengths and going in all different directions at once. This means they hit each other. Think of the ocean, with its waves. Now imagine a speed boat going in a direction different from the wave pattern in the water. It creates, in its wake, its own wave pattern. When the ocean's wave hits the boat's wave a whitecap forms. Then it is gone. This same sort of thing happens with the Zero Point Energy. Only its whitecaps are called 'virtual particles.' There are billions upon billions of virtual particles in every bit of space at any given time. Virtual particles form in pairs, one negative and one positive. For this reason, they snap back together and disappear very quickly. But they exist for a very small amount of time and for that amount of time, act like actual particles as we know particles.
When these particles are formed, it is a drain on the ZPE, but when they snap back together, they release the energy they took to form and the energy goes back into the ZPE, thus maintaining its strength.
So we are surrounded by an enormous field of uniform energy which exists everywhere in the universe, inside and outside of everything, including us.
These Zero Point Energy waves batter subatomic particles rather fiercely. Their battering sets up a motion called, in German, the zitterbewegung, or jitter motion. Because subatomic particles jitter so intensely and rapidly, it is impossible to tell where one is at any given instant. This is called their ‘uncertainty.’ We can tell the area in which they are jittering, but not the precise location of the particle doing the jittering. An idea of this would be to look at a fan. You cannot tell, with your eyes, where any given blade is at any one instant in time, but you can clearly see the amount of space the whirling blades are taking up. This is the same idea with subatomic particles.
Something called Planck’s Constant measures the uncertainty of subatomic particles. If the space one is taking up is small, then the uncertainty about where the actual particle itself is at any instant is also small, but if the space it is taking up is larger, then the measurement of the uncertainty also shows an increase. Think of the picture of a boy swinging a rope over his head. The more rope he lets out the larger the circle of the swing and thus the more space is being taken up. The boy is the same size himself, but he is taking up more and more space with his rope swinging around him. If he were an electron, his size would then be determined by how much rope is played out rather than the size of the boy himself.
In the case of subatomic particles, however, it is an outside force, the ZPE, which increases their jitter motion and thus the amount of space they are taking up, which, in turn, increases their effective size, or mass. So the ZPE is the parent cause of a change in the mass of subatomic particles. Planck’s Constant, in measuring the uncertainty of atomic particles is then also measuring the strength of the Zero Point Energy.
And both were measured as increasing.
As the Zero Point Energy built up, it affected every atom in the universe. But atoms, like all matter, resist change. As the ZPE built up, atoms and their composite particles would resist the change until it could no longer be resisted, then they would react, absorbing that amount of energy change, and so moving to a higher energy state. And every time an atom took up a higher energy state, the atom would emit light that was a little more energetic, or bluer. The red end of the color spectrum is the lower energy level, and the blue end a higher energy level.
If the Zero Point Energy had been affecting the atoms throughout space in this way, causing them to jerk to higher levels of energy as the ZPE built up, then we should see two things in particular. First we should see the resultant red shift of light go in a step fashion and not as a smooth function. Second, since the transition from potential to kinetic energy happens most quickly at first, we should see a very sharp increase in the red shift measurements as we get to the earliest (and thus farthest) reaches of space that we are able to detect.
Both of these things are true. The red shift measurements are quantized and the red shift increases dramatically as we get to the frontiers of the cosmos. This indicates not a more rapid expansion of the universe at its edges, but rather the massive increase in the Zero Point Energy at the beginning of our universe's history as potential energy quickly converted to kinetic energy.
This is interesting because every time in the Bible when God states He stretched out the heavens, the stretching is in the context of Creation Week and the verb indicates a past, completed act.
So we have the Zero Point Energy connected to both atomic mass and the red shift. What about Planck’s Constant? The greater the amount of jiggle caused by the Zero Point Energy, the more space the subatomic particle takes up. That means its uncertainty is greater, and that is what Planck’s Constant measures. Therefore as the Zero Point Energy changes, we should see corresponding changes in Planck’s Constant. That is true, but backwards from what actually happens. We can measure uncertainty, and it is that measurement which tells us whether the strength of the Zero Point Energy has changed. And yes, both have changed through time.
Then there is light speed. How is the Zero Point Energy related to that? Remember virtual particles? They are the result of the waves from the ZPE hitting each other and forming ‘white caps.’ When virtual particles form, they may only last the merest fraction of an instant, but in that amount of time they act like real particles and can absorb a photon of light. As a photon of light is speeding through space, it will hit billions of virtual particles. Each time it hits one, it is absorbed and then the two halves of the virtual particle snap back together, and thus out of existence, releasing the photon of light to go on its way. Then it hits another virtual particle. Although the time delay for the photon is extraordinarily small because of the almost instantaneous disappearance of virtual particles, there is, nevertheless, a very small time delay. So the more virtual particles the photon of light encounters, the longer it will take it to reach its final point of absorption in your shirt or your wall, or a leaf, or wherever.
Very early in the life of the universe, the Zero Point Energy started as almost nothing and then built up extraordinarily quickly. Think of how fast a balloon travels when you blow it up and then release it. Or how fast a rubber band flies if you stretch it and then release it. The buildup of the Zero Point Energy followed that same mathematical curve -- very, very fast at first and then much more slowly as most of the potential energy expended itself as kinetic energy.
Thus, the speed of light at the beginning was also extraordinarily fast. There were very few virtual particles to impede its progress. But as the ZPE built up quickly, so did the number of virtual particles in any given space at any given time, and the speed of light necessarily slowed as a result. Actually, the speed of light in between the virtual particles never slowed -- it was still extraordinarily fast. But it took a longer and longer time to reach its destination.
To give you an idea of the current number of virtual particles in existence at any given moment, the body of a six foot man will contain, at any instant, something like 100 billion billion virtual particles. Read that as 100,000,000,000,000,000,000 or 1020. Now a photon of light would not hit them all, as that is a volume of virtual particles and light goes in a straight line, but it does give you an idea of the hurdles a photon of light must negotiate coming through space.
There is one last anomaly which is related to the Zero Point Energy and it is this anomaly and its necessary conclusion (when admitted to) that brings down the ax from standard physics and cosmology and science in general. This is the ‘run rate’ of atomic clocks. An ‘atomic clock’ is time measured by atomic processes. If atomic processes have not remained constant, then radio decay has not remained constant. If radio decay has not remained constant, then trying to judge the age of a rock or the earth or the entire universe based on atomic processes and, in particular, radiometric dates, is not going to give us accurate orbital (time as we measure it in days and years) dates.
As already mentioned, atomic mass is governed by the Zero Point Energy. What about atomic processes?
The first clue Barry had regarding the implications of his early research on the speed of light was that either it was in the numerator of every reduced radio decay rate equation or its inverse, Planck’s Constant, was in the denominator. This meant that if the speed of light had been faster in the past (or, with the same effect, Planck’s constant had been lower in the past), then radio decay rates would have been proportionately faster in the past. This means a number of years could be ticking off atomically during one orbital year.
Let’s try for a short explanation of how one form of radio decay works. We are often shown pictures of an atom which indicate the electrons are moving extraordinarily quickly around the nucleus, and that is true, but what we do not, as laymen, usually understand, is that the subatomic particles in the nucleus are also jittering around, battered by the ZPE. However, unlike the more free electrons, the neutrons and protons in the nucleus are contained electromagnetically. However the particles in the nucleus are battering against the electromagnetic ‘wall,’ and the larger the nucleus, the more particles are banging around and the more likely one will get out. This explains why the heaviest elements tend to decay faster than any lighter elements. However there is another part to this. The less mass any of the subatomic particles in the nucleus has, the faster it is banging around and the more times it will hit the electromagnetic wall. The more hits, the more chances of escape.
In the past, as mentioned above, subatomic particles were much less massive as they were not getting jiggled as much by a lower ZPE. Therefore the particles in the nuclei of atoms were able to escape much more quickly and yes, radio decay happened much more quickly at the beginning than it does now. In pretending that radio decay rates have remained constant through the ages, scientists have provided a false age for the universe, for the earth, and for the rocks they measure. The ‘atomic ages’ may be correct, but they do not correspond with our calendar ages until the rate of change has been taken into account.
One of the major shocks Barry got during his work was the results of working through the math in this area. The results indicated that the Bible was absolutely correct in not only the age of the entire universe, but the age of the earth as well. It is primarily for this reason that his work is ridiculed by standard science. However what he has done is simply to collect the data and work with it. Where it leads is where it leads.
A Static Universe?
If this sort of a static universe is true, then we should see evidence of this oscillation. And that is precisely what happened in 1970 as all the ‘constants’ which had been measured as changing in one direction switched and started changing in the other. We had evidently reached a minimum in size and for hundreds of years the speed of light had been measured as slowing. Then, in 1970, the speed of light started to be measured as increasing a bit. Planck’s Constant, conversely, which had been increasing up until 1970, started decreasing. The mass of subatomic particles which had been increasing, showed a slight decrease. The universe itself was oscillating outwards after having reached its minimum. The measured changes are not dramatic, but they are there. They are caused by the contraction and then expansion of the space in which the Zero Point Energy is operating. The more space, the more spread out the energy and the less intense its effects. The less space, the more compact the energy and the more intense its effects.
In conclusion, for this part, the massive increase of the Zero Point Energy at the beginning of creation was caused by the fast conversion of potential to kinetic energy through the action of the newly created Planck Particle Pairs. The initial lower ZPE, however, meant that light speed was about 6 x 1011 times faster than what we measure it as being today. Planck’s constant was correspondingly lower as the subatomic particles were not getting bashed about as much. As a result, atomic processes themselves were faster and radio decay rates started enormously fast. The atomic clock was ticking off about six billion years in the first four days as a result.
How do we get this kind of figure? The red shift tells us. The red shift curve is a direct result of the ZPE, which is the parent cause of all of these things. So by looking at the red shift curve, calculations can be made for the other constants. That curve shows us how the speed of light and the atomic clock have both slowed down. Charting that curve against orbital time gives us a timing for the world’s events which ends up, and again this was a surprise Barry did not expect, exactly correlating with the biblical record.
God knew what was going to happen here. This is why, in Genesis 1:14, He told us to keep time by astronomical events: the sun, the moon, and the stars. The gravitational rate is steady. Atomic rates are not.
The graph below shows the current speed of light on the left and works back toward the beginning of the universe as we proceed to the right. Thus we can see that the speed of light dropped very swiftly at first, as the ZPE increased just as rapidly, interfering with the travel of the photons. The ZPE measurements are on the left of the graph and the light speed measurements are on the right. Remember that 'now' is the bottom left corner!
Perhaps it’s because it is gravity that keeps the moon going around the earth, and the earth and all the planets going around the sun. But whatever the original reason, the standard model for the universe and the galaxies and stars is based on the force of gravity. Gravity certainly exists! We can see the effects of it everyday. But, all things considered, gravity is a very weak force. You defy it every time you lift something up.
And because gravity is such a weak force, scientists cannot figure out how galaxies came together. They cannot give an explanation for the first stars forming. The fact that the outer sections of the spiral arms of galaxies spin around the center as fast as the inner sections of the spiral arms do is something they cannot deal with…..without figuring that the vast majority of the universe must be made up of dark matter. Dark matter is stuff we cannot see. We never have seen it. We have never found it in any form using any instruments. But it must be there for gravity to work, forming and keeping the universe in the way that it is. To the imaginary dark matter has been added dark energy and dark force, all necessary to support the gravitational model.
But over a hundred years ago Kristian Birkeland was exploring some electrical phenomena and although he did not call it plasma, found that electrical phenomena were probably responsible for our beautiful auroras.
By the early twentieth century, however, a well-respected theoretical physicist named Chapman ridiculed the work in plasma, discounting its effects, and so until his death it was not taken seriously…even though Hannes Alfvén got a Nobel prize in physics for his work with plasma. Alfvén also predicted we would find space full of plasma filaments. But because of Chapman, our education institutions were firmly grounded in the gravitational model and plasma was ignored. This is still the case the majority of the time today.
But let’s take a quick look at plasma, what it is and what it does, and see if men like Birkeland and Alfvén were right, or if Chapman was right in telling us to ignore plasma in favor of gravity as the main force in the universe.
Our space probes proved Birkeland was right. The auroras are the result of ionized gases, or plasma, which surround the earth when they are put into glow mode. This glow mode is produced by the particles in the solar wind, which are effectively an electric current, being channeled to the magnetic poles of the earth and charging the plasma there.
A plasma is electrically charged because some of the electrons, if not all of them, are separate from the nuclei, leaving the nuclei with positive charges and the loose electrons with negative charges. Any movement at all in this plasma causes an electric current. Around every electric current is a circling magnetic field. There is no exception to this. This circling magnetic field causes plasma to form long, stringy lines called filaments. Alfvén said we would see them in space. See if you think he was right.
When we work with small plasma filaments in the lab, some interesting things happen. When two of them approach each other, they start to interact. This interaction has been filmed and what we see is that just two filaments end up producing every structure we see in outer space, quickly and efficiently. No gravity needed. Here are stills of the progress. In these pictures we are looking down on the two filaments as they start interacting.
Now look at what a spiral galaxy looks like (this one below is names M81).
See how the stars in this galaxy are sort of lined up on the filaments? The filaments, just like lightning, are very unstable and can pinch easily. These pinches come in many forms, but the most common are the ‘z pinches,’ also known as the Bennett pinches. When a plasma filament pinches, a star forms. That is why we see series of stars like beads on a string along the plasma filaments.
There is a lot more to the whole thing, but the end result is that we can see exactly how the early plasma formed the quasars and various types of galaxies and the stars without the need for gravity or dark anything.
Sometimes a plasma filament will fracture within itself, becoming something like a cable, with the ‘daughter’ filaments bound together by the outside magnetic field. When a pinch happens in this, the outer daughter filaments respond first, forming some kind of object which is then circling the center. As the pinch goes in, more and more objects are formed. Then the central core of the filament is all that is left and when the pinch hits it and compresses it, it lights up. This may be why the Bible indicates the earth was formed before the sun. A plasma beginning explains it without any complications.
Plasma also sorts matter. The more highly ionized elements are channeled into the center of the filament. In a cable type filament, the inner daughter filaments contain more heavy metals than the outer ones do. Within each of these separations, additional separations also take place so that whatever elements are in each daughter filament are also sorted. This is seen. This is known.
And this explains why little Mercury is so dense and heavy with its large iron core. And as we go out from the sun, each of the planets has less and less percentage of iron or other heavy elements in its core and more and more of the lighter elements surrounding the core. Each planet was sorted in terms of its elements as it was forming in response to the pinch, and this was after the initial sorting within the original plasma filament itself, which brought the most easily ionized elements (which are usually the heaviest elements) into its center.
One last point. When the Zero Point Energy was lower, in the same way that atomic processes were faster, so were the processes taking place in the plasma filaments in space. Thus the stars, the galaxies, and the planets were able to form much more quickly than we would see happen today. In short, the plasma model is not only based on what we can see and have worked with, it has none of the problems the gravitational model has.
We have also reviewed a lot of this material in a biblical context in our Bible Study "Genesis 1 - 11, Can You Believe It?"
The following articles deal with the subjects above
The data concerning the speed of light measurements was originally published by Flinders University in Australia after Lambert Dolphin, a (now retired) senior research physicist at Stanford Research Institute International requested a paper regarding the light speed changes. Atomic Constants, Light and Time
Two papers dealing with the redshift
and what it means are here:
The paper dealing with whether or not the redshift means the universe is expanding now is Is the Universe Static or Expanding?
More discussion on the vacuum of space is Exploring the Vacuum.
The link between general relativity and the Zero Point Energy is General Relativity and the Zero Point Energy.
A discussion of the two types of time-keeping, atomic and orbital, can be found in section 3.16 of Behavior of the Zero Point Energy and Atomic Constants.
Reviewing the Zero Point Energy was published in 2007 and provides some timely updates
We have a group of three charts showing measured changes of the speed of light, Planck's constant, and the mass of the electron with the references.
More explanation about plasma can be found on our website here: Was this How God did it?
An article published in 2008 combining both the Zero Point Energy and Plasma physics is Reviewing a Plasma Universe with Zero Point Energy
Understanding that standard plasma physics is NOT creationist and does not try to combine plasma physics with faster processes in the past, the research that is going on is nevertheless fascinating. Some links are at Peratt's site, The Electric Universe and Thunderbolts.
Also on the Setterfield website is a
whole series of questions that have been asked of him and his
responses to them. Many of these are in the Discussion Section