Big Bang

The Big Bang Theory and Our Present State of Knowledge

For the dominant latter portion of the twentieth century it has been the Big Bang Theory that has prevailed in theoretical physics and, by necessarily convincing argument and appropriate fit, other branches of science. Also known as the “Standard Model,” it offers a sometimes empirically, but mostly theoretically, plausible explanation for how the universe began and how it continues to operate/function. It is this theory, as well, that has become the subject itself of anything from gentle discussion to heated debate among scientists, theologians and philosophers as to the origin(s), history and, indeed, the future of the universe in which we abide.

In 1927 Georges Lemaitre (b. A.D. 1894 – d. A.D. 1966), a Belgian physicist/cosmologist and a priest, began to put together the theoretical inferences that would later be modified by the Russian mathematician Alexander Friedmann (b. A.D. 1888 – d. A.D. 1925) and, much later (1948), by the Russian-American physicist George Gamow (b. A.D. 1904 – d. A.D. 1968) into the theory that would eventually acquire the label “Big Bang Theory.” 1

In a nutshell, the popularly accepted tenets of today’s “Big Bang Theory” state that:

The universe originated from an initially dense state of high energy that was extremely hot (approximately 100 billion degrees Celsius);

This primordial energy burst forth in an explosive array that formed an initial dense concentration of matter that has continued to expand outward since the actual “big bang” itself;

The “big bang” universe is isotropic (i.e., “exhibiting properties with the same values when measured along axes in all directions”) and homogeneous (i.e., “of uniform structure”);

We know that the universe continues to expand out from its primordial state;

We know what the ‘big bang” was like, but can know that only as far back as 10-43 seconds after it took place. Anything before that very tiny moment of time cannot be accounted for with our present state of knowledge.

With regard to research about the “Big Bang Theory,” many scientists, theologians and philosophers in the latter half of the twentieth century have come to recognize that, within our observable universe, two basic types of knowledge seem to be attainable: analytic knowledge 2 is that which we seem to be able to know almost with 100% certainty, and synthetic knowledge 3, which lies outside of analytic knowledge and cannot be known with 100% certainty given the most current level of information and experimental methods available.

Scientists, theologians and philosophers seem to be able to handle, at least to a certain extent, two types of events in our universe: reproducible events, which can be observed and experimented with because they either happen over and over again or can be made to take place over and over again; and unpredictable events which can, at least, be observed and reported when they happen via statistical inquiry. There is, however, a third type of event, one called a singular event (or singularity) that cannot be reduced to observable or statistical inquiry because it was only capable of happening once in our universe and it will not ever happen again.

From the best theoretical physics and theoretical mathematics of the Big Bang Theory comes the assertion of a singular event that is, admittedly, impossible to reproduce (although many scientists working with supercolliding machines believe that, within a few decades, it may be possible when the level of scientific information and observation methods are improved): the Creation of the universe.

In 1965, two scientist-researchers at the Bell Telephone Research Laboratories, Arnos Penzias and Robert Wilson, were using an extremely sensitive antenna to measure galactic radio waves. In using their antenna they observed some very weak and unexplained electromagnetic radiation that “seemed to be coming simultaneously from all directions in outer space.” They soon perceived that this radiation was specifically what had been predicted by the Big Bang Theory. This rather constant emission is now known as the Cosmic Background Radiation and is almost universally accepted as a standard measurement of the original Big Bang radiation.

Although this is admittedly what makes discussion among theologians, scientists and philosophers about the origins of the universe a difficult subject with which to wrestle, it is also the framework which is making it more possible for all who are willing to come together to dialogue. It seems as if, after all, science and religion may have more common ground than ever believed since the time of The Enlightenment. The best of our knowledge as we approach the new millennium does not allow us, whether from the vantage point of religion or science, to know empirically what or who the origin of the universe is based in. Many continue to speculate, and many have come to believe that appropriate dialogue has brought many to the discussion tables who would not have thought of doing so as recently as 50 years ago.

Quotable Quotes:

Wernher von Braun (b. A.D. 1912 – d. A.D. 1977): “I find it as difficult to understand a scientist who does not acknowledge the presence of a superior rationality behind the existence of the universe as it is to comprehend a theologian who would deny the advances of science.” [from Morris, Henry M.. (1982, 1988). Men of science, men of god: great scientists who believed the bible. Master Books.]

Paul Davies (British-born mathematical physicist/professor of philosophy): “It is impossible to be a scientist, even an atheist scientist, and not be struck by the awesome beauty, harmony, and ingenuity of nature. What most impresses me is the existence of an underlying mathematical order, an order that led the astronomer Sir James Jeans to declare, ‘God is a pure mathematician.’ “

Gregg Easterbrook (Journalist): “if nothing else, the theological idea of creation ex nihilo – out of nothing – is looking better all the time as ‘inflation’ theories increasingly suggest the universe emerged from no tangible source. The word ‘design,’ rejected by most 20th-century scientists as a theological taboo in the context of cosmology or evolution, is even creeping back into the big-bang debate.” (In U.S. News and World Report, August 10, 1998)

Leon Lederman (American theoretical physicist): “We don’t know anything about the universe until it reaches the mature age of a billionth of a trillionth of a second – that is, some very short time after creation in the Big Bang.”

Lederman again: “When you read or hear anything about the birth of the universe, someone is making it up. We are in the realm of philosophy. Only God knows what happened at the very beginning (and so far She hasn’t let on).”


1 The original term “Big Bang” was coined by Fred Hoyle in an interview on BBC radio. He used the term originally in a derogatory sense to set it off from the theory that he himself held to at that time, the “Steady State Theory.” The term “Big Bang,” however, stuck.

2 An example of analytic knowledge within the field of mathematics that has been acquired and seems to be 100% certain might be: 2+2=4.

3 Two examples of synthetic knowledge, which could also be asserted on the basis of “faith” and/or “belief” as well as observable/theoretical information and are, therefore, not at all near 100% certain, that may suffice for our discussion:

It is far from 100% certain that all known matter is built from six quarks and six leptons, as many theoretical physicists assert presently; however, some of the latest findings in theoretical physics and theoretical mathematics have impelled many to believe to a degree of conclusion that these were the earliest building blocks for all matter in the “Big Bang” universe, even though our present state of intelligence is probably decades away from demonstrating this and, therefore, approaching anything like 100% certainty about the matter; and

It is far from 100% certain that an Intelligent Designer, a Creator, a God, actually created the universe and all within it. The theoretical and/or allegorical information about creation available from various sacred texts (e.g., The Bible, The Quran, etc.) and from the undemonstrated theories of many theologians and/or scientists and/or philosophers has not yet been substantiated with anything near 100% empirically certain observation.


Aviezer, Nathan. (1999). The big bang theory. On The Torah and Science Web Site. (

Brinton, Crane, John B. Christopher and Robert Lee Wolff. (1967). Man’s Fate in the Twentieth Century: The Sciences. In A History of Civilization (Volume Two). (pp.652-654), Englewood Cliffs, New Jersey: Prentice-Hall, Inc.

Ferris, Timothy. (1997). Preface. In The Whole Shebang: A State-of-the-Universe(s) Report. (pp.14-17), New York: Touchstone Books.

Gange, Robert. (1994). The Fingerprints of God: The Origin of the Universe. The Forerunner.

Haines-Stiles, Geoff. (1996). Why do scientists believe in the big bang theory? On the Live from the hubble telescope web site/NASA (

Lederman, Leon with Dick Teresi. (1993). The invisible soccer ball (chapter 1). In The God Particle: If the Universe is the Answer, What is the Question? (p.1), Boston: Houghton Mifflin Co.

Morris, Henry M. (1998). Can Scientists Be Christians? The Mandate, 18(29).

Odenwald, Sten. (1987). Beyond the Big Bang. Kalmbach Publishing Co. Available on web page: (

Weinberg, S. (1977). The First Three Minutes: A Modern View of the Origin of the Universe. (p.188), New York: Basic Books.

Weisstein, Eric. W. (1996-98). Friedmann, Alexander/Gamow, George/Lemaitre, Abbe’. On scientific biography website (

Whitmore, Bradley C. (1993-1996). Cosmology. In the Encarta encyclopedia (CD-ROM version). Microsoft Corporation.

DNA Structure and Function

by Timothy R Swift


Deoxyribonucleic Acid (DNA) is a biochemical molecule found within all living things. It determines the basis of heredity and dictates, in blueprint fashion, the shape, composition, and function of those living things. It is analogous to the world’s most complex organic computer program in that it contains the coding for the construction and operation of every cell in the human body.

The complexity and perfection of DNA is vital for life to exist. DNA is specific, not only to constitute a difference between species, but to sustain similarities for biological processes. The complexity of the DNA code offers one of the greatest challenges to macro-evolutionary theory.

Due to its precision and complexity it is unlikely that DNA could have evolved from a spontaneous event. As all chemical reactions must be specifically directed for specific results, DNA and the creation of life and variety of species, each having specific needs and functions, must also be directed. Opposed to macro-evolution, DNA is programmed to resist mutation and maintain integrity of the code. Observed changes in organisms are either: a) a result of pre-programmed allowable variations whose results can be positive, or b) a result of random errors in copying which always result in negative mutations.


DNA is not visible to the human eye unless it is uncoiled. In its natural state it is able to reside in cells because it coils itself up so tight that it actually becomes microscopic! DNA does this by the use of topoisomerases. These are enzymes that catalyze the supercoiling of DNA and their relaxation by cutting and rejoining strands. (This is similar to what happens to a rubber band when it is tightly twisted in one direction until it collapses into a little ball). DNA is present in every living cell inside its nucleus. Each living thing has a different amount of DNA relative to its species. Large viruses have less than a millimeter. Large bacteria have about 1.5 millimeters. Fruit flies have 2 inches.

Humans have approximately six feet of DNA in each and every cell.

Human bodies carry a total of about five grams of DNA. If all the DNA from one human body were collected and placed end to end it would span about 29 million miles or 4.65 x 107 kilometers. That would circle the earth 20,000 times, make a trip to the moon and back six times, or even reach the 26 million miles to Venus!

In 1953, James Watson (b. A.D. 1928 – ) and Francis Crick,(b. A.D. 1916 – ), with the eventual assistance of Maurice Wilkins (b. A.D. 1916 – ), discovered the structure of DNA by working on a chemical model with the correct bond angles and lengths. By building the model using the correct structure of each molecule, they were able to determine the shape and size of DNA.

When DNA is coiled up it resembles a spring within a spring. Scientists call this a “double helix..” The center of this is filled with molecules called “base pairs” connected to one another. The base pairs are, then, each connected to a sugar called “ribose.”


There are four different molecules or chemical letters that make “base pairs”. They are called “adenine,” “cytosine,” “guanine,” and “thymine.” These are called base pairs because they are chemical bases (as opposed to acids) which pair up together inside the coil of DNA and resemble the rungs of a ladder. These “rungs” join one strand of DNA to the other strand, holding the double helix together to resemble a spring within a spring. A combination of three out of these four letters can be considered a “word” (codon) and code for one amino acid. Other codons, for example, code the end of an amino acid sequence.


It is apparent that specific base pairing is very important in DNA. Not only is base pairing important, but the sequence of these base pairs is important. In other words it is extremely important in what order these bases occur on the DNA strand since it is the order of these bases that determines the structure, function and makeup of the organism. It is the sequence of the base pairs that acts as the “chemical code” or program (i.e., the genetic information that tells the cell what to build and when to build it).

Genes are those portions of DNA that are coded by a specific sequence of base pairs. Each gene is the basic functional unit of heredity and carries with it the required information for the construction of the building blocks for proteins. Human genes vary widely in length, often extending over thousands of bases. An average size gene has 3000 base pairs. The entire human genome has 3 billion base pair sequences. Proteins are made of amino acids and DNA codes for each particular amino acid that is built.

Amino acids are biochemicals that, when put together in long chains, make proteins. Amazingly, most living things only use the same twenty amino acids although thousands exist! These acids assemble according to the DNA blueprint instructions to form proteins. Each protein can contain from 50 to 1000 amino acids. Proteins are the micro-machine workhorses of the biological organism. A typical cell has thousands and thousands of different kinds of proteins, each performing a specific function. They are essential, critical and immensely important for sustaining life. Without them life could not exist. Not only do the DNA base pairs code for each and every particular amino acid and, thus, every protein; they also code for when to start and stop the building of amino acids.

“One million bases (called a megabase and abbreviated Mb) of DNA sequence data is roughly equivalent to 1 megabyte of computer data storage space. Since the human genome is 3 billion base pairs long, one would need 3 gigabytes of computer data storage space to store the entire genome. This includes nucleotide sequence data only and does not include data annotations and other information that can be associated with sequence data.” 1

If compiled in book form, the entire base sequence of the human genome would fill 200 volumes of a Manhattan style telephone book, each book having 1000 pages. Working around the clock and reading 10 bases per second, it would take a person 26 years to read such a base sequence.

Interestingly, the protein coding function involves only about 10% of the base pairs.

The functions of much of the rest of the base pairs remains a complete mystery. Through the human genome program, man may one day understand it’s purpose. The 3 billion base pairs are divided into 23 physically distinct groupings called “chromosomes” and the smallest human chromosome has about 50 million bases.


If we unzip a strand of DNA by breaking just the bonds between the base pairs we have a sort of “template” in order to make more DNA. Since the bases can only pair up with other specific bases, we could insert one open strand into a “soup” of bases, deoxyribose and phosphate, and come out with a whole complete double helix of DNA.

That is what, in fact, occurs in every living cell. DNA gets unwound and opens at the base pairs, another strand builds off the template that is formed and a process known as “DNA replication” occurs.

DNA also has a redundancy or error correction mechanism in place: several codes for one amino acid exist. This insures that the correct amino acid is made every time. In other words, several different base pair sequences can code for the same amino acid. In order for the correct proteins to be built, the correct amino acids must first be built. It is absolutely essential for correct amino acid synthesis to be performed at the correct time as it is needed. (Living organisms are constantly making and breaking down amino acids and proteins, according to what is needed at the specific time. Amino acids are degraded in the liver to become fuel for energy.)

The complexity and perfection of the DNA code is vital for life to exist.

Due to the precision and complexity of DNA, it is unlikely that it could have evolved from a spontaneous event. All chemical reactions must be specifically directed for specific results. DNA and the creation of life and variety of species, each having specific needs and functions, must also be directed.


Undirected DNA replication and composition does exist, however. This has always had devastating results. Every major change or mutation involved that could be large enough to cause a change from species to species has, instead, caused genetic diseases that adversely affected that particular living organism.

Representative genetic diseases include Turner Syndrome, Klinefelter Syndrome, Down’s Syndrome, Sickle Cell Disease/Anemia, spontaneous abortions, leukemia and other cancers, birth defects, hemophilia, and many others. One single altered base causes most major diseases. In essence, every time DNA changes without direction, if the changes are big enough to cause macro-evolution, its changes cause unproductive results.


The Human Genome Project is a 15 year international effort to discover all the genes that exist in the human body and map them down to the base pair level. It began in October 1990 and is being carried out by the DOE Human Genome Program (directed by Ari Patrinos), and the NIH National Human Genome Research Institute (directed by Francis Collins). Other countries in addition to the United States include Australia, Brazil, Canada, China, Denmark, the European Union, France, Germany, Israel, Italy, Japan, Korea, Mexico, Netherlands, Russia, Sweden, and the United Kingdom.

It is estimated that there are between 60,000 and 80,000 human genes and a total of 3 billion base pair sequences. The goals of this project are 1) to discover these genes and make them accessible for further biological study; 2) to determine the complete sequence of the 3 billion DNA base pairs; 3) to store this information in databases; and 4) to develop tools for data analysis.

The cost of this project is enormous due to the innovative technology being developed, databases for storage, and construction of labs and equipment.. Thus far the cost has exceeded $300 million.

There are also many other people involved heading the project who are subcontracted and funded by the two organizations mentioned. Many laboratories around the world and researchers at major universities in the United States are involved. There are about 200 separate scientists involved in the core study. In addition, many large and small privately funded companies are conducting their own research. There are also about 1000 individuals from 50 countries who are members of the Human Genome Organization. These people help to coordinate international collaboration with the project.

There exists a group called The Genome Database (GDB); it is the public repository for human genome mapping information. In addition to mapped genes, the GDB stores information on genes whose locations have not yet been determined unequivocally. The GDB currently has over 7000 genes that have been mapped to particular chromosomes. Tens of thousands of human gene fragments have also been identified as “expressed sequence tags.” These are currently being assigned to positions on chromosome maps as well.

The recording and organization of data discovered is handled in two basic ways: physical mapping and sequencing. The goal of physical mapping is to establish a marker every 100,000 bases across each chromosome (which should result in about 30,000 markers). The most complete map yet was published in the summer of 1997 and featured about 8000 landmarks, which provided about twice the outcome of previous maps. Similarly detailed maps have been produced for a few individual chromosomes, but this map offers landmarks across the entire human genome that are also positioned relative to each other.

Physical mapping can be thought of as piecing together jigsaw puzzles. Since there are 23 different human chromosomes, imagine buying 23 different jigsaw puzzles, each with 130 million pieces, to form a total of 3 billion pieces. Then mix all those pieces in a large trash bag and begin putting the puzzles together piece by piece. That would actually be easier than mapping the human genome, because those base pair “pieces” are microscopic. in reality (referring to the base pairs).

“Sequencing” is the process of determining the sequence (order) of the base pairs.

Only about 4% of the human genome has been sequenced so far

and the current resolution of most human genetic mapping is about 10 megabases. This is so because of the need for more efficient DNA sequencing technology. Currently, research is being done in order to enhance this project. A great deal of progress has been made in the development of automated sequencing machines to produce more accurate sequences in a much shorter time, but much remains to be done in this area.

Besides automating current methods, researchers are also trying new ways to sequence the genome, including using a silicon “DNA chip.” The chip holds tens of thousands of short sequences. Although much progress has been made, many challenges remain in developing faster and cheaper sequencing technologies as well as in database development and sequence-analysis software.


The complexity and perfection of DNA is vital for life to exist. DNA is specific, not only to constitute a difference between species, but to sustain similarities for biological processes. DNA not only causes a deviation to make all species of animals and plants look different, but it also causes every living thing to have similar functions and actions that keep them alive.

Due to it’s irreducible complexity, it is unlikely that it could have evolved from a spontaneous event. As all chemical reactions must be specifically directed for specific results, DNA and the creation of life and variety of species, each having specific needs and functions, must also be directed.



1 Human Genome Project Web Site (


Cellular effects of sickle cell/The genesis of sickle cell/Inheritance of an illness/What is sickle cell? (1996). On the MCET web site (

%d bloggers like this: