The dictionary definition of chaos is turmoil, turbulence, primordial abyss, and undesired randomness, but scientists will tell you that chaos is something extremely sensitive to initial conditions. Chaos also refers to the question of whether or not it is possible to make good long-term predictions about how a system will act. A chaotic system can actually develop in a way that appears very smooth and ordered. | ![]() |
Determinism is the belief that every action is the result of preceding actions. It began as a philosophical belief in Ancient Greece thousands of years ago and was introduced into science around 1500 A.D. with the idea that cause and effect rules govern science. Sir Isaac Newton was closely associated with the establishment of determinism in modern science. His laws were able to predict systems very accurately. They were deterministic at their core because they implied that everything that would occur would be based entirely on what happened right before. The Newtonian model of the universe is often depicted as a billiard game in which the outcome unfolds mathematically from the initial conditions in a pre-determined fashion, like a movie that can be run forwards or backwards in time. Determinism remains as one of the more important concepts of physical science today.
|
Ilya Prigogine showed that complex structures could come from simpler ones. This is like order coming from chaos. Henry Adams previously described this with his quote "Chaos often breeds life, when order breeds habit". Henri Poincaré was really the "Father of Chaos [Theory]," however. The planet Neptune was discovered in 1846 and had been predicted from the observation of deviations in Uranus' orbit. King Oscar II of Norway was willing to give a prize to anyone who could prove or disprove that the solar system was stable. Poincaré offered his solution, but when a friend found an error in his calculations, the prize was taken away until he could come up with a new solution that worked. He found that there was no solution. Not even Sir Isaac Newton's laws provided a solution to this huge problem. Poincaré had been trying to find order in a system where there was none to be found.
During the 1960's Edward Lorenz was a meteorologist at MIT working on a project to simulate weather patterns on a computer. He accidentally stumbled upon the butterfly effect after deviations in calculations off by thousandths greatly changed the simulations. The Butterfly Effect reflects how changes on the small scale affect things on the large scale. It is the classic example of chaos, as small changes lead to large changes. An example of this is how a butterfly flapping its wings in Hong Kong could change tornado patterns in Texas. Lorenz also discovered the Lorenz Attractor, an area that pulls points towards itself. He did so during a 3D weather simulation. |
![]() The Lorenz Attractor |
Chaos theory describes complex motion and the dynamics of sensitive systems. Chaotic systems are mathematically deterministic but nearly impossible to predict. Chaos is more evident in long-term systems than in short-term systems. Behavior in chaotic systems is aperiodic, meaning that no variable describing the state of the system undergoes a regular repetition of values. A chaotic system can actually evolve in a way that appears to be smooth and ordered, however. Chaos refers to the issue of whether or not it is possible to make accurate long-term predictions of any system if the initial conditions are known to an accurate degree.
Chaos occurs when a system is very sensitive to initial conditions. Initial conditions are the values of measurements at a given starting time. The phenomenon of chaotic motion was considered a mathematical oddity at the time of its discovery, but now physicists know that it is very widespread and may even be the norm in the universe. The weather is an example of a chaotic system. In order to make long-term weather forecasts it would be necessary to take an infinite number of measurements, which would be impossible to do. Also, because the atmosphere is chaotic, tiny uncertainties would eventually overwhelm any calculations and defeat the accuracy of the forecast. The presence of chaotic systems in nature seems to place a limit on our ability to apply deterministic physical laws to predict motions with any degree of certainty.
One of the most interesting issues in the study of chaotic systems is whether or not the presence of chaos may actually produce ordered structures and patterns on a larger scale. It has been found that the presence of chaos may actually be necessary for larger scale physical patterns, such as mountains and galaxies, to arise. The presence of chaos in physics is what gives the universe its "arrow of time", the irreversible flow from the past to the future. For centuries mathematicians and physicists have overlooked dynamical systems as being random and unpredictable. The only systems that could be understood in the past were those that were believed to be linear, but in actuality, we do not live in a linear world at all. In this world linearity is incredibly scarce. The reason physicists didn't know about and study chaos earlier is because the computer is our "telescope" when studying chaos, and they didn't have computers or anything that could carry out extremely complex calculations in minimal time. Now, thanks to computers, we understand chaos a little bit more each and every day.
![]() Chaotic systems are instable | The definition of instability is a special kind of behavior in time found in certain physical systems. It is impossible to measure to infinite precision, but until the time of Poincaré, the assumption was that if you could shrink the uncertainty in the initial conditions then any imprecision in the prediction would shrink in the same way. In reality, a tiny imprecision in the initial conditions will grow at an enormous rate. Two nearly indistinguishable sets of initial conditions for the same system will result in two final situations that differ greatly from each other. This extreme sensitivity to initial conditions is called chaos. Equilibrium is very rare, and the more complex a system is, there are more disturbances that can threaten stability, but conditions must be right to have an upheaval. |
In the real world, there are three very good examples of instability: disease, political unrest, and family and community dysfunction. Disease is unstable because at any moment there could be an outbreak of some deadly disease for which there is no cure. This would cause terror and chaos. Political unrest is very unstable because people can revolt, throw over the government and create a vast war. A war is another type of a chaotic system. Family and community dysfunction is also unstable because if you have a very tiny problem with a few people or a huge problem with many people, the outcome will be huge with many people involved and many people's lives in ruin. Chaos is also found in systems as complex as electric circuits, measles outbreaks, lasers, clashing gears, heart rhythms, electrical brain activity, circadian rhythms, fluids, animal populations, and chemical reactions, and in systems as simple as the pendulum. It also has been thought possibly to occur in the stock market.
Complexity can occur in natural and man-made systems, as well as in social structures and human beings. Complex dynamical systems may be very large or very small, and in some complex systems, large and small components live cooperatively. A complex system is neither completely deterministic nor completely random and it exhibits both characteristics. The causes and effects of the events that a complex system experiences are not proportional to each other. The different parts of complex systems are linked and affect one another in a synergistic manner. There is positive and negative feedback in a complex system. The level of complexity depends on the character of the system, its environment, and the nature of the interactions between them. Complexity can also be called the "edge of chaos". When a complex dynamical chaotic system because unstable, an attractor (such as those ones the Lorenz invented) draws the stress and the system splits. This is called bifurication. The edge of chaos is the stage when the system could carry out the most complex computations. In daily life we see complexity in traffic flow, weather changes, population changes, organizational behavior, shifts in public opinion, urban development, and epidemics.
Fractals are geometric shapes that are very complex and infinitely detailed. You can zoom in on a section and it will have just as much detail as the whole fractal. They are recursively defined and small sections of them are similar to large ones. One way to think of fractals for a function f(x) is to consider x, f(x), f(f(x)), f(f(f(x))), f(f(f(f(x)))), etc. Fractals are related to chaos because they are complex systems that have definite properties. | ![]() Fractals are recursively defined and infinitely detailed |
![]() Benoit Mandelbrot |
Benoit Mandelbrot was a Poland-born French mathematician who greatly advanced fractals. When he was young, his father showed him the Julia set of fractals; he was not greatly interested in fractals at the time but in the 1970's, he became interested again and he greatly improved upon them, laying out the foundation for fractal geometry. He also advanced fractals by showing that fractals cannot be treated as whole-number dimensions; they must instead have fractional dimensions. Benoit Mandelbrot believed that fractals were found nearly everywhere in nature, at places such as coastlines, mountains, clouds, aggregates, and galaxy clusters. He currently works at IBM's Watson Research Center and is a professor at Yale University. He has been awarded the Barnard Medal for Meritorious Service to Science, the Franklin Medal, the Alexander von Humboldt Prize, the Nevada Medal, and the Steinmetz Medal for his works.
|
Sierpinski's Triangle is a great example of a fractal, and one of the simplest ones. It is recursively defined and thus has infinite detail. It starts as a triangle and every new iteration of it creates a triangle with the midpoints of the other triangles of it. Sierpinski's Triangle has an infinite number of triangles in it.
![]() The first recursion of Sierpinski's Triangle | ![]() The second recursion of Sierpinski's Triangle |
![]() The third recursion of Sierpinski's Triangle | ![]() The fourth recursion of Sierpinski's Triangle |
The Koch Snowflake is another good example of a fractal. It starts as a triangle and adds on triangles to its trisection points that point outward for all infinity. This causes it to look like a snowflake after a few iterations.
The Mandelbrot fractal set is the simplest nonlinear function, as it is defined recursively as f(x)=x^(2+c). After plugging f(x) into x several times, the set is equal to all of the expressions that are generated. The plots below are a time series of the set, meaning that they are the plots for a specific c. They help to demonstrate the theory of chaos, as when c is -1.1, -1.3, and -1.38 it can be expressed as a normal, mathematical function, whereas for c = -1.9 you can't. In other words, when c is -1.1, -1.3, and -1.38 the function is deterministic, whereas when c = -1.9 the function is chaotic.
![]() Time Series for c = -1.1 | ![]() Time Series for c = -1.3 |
![]() Time Series for c = -1.38 | ![]() Time Series for c = -1.9 |
When changing the values for the Mandelbrot fractal set from lines to geometric shapes that depend on the various values, a much more complicated picture arises. You can also change the type of system that you use when graphing the fractals and the types of sets that you use in order to generate increasingly complex fractals. The following fractals are very mathematically complex:
![]() Mandelbrot Set Fractal |
![]() Julia Set Fractal |
![]() ![]() Julia Set Fractals |