Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
CHAOS : Review/Essay of "CHAOS" 'Making a New Science' by James GLEICK (1987) © H. J. Spencer [10 Feb. 2021] 7,000 words; 10 pages. ABSTRACT This book describes the birth of the new theory of Chaos. This is a difficult new concept that is still evolving but it popularized the term: Butterfly Effect and introduced new concepts to a popular audience, such as fractals and introduced pioneering thinkers, such as Feigenbaum and Mandelbrot; it inspired the novel and movie Jurassic Park. This concept opens up a new view of nature: where previously randomness had to be forced in to explain the unpredictable variations, now chaos is seen as spanning both order (patterns) and disorder. Now, this phenomenon helps explain the shape of clouds, smoke, water eddies, mountain ranges and coast-lines. Implicitly, it shows how Newtonian mathematics has constrained physics (and science in general) to make simplifying assumptions that enables the calculus to become the universal tool-set of the scientific viewpoint. The book describes how this tough problem was cracked by five theoreticians described herein with a novelist's eye. Key to the solution was the early use of computers to repeat simple calculations, very many times. The viewpoint changed from static 'state' to dynamic process: becoming rather than being. Chaos is everywhere, it is switching the simple mathematical models of classical physics. It is the science of the global nature of systems. I show here (but not in the book or Wiki) that this is the start of the Death of Newtonian Physics and the Calculus: a TRUE REVOLUTION. AUTHOR'S BIOGRAPHY James Gleick (born 1954) is an American prize-winning author, specializing in the history of science; he has written 6 books. Recognized for his writing about complex subjects through the techniques of narrative nonfiction, he has been called "one of the great science writers of all time". He is part of the inspiration for the Jurassic Park character, Ian Malcolm. Born in New York City, Gleick attended Harvard, where he was an editor of The Harvard Crimson, graduating in 1976 with a BA degree in English and linguistics. In 1979, he joined the staff of The New York Times, working there for ten years as a science reporter. His writings have also appeared in The New Yorker, The Atlantic and Washington Post; he is a regular contributor to The New York Review of Books. His eight-year old son was killed (and Gleick was seriously injured) in 1997, when his plane, he was flying landed short in New Jersey. His previous books include: • The Nervous System, 1974. • Genius: The Life and Science of Richard Feynman, 1992. • Faster: The Acceleration of Just About Everything, 1999. • What Just Happened: A Chronicle from the Electronic Frontier, 2002. • Isaac Newton, 2003. • The Information: A History, a Theory, a Flood, 2011. • Time Travel: A History, 2016. REVIEWER'S WEBSITE All of the reviewer's prior essays and papers (referenced herein) may be found, freely available at: • • https://jamescook.academia.edu/HerbSpencer 1. INTRODUCTION 1.1 OVERVIEW This 350 page best-seller by a well-respected, science-journalist (James Gleick) was one of the first books that was written on the hot new topic of Chaos for the general public. Its objective was to alert the public to a new scientific paradigm that was spreading across several disciplines and contributing to a new subject, now called Non-Linear Systems Analysis. Unfortunately, these ideas were launched by mathematicians, as was modern physics, after Isaac Newton created the new mathematics of the infinitesimal Calculus. Since many subjects are only considered to be "scientific" when they are based on mathematics, then perhaps this new style of dynamic mathematics will have an impact on older subjects that have become obsessed on timeless presentations when reality is grounded more on Time and Change than on static imagery that is preferred by our visually dominated brains and our preservation of frozen knowledge in the universities. The book certainly documents some of the unusual characters who were brave enough to cross the Orthodoxy Frontier, especially in the cumulative area of mathematics as Gleick is an experienced journalist who knows all the public can relate to personalities but many are bedazzled by mathematics, as Gleick himself shows. However, I think that this book was too early to uncover the deep significance of the central findings of Chaos: a subject too important to be ignored by anyone who wishes to remain uninformed about the area of knowledge, known as Science that has emphasized stability and perfect equilibrium maintained by Laws of Nature with its implicit support for social rigidity. Chaos seems to promise new insights into the complexity of biology and living systems. Darwinists must be nervous. 1.2. MOTIVATION After climbing to one of the peaks of theoretical physics (Solid State Quantum Field Theory), I became deeply disillusioned about the role of just mathematics to provide explanations of reality. It encourages simplifications and an over-emphasis on local activity (Analysis), rather than trying to create a new Synthesis. The resulting reductionism broadened the gap between detailed mathematics and reality at the human-scale; this is reflected in the growing skepticism about science amongst the general public, who have been socialized (through school) to exaggerate the importance of mathematics and quantity in our lives compared to their personal intuition and preference for quality and relationships. Mathematics is the secret Uncle of western philosophy and theology. I have been suspicious of the utility of mathematics in science since I resigned from physics 50 years ago. 2. BUTTERFLY EFFECT 2.1 LORENZ Edward Norton Lorenz (1917-2008) was an American mathematician and meteorologist, who established the theoretical basis of weather and climate predictability, plus computer-aided atmospheric physics and weather. He is best known as the founder of modern chaos theory, the mathematics of complex dynamical systems that are highly sensitive to initial conditions. Lorenz was born in 1917 in West Hartford, Connecticut. He acquired an early love of science from both sides of his family. His father, Edward Henry Lorenz majored in mechanical engineering at the Massachusetts Institute of Technology (MIT), while his maternal grandfather, Lewis Norton developed the first course in MIT in 1888. As a boy, Edward's mother got him hooked on chess. Edward then received a bachelor's degree in mathematics in 1938 from Dartmouth College and in 1940 a master's degree in mathematics from Harvard. He worked as a weather forecaster for the US Army Air Corps, in World War II, leading him to pursue graduate studies in meteorology at MIT, earning both a master's and doctoral degree in 1943 and 1948. His doctoral dissertation, titled "A Method of Applying the Hydrodynamic and Thermodynamic Equations to Atmospheric Models" described an application of fluid dynamical equations to the real problem of predicting the motion of storms. Lorenz spent the entirety of his scientific career at MIT, in 1948, he joined the Department of Meteorology as a research scientist. In 1955, he became an assistant professor in the department and was promoted to professor in 1962. From 1977 to 1981, Lorenz served as head of that Department until the Department of Meteorology and Physical Oceanography merged with the Department of Geology in 1983 to become the current MIT Department of Earth, Atmospheric and Planetary Sciences, where Lorenz remained a professor before becoming an emeritus professor in 1987. On April 16, 2008, Lorenz died at his home in Cambridge, MA, from cancer at the age of 90, after a decade of vigorous hiking around New England. 3 In the 1950s, Lorenz started work on numerical weather prediction, using computers to forecast weather using observational data on such things as temperature, pressure and wind. This interest was sparked, after a visit to the Institute for Advanced Study in Princeton (IAS), where he met Jule Charney, then head of the IAS Weather Research Group (Charney would later join Lorenz at MIT in 1957) as a professor of Meteorology. In 1953, Lorenz took over leadership of a project at MIT that ran complex simulations of weather models that he used to evaluate statistical forecasting techniques. By the late 1950s, Lorenz was skeptical of the appropriateness of the linear statistical models in meteorology, as most atmospheric relationships (like between pressure and wind speed) in weather forecasting are non-linear. There was a lot of intuitive guesswork in creating these equations, so Lorenz used his own set of 12 equations that were reasonably good for short-range (a few days) forecast. In 1961, Lorenz has a lucky break but he was paying attention, when he restarted his computer during a long calculation using a number already printed out earlier. The first part of his continuation run should have just exactly duplicated the first group of remaining numbers from before but they were increasingly different. It took Lorenz a while to identify the problem: his computers were using six digit numbers (like 0.506127) but the printout only displayed the first three digits (like 0.506); the difference was tiny (i.e. 0.000127) and everyone thought these small differences (here one part in a thousand) should not matter, since the measuring devices that read the real starting values usually only worked to three digit accuracy but in Lorenz's equations these small differences were significant as he showed when he compared two full forecasts only differing by tiny amounts in the starting values. That day, he decided that long-range weather forecasting was doomed; it was OK for a few days but the errors got larger as the forecast duration became extended. Everyone else just assumed that 'better' equations would solve the problem - they did not; it was the accuracy of the starting numbers that was critical. This Sensitive Dependency on Initial Conditions was amusingly called the Butterfly Effect on the whimsical idea that whether a butterfly flapped its wings somewhere would later change the weather. Just reducing the number of equations to three that involved three inter-connected variables always produced this sensitivity (also called Chaos). Ironically, this connects to Newton's Three-Body Problem [see §6.1] that stopped Newton doing exact planetary calculations with three interacting celestial objects mutually attracting each other all the time under gravity. In other words, this is a bigger problem with all of physics than weather, it had been hiding in plain sight for 300 years. It is not even the fact that we are using computers to evaluate the equations: the problem goes right back to René Descartes who suggested using REAL numbers with an infinite number of digits to represent all numbers, including 'nasty' fractions, like 1/3 or 1/7. This was his solution to linking arithmetic (or Algebra) to Geometry. A step that I criticized in April, 1918 [and described in my critical essay NotRealRené]. This ties back to the fact that Geometry is an Imaginary 'Science' that relates to perfectly defined objects in our imaginations that do NOT exist in reality: they are perfectly smooth and perfectly thin. Lorenz did not see this deeper connection but saw it in terms of 'Order Masquerading as Randomness'. In fact, we shall see that non-linearity can manifest itself even in simple one-dimensional equations when they require finite recursion instead of Newton's Continuous Calculus [see my Infinitesimal essay]. Indeed, Lorenz plotted his three-value equations in a 3D space (one direction for each variable) and generated his famous interwoven pathway, now known as the Lorenz Attractor that resembles an owl's face; this shows that the system NEVER repeats itself, it loops around and around forever: a beautiful metaphor for the uniqueness of the Universe or even our individual lives. Lorenz documented his realization ('discovery') in his best cited scientific paper in 1963 (volume 20) in the Journal of Atmospheric Sciences with the title: "Deterministic Nonperiodic Flow". 3. NATURE'S GEOMETRY - CHAOTIC IMAGES Geometry is the oldest mathematical science invented by man. It was created by several clever intellectuals in classical Greece, who were inspired by the practical architects of ancient Egypt but the Greek pioneers only wished to construct a perfect scheme. This scheme used perfect idealizations, like the point of no size, lines of no thickness either connecting between two points to define 'the' line. If every point on the line is exactly of the same distance from a special point (the 'center') then the perfect circle can be defined. The fact that there were no real examples of these ideal shapes to be found in nature was not to held against them. Mind trumps body. 3 4 3.1 MANDELBROT BIOGRAPHY However, in the 20th century arose one person who wanted to describe the actual shapes found in nature, he was to revolutionize geometry; his name was Benoit Mandelbrot (1924-2010). Mandelbrot was a Polish-born French-American mathematician, who was attracted to the 'Art of Roughness'. He invented the concept of the fractal and the associated fractal-geometry, that explored the ideas of self-similarity. Mandelbrot was born in a Lithuanian Jewish family, in Warsaw while Poland was a free republic. His father was a clothing wholesaler and his mother was a dental surgeon. When he was eleven, his family (sensitive to political realities) moved to Paris to live with his father's brother Szolem, a mathematician, who privately tutored him. As the Nazis invaded France, the family moved again to Tulle, in central France. After D-Day in 1944, the Germans retreated from France so that Mandelbrot could return to Paris, where he attended the Polytechnique, France's top school for science. He studied at the California Institute of Technology (CalTech) from 1947 to 1949, earning a Master's degree in aeronautics. He returned to France and obtained his PhD in Mathematical Sciences at the University of Paris in 1952. In the next six years, he worked at France's leading scientific agency (National Centre for Scientific Research), 12 months at The Institute for Advanced Study at Princeton, the University of Lille. In 1958, he and his new wife, moved permanently to the United States, where Mandelbrot joined the research staff at the IBM Research Center in Yorktown Heights, New York. For the next 35 years he stayed at IBM becoming an IBM Fellow. He had to leave in 1987 when IBM closed their Pure Research Division, so he joined the mathematics department at Yale University, becoming a tenured professor at age 75. He died from pancreatic cancer in Cambridge, MA at age 85. One of his first projects was to see if there were any patterns in the prices of cotton that seemed to reflect both the economy and a random variation. He was surprised to find there were too many large jumps while the ratio of the many small price changes to large was not as high as he expected: it certainly did not fit the Normal (or Gaussian) distribution and did not fall off quickly enough - it had a long tail. Actually, this is not too surprising as the derivation of the normal curve shows there has to be an infinite number of very tiny contributions to any single value with equal numbers of both positive and negative from the norm but although markets have many 'players' they are not infinite and buy/sell orders are limited to small fractions, at any time. Mandelbrot then realized that the sequences of change were independent of scale: curves for daily price changes exactly matched those for monthly price changes; just across a smaller range. He found that the degree of variation had remained constant over sixty years: through two World Wars and a global depression. He decided to explore the phenomenon of scaling. Increasingly, he came to question Euclidean Geometry with its perfect shapes: he would often remark that "clouds are not spheres, mountains are not cones". He was impressed when he found an old paper about nature written by Lewis F. Richardson, who asked the simple but provocative question: How long is the coast of Britain? As Mandelbrot realized, the answer depended on the length of one's measuring stick. In 1975, Mandelbrot invented the word fractal while playing with simple shapes that grew by repeating a simple rule over and over; it was based on the Latin word fractus that means a broken fragment. His fractal idea implied self-similar at different scales; generated by a finite mathematical technique known as recursion (looping across time). 3.2 MANDELBROT SET If Benoit Mandelbrot is best known for inventing the fractal concept, more people have seen the intriguing pictures illustrating the Mandelbrot Set (M-S). These images correspond to the boundary of a finite subset of the M-S exhibiting ever-finer recursive details at increasing magnifications, making the boundary a fractal curve. Author Gleick exposes his mathematical naiveté when he introduces the M-S as "the most complex object in mathematics - an eternity would not be enough time to see it all, its disks studded with prickly thorns, its spirals and filaments curling outward around, bearing bulbous molecules that hang, infinitely variegated, like grapes on God's personal vine." The M-S is NOT unique, it is simply an infinite set produced by an open recursive process. Any function could generate a comparable self-similar series of images. In fact, because it's self-similar, once one has seen a part of it, one has 'seen' all of it. Only the Core-Function (see later) and that it is to be calculated recursively need be known. The M-S is an example of a complex structure arising from the application of simple rules; i.e. the conjoining of complexity (the pictures) and chaos or simplicity (the Generator). 4 5 Gleick notes that images from the M-S are often used to provide graphic images for conferences and brochures relating to Chaos (really hype). I reject the notion that these images are beautiful; give me a photo of nature any time. All math is boring, even M-S 'wall-paper'. Before computers, these recursion calculations were possible but too tedious for most people. With computers, trial-and-error geometry became possible. There are a few examples when changing the rules resulted in new geometries, as with Non-Euclidean geometries defined on spheres and parabolas rather than flat (Euclidean) space [see my MathematicaCritica essay §3.7]. Cartesian algebra related a simple equation (like X2+Y2 = R2) to a circle, radius R in 2D flat space. But when an equation is iterated, instead of solving it, the equation becomes a process instead of a description: dynamic instead of static. Mandelbrot used an implicit selection rule for his process: if the recursive process diverged (went to infinity) then it was rejected from membership in the set. The heart of the M-S is simple, complex mathematics [see my MathematicaCritica essay §3.5], along with recursion, as complex-number arithmetic (addition, multiplication) is closed. The M-S process ('Core-Function') is : Z => Z2 + C. That is, take a number (initially Z=0) then multiply by itself and then add a complex number C; then repeat, using the previous answer each time; so if C=1/10, then: Z0 = 0, Z1 = 0.11, Z2 = 0.1121, Z3 = 0.11256641, ... . If we define the start number, as C0, then Mn+1(C0, C) => {Mn(C0, C)} * {Mn(C0, C)} + C ; with C=test point. So, all the numbers {Mn(C0, C)} for n=0, 1, 2, ..., ∞ are an M-S, if {Mn}2 ≤ 4; In our example: Mn = Mn(0, 0.1). There is nothing random about the M-S: it is a direct result of the arithmetic rules of complex numbers and the definition of the key symbol, the imaginary square root of minus one, i.e. i = √(-1) and the process of iteration. In calculating the points in the M-S, there are regions (called 'boundaries') where most (75%) of the points occur. When real, physical phenomena that exhibit features of the M-S are studied experimentally, these boundary regions usually correspond to transitions, such as phase-transitions (e.g. melting) or breaking. These insights lead to the idea that nature abhors randomness (or probability): randomness is death; order is vital for living systems to survive. This suggests that it is only our ignorance that is named 'randomness'. It has been shown that the M-S is 'connected' i.e. cannot be represented as the union of two or more disjoint subsets. This is contrary to Mandelbrot's guess ('conjecture') that the M-S is disconnected based on the early computer images that were unable to detect the thin filaments connecting different parts of the global M-S. 3.2.1 PEITGEN It was the German mathematician, Heinz-Otto Peitgen (born 1945) and his co-author, physicist Peter Richter, whose book The Beauty of Fractals (1986) and photographs, along with their promotional touring exhibit on behalf of the Goethe-Institute in 1985 that did much to publicize the M-S and fractals. Peitgen was a strong advocate of the use of computers to investigate mathematics experimentally, even though mathematicians still insisted on the tradition of stepwise proofs before results were accepted as "true". The huge numerical power of computation and the visual cues to intuition could suggest promising avenues for valuable investigation, sparing mathematicians from useless blind-alleys. 4. UNIVERSALITY of RANDOMNESS Physics was first successful science with the simplest of concepts: a single object, initially of zero size - the point, or "corpuscle" as Newton called it, to avoid the (Greek Democritus and Roman Lucretius) atheistic term 'atom'. Solid bodies, treated as a single object, then became the focus of Classical Mechanics; this progress was extended to regular, repetitive solids or crystals. The mathematical trick that underlay this success was Newton's Calculus that that analyzed continuous motion in terms of instantaneous changes in the position of the point i.e. two locations that were connected by the movement at a temporal difference of zero duration: the "infinitesimal" [see my essay]. There was little theoretical progress with the next level of matter: fluids until 1757 when the blind Swiss genius Leonhard Euler (1707-1783) introduced partial differential equations to describe incompressible (perfect) or Newtonian-liquids (zero viscosity and zero thermal conductivity), based on the continuity of matter and conservation of momentum, as he invented the subject of Fluid Dynamics that he understood through his invention of the Power-Series (extending to infinity). The next development was adding in real fluid 'stickiness' (technically "viscosity") that was introduced by two mathematicians, the French engineer Claude-Louis Navier (1785-1836), published in 1822 and Anglo-Irish George Gabriel Stokes (18191903), published in 1843. 5 6 Their combined effort, defines fluid mechanics and is referred to as the Navier-Stokes equations that mathematically express conservation of momentum and mass for Newtonian fluids (with very special mathematical conditions). While no real fluid fits the definition perfectly, many common liquids and gases, such as water and air, can be assumed to be Newtonian for practical calculations under ordinary conditions. The Navier-Stokes equations are a central pillar of fluid mechanics. However, even basic properties of the solutions to these equations have never been proven; for example in three dimensions, no one has yet proved that smooth solutions always exist: there is a million dollar prize from the Clay Institute for this solution. [Note: once again, the troubling situation arises with the first prime number, three]. The problem is to make progress towards a mathematical theory that will give insight into these equations, by proving either that smooth, globally defined solutions exist that meet certain conditions, or that they do not always exist and the equations break down. However, theoretical understanding of their solutions is incomplete. In particular, solutions of the Navier– Stokes equations often include turbulence, the general solution for which remains one of the greatest unsolved problems in physics, despite its immense importance in science and engineering. 4.1 TURBULENCE The most desirable flow of liquids or gases is smooth (called 'laminar') that occurs when a fluid flows in parallel layers, with no disruption between the layers. However, in turbulent flow, unsteady vortices (swirls) appear of many sizes interacting with each other and increasing drag (friction); this means that more energy is needed to pump a given volume through a pipe. Turbulence is commonly observed in everyday phenomena such as surf, fast flowing rivers, billowing storm clouds, or smoke from a chimney; sadly, most fluid flows occurring in nature or created in engineering applications are turbulent. Turbulence is caused by excessive kinetic energy in parts of a fluid flow, which overcomes the damping effect of the fluid's viscosity. Turbulence has long resisted detailed physical analysis and the interactions within turbulence, create a very complex phenomenon that could not be described with mathematics. Theorists had mainly slunk away defeated, leaving real problems to engineers and technicians. 4.1.1 FEIGENBAUM The first person to pioneer this new science was Mitchell Feigenbaum (1944-2019), born in Philadelphia to Jewish immigrants from Poland and the Ukraine. He was raised in Brooklyn, New York and got his BS at City College in 1964, moving on to MIT, first in electrical engineering then theoretical physics. His PhD thesis was on the spread (dispersion) of waves. After brief post-doctoral positions at Cornell University (1970/2) and Virginia Polytechnic Institute (1972/4), at 29 he was offered a position in the Theoretical Division at the Los Alamos National Laboratory in New Mexico to study turbulence in fluids (a problem in exploding nuclear bombs). He appeared to be the classic 'eccentric' scientist, uncaring for his appearance, working 22 hours at a stretch and did not care to publish any papers (the expected path to gain promotions but this was a tough area). Feigenbaum was quite unorthodox, rather than accept Newton's reductionist theory of prisms and colors, he was more inspired by rival Goethe's holistic shadow theory. Feigenbaum realized that calculus had been developed to solve problems involving linear equations but turbulence implied non-linearity. So, he decided to investigate the quadratic equation of the parabola {Y = R(X – X2)} using recursive techniques [§3.2], repeating endlessly as a feedback loop: the output of one calculation was fed back, as input for the next. The trick was to plot the sequence as a graph, using a diagonal line {Y = X} where it meets the parabola and use the Y value and start over. The sequence bounces from place to place on the parabola until it comes to a single point with X and Y equal; sometimes it oscillates between two values: these single or double solutions depended on the parameter R. It was not realized then but Lorenz had looked at the same equation in 1964, as an attempt to answer the question: does a climate exist? (i.e. a long-term average). In effect, Feigenbaum had converted a continuous differential equation to a finite-difference equation, exposing more than one stable solution. Feigenbaum suspected that the series of solutions (based on changing R) were converging like a geometric series, so he calculated the ratio of convergence to the finest precision (3 decimals) on his HP-65 hand-held calculator, the number came to 4.669. Feigenbaum realized that something in his equation was scaling that implied that some quality was being preserved. He tried another function {Y = R sin (π X)}; again he got the limit number (later called the Feigenbaum-Constant, F} F=4.669. 6 7 This was exciting, there was no theory why two such diverse functions would converge to the same value, F . It was 1975, he called his mother he had discovered something that was going to make him famous: he was right, within a few years he was awarded several international prizes. Feigenbaum learned the new computer language, FORTRAN (FORula TRANslation) and then found that F = 4.6692016090 ≈ 14/3. It was eventually shown that virtually any mathematical equation with a period-doubling cascade (bifurcations) produced the Feigenbaum-Constant. It is not a surprise to discover that Feigenbaum had great difficulty getting his discovery published in a professional journal; eventually it appeared in an annual report from the Los Alamos Theoretical Division (1975). In 1983, he was awarded a MacArthur Fellowship and in 1986, the Wolf Prize in Physics for "his pioneering theoretical studies demonstrating the universal character of non-linear systems, which has made possible the systematic study of chaos". He was a member of the Board of Scientific Governors at The Scripps Research Institute becoming Toyota Professor at Rockefeller University from 1986 until his death. Universality implies that different systems would behave identically, leading to conceptual consolidation. Even though Feigenbaum was only studying simple numerical functions, he believed that his theory expressed a real natural law about systems at the point of transition between orderly and turbulent. His universality was not just qualitative, it was quantitative; not just structural but numerical: with huge implications. 4.1.2 LIBCHABER Gleick includes a detailed section on the experimental investigation of turbulence involving liquid helium by an experimental French physicist, called Albert Libchaber (born 1934), who first demonstrated the bifurcation cascade that leads to chaos and turbulence by measuring minute temperature changes in liquid helium. For this achievement, he shared the 1986 Wolf Prize with Feigenbaum: a major recognition of a paradigm change. 5. DISAPPOINTMENTS This review deliberately omits several chapters of the book as they introduced interesting topics but failed to give any useful insights on the book's subject. I suspect that the author did not understand what these people were saying or the conversations were too premature but he felt obligated to include them for the time they gave him; this also doubled the book size to over 300 pages that publishers seem to desire, so they can produce a standard-sized hardback (at standard prices). 5.1 DYNAMICAL SYSTEMS The first dynamical system to be investigated was the pendulum that activated Galileo's interests in physics. It has been much investigated, as its mathematics are well known but they are only accurate for small swings, when its motion is linear so that multiple possibilities may be added together. Gleick expends several pages on trying to relate topology to the motion of the pendulum but generates little illumination about either. Even more pages are wasted with a discussion of the mystery of the erratic behavior of Jupiter's Great Red Spot that had been seen by a Voyager close fly-by to be a hurricane-like system of swirling flows, made worse by the realization that Jupiter's surface was not solid (as assumed) but was itself virtually all fluid in motion. 5.2 ECOLOGY Ecology is the branch of biology that studies the spatial and temporal patterns of the occurrence of organisms and the interaction between them and their environment. Topics include the biodiversity and populations, as well as cooperation and competition within and between species. Ecosystems are dynamically interacting systems of organisms, the communities they make up, and the non-living components of their environment. Ecosystem processes, such as production, nutrient cycling and soil production, regulate the flow of matter and energy through an environment. These processes are sustained by organisms with specific life history traits. Biology, like many sciences, has been influenced by mathematics, especially for population changes. This area strongly influenced the emergence of chaos in the 1970s based on their mathematical models that were crude approximations of reality; the trick was to invent a plausible set of equations, choose some starting values and try to predict the situation a short time ahead: often next year. Many creatures only breed once a year that greatly simplifies the situation, often relating next year's population to this year's numbers. Calculating changes over several years just involves repeating the process several times: a natural form of recursion. 7 8 The simplest model just treats population like money, subject to compound annual interest; so if Xn is the population this year and Xn+1 represents the number next year then the equation is: Xn+1 = (1+r) Xn = R Xn with R = 1+r, where r is the annual percentage increase rate. This is called the Malthusian equation because the population grows each year until there in not enough food, so if r = 0.1 then the population growth is 10% annually. Introducing self-restraint produces what has been called the Logistic-Difference equation: Xn+1 = R Xn (1– Xn). The new term (1 – Xn) keeps the growth within bounds, since as Xn gets closer to 1 then it gets smaller. Note: here population is normalized to unity, where 1 is the maximum population (in a given situation), while zero corresponds to extinction. 5.2.1 MAY Author Gleick tells the story of Australian biologist, Robert May (1936-2020) who achieved fame and glory by pushing the Logistic-Difference equation to its limits. May was born to a lawyer in Sydney and first studied chemical engineering and physics at the University of Sydney, gaining his BSc in 1956; he stayed on for a doctorate in theoretical physics that he achieved in 1959. Early in his career, May developed an interest in animal population dynamics, exhibited through the relationship between complexity and stability in natural communities. He revolutionized the field of population biology through the analysis of this one equation alone. Later, he became a professor at Oxford University and was raised to a Life-Peerage by the UK government. 5.3 HEART RHYTHMS There are two flows in the heart: obviously blood flows through the four chambers but electrical signals also flow across all the muscles to synchronize their beating. Although the electrical waves do not suffer turbulence, they can experience wave interference that can disrupt the regular activity. Irregularities in the heartbeat have been known for some time and investigated and categorized as these can be responsible for many deaths. The most life threatening are fibrillation, when the muscles 'flutter' and lose their needed rhythm. Researchers have recently discovered that traditional cardiology was making the wrong generalizations about irregular heartbeats, using superficial classifications to obscure deep causes. A common problem was that specialists diagnose many different arrhythmias by looking at short strips of electrocardiograms, looking for standard patterns but the heart dynamics are much richer than most have imagined. Trial and error has been the method for designing artificial heart valves that replace the original valves when needed but changing the patterns of fluid flow in the heart, the artificial valves can create regions of turbulence and stagnation that may form clots that can break off and travel to the brain, where they cause strokes. The elasticity of the heart walls adds a whole level of complexity to the standard fluid flow problem. Ventricular fibrillation causes hundreds of thousands of death annually in the USA; in many fibrillations the synchronizing wave breaks up. In autopsy, the muscle tissue may reveal no damage at all indicating the parts seem to be working, yet the whole system goes fatally wrong: illustrating a global problem of a complex system. Researchers have used the methods of non-linear dynamics to judge that small changes in a single parameter: timing or conductivity could push an otherwise healthy heart across a bifurcation point into a qualitatively new behavior. 5.4 DYNAMICAL SYSTEMS COLLECTIVE Author Gleick dedicates a 30 page chapter to a group of researchers at the University of California at Santa Cruz, located at Monterey Bay. Several of the physicists there got interested in Dynamical Systems, so they called themselves the Dynamical Systems Collective - but others called them the Chaos Cabal. Gleick briefly describes four of them, who were viewed as rebels by most of the faculty, apparently they wrote lots of papers but no details or even titles are described. They dispersed in the early 1980s after gaining their PhDs. Why this chapter was included in the book is as mysterious as their scientific contributions. Quantity trumps quality? 6. CONCLUSIONS Reading this book made me realize that I had been strangely attracted (but unconsciously) to Chaos, over the last few years with my growing critique of Mathematical Physics that has manifested as 20 essays and book reviews. Indeed, my early resignation from a career in academic physics back in 1967 may have been triggered by a subconscious awareness of this problem that took 50 years to surface into my conscious thinking - who knows? 8 9 Reading this book reminded me of my dissatisfaction with traditional physics; at the very least, it has inspired me to write a summary essay on MetaMathematics - not the esoteric speciality found in a very few mathematics departments, but a critique of the foundations and assumptions of mathematics from a philosophical viewpoint, as Aristotle's own MetaPhysics was a critique of the foundations of physics and as Luther analyzed Catholicism. 6.1 CHAOS THEORY Chaos promises to re-integrate science that is disintegrating into too many separate specialities. Systems is the new focus with an emphasis on real interactions, not artificial simplifications on a few objects so that calculus can be readily used. The key is to observe the real patterns in nature. This will reinforce the move to synthesis - away from analysis (or classical reductionism): it is not the parts but the whole that is important. Chaos makes strong claims for apparent complexity. This revolution is more profound that the fashionable attention paid to Relativity and Quantum Mechanics that are now seen as special areas of classical physics. The price is the loss of the dream of predictability; that humans can control nature. One of the dirty physics secrets, discovered by Isaac Newton himself, is known as The Three-Body problem: calculus can only solve simple systems that involve continuous interactions between TWO objects (e.g. Sun and Earth under gravity) but not when three or more (like adding the moon to our simple system). As author Gleick points out in his introduction, Particle Physics has not produced a new idea in two generations - theoretical physics has stalled and has not contributed a useful technology in spite of the billions spent on it. The gain is that we can appreciate Nature at the human scale; not at the cosmic or atomic levels. Perhaps, humanity can stop spending billions on Particle Smashers or Gigantic telescopes or even inter-planetary explorations. We can return to improving our home planet and most importantly, our human relationships; we need more order and a lot less weapons research. There is a real need to relate the mathematical ideas of Chaos to real features of biology. It is the Cell that needs to be the focus. When confronted with complex results, we no longer have to seek complex causes or add random 'noise'. We now know from Chaos theory that tiny differences in starting conditions can quickly overwhelm differences in output: now called "sensitive dependence on initial conditions". In weather, for example, this is half-jokingly called the Butterfly Effect. This puts the nail in the human fantasies of predictability and control. The most famous icon of Chaos, the Mandelbrot Set [see §3.2] was one of the earliest examples when TIME (a recursion process) readily generates SPATIAL complexity i.e. a simple rule generating dynamic behavior (motion between the points) that appears random in a static picture. 6.2 WEAKNESSES One of the roots of these problems can be traced back 300 years to Descartes' assumption that infinite decimal numbers, also called "real" numbers can be used to represent nature: they cannot. Infinite ideas have a religious origin and cannot be applied to the real world by finite human beings; except in our imaginations. The integers (and rational fractions) are valid numbers because they can connect human imagination and the reality of human finite counting of distinct, physical existents. Newton himself introduced an imaginary mathematical concept into physics, with his definition of INSTANTANEOUS velocity of a body, even a corpuscle, as he tried to justify his method for describing change in time to changes in a perfectly smooth curve, with smuggling in the contentious subject of the infinitely small [see my Infinitesimal essay]. This was needed by Newton to focus on continuous changes in time, as he linked the human intuition of (muscular) FORCE to his brilliant idea of change in MOMENTUM, (using Descartes' algebraic suggestion of multiplying two disparate physical ideas: the change in location and inertial mass (P = M V and F = dP/dt). Actually, Newton's first version of his Law of Motion used the instantaneous idea of Interaction (Impulse) and can be used for finite changes in location ∆X over a finite time difference (∆T) between interactions (V=∆X/∆T), so the Laws of Motion can relate human imagination to finite measurements (∆X and ∆T) to preserve linear mathematics but we will have to forgo continuous interactions and instantaneous momentum that eliminates the popular tool of "Phase Diagrams" that has been a key technique for physicists and mathematicians for a hundred years. 9 10 One of the greatest difficulties we face for expanding the study of chaos is that Chaos Theory is almost completely mathematical, so that it is too abstract and timeless. The challenge is that nature is real and full of change, as mathematical physics has found to its cost that the hidden, implicit assumptions behind Calculus reject. 7 RECOMMENDATIONS I cannot recommend this book at all, even though it is well-known, as is the author. I found it far too fragmented and disorganized: in a word too "chaotic" (in the traditional sense). Unfortunately I don't think this book is very well written. The purpose and point of the book is quite hard to see. Does Gleick want to tell us about chaos? In that case the explanations are few and far between and when an explanation of chaos is attempted, it doesn't quite adequately explain it. Perhaps this is more to tell the human side of chaos? How recognition might not always come but the pursuit of truth is more important. In this case I am unconvinced as the structure doesn't allow that to come across. The book seems to be more about individuals and looks to be split up into small autobiographies. Finishing this book, I'm not really sure what it was I was supposed to get out of it. One interesting point was how Kuhn's theory of scientific revolutions was exemplified that illustrated the conservative nature of science and the intense efforts to preserve standard Orthodoxy (and the personal memorizations) that most scientists have invested in throughout their lives: they accept minor changes but big ones are too radical. This then is not a book about chaos theory. It's more a collection of stories of the people who helped bring about chaos theory and their struggles in doing so. People who enjoy biography may enjoy it. Finally, the author is too enamored of the reputation of mathematics and mathematicians, who have successfully inflated their reputations across society. The ideas of quantity have perniciously swamped quality, just as mathematicians have hijacked too many of the sciences, including physics. 10