Logically the Best

In keeping with the New Years Theme, this month’s column lists the five most important accomplishments from the world of logic, mathematics, computing, or philosophy in the last 100 years.  This idea is an offshoot of the books Five Golden Rules: Great Theories of 20th Century Mathematics – and Why They Matter

and Five More Golden Rules: Knots, Codes, Chaos and Other Great Theories of 20th Century Mathematics,

both written by John Casti.  In these works, Casti devotes a chapter to each of the following:

  1. Minimax Theorem (Game Theory)
  2. Brouwer Fixed-Point Theorem (Topology)
  3. Morse’s Theorem (Singularity Theory)
  4. Halting Theorem (Theory of Computation)
  5. Simplex Method (Optimization Theory)
  6. Alexander Knot Polynomial (Knot Theory)
  7. Hopf Bifurcation (Dynamical System Theory)
  8. Kalman Filter (Control Theory)
  9. Hahn-Banach Theorem (Functional Analysis)
  10. Shannon Coding Theorem (Information Theory)

with items 1-5 being from the first book and 6-10 from the second.

The idea of ranking my own top 5 had been pinballing around the back of my mind since I first encountered these books in 2015 and when the opportunity presented itself for this month’s theme, I thought I would seize it and run.  But it is important to be aware that the list I’ve compiled here is my own, inspired by Casti’s work, but independent in both content and analysis.  In fact, I think Casti misses the two most important contributions even with a list of 10 – but that is a matter of judgement and taste.

So, without any more preamble, let’s get to the list.

  1. Kalman Filter

It is hard to overemphasize the Kalman filter’s importance to real time computing.  The algorithm allows for fitting and estimation to be performed, in the presence of noise and uncertainty, during the course of process, such as the motion of a robotic arm or the flight of a spacecraft.  The filter takes in measured points as they become available and ‘re-fits’ the new point in the context of all previous points thereby giving an improved estimate of the state of the system.  The key point is that the system directly takes into account both the uncertainty in the measurements as well as the short-comings of any physical model of the motion.  The introduction of the Kalman Filter gave rise to a whole industry of algorithm development (extended Kalman Filters, unscented filters, Bayesian networks, etc.) that balance model-based prediction (e.g. Newton’s laws) with uncertainty in knowledge and measurement – the essential ingredient for computation in the real world.  Final note:  Casti also recognizes the Kalman filter.

  1. Optimization Techniques

This entry is a catch-all for a variety of algorithms and approaches that have been stirred into action with the advent of the computer.  While not particularly glamourous, optimization methods solve a host of real world problems associated with getting the best; as long as best can be mathematically encoded in something called an objective function.  There are two basic families of optimization algorithms:  indirect and direct.  In the indirect method, additional equations, typically called adjoint equations, are added to the problem to be solved and the response of the new variables in these equations indicates when controls should be applied and how.  In the direct methods, a set of controls are assumed at the onset and their values are adjusted to achieve optimality.  Optimization techniques find application in thousands of practical problems including trajectory optimization, operations research, and economic modeling.  Final note: Casti singles out the simplex method for special recognition.  While this method is of particular importance in both the history of optimization and the particular application to economics, its limitation to linear problems is a significant liability, hence the larger scope of this entry.

  1. Shannon’s Mathematical Model of Information

In 1948, Claude Shannon published A Mathematical Theory of Communication in The Bell System Technical Journal.  This paper, considered a landmark, set the stage for the modern digital age and for the advances in the field of telecommunications (wired and wireless).  Building on the ideas of Nyquist and Hartley, Shannon essentially created the field of information theory and the digital signal processing that follows in its train.  His theory defined the notion of information (a bit), sets the theoretical limits on the amount of information that can be transfers, adapted the idea of entropy to communications, and provided a set of theorems that defined what could and could not be done in a communications system.  Final note: Casti recognizes the importance of Shannon’s work in his second book and increased my own recognition of the importance of Shannon’s work.

  1. Fast-Fourier Transform

Deemed as one of the most important numerical algorithms ever created, the fast fourier transform or FFT is the workhorse of the modern digital age.  There are so many applications that it is hard to even list a fraction of them but some of the most important ones are:  1) signal processing in telecommunications, 2) image construction in medical imaging, and 3) spectrum analysis of scientific data ranging from the mechanics of materials to response of electrical components to quantum mechanics.  An FFT system resides in virtually every home in the US in the high definition televisions, cellphones, videogame systems, and so on.  Final note:  Casti has no mention that I’ve found of the FFT.

  1. Gödel’s Theorem

Rounding out the top of this list is the far-reaching analysis on the limit of logic (mathematical and otherwise) that Kurt Godel provided the world in 1931.  This theorem, along with the closely related theorems due to Tarski (undefinability theorem), Church (undecidability) , and Turing (halting theorem) states that formal logic must be either incomplete or inconsistent.  All of these related theorems deal with how logic views or expresses statements about itself and have profound implications on human thought, philosophy, and artificial intelligence.  I also interpret them as saying something essential about man as a rational animal and, therefore, about the human condition as I’ve written in earlier posts (Gödel’s Theorem – General Notions, Turing, Godel, and the Universe, Logic and Limits, Bare Bones Halting, and Self-Reference and Paradoxes).  The full content of these results will no doubt be explored for years to come.  Final note:  Casti focuses more on the halting theorem and its implications for computing.

Leave a Comment