John von Neumann
Hungarian-Born American Mathematician and Computer Pioneer
Mathematics Ranking 29th of 46
Von Neumann with his computer design. Hungarian stamp from 1992.
John von Neumann’s contributions ranged from pure logic and set theory to the most practical area of application. For example, Von Neumann helped to develop the U.S. hydrogen bomb. He analyzed the mathematics of quantum mechanics, founding a new area of mathematical research (algebras of operators in Hilbert space), and established the mathematical theory of games (game theory) in 1928. Most significantly, his design and operation of the first modern computer, finished in 1951, proved to be the model for the more advanced computers built in succeeding years.(1)
Physics–Quantum Mechanics, Hilbert Space
Quantum mechanics came out of the quantum theory founded by Karl Ernst Ludwig (Max) Planck in 1900. Planck announced his radiation law, according to which electromagnetic radiation from heated bodies was not emitted as a continuous flow but was made up of discrete units or quanta of energy, the size of which involved a fundamental physical constant (Planck’s constant). The quantum concept was immediately used to explain atomic structure and the photoelectric effect. Hilbert space was invented by David Hilbert (1862–1943), a German mathematician, as an infinite-dimensional analog of Euclidean space. Von Neumann carried out a complete theoretical analysis of the foundations and the basic tenets of quantum mechanics, which he published as The Mathematical Foundations of Quantum Mechanics, one of the most influential books in physics.(2) Von Neumann’s later work in pure mathematics marks him as one of the greatest mathematicians of all time.
Game theory, a branch of mathematics, deals with the selection of best strategies for participants in competitive situations where the outcome of a person’s choice of action depends critically on the actions of other players. The entire edifice of game theory rests on two theorems: (1) Von Neumann’s min-max theorem of 1928 and (2) (John) Nash’s equilibrium theorem of 1950.(3) As well as being influential for economics and business, game theory has found uses in biology, military tactics, sports, and other fields.
Von Neumann on a 2005 stamp from the United States.
Perhaps his most influential contribution was in the design and operation of electronic computers.(4) The computers of today originated in the 1940s. In 1946, the first general-purpose electronic computer was completed. Although this computer—called ENIAC, an acronym of Electronic Numerical Integrator and Calculator—could be programmed to do different tasks, this programming required a partial rewiring of the machine. Von Neumann discovered the concept of the stored-program computer that brought about the development of digital computers in their present form. In this computer, the instruction (program) and data are kept in a common memory so that they can be processed in a uniform fashion, as appropriate to their respective roles in the computation. The first operational stored-program electronic digital computer (what is today called simply a computer) was completed at the University of Cambridge in 1949 under the direction of Maurice Wilkes, an English mathematician.(5)
Immediately after the Second World War von Neumann gathered a brilliant group of scientists and engineers at the Institute for Advanced Study in Princeton. Their task was to take the experience developed during the war years in the development of two early computers, the ENIAC and the EDVAC, and combine it with recently developed theoretical knowledge to develop what one of its backers called "the most complex research instrument now in existence." The group under von Neumann decided to organize the computer under four main sections: (1) an arithmetic unit (2) a memory (3) a control and an (4) input-output device.
The arithmetic unit, now generally called the central processing unit, is the place where the machine performs the elementary operations, those operations that should not be reduced any further. These essential operations are essentially wired into the machine, while any other operation is built out of the elementary ones by a set of instructions. The number system of Charles Babbage's Analytical Engine was decimal. But with the advent of electronic, rather than mechanical, devices for representing numbers, it turned out that it was simpler to represent numbers in binary, so that any particular device holding a digit would only need to have two states, on and off, to represent the two possibilities of 1 and 0. Von Neumann was in fact instrumental in designing efficient sets of decimal-binary and binary-decimal conversion instructions so that the operator could enter numbers in the normal decimal mode and receive answers in that mode as well, without compromising the speed and ease of construction of the machine.
The memory unit of the machine needed to be able to take care of two different tasks, storing the numbers that were to be used in the calculations and storing the instructions by which the calculations were to be made. But because instructions themselves can be stored in appropriate numerical code, the machine only needed to be able to distinguish between the actual numbers and the coded instructions. Moreover, in order to compromise between the "infinite" memory desired by the user of the machine and the finite memory constructible by the engineer, it was decided to organize the memory in hierarchies, such that some limited amount of memory was immediately accessible while a much larger amount could be accessed at a somewhat slower rate. It was also decided that in order to achieve a sufficiently large memory in a reasonable physical space, the units that stored an individual digit needed to be microscopic parts of some large piece.
The control unit was the section where the instructions to the machine resided, the orders that the machine could actually obey. Again, compromises had to be worked out between the desire for simplicity of the equipment and the usefulness for the sake of speed of a large number of different types of orders. In any case, one of the more important aspects of the control procedure, was the ability of the machine to use a given sequence of instructions repeatedly. But because the machine must be made aware of when the repetition should end, it was also necessary to design a type of order to let the machine decide when a particular iteration was complete. Furthermore, the control unit needed to have a set of instructions that integrated the input and output devices into the machine. Von Neumann was particularly interested, in fact, in assuring that the latter devices would allow for both printed and graphical outputs, because he realized that some of the more important results of a particular computation may best be explored graphically.
The computer eventually constructed at the Institute for Advanced Study, based on von Neumann's design and finalized in 1951, proved to be the model for the more advanced computers built in succeeding years.
(1) Victor J. Katz, A History of Mathematics - An Introduction, 3rd edition (Boston, 2009), p. 917.
(2) Lloyd Motz and Jefferson Have Weaver, The Story of Mathematics, (New York, 1993), p. 275.
(3) Sylvia Nasar, A Beautiful Mind: The Life of Mathematical Genius and Nobel Laureate John Nash (New York, 1998), p. 96.
(4) Ibid., p. 97
(5) Encyclopaedia Britannica, Macropaedia, Volume 16, 1993, 15th Edition, p. 630.