Skip to main content

The XY files

What is algebra and how did it develop? Victoria Neumark traces the history of modern algebraic thinking.

Babylonian tablets show that scribes knew how to work out the base and height of a rectangle if they had any key measurements. With four variables Q base, height, the sum of these two numbers and the product gained by multiplying them Q established, they were able to discover any one or two if the others were known, using a simple rule. These Babylonian sums have been described as the first appearances of algebra, since the procedure is equivalent to solving the general quadratic. They certainly show that, 5,000 years ago, intellectuals were concerned to discover general solutions. However, the Babylonians did not develop specific algebraic notation and always presented these general rules as specific cases. Thus they could not present proofs or derivations. Their methods, using sequence, are more in tune with the methods of electronic calculators and computers than the inductive proof used by later thinkers.

Diophantus of Alexandria (active in the 3rd or 2nd century BC) wrote a huge work called Arithmetica. He also began with an arithmetical form of analysis of particular problems rather than general solutions. To find his answers, he developed a method of reformulating the problem in terms of one of the unknowns and then manipulating it as if it were known until the explicit value for the unknown emerged. He went so far as to adopt an abbreviated notation for these operations, using an "S" figure for the unknown.

Diophantus had a great influenc on later mathematicians like Pierre de Fermat, whose famous "last theorem" was the subject of an acclaimed piece of recent work by Andrew Wiles at Princeton, in its turn subject of a TV programme. Diophantus only worked with rational numbers and his symbolic notations were rudimentary, but the ingenuity of his thinking transcended his conceptual tools. His work was later refined by Hindu mathematicians who elaborated the symbolism needed.

It was the Arabic mathematician (Muhammad ibn Musa) al-Khwarizmi who gave algebra its name. His Book of Restoring and Balancing, which like the rest of his work served to introduce Hindu arithmetic to the Arab world, provided a systematic introduction to algebra, including one of the theories of quadratic equations. The Arabic word for "restoring" is al-jabr. Restoring and balancing is still a good image to keep in mind if we are asking what it is exactly that algebra does.

The next great leap forward in mathematics came in the 16th and 17th centuries when Francois Viete codified the use of symbols to denote variables (he used capital vowels A, E, I, O, U). Two other Frenchmen, Albert Girard and the philosopher Rene Descartes, made great conceptual advances, Girard analysing negative quantities and Descartes applying his formidable logic to new areas of thought like algebraic geometry. Such problems as how to find the curve or locus traced by a point whose distances from several fixed lines satisfy a given relation were then at the cutting edge of algebraic thinking.

Descartes also adapted Viete's notation by setting aside letters at the end of the alphabet for variables (x,y,z) and letters at the beginning for parameters (a,b,c). Descartes' work was a precursor to the development of true calculus.

It was Newton who baptised algebra as Universal Arithmetic in 1707, and it was Newton who first published the calculus in the first book of his Philosophiae Naturalis Principia Mathematica (1687), but in this work he eschewed algebraic analysis of curves and vectors in favour of geometric or mechanics calculations. It was the philosopher Leibniz who developed a mathematical or algebraic way to solve the differential and integrative calculus, with d(x+y)=dx+dy and d(xy)=xdy+ydx published in 1684 in an article with the glorious title of "A New Method for Maxima and Minima as well as Tangents, Which Is Impeded Neither by Fractional Nor by Irrational Quantities, and a remarkable Type of Calculus for This".

Later mathematicians such as Leonhard Euler and Joseph-Louis Lagrange continued to develop the Leibnizian algebraic analysis of calculus as the "true metaphysics", but their view has since been rejected. They also worked on the fundamental theorem of algebra, and this has been incorporated into modern algebraic thinking, which might be dated from the first successful proof of the fundamental theorem by Carl Friedrich Gauss in 1799.

Why was Gauss' proof so important and what is all this algebra up to? To answer this question one needs to consider carefully what are numbers and what can be done with them.

Numbers are abstractions. Natural numbers are what we use to count with. Rational numbers describe ratios. Real numbers describe all the numbers on a number line. And there are more complicated categories of numbers yet. All numbers in algebra are subject to the four operations of arithmetic Q add, subtract, multiply, divide.

But supposing one wants to make general statements about numbers and their relationships. Just as one no longer wants to push beads around to talk about "9 divided by 3", so one does not want always to explain "this is true for all numbers satisfying these criteria". Thus a "meta-language" is needed, to symbolise relationships between symbols or sets of symbols.

Has algebra got any practical applications or is it just bug-eyed mathematicians contemplating the essential harmonies of the universe? It has been used in theoretical physics, computing, economics and the statistical side of the social sciences.

But basically, yeah, it's great fun.

Log in or register for FREE to continue reading.

It only takes a moment and you'll get access to more news, plus courses, jobs and teaching resources tailored to you