The PolyNumbers.

This topic is fundamentally ground breaking.

I will provide links to the videos which Norman has posted on YouTube, but they require much listening to and deep thought, meditation and letting go!

Recently I realised my early geometrical training had fixated me on the reference frame of the page. Horizontal and vertical were unconsciously linked to the page, not to the space around me. At a deep level my apprehension of space was limited. By studying other geometers with greater abilities I gradually identify and correct my own limitations. I let them go.

Polynumbers is one of those key moments in my understanding.

http://youtu.be/-Ad6pYjCAmg
http://www.youtube.com/watch?v=UMWbb-WAPsk&feature=youtube_gdata_player

I have concluded by other means that mathematics now belongs with computer programming and programming languages, a subsection of computational science. I am not the only person who thinks that. Wolfram Research is of the same opinion, or rather they are pragmatically entwined in this concept. They use software to provide tools for the sciences and for all who need computational power and assistance.

I have programmed computers with some simple runtime programmes to do specific things. My experience is scatty. I only learned what I needed. My greatest satisfaction was creating pointers , common in c and c++ and other higher languages for Qbasic. I only had access to Qbasic at home and I wanted to access the register structure flexibly. Bits of hardware design I never understood before became familiar. I had a concept of the 866x chip, but only as much as I needed. Once I achieved what I wanted, I gradually lost the understanding. My memory shifted back to less specialist use, but I retained that sense of achievement to this day. If I do not need stuff in my head, I let it go!

I understood that what I wanted to do mathematically I had to design a programme for it, and I appreciated how helpful good labelling was in reminding me what a procedure, function or sub routine was designed to do.,I also commented within the blocks to make plain how it worked or what my thinking was.

When I got a function design implemented and working I often found I could call it in a container function designed to do more complex things by choice either set by me or some conditional statement. Code warriors know when code works it is a beautiful creative thing. It liberates the mind and opens up many previously unplanned opportunities and options. But when it does not work, it can be a nightmare that closes you down, restricting your thinking down to the lower levels of the chip down into the registry and machine code. You need to know how the system actually processes sometimes.

The memory, the Arithmetic Logic Unit, the clock cycle, the address section in a word or a bit of machine code, etc etc..
The binary numbers , the octal, the hexidecimal number systems were explored to explain how the ALU added and multiplied. The goto loop explained how a programm could flow out of sequence, the address becoming a label.

The ALU doubled for logical and ,not and or operations.. I learned just enough to get the basic idea. I did not do electronics. In fact I was taught the magnetic memory core! I did not really get how flip flop transistor circuits stored data, and I did not care. I wanted a higher level language that was more natural than even Basic.

I came to appreciate the conciseness of c, but I wrapped it in expressive labels so I knew what was going on. I wanted the language to become ,ore like I used everyday. I discounted how the machine actually worked, how I made it behave to mimic my everyday concepts.

I realise now that I need to turn that on its head!

Computers have successfully done math in an entirely different way to the way I do. That means we have created concrete definitions of some concepts we struggle with in mathematics. The fundamental one is that of a variable. The second s that of a function.. The third is that of a continuum as in continuous functions. The fourth was algebraic manipultions. All of these have been successfully coded.

It was time to redefine mathematics. Norman Wildberger is the only public teacher who has attempted to do this using modern up to date computational solutions..

From my perspective I had divested many of the overarching concepts , but I had not realised how the fundamental mosaic concept could, or has been implemented in computer technology and culture. Polynumbers is the first time I have made the connection.

I have written Befor e about the prevalence of the Arithmoi in this modern technological superstructure. The array has been a standard design icon since mosaics were developed. Bricks and mortar, blocks and pegs all have shaped how we have civilised the world. In contrast the Shunyasutras are more rounded and flowing and organic, but the fractal nature shows this Arithmoi structure in free form.

Before we could advance to computers we had to understand the polynomial arrays . Leibniz was among the first to use a lineal polynomial structure on a rod to perform geared calculations of an arithmetical nature, but of course the Hindus perfected the decimal polynomial calculus first.

This decimal calculus inspired Napier to form calculating arrays of rods which were the first Hindu Arabic Polunumbers, but prior to both these developments Greek technologists were developing the calculating gear ratios. Archimedes was among the most notable but he was inspired by timeous who sought to creat accurate water clocks to count the hours of the sun or ratio the Tyme or spatial position of the sun and moon.

Pendulums were thus first accurately timed by water clocks and found to be regular or periodic after the water clock. Thus they became useful at gearing time more accurately.

Among the Hindu treasures that Napier explored was Brahmaguptas wheel. By it Napier was able to use the sines that we're being calculated to 20 or more digits to derive a proportion we now call the laoarithm. The movement of this geared wheel in equal beat to a pendulum allowed Napier to compare 2 lengths : the sine of an arc against the length of an arc. The arc was so small that it could be approximated by a straight line.

There were no real numbers then only fractions or ratios of integers. The arc was approximated by an ngon and each of these sides was used to count the proportion of the sines. The Arithmos was thus the array of sectors in this ngon and the logos was the sine ratio. This sequence of polynomials made it possible to create a table, extracted from the sine tables which made multiplication into a look up and add process.

Much later Cotes and Newton used the same kind of polynomial to calculate the ratio e and Newton used it to calculate certain actuarial tables by a difference method. The Naperian logarithms followed this scheme for those in the actuarial professions while those in the Astronomical ones found the original Naperian logarithms based on the sines more relevant..

The base 10 logarithms were devised by Briggs for general calculation.

The Naperian logarithms were based on a difference calculus devised by Newton and studied by De Moivre who went on to use them in his analysis of chance games. The underlying sine tables were well understood by Cotes and DeMoivre and Newton in a way that astonished their fellows in the royal society. Because of their cyclical nature , even though they were written in tens of digits, DeMoivre surmised that closed systems like card games must conform to some equation of these sine polynomials. In fact he found they conformed to the actuarial tables better and derived a relation between the sines and the ratio e.

While this is all quite remarkable, we have to recall that these were exceptional men, calculating long answers was a particular joy to them, and setting out detailed differences hardly laborious! Thus they could see further than anyone the very cycles of these Arithmoi where others were floundering in the first few digits!

Eventually these long digit tables had to be cut down to 4 digits for those who were less able!

We were taught, as children to use these tables as look up tables. We had no concept of the difference calculus or the polynomials of increasing degree that underpinned their construction. However, their layout was the next example of polynumbers in the historical records

I am reminded that the Egyptians and the Babylonians possessed tables . While these were tables of squares and cubes , they were not polynomial perse
Much of what came to be called function theory is based on these extant tables and their polynomial underpinnings. As calculators became possible under the new electronics , as opposed to the winding of gears in decimal polynomial form, the use of tables declined and the direct connection to function theory was lost. The real number theory also misguided mathematicians to see the decimal expansion of rational numbers as somehow more accurate..

When more and more data points could be easily computed, it became fashionable to view certain functions as continuous, and to eschew the extrapolation and interpolation of former times as unnecessary. This is when we lost sight of polynumbers, and why mathematicians were so opposed to the early computers.

Early computers were fast but could only use integers. This was thought to be a severe limitation, and a reason to deride advances in computation afforded by these fast computers. The idea that computers were error prone developed and their inn accuracy highlighted. The intellectual snobbery of this view was hard to delineate when computer memory was so small and expensive, but in principle the use of integers was never a limitation. Computer scientists or mathematicians who were enthusiastic about computation and number theory went down the misguided root of creating floating point technology.

It was misguided computationally because it was more compute intensive and mor e memory expensive. But to overcome objections they had to be seen to be willing to be mathematically subservient. In fact these issues would disappear when more memory became cheaply available.

The acceptance of floating point arithmetic in computers is the next manifestation of polynumbers, if you consider the calculation in sub routines of mant trig and log tables as equivalent to the tables themselves.

We're it not for memory storing the tables as polynumbers or bipolynumbers in memory arrays would have bern computationally faster in terms of look up.

The floating point system required polynumber storage before assigning the point position by advancing the polynumber up or down in the registry. Or ALU.

We could take an aside and consider the binary octal and hexidecimal systems as polynumber systems, which of course were only of interest to the electronic engineers who had to access registries or input data into registers.

Polynumbers have bern with us for a while, but we have been lead to demean them. Whereas, Newton and Wallis and Euler would all have quickly mastered and apprehended them.

It was Norman Wildberger, probably inspired by Wolfram Mathematics, who recognised the foundational significance of the modern computational paradigm. He set out to rewrite all of mathematics from this standpoint.

http://en.wikipedia.org/wiki/Polynumbers.
This rather dry reference places polynumbers in a wider set of algebras and distinguishes it from Clifford algebras.

The basic insight of the hardware designers was that they could store data in specific locations in a mosaic. But to implement that they needed a simple Algrbra and the logical choice was binary. Due to the liberation of Algbra from Anlysis both symbolic logic and symbolic arithmetic shared the same binary algebra. An ALU was thus from the outset much more than a Babbage difference engine or a Babbage polynomial. Both numerical and logical combinations could be " processed" .

The rotational nature of that processing was crucial, and so was the shift along the register. The idea of a register comes from the Babbage difference engine and proposals by Leibniz. There is a lady mathematician who essentially single handedly promoted the Babbage machine arithmetic, and laid the foundation for modern computing register behaviours.

So the binary polynomial became the foundational design for memory storage, addressing and logical nd arithmetical manipulation. In the process the concept of the unknown, pace holding variable , the athematicins x and y was subsumed. It was subsumed into the general process of memory storage access, processing and manipulation.

Computer technologists were not restricted by years of tradition, they were innovative and they were creative. They deconstructed the issue of what is a variable into concrete terms. A arable was an addressable mosaic!

In short the Arithmoi were the fundamental,concept behind a variable. But what had been down played and split off was the addressable nature of any position in a mosaic. . In fact it was totally obscured by the Hindu Arabic notation. The implementation of reference frames comes from the reestablishment of the mosaic as a fundamental object called a graph plain or a 3 d space in Cartesian coordinates.

Again Dedekind established the real numbers as a cut in a continuous gnitude called a measuring line by Wallis, and a number line by 19 th century mathematicians responding to Betkely's Rant!

Once number took over as a concept the notion of magnitude becme arcane. In particular algebra became divorced from so called geometry , and it was very fashionable to belittle geometrical presentations in higher mathematics. There was a belief that Logic could solve all problems ad logical positivism motivated mathematicians and so led scientist alike..

It was therefore not considered intellectually enervating to consider the trivial or the elementary cases. It was also not considered proper to get your hands dirty making stuff. That was for those who were less able o engage in higher mathematical abstractions.

The electronic engineer was therefore almost on his own. Babbage, Turing, Neuman were perhaps the few intellectual giants who considered the design and functioning of a universal coding machine. The impetus was war and the decoding of nd my codes.. The mathematical representation of codes as polynomial strings or linear clock arithmetic combinations lead to the employment of probability, group and ring theoretic ideas. And algorithms or programmed instructions.

A lot of this was top secret, so again few knew that a body of intellectual documents existed across these diverse field which could guide the electronic engineer, it was therefore a highly specialist small group of government sponsored individuals who pioneered these speculative theories. It was only later, during various peace time initiatives when commercial attempts were made to further this kind of development.

The topic languished until the Germn nd Americn scientist could replace the thermionic valve with a crystal transistor. . Then logic circuits could be. Ulithi and ALU designs from in war literature implemented.. The military played a key role in the development of these speculative technologies. Huge computing arrays were built , but they were temperamental ( sometimes needing a kick to start working!) and unreliable.but they were much faster thn mechanical analogues from which hey were designed, by an electronic engineer called tommy..

Ok, so the top secret was that mathematics or more precisely arithmetic of polynomials could be done on these machines. There was no variable, just a set up to store and shift data around in the store, and also to move data in and out of the ALU to "process" it.

Essentially data was now strings of polynumbers. The human processing was replaced by an utomtic machine processing, and the variables were replaced by addressable memorie nd stores of sequenced data.

Everything in the stores of a processing unit is the variable element of the processing system. But underpinning it all is the polynomial mosaic array.
The fundamental building block of this polynomial array is the dynamic memory block that memory [0 1) is the first polynomial from which all other polynomials memory arrays can be built by register shifting. This is the alpha that Norman defines and the register shifting is the powers of Alpha. This is also why alpha can suddenly take a definite data value, because alpha is a bit of the registry for storing data!

Calculus using polynumbers. Check it out!
http://youtu.be/DAHBgcDJQjw

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s