Refraction, Reflection and Rotation of light and waves.

It was just pointed out to me that the laws of reflection are empirically based and well known. Since I have not known them beyond a child like acceptance it is time to explore.

http://en.wikipedia.org/wiki/Snell%27s_law
http://en.wikipedia.org/wiki/File:Snell_Law_of_Sines_1837.png
http://en.wikipedia.org/wiki/File:Ibn_Sahl_manuscript.jpg

So far the history seems to involve optical lenses and water in curved receptacles. A light ray is intimated to be refracted , and by varying the amount of refraction light rays are focused onto a point giving burning . Light rays, by Newton were corpuscular and received motive in passing into materials. This could be accelerative or decelerative.

Huygens then demonstrates that wave fronts would have the same property.
http://en.wikipedia.org/wiki/Huygens–Fresnel_principle
The theorem of Pappus plays a role in the understanding of this phenomenon, but crucially it requires Descartes to develop his Cartesian coordinate system using general coordinates. That is, Wallis later fixes the ordinate and coordinate at right angles.

Descartes used lines in fixed relationships, and fixed or invariant relationships of points on those lines. Thus he was setting out a function for a curve of points which had ordinates and coordinates akimbo, but because they were fixed he could choose 2 and write all the others in terms of these 2. These 2 became known as the axes of the system.

This method became popular and was adapted for each situation. It was Wallis who arranged all the lines regularly and all the angles at right angles so the lines were akimbo in a regular array. Pappus theorem thus underpins the straight lines of all graphed equations of degree 1!

Secondly, Apollonius was considering a problem in projective geometry. Pappus showed this lineal coincidence for a fixed system of lines, Descartes showed how a coordinate system could be derived from any general lines in fixed relation in the plane and Wallis showed how it could be standardised. Thus projective geometry underpins the geometry of the Cartesian plane.
Now we also know that these lines were considered as rays of light in optics, so the optical theories based on these rays are evident in the geometrical descriptions of focus and directrix. The parabola received its name because it resembled the burning mirror shape. By solving Apollonius problem curves could be found that focused lines onto a line or one point.

If such a curve was fashioned in mirrors, that is by taking short segments of lines between the points in the solution curve and replacing them with mirrors, and revolving the curve as a solid of revolution, very powerful burning mirrors could be made wit precise focal points.

The fact that this worked implied that the sun emitted light to any given point on the mirror in straight lines! Thus burning mirrors and other contraptions supported the view that light was emitted in straight lines. It was thought that it was emitted from a burning object as heat and light, but also from the eyes , enabling vision. It took some considerable time to realise that reflected light entered the eyes, or direct light entered the eyes.

I do not think the ancients were ignorant of these facts, but rather, those that followed interpreted their lines as emitting from the eyes rather than being received at the eyes.. No one much cared because they felt optics was a closed subject, and there was nothing new to be found. It took Isaac Newton and a few others to disabuse the intellectuals of this arrogant position.

In 1678, Huygens[1] proposed that every point to which a luminous disturbance reaches becomes a source of a spherical wave; the sum of these secondary waves determines the form of the wave at any subsequent time. He assumed that the secondary waves travelled only in the "forward" direction and it is not explained in the theory why this is the case. He was able to provide a qualitative explanation of linear and spherical wave propagation, and to derive the laws of reflection and refraction using this principle, but could not explain the deviations from rectilinear propagation that occur when light encounters edges, apertures and screens, commonly known as diffraction effects.[2]

Huygens theoretical model , based on hypothesis and more complicated than Descartes pressure pulse was argued against by Newton. Newton required empirical data, not clever supposition. It was not that he disagreed with the Aether required for this to propagate, but rather he desired empirical evidence that aether behaved in this way. The way was specified by Huygens theoretical model to fit the facts, but it failed to fit all the known empirical observation, like it explained others that Newtonian corpuscularism failed to.

The Addition of Fresnels Laws made Huygens ideas a better fit, but some still argue with Huygens hypothesis
http://www.mathpages.com/home/kmath242/kmath242.htm
Kirchoffs so called mathematical model supposedly justifies Fresnel, but of course this is mathematical window dressing.

ure is the upper rather than the lower envelope of the secondary wavelets. Why does an expanding spherical wave continue to expand outward from its source, rather than re-converging inward back toward the source? Also, the principle originally stated by Huygens does not account for diffraction. Subsequently, Augustin Fresnel (1788-1827) elaborated on Huygens' Principle by stating that the amplitude of the wave at any given point equals the superposition of the amplitudes of all the secondary wavelets at that point (with the understanding that the wavelets have the same frequency as the original wave). The Huygens-Fresnel Principle is adequate to account for a wide range of optical phenomena, and it was later shown by Gustav Kirchoff (1824-1887) how this principle can be deduced from Maxwell's equations. Nevertheless (and despite statements to the contrary in the literature), it does not actually resolve the question about "backward" propagation of waves, because Maxwell's equations themselves theoretically allow for advanced as well as retarded potentials. It's customary to simply discount the advanced waves as "unrealistic", and to treat the retarded wave as if it was the unique solution, although there have occasionally been interesting proposals, such as the Feynman-Wheeler theory, that make use of both solutions.

What the wave corpuscle debate shows is that reflection, refraction and diffraction of light is not what it seems! Certain laws and practices have arisen based on matching or constructing theoretical or rather as Newton put it Hypothetical models to empirical data. Newtons method was to start with the data and then by degrees of complexity or scale to develop the general model or law. He used proportional reasoning to do this, that is Logos Analogos.

Newton was entitled to ask for the empirical data that could then be proportioned to derive the law, in the case of light there were beams of light, and these could be portioned into corpuscles of light. However Huygens could not provide the waves of light. No one could, so his whole theory was an unempirical conjecture!
Have we found the waves of light in our modern science?

We certainly have empirical evidence of waves, and powerful predictions that confirm the analogue in Aetheric terms, but we have not yet observed waves of light until recently with the femto speed cameras that we have constructed only recently. However it could be argued that we have constructed these waves by the inherent wave assumptions in the processing.

The difficulty is really what basis or criteria do we use for evidence, empirical or otherwise. When we consider this question as philosophers have we find that we are faced with tautological reasoning. Now this has been portrayed as a problem. Circular reasoning is a supposed logical faux pas! However to make this determination we have to make the unjustified convention that the laws of logic are immutable and independent of the utiliser. This is clearly another tautology, and one which some are unwilling to recognise.

The resolution is to accept tautology at the fundamental level of all our systems of logic and language. In so doing one can then study the role of tautology in the development of concepts. What we find is that tautology is fundamental to iterative development of concepts. No empirical data as an experience is free from tautology. However we start at a given level, and as we construct our definitions we maintain a respectful consistency. However when a new apprehension of the data is found then we have to build our constructions starting at a different level. The old level is supersceded by the new information based on it!
As an example: if I define a meter as a length of cord in a given secure place. It may be that I eventually find that humidity effects that length. But I only find that out by some other experiments using another material to transport that standard length. The other material behaves differently and when the investigation is done it is found that the second material is impervious to humidity. Thus pragmatically we replace the cord by this new material.
However that means we never really know what a meter is precisely. Today we define it as so many wavelengths of a laser light emitting at a specified frequency. This is on the agreed consensus that light speed in a vacuum is invarint.

We cannot see wavelength but we can use interference spectroscopy to count them over a fixed distance involving reflection and pulse generation. And invariant spectrographs. Spectroscopy is in fact a fundamental notion of measurement in physics enshrining the wave concept in our standards. The tautology involved in arriving at this state of affairs was never questioned! It was merely hailed as technological progress!

We do ourselves damage when we allow tautology in one area but denounce it in another. We have to man up and recognise the important iterative role tautology plays in our apprehension of space.

In that case what do I mean by empirical data. Ultimately tautology is involved in all data, both in definition, sampling and recording. It makes no odds if I hand off this task to an independent observer or a computer. The issue lies in the agreed consensus.

By empirical I mean that due and exacting consideration has been given to the sensors I use in experiencing events. I have agreed a model of how those sensors actually behave in the event, and I have agreed at what cut off level a señsor will be agreed to have transmitted a safe signal.

This actually means that all data samples from an infinite variety of sensor states!. Data is therefore a crude sample of all that is possibly happening. Further, in stead of attending to the actual variation between sensor signals, I impose a crude formal curve or lineal relationship. This is so that crude proportions can be elicited. These crude proportions may then be examined and some best fit simplified rule deduced.

The test of the rule is how accurately it adheres to and interpolates results. In addition if it extrapolates results it is given a gold star and may be called a law of nature!

Because of this process, the tautological iteration involved, as instruments become more sensitive, means that old was will inevitably be supersceded. However, we are emotionally wedded to the current laws and Will not allow this natural process to occur. Instead of iterating to an evolved state our laws cease to be relevant, stagnate and die. They may be propped up by patches and sticking plaster, but a dead corpse is a can of worms!

Empirical data is therefore a formal concept doing the best we can to record experience as is in as direct a way as possible involving only sensor processes as far as we understand them.
In a simplistic way our tools of measurement are our sensors. , and so by analogy our biological sensors are to be treated as tools of measurement, as if free from subsequent processing. Computers have made ths distinction as clear as possible. The materials on and around us have properties which vary and can be used to sense impinging environmental variations. Thus they are used as sensor models in mechanical tools for measurement. We tautologically model our sensors on our biology and our biological models are expounded in terms of our mechanical or electromechanical sensors!

Understanding this relationship between tautology and iteration to a more general model relies on the observation that all is dependent on what one subjectively accepts. Consequently I find it opportune to accept the most general formal principles as tools and the most basic signals as empirical data. I have determined over time that the most general tool is that derived as spherical or spheroidal geometry or as I prefer Spaciometry. This general tool is in fact dynamic and requires observation of its fundmental demands.

This is not new, it is in fact the teaching of the Pythagorean school. My fundamental mosaic would be that which is popularly called Sacred Geometry.

Assuming that Huygens wished to apply this type of general tool, we can see that he did so effectively. But he did not give full consideration to all the properties of this general system.
We have seen that Apollonius and Pappus were at the heart of this subject. In that they were considering light rays as straight lines. But Euclid and Apollonius were also considering Conics as curves naturally arising from these investigations. Thus for Huygens to concentrate solely on the sphere for his wave propagation theory was always going to be a first attempt!

Apollonius&’s analyss of the Conics was extremely sophisticated, and Fresnel is really using a more detailed analysis of Apollonius theorems.

The complaint that his coefficients were arbitrary were therefore justified, because he was applying a general theory of spherical measurements to a specific example.

Kirchoffs analytical method similarly relies fundamentally on Apollonius theorems, but being a fuller treatment of his theorems the arbitrary constants naturally are identified. Apollonius identified them when he constructed the theory!

Now we can see that the wave corpuscle debate was really about 2 metrical systems. One was easier to measure directly than the other. In fact the lineal system is always easier to measure once you have constructed ideal straight lines.. However you need the curvilinnear system to construct the ideal straight lines in the first place!

I have derived these constructions in previous posts, but this too is not new . It was shown that all Euclidean geometry is constructible by compass alone!

The question is can we now make the empirical measurements that Huygens theory requires to satisfy Newton?

I say yes we can. We can use photography, film and even radar mapping to make thes measurements. We should now be able to derive Huygens theory from empirical data!

However, the stricter empirical data of actual waves in a luminiferous aether we will only be able to measure by tautological means. But at this juncture we find that our measurement scheme needs to be iterated to a new level! Ivor Catt and his team have demonstrated what Heaviside suspected that the signal that passes around 2 guide wires is a blob of energy , not a fluctuating sine wave! The question is is the wave model fit for purpose?

The reason I am researching these fundamentals of optics is because certain behaviours now need to be precisely known at this level of precision.
http://arxiv-web3.library.cornell.edu/pdf/1302.6287.pdf
This article gives a brief historical development not known by me before.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s