Physics (a term derived from the Latin physica, “nature,” which in turn derives from the ancient Greek τὰ φυσικά, tà physiká, “[the] natural things,” from φύσις [phýsis], both of common Indo-European origin) is the science of nature that studies matter, its fundamental constituents, its motion and behavior through space-time, and the related entities of energy and force. Physics is one of the natural sciences and one of the oldest academic disciplines whose main goal is to understand the behavior of the universe. Physics, chemistry, biology, and some branches of mathematics form the pure sciences, also called the fundamental sciences, and have been part of natural philosophy.

They were created with the aim of studying natural phenomena, that is, all events that can be described, i.e, This goal is achieved through the rigorous application of the scientific method, the ultimate goal of which is to provide a simplified outline or model of the phenomenon being described: the set of physical principles and laws associated with a given class of observed phenomena defines a deductive, consistent, and relatively self-consistent physical theory, typically constructed from experimental induction.

The meaning and scope of physics have evolved over time precisely because of its speculative and interpretive character. The ancient world remained closely linked to philosophy, except for the causalistic interpretation of certain phenomena by the school of Alexandria. This methodology changed only with the Renaissance, following the introduction of the experimental method, which marked the beginning of classical physics. In the subsequent development of physics, there was a subdivision into different branches corresponding to the many aspects under which natural phenomena manifest themselves: mechanics, thermology, acoustics, electrology, optics, magnetism.

This subdivision, with its derived branches, is followed to a first approximation and for reasons of practical order, derived from the close relationship with technology, but it does not respond to the problem opened up by the great mass of discoveries made in the 20th century. Thus, the methodological direction of modernity was born, which began with the rejection of the mechanism proper to classical physics, the introduction of new methods of experimental investigation and new mathematical models and, above all, two revolutionary theories: that of relativity and that of quantum mechanics.

Contemporary physics aims not so much to formulate specific laws as to study and understand the intimate structure of matter and to define the real connections and interdependencies between phenomena occurring at each level of organization of matter itself. The extreme specialization of research has led, on the one hand, to the definition of the fields of study of physics in the narrow sense (atomic physics, see atom; nuclear physics, see nucleus; molecular physics; solid-state physics, see solid; plasma physics, see plasma; space physics, quantum physics) and, on the other hand, to the emergence of interdisciplinary sciences (astrophysics, chemistry-physics, biophysics, etc.) that analyze and interpret the interdependence of phenomena. ), which analyze and interpret the interdependent relationships between the microcosm and the macrocosm, both understood as resulting from the intimate connection, arrangement, and functionality of essentially equal sets of infinitesimal entities.

Among the interdisciplinary sciences in which physics plays a leading role are medical physics (see below), which provides medicine with very powerful tools for diagnosis and treatment, and computational physics (see below), which on the other hand provides physics with a mathematical and computational tool so powerful that it constitutes a veritable new branch of physics alongside theoretical and experimental physics.

Historical background

The Greek Age

The beginning of physics, like that of the sciences in general, is traditionally located in the Greek cultural world of the 6th century BC. B.C., when some of the pre-Socratic schools attempted to explain natural phenomena through the abstraction of sensible things such as the primordial elements, earth, water, air, and fire (Thales, Anaximander, Anaximenes), or through abstract concepts such as numbers of the Pythagorean school, which eventually led to the atomistic explanation of Leucippus and Democritus, which constituted the first great mechanistic conception, according to which all reality, including the soul, is material and composed of moving atoms in empty space. This conception, due to the particular social and cultural conditions of Greece, was defeated in the 4th century B.C. by the great systems of first Plato and then Aristotle.

For Plato, the order found among things and in cycles is not the result of chance, but derives from an intelligence that seeks the common good, and the occurrence of each phenomenon thus fulfills a purpose. Plato identified the true principles of all things in geometric entities (planes, triangles, etc.), from the composition of which the basic constituents of objects are derived, namely regular polyhedra, each of which represents the elementary entity of a particular body and determines, by its own shape, its physical properties. Thus, fire would come from regular tetrahedra, air from octahedra, water from icosahedra, and earth from cubes. Aristotle, responding to Platonic thought, attempted to ground physics in empirical observation within a general world view. Common observation shows the regularity of celestial phenomena, the circularity of the motions of the stars around the earth, and their uniformity and continuity; it also shows that there are falling bodies (e.g., a stone) and rising bodies (smoke, fire, etc.) on the earth, and that the phenomena that take place there are generally neither uniform nor eternal. Hence the Aristotelian idea of two distinct worlds: the heavenly and the earthly. The celestial world, incorruptible and unchanging, consists of concentric spheres, each supporting a planet, bounded by the sphere of fixed stars; the motions of the spheres, considered natural and eternal, are determined by an immovable prime mover.

The terrestrial or sublunar world, stationary at the center of the universe, corruptible and mutable, is a mixture of various elements that tend to move toward their natural places, represented by the concentric spheres of earth, water, air, and fire. Consequential motions are considered natural, as opposed to violent motions, which prevent or divert objects from achieving this end. For Aristotle, any violent motion requires a force to accomplish it, and its speed is directly proportional to the force and inversely proportional to the resistance of the medium in which it is accomplished. Since the resistance in a vacuum is zero, the velocity of a body should be infinite. Hence the exclusion of vacuum from the Aristotelian system (horror vacui). The Aristotelian conception was unchallenged for centuries because of its coherence and its ability to justify Ptolemy’s astronomical system and provide a finalistic worldview accepted by Christianity.

But physics had other moments in antiquity that were very important for later developments in modern times. It is enough to recall the names of Euclid and Archimedes (3rd century BC). The former was the author of a true treatise on geometrical optics, based on the hypothesis of the rectilinear propagation of light. The latter laid the foundations of statics and hydrostatics; he sharply studied the problems of the equilibrium of bodies, going so far as to enunciate the principles of leverage and barycenter and the concept of the momentum of a force with respect to a straight line; he also formulated the principle that bears his name and the concept of relative specific gravity. The mathematical approach, based on expertly performed experiments, makes Archimedes’ work the highest point reached by ancient physics. However, economic and cultural reasons prevented the development of the methodological direction proposed by the school of Alexandria, of which Archimedes was the highest representative. In fact, ancient society, partly because it was based on slave labor, did not favor those innovations in the field of technology that would have allowed the continuation, according to this modern conception, of the studies already begun.

The re-examination of Aristotelian physics from Leonardo to Galileo

The Aristotelian philosophical system and the physics that was an integral part of it remained the fundamental heritage of all medieval culture. However, some important contributions were made, in particular by the Arab Alhazen (11th century), who revived the study of optics, experimenting with spherical, cylindrical and conical mirrors and studying the phenomenon of refraction. At the same time, the discovery of the directional property of the magnetic needle brought the strange phenomena of magnetism to the attention of scholars. Later, old problems of Aristotelian physics that had remained unsolved resurfaced and took shape. Why does an arrow continue to move even though the force acting on it has ceased? The Paris and Oxford schools of physicists (14th century) began to look critically at Aristotle’s theory of motion. Later, achievements in the technical field raised problems in hydraulics, ballistics, and the strength of materials. New personalities emerged, linked to active and enterprising social classes.

L. da Vinci, with his numerous inventions, studies in mechanics, hydraulics and hydrostatics (principle of communicating vessels), was the most distinguished representative of Renaissance “engineers”. Besides these particular contributions, developed by G. Benedetti, R. Tartaglia and G. Cardano, which made the dominant Aristotelian physics increasingly inadequate, some decisive events of the 16th century should be mentioned. First, the publication of the works of Archimedes, which made it possible to recover the tradition of technical and mathematical physics of antiquity; secondly, the revival of Platonism and Pythagoreanism and the renewed interest in the philosophy of nature with B. Telesio and G. Bruno; and finally, the introduction of the idea of the natural sciences. Finally, the introduction of the astronomical system of N. Copernicus, which, assuming heliocentrism, not only stood in clear opposition to the Aristotelian cosmological system taken up and perfected by Ptolemy, but was also completely incompatible with Aristotelian physics.

With no alternative theory of motion, the Copernican system was subject to serious objections. If the Earth rotates on itself, why is there no observed westward fall of bodies? Why does a cannonball not travel a shorter distance when fired toward the east? In such a situation, Copernicus more than closed a whole series of new problems, made even more complicated by the hypothesis of elliptical orbits introduced by G. Kepler to explain the observed irregularities in the motion of the planets. It was Galileo who tackled the most important of these problems and laid the first foundations of the new physics. The refinement and scientific use of the telescope and the systematic observations of the spots of the Sun and Venus, meteors and the discovery of Jupiter’s satellites led him to deny the existence of two distinctly separate worlds, one perishable and the other immutable. There is only one world and, contrary to Aristotle’s claim, the laws derived from the study of earthly phenomena apply to the entire universe. There remained the problem of a new theory of motion, and Galilei realized that to solve this problem it was necessary to make a rigorous critique of the everyday experience that seemed to legitimize Aristotle’s conception. For Galilei, such experience cannot form the basis of science: it is too unstable and imprecise to lend itself to rigorous mathematical discourse.

The new science must therefore not deal with objects as they present themselves to our senses, but must make a clear distinction in objects between primary and secondary qualities. Only the former (shape, motion, and position) can be treated precisely and objectively mathematically, while the latter (smell, taste) are entirely subjective. Taking up the study of motion on this new basis, Galilei decisively refuted the Aristotelian thesis of natural places and of the “lightness” or “heaviness” of bodies per se: all bodies are heavy, and going up or down depends only on their specific gravity relative to the surrounding medium. Underlying the new dynamics, Galileo posited two fundamental principles: the principle of inertia and the principle of relativity. The first breaks down the Aristotelian distinction between natural and violent motion. The second allows the same mathematical forms of mechanical laws to be preserved for all inertial systems by means of simple adaptive formulas, called Galilei’s transformations, and removes all value from Aristotelian objections to the motion of the Earth. Galilei’s precise formulation of the law of falling bodies also contributed to this renewal of physics. However, a very important question remained: given the principle of inertia, what were the forces that held the planets in their elliptical orbits?

Newton and the Theory of Universal Gravitation

It was Isaac Newton who, based on Galileo’s studies of falling bodies and C. Huygens’ studies of centrifugal forces, solved the problem with an ingenious hypothesis: the force that holds the planets around the sun is the same force that causes a tomb to fall to the earth. The theory of universal gravitation, which launched the scientific revolution, was formulated by Newton in 1687 in his famous work Philosophiae naturalis principia mathematica. In this work, all bodies are reduced to the consideration of a point (their barycenter) to which is attributed a mass, defined as the amount of matter possessed by the bodies, regardless of their state of motion or rest. Newton placed all phenomena in absolute space and time in order to provide a solid foundation for the mathematical-mechanistic treatment of the new physics, in which the relationships between events are strictly causal-deterministic. Despite the mechanistic framework of the treatment, there was no shortage of objections from the Cartesians, who saw gravitational force at a distance as a magical or mysterious concept.

Newton met such objections with a radical rejection, stating that the new science merely acknowledged the existence of such a force and gave its most precise mathematical formulation. Newton did much research on the nature of light, and in contrast to Huygens’ theory that light consists of waves propagating in the etheric medium that fills the universe, he argued that light consists of corpuscles of different colors, which, when thrown into space by luminous bodies, give rise to all the phenomena of reflection and refraction. Newton’s research conditioned, in its fundamental directions, subsequent developments in physics. In fact, in the eighteenth century, studies were carried out in two different directions. On the one hand, on the basis of the great advances made in mathematics, efforts were made to give Newtonian mechanics greater analytical rigor and completeness.

Important examples are G.L. Lagrange’s Analytical Mechanics (1788), which completed the research begun by L. Euler and J. D’Alembert, and P.S. Laplace’s Celestial Mechanics (1799-1825). On the other hand, there was an intensification of research in various fields of physics, favored by technical progress and the prevalence of empirical-rationalist directions in the philosophical field. Thus, the development of studies on electricity and magnetism (B. Franklin, A. Volta and L. Galvani), heat (A.L. Lavoisier and P.S. de Laplace), optics (study of the achromatism of lenses) and acoustics (problem of vibrating foils), in short, led to a complex enrichment of all physics. The prevailing tendency was to explain the new phenomena using the fundamental concepts of Newtonian mechanics (mass, velocity and acceleration of particles, forces proportional to the inverse of the square of distances, etc.).

19th century

Along this line of research, G. Green and S.D. Poisson, encouraged in the early 19th century by the successes achieved by C.A. Coulomb’s investigations of electric and magnetic forces, sought to give this field a rigorous mathematical treatment that could signify its analytical reduction to mechanics. The same kind of studies were carried out by J. Fourier for heat. But these intentions and hopes were destined to find increasingly consistent obstacles. The birth of thermodynamics (S. Carnot), a science linked to the research on thermal machines and the great development of industrialization, seemed to shift the whole issue in favor of the new concept of energy. In fact, if it was not possible to reduce physical science to mechanics, it seemed possible to reduce it to the new principles of energy conservation, which could also support mechanics. But even this tendency encountered various difficulties.

The discovery of electromagnetic induction, electrolysis, the polarization of light, the study of gases, the Joule effect, and the photoelectric effect presented a picture too contradictory to complete the enterprise. Moreover, with the advent of the kinetic theory of gases and statistical mechanics, which attempted to explain the behavior of gases by the disordered motions of their constituent molecules, the idea of the probabilistic law was arrived at (L. Boltzmann). In this context, the second principle of thermodynamics lost the character of a natural law and took on the character of a highly probable law. Although most physicists then believed that it was possible to bring statistical laws back into the realm of ordinary mechanical laws, the principle of deterministic causality was for the first time seriously questioned.

The ultimate crisis of the mechanism, i.e. its inability to provide acceptable interpretations of reality, became more and more evident with the introduction of new theories into individual areas of physics, such as: A. Fresnel’s wave theory of light, which, under the assumption of the transverse light wave (i.e, A.M. Ampère’s mathematical theory of electromagnetic induction, and the concept of the line of force, with which M. Faraday succeeded in constructing an important model for dielectrics.

An attempt at a partial synthesis of these various results was made by J.C. Maxwell, who, by critically revisiting Faraday’s work and reconsidering the problem of dielectrics from a mathematical point of view, arrived in 1873 at a unified theory of optics, electricity and magnetism based on the concept of field and condensed into four very famous equations. Maxwell’s unification of electromagnetism, while a remarkable theoretical achievement, left open major problems whose solution would bring about a radical theoretical change in physics. In the last decades of the 19th century, critical analysis of the foundations of Newtonian mechanics also intensified, and the notion that the fundamental principles and primitive concepts of mechanics should form the ultimate basis of all physics was questioned. In addition to a number of studies by H. Hertz, the fundamental epistemological works of E. Mach, J.H. Poincaré, and P. Duhem are in this vein.

Within the new Maxwellian physics, the first problem physicists faced was that of inertial systems. In Newtonian mechanics, there is an infinite class of reference systems (more precisely, inertial systems) for which the laws of mechanics remain invariant, that is, they do not change their formal expression, with respect to Galilean coordinate transformations. Maxwell’s equations do not have such properties, i.e. they do not admit inertial systems with respect to Galilean transformations. H.A. Lorentz proposed to solve the serious difficulty by keeping the classical space-time framework intact and introducing some conceptual novelties (local time, contraction, etc.) that can be interpreted physically in the above framework.

Einstein and the foundations of relativistic physics in the 20th century

A. Einstein, however, in a 1905 memoir, addressed the problem by revolutionizing the entire framework of classical-Newtonian mechanics. Influenced by epistemological criticism, especially by Mach, Einstein redefined the concept of simultaneity of events on the basis of actually performable measurement operations. This led to the rejection of the idea of absolute time introduced by Newton and the adoption of the concept of time relative to a reference system. On the basis of mechanics, Einstein postulated two new fundamental principles, the principle of relativity and the principle of the constancy of the speed of light (the latter clearly contradicting Galilei-Newtonian mechanics), from which he mathematically derived the Lorentz transformation formulas and other surprising results.

These included the discovery of the variability of mass with speed (between 1909 and 1919 by studying the electrons emitted in β decay), the effect of slowing down time in moving “clocks”, i.e, in physical systems that can somehow give a measure of time (verified by the lengthening of the average lifetime of particles in relativistic motion close to, that is, the speed of light, compared to that of the same particles at rest: measurements on mesons observed in cosmic rays were used for such verification), and the substantial identity between mass and energy, expressed in the famous formula E=mc2, where c is the speed of light (the first experimental verification was made in 1932 by J. Cockcroft and E. Wilson). Cockcroft and E. Walton in 1932 in a nuclear reaction by bombarding 7Li atoms with 1H protons (hydrogen nuclei) and obtaining from their nuclear reaction the production of two α particles (helium nuclei, 4He): by comparing the mass of the incident particles with that of the products of the nuclear reaction, they could see a mass defect Δm corresponding to an energy Δmc2 = 17.35 MeV). This result, which became the foundation of nuclear physics a few decades later, was tragically confirmed by the explosion of the atomic bomb in Hiroshima.

These deep and extensive studies led Einstein to formulate, in 1916, the theory of general relativity, in which he established the fundamental principle that the laws of physics should apply to any system in any state of motion, that is, they should be invariant not only with respect to Lorentz transformations (restricted relativity), but with respect to any transformation. This led to a denial of the three-dimensional Euclidean character of space and a close connection between the type of geometry it assumed and the particular distribution of masses in space. It was also necessary to direct studies toward greater mathematical investigation in order to develop a reasonably satisfactory synthetic theory of gravitation, fields, and relativity. Such investigations, while remaining within the causal-deterministic framework, involved a profound reconsideration of the foundations on which classical physics rested, and contributed greatly to the establishment of modern scientific consciousness, which also developed in relation to attempts to explain the new phenomena found at the atomic scale.

General relativity was fundamental to the development of modern cosmology and astrophysics. In the 1960s and 1970s, the general properties of space-time as described by the field equations of general relativity were thoroughly investigated by R. Penrose, S. Hawking, and G. Ellis, who developed innovative mathematical techniques in an attempt to unify general relativity with quantum theories. The results of these investigations have made it possible to study and understand, for example, black holes and the conditions under which the universe might have been found in the first moments after its formation according to the big bang theory.

Max Planck’s quantum

In the last decades of the 19th century, the study of electrons, the discovery of X-rays and natural radioactivity, and the photoelectric effect had opened up to physicists a new field of phenomena that could not be easily interpreted in the light of Max Planck’s electromagnetic theory. Subsequent studies of blackbody radiation phenomena, attempts to trace them back to electromagnetism or thermodynamics, represented a critical point at this stage of research, which was overcome in 1900 with the introduction of the new quantum physics by M. Planck. To explain the phenomena of blackbody radiation, Planck introduced a constant whose physical meaning was to transform energy transfer processes, previously considered continuous and treated by integral calculus, into discontinuous packets that were added up.

This hypothesis caused a great deal of confusion among physicists, a confusion which was greatly exacerbated when, in 1905, Einstein extended the use of this constant to all radiation phenomena of photons or light quanta and to the emission of electrons in the photoelectric effect. This constant finally also appeared in the atomic model proposed by N. Bohr in 1913, a model which, despite its success (it did indeed succeed in explaining the series of emission lines of the hydrogen atom in an excellent way, and was later confirmed in a famous experiment by J. Franck and G.L. Hertz), contains assumptions that have to be taken into account. Hertz), it contains assumptions that are very difficult to reconcile with classical mechanics and electromagnetism (according to these laws, for example, an electric charge in accelerated motion emits radiation with a continuous spectrum and, losing energy, describes a spiral motion until it collapses on the nucleus around which it orbits).

Birth of quantum mechanics

This, too, resulted in the need to construct a new mechanics, a task that was approached around the 1920s starting from the consideration of the existence of photons, which was now also confirmed by the discovery of the Compton effect and the Raman effect, and the need to resort to wave theory to account for the frequency that appears in the definition of the photon and in the explanation of interference phenomena. Add to this the fact that, on the atomic scale, Bohr’s model suggested that wave-corpuscle dualism should also be introduced for electrons and other elementary constituents of matter. To overcome this difficulty it was therefore necessary for a new mechanics to account for the presence of said dualism. Here again, as in the case of relativity, the diversity of proposals involved a precise judgment on classical physics and its methodology.

L. de Broglie argued that wave-corpuscle dualism is an undeniable fact: in order to save the classical framework of explanation, it is necessary to take up Fresnel’s wave theory, compare its equations with those of analytic mechanics in the formulation given by W. Hamilton, and study those solutions (waves) that present singularities of a certain type (corpuscles). Mathematical difficulties, which were insurmountable at the time, however, forced de Broglie and his supporters to declare their proposed path impracticable and to accept, albeit reluctantly, at the Solvay congress in 1927, quantum mechanics founded and supported by the Copenhagen group and in particular by N. Bohr and W. Heisenberg, for whom, on the contrary, only a definitive renunciation of traditional representations could allow the full explanation of atomic laws. Classical physics thus underwent a change that can be compared to the scientific revolution of Galilei and Newton.

Deterministic causality, the mainstay of earlier physical theories, was set aside, probability took on new meanings, the concept and instrument of measurement underwent radical changes in their meaning, and the mathematics-nature-theory relationship was completely revised. The experimental data of atomic physics (the transition frequencies) are, in quantum mechanics, directly encapsulated in mathematical tables (the matrices) which admit certain computational rules that imply, among other things, the noncommutativity of their product. Taking as the basis of the theory an assumption concerning the difference between the commutative products of the matrices representing the momentum and the corresponding coordinates of the particles, quantum mechanics arrives at the very important principle of indeterminacy: the characteristic quantities of particles admit, in principle, a limit in the simultaneous precision of their measurements.

According to Bohr, these quantities are “complementary” in the sense that, contrary to the intuitive notion of a complete image of a particle, they cannot be used simultaneously: one excludes the other and vice versa. W. Pauli formulated the exclusion principle, according to which only a single electron can occupy a given quantum state. According to quantum mechanics, each electron can be described by a set of quantum numbers, which define its orbital (shell, or shell) or energy state, its angular momentum, and the orientation of the momentum in a magnetic field: a multiplicity of electronic states can exist in each orbital, each state being identified by a set of quantum numbers different from all others. The periodicity of the table of elements and the repetition period of elements with similar chemical behavior could be explained by quantum mechanics supplemented by Pauli’s exclusion principle.

The successes achieved by the new mechanics facilitated its acceptance among physicists, although the depth of the break made with classical physics aroused strong perplexity and aversion in physicists such as Einstein and E. Schrödinger, who were unwilling to accept all the many consequences of the new theory (foremost among them, that of undermining the principle of causality underlying determinism). The debate continued in the following years, and both the difficulties encountered by quantum mechanics in controlling and explaining the new world of elementary particles and the difficulty of making a synthesis between its theoretical framework and that of general relativity led to the revival and reconsideration of de Broglie’s old proposal.

E. Schrödinger, in 1926, gave a formulation of quantum mechanics in terms of a wave function described by a differential equation at partial derivatives, the Schrödinger equation, building on the ideas of de Broglie, who had proposed a representation of particles in terms of waves (with each particle endowed with momentum p, de Broglie associated a wavelength λ=h/p). Experimental confirmation of the validity of this description came from C.J. Davisson and L.H. Germer in an electron diffraction experiment on a nickel crystal lattice, where bangs quite similar to those produced by the diffraction of a light wave were observed.

The commonly accepted interpretation of the Schrödinger wave function was given by M. Born, who interpreted the square of the modulus of the wave function as the probability of finding the particle described by it in a given state (i.e., at a given position, with a given momentum, etc.). In 1927, W. Pauli also included in Schrödinger’s formulation the dependence on electron spin, the intrinsic angular momentum (i.e., not related to its motion in space, unlike orbital angular momentum) discovered experimentally by G. Uhlenbeck in S. Goudsmit in 1925.

Another more general formulation of the new quantum mechanics was proposed in 1926 by P.A.M. Dirac, who proposed a wave equation for the electron, which was very successful and widely used in the following years. Much of the success of Dirac’s proposed equation comes from the fact that, if the constraints of special relativity are required to be satisfied, spin is introduced naturally into that formulation. A peculiarity of the Dirac equation is that it admits for each state of the electron characterized by positive energy a “companion” state with negative energy.

The states of a free particle with negative energy do not make physical sense, and the interpretation that was given (after several conflicting attempts) was that of states corresponding to particles with electric charge of opposite sign to that of the electron. For several years such an interpretation of the result of Dirac’s equation was controversial, and it was not until 1933 that the antiparticle of the electron, the positron, was discovered in cosmic rays. Such a successful formulation of quantum theory now demonstrated the state of maturity and deep understanding that had been reached, such that the existence of particles not yet observed could be predicted.

Quantum theory was thus applied to a great many fields of physics, providing explanations for effects that had remained unexplained by classical physics, such as, for example, the fine structure of the hydrogen spectrum, the binding energy in the simplest molecules, and electrical and thermal conductivity in metals (explained using the concept of free electrons in metals, subject to the Pauli exclusion principle). The consistent study of the interaction between radiation and matter led to a quantum treatment of the electromagnetic field: J.S. Schwinger and R. Feynman (1948), and earlier S. Tomonaga (1943) independently constructed a relativistic theory of quantum electrodynamics (Quantum ElectroDynamics, or QED), by which it was possible to account for very detailed effects due to the interaction of the electromagnetic field generated by the electron with itself. One is the so-called Lamb shift (Lamb shift, named after its discoverer) of some lines in the spectrum of the hydrogen atom explained by H.A. Bethe (1947); another is the anomaly (with respect to Dirac theory) in the value of the electron’s magnetic moment measured by P. Kush and H.M. Foley and explained by Schwinger in 1949.

Notify of

Inline Feedbacks
View all comments
Scroll to Top