Abstract and keywords
Abstract (English):
The role of models in the development of physics is considered. It is shown that all physical theories are based on models - idealized structures that reproduce parts of reality in a simplified form. The development of physics is presented as a change of models.

Keywords:
model, physics, physical theory, objective reality, method, analogy, symmetry
Text
Publication text (PDF): Read Download

At the beginning of the XXI century, it is absolutely obvious that physics, as a science that studies the most general laws of nature, is not only the leader of natural science and the scientific base of most modern technologies, but also represents one of the most important elements of the culture of society. The general cultural significance of physics is primarily due to the fact that the achievements of physics form the basis of the modern natural science worldview and form the basic scientific ideas of mankind about the world around us. Physics is closely related to technology. It is difficult to find now such a branch of technology that would not grow out of physics. The entire history of the development of technology is a gallery of theoretical and experimental physical discoveries brilliantly realized in engineering structures and technologies. Physics has now become the direct productive force of human society. Many physical effects and phenomena, which at first glance are very far from technical applications, become the basis for new branches of technology and ensure further technical progress. That is why physics is the basis for the fundamental training of an engineer and a bachelor in technical areas [1].

In the process of teaching physics at a university, it is important not only to inform the student of the knowledge necessary for further professional activity, but also to form a natural science worldview. For this, it is necessary to pay great attention to methodological issues, the process of the genesis of physical knowledge. Neglect of these issues leads to shortcomings in the theoretical knowledge and worldview of graduates, an unformed understanding of the physical picture of the world and clear ideas about the limits of applicability of physical theories, which further interferes with the correct explanation of the physical phenomena of the surrounding world, the successful application of physical methods of cognition in the development of other sciences. It is very important in the learning process to focus students' attention on the model nature of theoretical knowledge, the model nature of the cognition process in physical science, to form an idea of physics as a combination of a large number of different models, to one degree or another reflecting reality. Theoretical knowledge is of a model nature, it is impossible to give it the status of complete adequacy of physical reality.

     Physical theories operate not with real objects, but with their idealizations, ideal models that abstract from some real, secondary sides of objects and therefore give an incomplete picture of reality (ideal gas, ideal liquid, etc.). Many researchers have emphasized the model nature of our scientific ideas about the natural world. Thus, in the article "The Role of Models in Science" A. Rosenbluth and N. Wiener wrote: "None of the parts of the Universe is so simple that it could be understood and controlled without abstractions. Abstraction is the replacement of the considered part of the Universe with some of its model, a model of a similar, but simpler structure. Thus, the construction of formal or ideal ("mental") models, on the one hand, and material models, on the other, necessarily takes a central place in the procedure of any scientific research". - Quote. [2, 24]. All science in this article is interpreted as a process of modeling, the development of science as the construction of more and more accurate models. In natural sciences, a model in a broad sense is understood as "a mentally or practically created structure that reproduces this or that part of reality in a simplified (schematized or idealized) and visual form" [3, 8]. Ideal objects (models), in contrast to real ones, are characterized not by an infinite, but by a quite definite number of properties.                                A model is an abstract reflection of reality in some form, designed to represent certain aspects of this reality and allowing you to get answers to some of the questions being studied and new information about this reality. Models constitute a necessary element of scientific knowledge, they are an important component of the changing natural science picture of the world.   

     Acting as some idealization of reality, the model can change over time, i.e. the degree of idealization and simplification of reality changes. From the initial ideal objects, a certain theoretical model of a given concrete phenomenon is built and it is assumed that this model in its essential aspects, in certain respects, corresponds to reality. As a result, a theory that describes the properties of ideal objects, the relationship between them, as well as the properties of structures formed from primary ideal objects, is able to describe the variety of data that a researcher encounters at the empirical level. In theory, a scientist deals with an intellectually controlled object, while on an empirical level, with a real object with an infinitely large number of properties.  A scientific theory built on the basis of a system of abstract models is not a direct, but an idealized reflection of reality. "The concepts and statements of the theory in the strict sense of the word describe not the properties and relations of real phenomena or systems, but the features of the behavior of an idealized scheme, or a conceptual model, which was built as a result of the study of a particular real system" [4, 14]. From the point of view of model representations, the content of the theory is the description of the model [5, 34]. The model, therefore, is one of the forms of cognition, a specific means of displaying the material world by a person.

      Physical science is a collection of a large number of different models to one degree or another reflecting reality. "Physical theory is usually based on a few selected experimental results... These results are initially very simplified and systematically put into a model that may contain many additional human inventions that originate from the scientist's imagination rather than direct experimental facts. Proceeding from this, some theoretical system is not built too accurately and not too logically, and predictions are made"[6, 65 - 66]. Thus, the model as an element of the scientific picture of the world also contains an element of fantasy, being a product of the scientist's creative imagination.

      Let us consider the main stages in the development of physics from the point of view of model concepts. Models as a means and forms of scientific knowledge appeared in ancient times (the continual model of Aristotle, the ideas of Democritus and Epicurus about atoms, the geocentric model of the world of Ptolemy). True, these models were purely speculative. Models were widely used in the creation of classical mechanics. Galileo used models as idealized mental systems as logical and methodological devices (for example, the movement of a body along an inclined plane without friction), which allowed him to discover the law of inertia. He understood the role of the mental model as a kind of abstract reproduction of the object under study. The models were widely used by I. Newton. At the same time, such logical devices as abstraction, idealization (absolute space and time, material point, etc.) were used. So, material points with which mechanics deals, have only mass and the ability to be in space and time. In theory, not only ideal objects are set, but also the relationships between them, which are described by laws. Derived objects (for example, a system of material points) can be constructed from primary ideal objects in theory. In classical mechanics, the following abstractions acquired the status of fundamental models: a material point and a system of material points, an absolutely rigid body, an elastic or plastic medium, ideal and viscous fluids, etc. [7, 43] These models were later widely used as basic ones.

      The successes of classical mechanics gave rise to the tendency to regard all physical phenomena as mechanical, to reduce all physical phenomena to the mechanical motion of particles or their combinations. The greatest physicists of the past centuries argued that they do not understand the phenomenon until they have built a mechanical model of it. After Faraday's discovery of the phenomenon of electromagnetic induction in the XIX century, attempts were made to explain electromagnetic phenomena on the basis of mechanical models (W. Thomson, G. Lorentz, D.K. Maxwell, etc.). W. Thomson wrote: "I never feel satisfied until I can build a mechanical model of the thing being studied. If I can build a mechanical model of it, I understand it. Until I can build a mechanical model of it, I don’t understand it during all this time; that's why I don't understand electromagnetic theory". - Quote. [3, 40]. Attempts to build a mechanical model of electromagnetic phenomena led to the idea of a special hypothetical medium - ether, which has mutually exclusive mechanical properties. D.K. Maxwell made extensive use of mechanical analogies and models built on their basis. To create his theory, later called electromagnetic, Maxwell used the mechanical model of field lines created by him, the vortex model of the magnetic field [8]. Maxwell solved the problem of constructing a mechanical model of non-mechanical phenomena. Maxwell's model is based on the concept of vortex motions of an ideal fluid, which meant the world ether filling space. In his work "On physical lines of force" he builds a mechanical model of electromagnetism from hexagonal magnetic vortices, between which electric conducting spheres are located in the form of friction rollers. This speculative model provided a reliable representation of the laws of electricity and magnetism known at that time. Using his model, Maxwell generalized the previously experimentally discovered laws of Coulomb, Bio-Savard-Laplace, Ampere, the phenomenon of electromagnetic induction by M. Faraday and derived the famous equations that now bear his name. Maxwell's system of equations fully describes electromagnetic phenomena, is a logically harmonious and perfect theory, similar to Newtonian mechanics. Very important predictions about the independent existence of an electromagnetic field not "tied" to charges followed from Maxwell's equations, i.e. that the field can independently exist and spread in space. The analogy between Maxwell's equations describing the electromagnetic field and the wave equations made it possible to predict the existence of electromagnetic waves.

      The "catastrophe" of mechanical models is associated with the penetration into the world of electromagnetic phenomena, with the concept of an electromagnetic field and other new concepts. "Being at first only an auxiliary model, the field becomes more and more real... The assignment of energy to the field is a further step in development, in which the concept of the field becomes more and more essential, and the substantial concepts inherent in the mechanistic point of view are increasingly receding into the background". [9, 445 - 446]. The special theory of relativity did away with the world ether hypothesis.

    There are many different phenomena, different in their physical nature, but having the same signs and patterns. In these cases, models are often built by analogy. The analogy method was widely used to create models of physical phenomena and theories based on them [10]. Newton's idea of the analogy between the motion of celestial bodies (the Moon around the Earth) and the motion of bodies thrown on the Earth (the fall of an apple), about the similarity of the Moon's acceleration to the gravitational acceleration on Earth, led him to the creation of the theory of gravitation. The analogy with the flow of liquid in a pipe played an important role in the creation of the theory of electric current. Its hydrodynamic model was created. The current was considered as the flow of an electric fluid through a conductor, similar to the flow of a fluid in a pipe. By analogy with hydraulic resistance, the concept of electrical resistance was introduced. G.S. Ohm used this analogy and, transferring the laws of hydrodynamics to electricity, formulated the now well-known Ohm's law.  The mechanical analogy was also used by Maxwell when deriving the law of distribution of gas molecules by velocities. There are other examples of the successful application of analogy as a method of scientific knowledge to create models. A. Einstein widely used analogy in his works. The main ideas that led Einstein to create the special theory of relativity are based on analogy. By analogy, Galileo's principle of relativity was extended to electromagnetic phenomena.  The postulate of the independence of the speed of light from the speed of the source was put forward by Einstein by analogy with the fact that the speed of sound propagation in a medium is independent of the speed of the source. The general theory of relativity is based on the principle of equivalence of inertial and gravitational forces, formulated by analogy with the principle of equivalence of inertial and gravitational masses.

     An important moment in the development of modern physics was the hypothesis of M. Planck about the discreteness of radiation. True, M. Planck considered the idea of ​​a quantum as a successful mathematical device. Developing Planck's idea, Einstein considered quanta as real particles in the theory of the photoelectric effect. At the beginning of the XX century, physics developed an understanding that different models can be used to describe the same phenomenon, describing this phenomenon from different angles. The concept of wave-particle dualism was formed. The combination of different models in describing one phenomenon (object) is a consequence of the fundamental principle of Bohr's complementarity. The various models complement each other. 

      Modeling is of particular importance in the study of the microworld. The micro-object model is a substitute for the studied subject, which cannot be directly perceived by the human sense organs. In the case of cognition of the microworld, its objects are not directly observed, therefore physicists are forced to create a rough model of it on the basis of a few data about it. Analogy becomes an important means of creating models that replace the sensory image of an invisible material entity. The model is based on empirical data on the objects of the microworld, obtained in a material experiment. Such models help to understand the data obtained in the experiment about the studied objects of the microworld, to reveal the laws of their functioning, their connection with the rest of the world. In cognizing the microcosm, physicists were initially forced to use the objects of the macrocosm and the laws of the macrocosm as analogs, and only then, as the microobject was cognized, to make refinements and changes in the created models. Attempts were made to visualize the objects of the microcosm using such representations as a material point, trajectory, etc. Later, this had to be abandoned". Visibility is a very conditional and changeable thing... The word "visual" is not clearly defined. In essence, it means only something familiar and accessible to us without deep thought. It is clear that, in accordance with this, atomic physics is beloved. Only mathematical formalism can help to understand it. Those who do not understand him should leave physics alone. Nature does not care about our mathematical abilities, she is a much better mathematician than we are, and she conforms her laws not with the simplest, but with the highest and most effective mathematical methods". [11, 122-123].

      Modeling micro-objects began with studying the structure of the atom. Consider how the atomic model gradually changed and became more complex. The first model of the atom was the model of J. Thomson, according to which the atom consists of a positively charged sphere, inside which negatively charged electrons move. In 1904, the Japanese physicist H. Nagaoka, by analogy with the theory of stability of the rings of Saturn, developed by Maxwell, put forward a hypothesis that the atom consists of a heavy positive nucleus surrounded by rings of a large number of electrons, whose vibrations in the plane of the rings should be accompanied by the emission of atomic spectra. It was replaced by Rutherford's planetary model of the atom, based on a large amount of experimental data and more accurately reflecting the processes in the atom. In this model, electrons revolve around an atomic nucleus like planets orbiting the sun. This model came into conflict with classical physics, according to classical electrodynamics, such an atom would be unstable, which contradicted reality.

    In 1913 N. Bohr proposed a hypothesis that contradicted the classical electromagnetic theory, based on two postulates. According to Bohr's hypothesis, an electron in a stationary orbit does not emit. Radiation occurs when an electron passes from a higher orbit to a lower one. These transitions form the spectrum of the atom. Bohr formulated the rules for quantization, introduced quantum numbers. Based on Bohr's model, the spectrum of the hydrogen atom was calculated, which was in good agreement with experiment. Bohr's theory was a major step forward in the development of atomic physics and was an important stage in the creation of quantum mechanics. However, it had internal contradictions (it was based on quantum postulates, but applied the laws of classical physics to the description of the motion of an electron). It failed to calculate the spectra of more complex atoms.

      Based on an analogy with the solar system, Rutherford and Bohr's crude atomic models made it possible to take the first steps in understanding the structure of the atom, which became the starting point for further study of this complex material formation. By analogy with the wave-particle duality of light, Louis de Broglie suggested that the wave-particle duality is also inherent in microparticles. In each case, scientists have tried to make the theory similar to earlier theories. Light was likened to sound, waves of matter to a light wave. Often, further research revealed a discrepancy in the analogy due to the specifics of the object under study, which led to the need to expand the boundaries of the theory and preserve its productive side.

         Later, a quantum-mechanical model of the atom appeared, taking into account the wave properties of the electron. It made it possible to more accurately understand the structure of the atom, the laws governing the interaction of the nucleus and electrons, etc. Many historians of science note that E. Schrödinger derived the basic equation of wave mechanics, which bears his name, by analogy with the de Broglie equation describing the wave properties of matter. Some models based on analogy, as it turned out later, had no physical meaning, but played a large role in the development of physics. Rutherford - Bohr's planetary model of the atom, which compared the motion of an electron around the nucleus with the motion of a planet around the sun, led to the question of the position and speed of an electron in an atom. Further research has shown that similar answers cannot be given to these questions. But from this idea, atomic and quantum physics arose. In 1925, based on the analysis of spectroscopic data, J. Uhlenbeck and S. Goudsmit put forward a hypothesis about the existence of an electron spin by analogy with the angular momentum of planets rotating around an axis. Later it was found that spin has a specific quantum nature, not associated with the movement of particles as a whole.

     The more complex the object of cognition and the deeper it is located in the structure of the material world, the more difficult it is to find an analogue for it in the macrocosm, which makes it difficult to create its model. "As our mental gaze penetrates more and more small distances and short periods of time, we are faced with the fact that nature behaves so differently from what we observe in visible and tangible bodies from our environment, that neither one model created from our large-scale experiments cannot be "true". A completely satisfactory model of such type is not only practically inaccessible, but also unthinkable. More precisely, we, of course, can invent it, but no matter how we invent it, it will be wrong...". [12, 28-29]. In such cases, at first, scientists are forced to use not one, but several models, each of which helps to cognize some side, property, regularity of the most complex material formation. This is how the research of the atomic nucleus was conducted. The first model of the nucleus was the droplet model developed independently by N. Bohr and Ya.I. Frenkel'(1936). It was based on the analogy between the behavior of nucleons in a nucleus and the behavior of molecules in a liquid droplet. The droplet model of the nucleus made it possible to explain the mechanism of nuclear fission and nuclear reactions. In 1949-1950, the American physicist M. Geppert-Mayer and the German physicist H. Jensen developed a shell model of the nucleus, which made it possible to explain the different stability of atomic nuclei, the frequency of changes in their properties, the spin and magnetic moments of nuclei. To explain the process of neutron scattering by various nuclei, the interaction of a nucleus with incident particles, VF Weisskopf and G. Feshbach developed an optical model of the nucleus. With the accumulation of experimental data on the properties of atomic nuclei, new facts appeared that could not explain the above models. O. Bohr and B. Mottelson proposed a collective (generalized) kernel model. Later, statistical, cluster and other models of the atomic nucleus were developed.

     Physicists go even further from "clarity" when studying elementary particles. Particular attention is paid to the application of the methods of symmetry and group theory, which establish the relationship between objects, phenomena and theories that are outwardly unrelated in any way. The interconnection of various kinds of groups in group theory reflects the objectively existing variety of common sides, relationships, properties, movements and states of material systems of the most diverse nature. The hierarchy of groups constructed in mathematical theory reflects to a certain extent the hierarchy of more and more general levels of reality. Fundamental symmetry groups are consistently applied in physics: development proceeds from classical mechanics with the Galileo - Newton group to special relativity with the Poincaré group, to quantum mechanics with the Lorentz group, to the physics of elementary particles with the SU (2) and SU (3) groups. In 1964, M. Gell-Mann and J. Neeman suggested that the presence of supermultiplets in hadrons is a manifestation of the strong interaction of the symmetry of the unitary group SU (3), a transformation group in three-dimensional complex space. In 1964, M. Gell-Mann and J. Zweig, analyzing the mass spectra of hadrons in supermultiplets on the basis of the mathematical structure of representations of the unitary group SU (3), put forward the idea that hadrons are built from quarks. By analogy with the experiments of E. Rutherford, E. Mardsen and H. Geiger (bombardment of atoms with alpha particles) J. Friedman, G. Kendall and R. Taylor performed experiments on deep inelastic scattering of electrons by protons and bound neutrons. The scattering occurred as if there were three point objects inside the nucleons, in which the entire mass of the nucleon was concentrated. These experiments confirmed the quark model of hadrons. On the basis of the concepts of symmetry, the Standard Model of elementary particles is constructed, which describes quite well the properties of open elementary particles participating in strong, electromagnetic and weak interactions.

      The general direction in the physics of the microworld is the creation of a theory in which the currently known four fundamental interactions (strong, electromagnetic, weak and gravitational) would be special cases of one fundamental interaction. Confidence in the possibility of creating such a theory increased after the unification of the electromagnetic and weak interactions into the electroweak and the discovery of the Higgs boson. The models used to develop these theories include concepts such as multidimensional spaces; supersymmetries that unite physical systems obeying different statistics; superstrings - trelativistic supersymmetric extended objects, including fermionic degrees of freedom and other mathematical objects.  

      Thus, the entire history of the development of physics shows that modeling is the most important method of understanding the surrounding reality. Any physical theory is based on this or that model. The model is formed on the basis of limited experimental and available theoretical knowledge about the object. Therefore, the model often cannot be built unambiguously. Physical theories should be viewed as intermediate utility models, but not as complete knowledge. They will be changed, altered, revised, etc. when new experimental data become available. The modern understanding of the processes of the microworld does not provide for visual representations of objects. In models describing the microworld, the role of abstractions, including mathematical ones, is increasing. The ability to choose a mathematical model adequate to a micro-object is on the verge of science and art [7, 43]. Therefore, the development of physics can be viewed from the point of view of the art of modeling.

References

1. Popkov, V.I. Physics is the basis for the professional training of an engineer / V.I. Popkov // Bulletin of the Bryansk State Technical University. - 2008. - № 4. - P. 127 - 133.

2. Gusinsky, E.N. Introduction to the philosophy of education / E.N. Gusinsky, Yu.I. Turchaninov. - M.: Publishing Corporation "Logos", 2000. - 224 P.

3. Shtoff, V.A. Modeling and philosophy / V.A. Shtoff. - M. - L-d.: Publishing house "Science", 1966. - 303 P.

4. Ruzavin, G.I. Scientific theory (Logical and methodological analysis) / G.I. Ruzavin. - M.: Thought, 1978. - 237 P.

5. Markov, M.A. Thinking about physics / M.A. Markov. - M.: Science, 1988. - 301 P.

6. Brillouin, L. Scientific uncertainty and information / L. Brillouin. - M.: Mir, 1966. - 272 P.

7. Gladun, A.D. Physics as a culture of modeling / A.D. Gladun //Journ. of the Mosc. phys. soc. Series "B" "Physical education in universities". - 1996. - V. 2. - № 3. - P. 41 - 45.

8. Maxwell, J.C. Selected Works on the Theory of Electromagnetic Field / J.C. Maxwell. - M.: GITTL, 1954. - 687 P.

9. Einstein, A. Collection of scientific works: in 4 v. / A. Einstein. - M.: Science, 1967. - V. 4. - 599 P.

10. Popkov, V.I. The role of analogy in the development of physics / V.I. Popkov // Actual Science. - 2017. - № 1. - P. 6 - 13.

11. Sommerfeld, A. Ways of knowledge in physics: Coll. articles / A. Sommerfeld. - M.: Science, 1973. - 319 P.

12. Schrödinger, E. Science and humanism / E. Schrödinger. - Izhevsk: Research Center "Regular and Chaotic Dynamics", 2001. - 64 P.

Login or Create
* Forgot password?