Primary theoretical models and laws. Developed theory. Formation of primary theoretical models and laws Formation of primary theoretical models and laws

Theoretical models reflect the structure, properties and behavior of real objects, allow you to imagine objects and processes that are inaccessible to perception (model of the atom, the Universe).

I. Lakatos noted that the process of their formation is based on programs: 1) Euclidean (can be deduced from a finite set of trivial true statements, theory at the top, intuition), 2) Empiricist(built on the basis of basic provisions of a well-known empirical nature, theory below, intuition), 3) Inductivist(emerged as part of the effort to construct a channel through which truth flows upward from basic positions, and thus establish an additional logical principle of truth relay). All 3 come from the organization of knowledge as a deductive system.

V. S. Stepina: « main feature theoretical schemes is that they are not the result of a purely deductive generalization of experience.” In advanced science, theoretical schemes are first constructed as hypothetical models using previously formulated abstract objects. On early stages scientific research constructs of theoretical models are created through direct schematization of experience. But then they are used to build new theoretical models, and this method begins to dominate. Experience is used when science encounters objects for the theory of which sufficient means have not yet been developed. On its basis, the necessary idealizations are gradually formed as a means for constructing the first theoretical models in a new field of research (the beginning of the theory of electricity).

As Theoretical constructs abstract objects appear (ideal gas, absolute black body, dot). In reality, there are no isolated systems, therefore all classical mechanics, focused on closed systems, are built using theoretical constructs. Constructive modification of observed conditions, the promotion of idealizations, the creation of a different scientific subjectivity that is not found in ready-made form, the integrative crossing of principles at the “junction of sciences” that previously seemed unrelated to each other - these are Features of the logic of formation of primary theoretical models.

Law of Science reflects objectively existing interactions in nature. Aimed at reflecting natural patterns, they are formulated using the artificial languages ​​of their discipline. Highlight " Statistical", based on probabilistic hypotheses, and " Dynamic» laws, i.e. in the form of universal conditions. They are generalizations that are changeable and subject to refutation, raising the problem of the nature of laws. Kepler and Copernicus understood laws as hypotheses. Kant: laws are not derived from nature, but are prescribed by it. A. Poincare : The laws of geometry are not statements about the real world, but rather arbitrary conventions about how to use terms such as "straight line" and "point". Max : laws respond to the mental need to organize physical sensations.

Formation of laws suggests that an empirically grounded hypothetical model has the potential to be transformed into a scheme that is first introduced as a hypothetical construct, then adapted to a specific set of experiments, and in this process justified as a generalization of experience. Further its application to the variety of things (qualitative extension). After this comes the stage of quantitative mathematical design and the phase of the emergence of the law. Model - scheme - qualitative / quantitative extensions - metamatization - formulation of the law. Scientific research in various fields strives not only to generalize events in the world of experience, but also to identify regularities and establish general laws.

The role of analogies . The transfer of abstract objects from one field of knowledge to another, which is used by modern theoretical knowledge, uses as its basis the method of analogies, which indicate relations of similarity between things. Analogies are distinguished: 1) inequalities (different objects have the same name: heavenly body, earthly body); 2) proportionality (physical/mental health); 3) attribution (the same relationships are attributed to an object in different ways: healthy image life / healthy body / healthy society). Thus, inference by analogy allows one to liken a new individual phenomenon to another known one. Analogy, with a certain degree of probability, makes it possible to expand existing knowledge by including new subject areas within its scope. The question of the reliability of the analogy is always relevant. They are recognized as an integral means of scientific and philosophical understanding. There are analogies of objects and analogies of relations, as well as strict analogy (provides the necessary connection of the transferred feature with the sign of similarity) and non-strict (is problematic in nature). The difference from deduction in analogy is the likening of individual objects, and not subsuming a separate case under general position(analogy of selective work in cattle breeding / Darwin's theory of natural selection).

In the field of technology, when creating objects similar to inventions, some groups of knowledge and principles are reduced to others. Great importance has a schematization procedure that replaces a real engineering object with an idealized representation (model). Prerequisite- mathematization. It is customary to distinguish between invention (creation of an original) and improvement (transformation of an existing one). Sometimes the invention shows an attempt to imitate nature, an analogy between the artificial and the natural.

If the role of analogy needs to be proven, then Justification procedure has always been recognized as a significant component of scientific research. Justification has always faced counterexamples. The type of justification can come from analytical (dismembering) procedures or generalizing ones.

Analytics allows you to clarify details and reveal the full potential of the content present in the original basis. The main essential aspects and patterns of the phenomenon being studied are assumed to be given. Research carried out within the framework of the already outlined area, the assigned task and is aimed at analyzing its internal potential. The analytical form of justification is associated with deduction and the concept of “logical consequence.” Example: finding new ones chemical elements.

Synthetic procedures justifications lead not only to proven generalizations, but highlight fundamentally new content that was not contained in isolated elements. Example: clarifying the relationship between “theoretical terms” and “observational terms” (electron and the term itself). Hempel shows that when the meaning of theoretical terms is reduced to the meaning of a set of observational terms, theoretical concepts turn out to be redundant. They turn out to be unnecessary if one relies on intuition when introducing and justifying theoretical terms, which is why the concepts are different.

The justification procedure involves : a) empirical verification of sentences talking about certain conditions; b) empirical testing of universal hypotheses on which the explanation is based; c) examining whether an explanation is logically convincing.

We can talk about the structural equality of justification and prediction procedures. A prediction consists of a statement about some future event; the initial conditions are given, but the consequences have not yet occurred. In the justification, the course of reasoning is structured in such a way as if the event had already happened, i.e., the full potential of retrospective analysis is used. Sometimes justifications are formulated so completely that they can reveal their predictive character.

The logic of scientific discovery - developing trouble-free rules of creativity is an impossible task; it is impossible to give rational justification for the spontaneous creative process. Much space is given to bold guesses, intuition, switching “samples”, and analogue modeling. Heuristics accompany the discovery process. It is perceived as a surprising area of ​​search and discovery in conditions of uncertainty. Heuristic methods and models suggest the use of non-trivial scenarios, tools and methods; they are opposed to formal logical techniques. The logic of discovery fundamentally cannot be formalized. Reduction, borrowing of methods, integration of techniques from the humanities and technical sciences, the choice of practical implementation of certain scientific developments, and the decisive experiment itself are explicitly or implicitly based on heuristic assumptions. And although heuristics as a section of methodology has not yet received official recognition, it is assessed as a strategy for finding effective solutions, as a measure of creative risk.

A characteristic feature of the logic of discovery is its fundamental interdisciplinarity. Creative activity is based on methods that differ from methods of simple enumeration and from traditionally accepted and established ones. Search models are significantly individualized and closely related to the mental and motivational activity of the subject of cognition and provide sufficient resistance to external restrictions imposed on the research parameters.

Heuristics enriches the researcher with a variety of non-standard methods, among them the method of analogy, based on imitation of all kinds of structures; precedent method, indicating cases already existing in scientific practice; the method of reintegration (Ariadne's thread), is based on the creation of complex structures from simpler ones; method of organismic imitation (Toynbee's construction of the theory of local civilizations); pseudomorphization method, i.e. using a form that is not its own (an umbrella-shaped weapon).

The logic of discovery does not presuppose the presence of stereotypes and regulation, arranged in a strict sequence and formulated in a general form. It represents a surprising area where novelty accompanies both the research process itself, the choice of search methods and techniques, and its results.

Models play an important role in scientific and theoretical knowledge. They make it possible to present in a visual form objects and processes that are inaccessible to direct perception: for example, an atomic model, a model of the Universe, a model of the human genome, etc. Theoretical models reflect the structure, properties and behavior of real objects. The construction of a scientific model is the result of interaction between the subject of scientific and cognitive activity and reality. There is a view that primary models can be assessed as metaphors based on observations and inferences drawn from observations that facilitate the visual representation and retention of information. The famous Western philosopher of science I. Lakatos noted that the process of forming primary theoretical models can be based on three types of programs: firstly, the Euclidian system (Euclidean program), secondly, the empiricist program and, thirdly, the inductivist program program. All three programs are based on the organization of knowledge as a deductive system.

The Euclidean program, which assumes that everything can be deduced from a finite set of trivial true statements consisting only of terms with a trivial semantic load, is usually called the program of trivialization of knowledge. This program contains purely true judgments, but it does not work with either assumptions or refutations. Knowledge as truth is introduced at the top of the theory and, without any deformation, “flows down” from primitive terms to defined terms.

Unlike Euclidean, the empiricist program is built on the basis of basic provisions that have a well-known empirical nature. Empiricists cannot allow any other introduction of meaning than from below the theory. If these statements turn out to be false, then this assessment penetrates up the channels of deduction and fills the entire system. Therefore, the empiricist theory is conjectural and falsifiable. And if the Euclidean theory places truth at the top and illuminates it with the natural light of reason, then the empiricist theory places it at the bottom and illuminates it with the light of experience. But both programs rely on intuition.

About the Inductivist Program The emergence of the inductivist program was associated with the dark pre-Copernican times of the Enlightenment, when refutation was considered indecent and conjecture was despised. Inductive logic was replaced by probabilistic logic. The final blow to inductivism was dealt by Popper, who showed that even partial transmission of truth and meaning cannot proceed from bottom to top.

According to academician V.S. Stepin, “the main feature of theoretical schemes is that they are not the result of a purely deductive generalization of experience.” In advanced science, theoretical schemes are first constructed as hypothetical models using previously formulated abstract objects. In the early stages of scientific research, the constructs of theoretical models are created through direct schematization of experience.


Important characteristics of a theoretical model are its structure, as well as the ability to transfer abstract objects from other areas of knowledge. According to Lakatos, the main structural units should include a hard core, a belt of protective hypotheses, positive and negative heuristics. The negative heuristic prohibits the application of refutations to the hard core of a program.

Theoretical objects convey the meaning of such concepts as “ideal gas”, “absolute black body”, “point”, “force”, “circle”, “segment”, etc. In reality, there are no isolated systems that would not experience any influences , therefore, all classical mechanics, oriented towards closed systems, is built using theoretical constructs.

How does the process of forming laws proceed?

The concept of “law” indicates the presence of internally necessary, stable and repeating connections between events and states of objects. The law reflects objectively existing interactions in nature and in this sense is understood as a natural law. The laws of science, aimed at reflecting natural patterns, are formulated using the artificial languages ​​of their disciplinary field. The laws developed by the human community as norms of human coexistence differ significantly from the laws of natural sciences and, as a rule, are of a conventional nature. There are “probabilistic” (statistical) laws based on probabilistic hypotheses regarding the interaction of a large number of elements, and “dynamic” laws, i.e. laws in the form of universal conditions.

The laws of science reflect the most general and profound natural and social interactions; they strive to adequately reflect the laws of nature. However, the very measure of adequacy and the fact that the laws of science are generalizations that are changeable and subject to refutation raise a very acute philosophical and methodological problem about the nature of laws. It is no coincidence that Kepler and Copernicus understood the laws of science as hypotheses. Kant was generally confident that laws are not derived from nature, but are prescribed by it.

The formation of laws presupposes that an experimentally or empirically based hypothetical model has the potential to be transformed into a scheme. Moreover, theoretical schemes are introduced at first as hypothetical constructs, but then they are adapted to a certain set of experiments and in this process are justified as a generalization of experience. Then must follow the stage of its application to the qualitative diversity of things, that is, its qualitative expansion. And only after this is the stage of quantitative mathematical formulation in the form of an equation or formula, which marks the phase of the emergence of the law. So, model - scheme - qualitative and quantitative expansions - metamatization - formulation of the law - this is a chain tested by science.

At all stages of scientific research without exception, both the correction of the abstract objects themselves, and their theoretical schemes, as well as their quantitative mathematical formalizations, are actually carried out. Theoretical schemes could also be modified under the influence of mathematical means, but all these transformations remained within the limits of the hypothetical model put forward. B.C. Stepin emphasizes that “in classical physics we can talk about two stages of constructing particular theoretical schemes as hypotheses: the stage of their construction as meaningful physical models of a certain area of ​​interactions and the stage of possible restructuring of theoretical models in the process of their connection with the mathematical apparatus.” At higher stages of development these two aspects of the hypothesis merge, but at early stages they are separated.

Scientific research in various fields seeks not only to generalize certain events in the world of our experience, but also to identify regularities in the course of these events, to establish general laws that can be used for prediction and explanation.

Philosophy of science and technology: lecture notes Tonkonogov A V

4.2. Formation of theoretical knowledge and its justification

Formation of theoretical knowledge in the philosophy of science represents one of the important aspects of its development. It is obvious that science cannot exist without the correlative existence of factual and theoretical knowledge, individual and general, perceptual and cognitive (mutual accompaniment of feelings and thoughts), individual and universal statements. The correlation of these concepts is manifested at the event-everyday, perceptual-cognitive, logical-linguistic levels.

In the formation of scientific knowledge, classification plays a significant role: it contributes to the transition of science from the stage of empirical accumulation of knowledge to the level of theoretical synthesis. Based on scientific basis classification represents not only a detailed picture of the state of science, but also its fragments; allows you to make informed predictions regarding still unknown facts and patterns.

The foundations of science include fundamental principles, conceptual apparatus, ideals and standards of scientific research. The maturity of a particular science can be judged by its compliance with the scientific picture of the world. According to modern classification Sciences are divided, on the one hand, into natural, technical and social, on the other hand, they distinguish between fundamental and applied sciences, theoretical and experimental. When they talk about “big science”, about “cutting edge science”, they emphasize its hypothetical nature. Modern science develops taking into account deep specialization, as well as at the intersections of interdisciplinary fields, which indicates its integration. Common to all sciences are their integrating properties: a) ideals and norms of knowledge, characteristic of a given era and specified in relation to the specifics of the area under study; b) scientific picture of the world; V) philosophical foundations. Thus, integrating properties imply the functioning and development of science as a whole, as well as its various branches, on general axiological (value) and methodological principles.

Primary Theoretical Models and Laws. In the process of cognition, the formation of primary theoretical models and laws has a certain significance. The concept " model"(from Latin modulus - measure, sample) means norm, sample (standard, standard). In the logic and methodology of science, a model is understood as an analogue, structure, sign system, which serves to determine the social and natural reality generated by human culture - the original, expanding knowledge about the original, constructing the original, and transforming it. From a logical point of view, such a distribution is based on the relations of isomorphism and homomorphism that exist between the model and the fact that with its help an isomorphic or homomorphic image of a certain object is modeled. These relations are relations of equality. The model can gain status law– a necessary, essential, stable, repeating relationship between phenomena. The law expresses the connection between objects, the constituent elements of a given object, between the properties of things, as well as between the properties within a thing. There are laws of functioning, laws of development. They are objective in nature, they are characterized by statistical and dynamic patterns. The effect of laws is determined by the operating conditions: in nature they act spontaneously, in social practice the regulating influence of man is possible.

Analogy. In theoretical studies plays a certain role analogy(from Greek analogia - correspondence, similarity). When considering an object (model), its properties are transferred to another, less studied or less accessible object. Conclusions obtained through analogy are, as a rule, only plausible in nature; they are one of the sources of scientific hypotheses, inductive reasoning and play an important role in scientific discoveries. The term “analogy” is also considered in the meaning of “analogy of existence”, “analogy of being” (lat. analogia entis). In Catholicism, this is one of the principles of scholasticism, which substantiates the possibility of knowing God from the existence of the world he created. Analogy played a huge role in the metaphysics of Aristotle, who interpreted it as a form of government of a single principle in single bodies. The significance of the analogy can be understood by referring to the reasoning of medieval thinkers Augustine the Blessed and Thomas Aquinas. Augustine wrote about the similarity between the Creator and his creation, and Thomas Aquinas considered the “analogies of beings,” which testify to the unequal and ambiguous distribution of perfection in the universe.

Modern researchers highlight the following types analogies: 1) analogy of inequalities when different objects have the same name (heavenly body and earthly body); 2) analogy of proportionality(physical health – mental health); 3) attribution analogy when the same relationships or qualities are prescribed to different objects (healthy lifestyle, healthy body, healthy society, etc.).

According to researchers, the analogy between the motion of a thrown body and the motion of celestial bodies played an important role in the development of classical mechanics. The analogy between geometric and algebraic objects was realized by Descartes in analytical geometry. The analogy of selective work in cattle breeding was used by Darwin in his theory of natural selection. The analogy between light, electrical and magnetic phenomena turned out to be fruitful for the theory electromagnetic field Maxwell. Analogies are used in modern urban planning, architecture, pharmacology, medicine, logic, linguistics, etc.

Thus, inference by analogy allows you to liken a new single phenomenon to another, already known phenomenon. With a certain degree of probability, analogy allows you to expand knowledge by including new subject areas in their scope. Hegel called analogy “the instinct of reason.”

Often, the inventor (author) of a concept comes up with terms by intuition, by accident. To confirm the correctness or incorrectness of the proposed concepts, you can use the concept of a logician and a historian of knowledge Carl Gustav Hempel(1905–1997). This is the essence of his concept.

1. Theoretical terms either fulfill or do not fulfill their function.

2. If theoretical terms do not fulfill their functions, then they are not needed.

3. If theoretical terms fulfill their functions, then they establish connections between observed phenomena.

4. These connections can be established without theoretical terms.

5. If empirical connections can be established without theoretical terms, then theoretical terms are not needed.

6. Consequently, theoretical terms are not needed both when they perform their functions and when they do not perform these functions.

In 1970, Hempel, using modern logical-mathematical research tools, first showed the incorrectness of Popper's definition of plausibility. Against skepticism Karl Popper(1902–1994), expressed in his maxim “We do not know - we can only guess,” irrefutable counterarguments were found. Hypothesis- a specific form of comprehension of objective truth - becomes a reliable theory when, from its basic assumption, conclusions are drawn that allow practical verification. Are negative results individual experiments the final “verdict” of this hypothesis? Hempel believed that no, because:

a) erroneous interpretation of these experiments is possible;

b) confirmation of other effects predicted by this hypothesis is possible; c) the hypothesis itself allows for further development and improvement.

The relationship between the Logic of Discovery and the Logic of Justification. In form, the theory appears as a system of consistent, logically interconnected statements. Theories use a specific categorical apparatus, a system of principles and laws. The developed theory is open to description, interpretation and explanation of new facts, and is also ready to include additional metatheoretical constructions: hypothetico-deductive, descriptive, inductive-deductive, formalized using complex mathematical apparatus. Thomas Kuhn(1922–1996), listing the most important characteristics of a theory, argued that it should be accurate, consistent, widely applicable, simple, fruitful, have novelty, etc. However, each of these criteria separately is not self-sufficient. From this fact, Popper concludes that any theory is, in principle, falsifiable and subject to a refutation procedure. Based on these arguments, Popper puts forward the principle of fallibilism. He concludes that there is no error only in the statement that “ all theories are wrong».

It is easy to see that the development of scientific concepts is repeatedly mediated by linguistic conceptual definitions. In his research on this issue, the Russian scientist T. G. Leshkevich writes: “Language does not always have adequate means of reproducing alternative experience; the basic vocabulary of the language may lack certain symbolic fragments. Therefore, for the philosophy of science, the study of the specifics of language as effective means representation, coding of basic information, the relationship between linguistic and extra-linguistic mechanisms of theory construction."

From the book Philosophy of Science and Technology author Stepin Vyacheslav Semenovich

Formation of private theoretical schemes and laws Let us now turn to the analysis of the second situation of the development of theoretical knowledge, which is associated with the formation of private theoretical schemes and private theoretical laws. At this stage, explanation and prediction

From the book The Fate of Civilization. Path of Reason author Moiseev Nikita Nikolaevich

Procedures for constructive justification of theoretical schemes Constructive justification ensures the linking of theoretical schemes to experience, and therefore connection with experience physical quantities mathematical apparatus of the theory. It is thanks to the procedures of constructive

From the book Thresholds of Dreaming author Ksendzyuk Alexey Petrovich

5. Formation of the world of TNCs But the most important thing happened in the public sphere, in the restructuring of its economic basis, and it happened on a planetary scale. And lastly, the most important. The scale of economic development, the need for cooperation (public

From the book Metamorphoses of Power by Toffler Alvin

1. Unity of Knowledge At the beginning of the first part of this work I made a few remarks about what I mean when I say “my worldview,” about its origins and what place it occupies in its formation modern natural science. In this book I discuss humanitarian problems -

From the book Philosophy of Science and Technology: Lecture Notes author Tonkonogov A V

From the book Fundamentals of Philosophy author Babaev Yuri

FORMING PUBLIC OPINION Changing financial control of the media always leads to heated debate. Nowadays, the sheer size of the media empire is alarming. Created network structures and other media

From the book Vladimir Ilyich Lenin: the genius of the Russian breakthrough of humanity to socialism author Subetto Alexander Ivanovich

13.2. Socio-philosophical features of theoretical research in scientific and technical disciplines The “science - technology” system includes the entire set of fundamental scientific disciplines, knowledge about the direct applications of their results, a set of technical

From the book Volume 26, part 3 author Engels Friedrich

XIX century – the time of generalization of theoretical quests of the era of the “kingdom of reason” Social and political processes XIX period centuries are significant not only for the history of Europe, but also for the whole world. Global civilization matured when Europeans and European culture exerted their

From the book The Formation of the Philosophy of Marxism author Oizerman Theodor Ilyich

Chapter 6 Interrevolutionary period in the life of V.I. Lenin and Russia: 1908–1917. Development of the philosophical and theoretical foundations of Leninism “Like any leader, Lenin recognized and appreciated the power of the majority. But he never became its prisoner if the decision of the majority was not based on

From the book All the best that money can't buy. A world without politics, poverty and wars by Fresco Jacques

a) “Observations on certain verbal disputes...” [skepticism in political economy; reduction of theoretical disputes to a dispute about words] “Observations on certain Verbal Disputes in Political Economy, particularly relating to Value, and to Demand and Supply.” London, 1821. This work is not without a certain poignancy. Its title is typical - “Verbal

From the book The Lifestyle We Choose author Förster Friedrich Wilhelm

Chapter first. The formation of the revolutionary democratic views of Marx and Engels and their philosophical

From the book Marxist philosophy in the 19th century. Book two (Development of Marxist philosophy in the second half of the 19th century) by the author

From the book Nudity and Alienation. Philosophical essay on human nature author Ivin Alexander Arkhipovich

Formation of will 1. Willpower One French teacher reproached modern people is that they have “a child’s will in men’s bodies.” This is, of course, an exaggerated reproach, but only insofar as today we see enormous energy in action - in research

From the author's book

5. Formation of taste The correct formation of taste nowhere has such a fateful significance as in the area of ​​​​human relations that is being discussed here. And above all, because we need to get out from under the fatal tutelage of purely external preferences

From the author's book

1. F. Engels’s criticism of the ideological and theoretical foundations of opportunism in the international labor movement After the defeat of the Paris Commune and the self-dissolution of the First International in 1876, the international labor movement was faced with the task of creating mass socialist parties

From the author's book

3. Formation of a united humanity Let us consider in more detail two of the listed global problems - the problem of the formation of a united humanity and the problem of preserving human nature. Expression " world history", often used by historians, has two very

In the philosophical and methodological literature of recent decades, fundamental ideas, concepts and ideas that form relatively stable foundations on which specific empirical knowledge and theories explaining them are developed are increasingly becoming the subject of research.

Identification and analysis of these foundations involves considering scientific knowledge as an integral developing system. In Western philosophy, such a vision of science began to take shape relatively recently, mainly in the post-positivist period of its history. As for the stage at which ideas about science, developed within the framework of positivist philosophy, dominated, their most striking expression was the so-called standard concept of the structure and growth of knowledge 1. In Wei, the unit of analysis was a single theory and its relationship with experience. Scientific knowledge was presented as a set of theories and empirical knowledge, considered as the basis on which theories are developed. However, it gradually became clear that the empirical basis of the theory is not pure, theoretically neutral empiricism, that it is not observational data, but Facts that represent the empirical basis on which theories are based. And facts are theoretically loaded, since other theories take part in their formation. And then the problem of the interaction of a particular theory with its empirical basis also appears as a problem of the relationship of this theory with other, previously established theories that form the composition of the theoretical knowledge of a certain scientific discipline.

From a somewhat different angle, this problem of the interconnection of theories emerged during the study of their dynamics. It turned out that the growth of theoretical knowledge is carried out not simply as a generalization of experimental facts, but as the use of theoretical concepts and structures developed in previous theories and used in generalizing experience. Thus, the theories of the corresponding science were presented as some kind of dynamic network, an integral system interacting with empirical facts. The systemic impact of knowledge of a scientific discipline posed the problem of system-forming factors that determine the integrity of the corresponding knowledge system. This is how the problem of the foundations of science began to emerge, thanks to which the diverse knowledge of a scientific discipline is organized into a systemic integrity at each stage of its historical development.

Finally, consideration of the growth of knowledge in its historical dynamics has revealed special conditions associated with critical epochs in the development of science, when a radical transformation of its most fundamental concepts and ideas occurs. These states are called scientific revolutions, and they can be considered as a restructuring of the foundations of science.



Thus, the expansion of the field of methodological problems in postpositivist philosophy of science has put forward the analysis of the foundations of science as a real methodological problem.

These foundations and their individual components were recorded and described in terms: “paradigm” (T. Kuhn), “core research program” (I. Lakatos), “ideals of the natural order” (S. Toulmin), “main themes of science” (J. Holton), “research tradition” (L. Laudan).

In the process of discussions between supporters of various concepts, the problem of a differentiated analysis of the foundations of science became acute. Indicative in this regard are the discussions around the key concept of “paradigm” in Kuhn’s concept. Its extreme polysemy and vagueness were noted by numerous opponents of Kuhn.

Influenced by criticism, Kuhn attempted to analyze the structure of the paradigm. He identified the following components: “symbolic generalizations” (mathematical formulations of laws), samples of solving specific problems, “metaphysical parts of the paradigm” and values ​​(values ​​of science) 2 . This was a step forward compared to the first version of the concept, but at this stage the structure of the foundations of science remained unclear. Firstly, it is not shown in what connections the identified components of the paradigm are located, which means, strictly speaking, its structure has not been revealed. Secondly, the paradigm, according to Kuhn, includes both components related to the deep foundations of scientific research and the forms of knowledge that grow on these foundations. For example, “symbolic generalizations” include mathematical formulations of particular laws of science (such as formulas expressing the Joule-Lenz law, the law of mechanical vibration, etc.). But then it turns out that the discovery of any new private law must mean a change in paradigm, i.e. scientific revolution. Thus, the distinction between “normal science” (the evolutionary stage of the growth of knowledge) and the scientific revolution is erased. Thirdly, highlighting such components of science as “metaphysical parts of the paradigm” and values. Kuhn captures them “ostensively,” through the description of relevant examples. From the examples given by Kuhn it is clear that the “metaphysical parts of the paradigm” are understood by him as philosophical ideas, then as principles of a specific scientific nature (such as the principle of short-range action in physics or the principle of evolution in biology). As for values, Kuhn’s description of them also looks like only a first and very rough sketch. Essentially, what is meant here are the ideals of science, taken in a very limited range - as the ideals of explanation, prediction and application of knowledge.



In principle, we can say that even in the most advanced studies of the foundations of science, which include the work of T. Kuhn, Western philosophy of science is not analytical enough. She has not yet established what the main components of the foundations of science and their connections are. The connections between the foundations of science and the theories and empirical knowledge based on them are not sufficiently clarified. This means that the problem of the structure of foundations, their place in the system of knowledge and their functions in its development requires further, deeper discussion.

In the established and developed system of disciplinary scientific knowledge, the foundations of science are revealed, firstly, in the analysis of systemic connections between theories varying degrees communities and their relationship to various forms empirical knowledge within a certain discipline (physics, chemistry, biology, etc.), secondly, in the study of interdisciplinary relationships and interactions of various sciences.

The most important components that form the foundations of science can be identified: 1) the scientific picture of the world; 2) ideals and norms scientific knowledge; 3) philosophical foundations of science.

The listed components express general ideas about the specifics of the subject of scientific research, about the characteristics of cognitive activity mastering one or another type of object, and about the nature of the connections between science and the culture of the corresponding historical era.

Concept - the unity of essential properties, connections and relationships of objects or phenomena reflected in thinking; a thought or system of thoughts that identifies and generalizes objects of a certain class according to certain general and generally specific characteristics for them. Scientific concepts reflect essential and necessary features, and the words and signs (formulas) that express them are scientific terms. The concept distinguishes its content and volume. The set of objects generalized in a concept is called the scope of the concept, and the set of essential features by which objects in the concept are generalized and distinguished is its content. The development of a concept involves a change in its volume and content. The transition from the sensory stage of cognition to logical thinking is characterized primarily as a transition from perceptions and ideas to reflection in the form of concepts. By its origin, the concept is the result of a long process of development of knowledge, a concentrated expression of historically achieved knowledge. The formation of a concept is a complex dialectical process, which is carried out using methods such as comparison, analysis, synthesis, abstraction, idealization, generalization, experiment, etc. A concept is a non-figurative reflection of reality expressed in words. It acquires its real mental and verbal existence only in the development of definitions, in judgments, as part of a certain theory. In the concept, first of all, the general is highlighted and fixed, which is achieved by abstracting from all the features of individual objects of a given class. But it does not exclude the individual and special. On the basis of the general, it is only possible to isolate and recognize the particular and the individual. A scientific concept is the unity of the general, the particular and the individual, that is, concretely universal. In the approach to the concept in the history of philosophy, two opposite lines have emerged - the materialist, which believes that concepts are objective in their content, and the idealist, according to which the concept is a spontaneously arising mental entity, absolutely independent of objective reality. For example, for the objective idealist G. Hegel, concepts are primary, and objects and nature are only pale copies of them. Phenomenalism considers the concept as the last reality, not related to objective reality. Neopositivists, reducing concepts to auxiliary logical-linguistic means, deny the objectivity of their content. Being a reflection of objective reality, concepts are as plastic as reality itself, of which they are a generalization. Scientific concepts are not something complete and complete; on the contrary, it contains within itself the possibility of further development. The main content of the concept changes only at certain stages of the development of science. Such changes in the concept are qualitative and are associated with a transition from one level of knowledge to another, to knowledge of the deeper essence of objects and phenomena conceivable in the concept. The movement of reality can only be reflected in dialectically developing concepts. By concept, Kant meant any general idea, since the latter is fixed by a term. The concept for Hegel is “first of all, a synonym for a real understanding of the essence of the matter, and not just an expression of any general, any similarity of objects of contemplation. The concept reveals the true nature of a thing, and not its similarity with other things, and therefore not only abstract generality (this is only one aspect of the concept, which makes it related to representation), but also the particularity of its object must find its expression in it. A concept in formal logic is an elementary unit of mental activity, possessing a certain integrity and stability and taken in abstraction from the verbal expression of this activity. The process of concept formation is naturally described in terms of homomorphism; dividing the set of objects that interests us into classes of elements that are “equivalent” in some respect (that is, ignoring all the differences between elements of the same class that are not of interest to us at the moment), we obtain a new set, homomorphic to the original one, according to the equivalence relation we have identified. The elements of this new set (equivalence classes) can now be thought of as single, indivisible objects obtained as a result of “gluing” all the original objects, indistinguishable in the relations we have fixed, into one “lump”. These “clumps” of images of initial objects identified with each other are what we call concepts obtained as a result of the mental replacement of a class of closely related ideas with one “generic” concept. When considering the semantic aspect of the problem of a concept, it is necessary to distinguish between a concept as some abstract object and the word that names it (which is a completely concrete object), name, term. The scope of a concept is the denotation (meaning) of the name denoting it, and the content is the concept (meaning) that this name expresses.

Models play an important role in scientific and theoretical knowledge. They make it possible to present in a visual form objects and processes that are inaccessible to direct perception: for example, an atomic model, a model of the Universe, a model of the human genome, etc. Theoretical models reflect the structure, properties and behavior of real objects. The famous Western philosopher of science Imre Lakatos noted that the process of forming primary theoretical models can be based on programs of three kinds, each of which comes from the organization of knowledge as a deductive system:

1) Empiricist program;

2) Inductivist program;

3) Euclidean system (Euclidean program).

Euclidean program , which assumes that everything can be deduced from a finite set of trivial true statements consisting only of terms with a trivial semantic load, is usually called the trivialization program (simplifications) knowledge. This program contains purely true judgments, but it does not work with either assumptions or refutations. Knowledge as truth is introduced at the top of the theory and, without any deformation, “flows down” from primitive terms to defined terms.

Unlike Euclidean, empiricist program is built on the basis of basic provisions of a well-known empirical nature. Empiricists cannot allow any other introduction of meaning than from below the theory. If these provisions turn out to be false, then this assessment penetrates up the channels of deduction and fills the entire system. Therefore, the empiricist theory is conjectural and falsifiable. And if the Euclidean theory places truth at the top and illuminates it with the natural light of reason, then the empiricist theory places it at the bottom and illuminates it with the light of experience. But both programs rely on logical intuition.

About inductivist program Lakatos says: “The mind expelled from the upper level seeks refuge below. The inductivist program arose as part of the effort to construct a channel through which truth flows upward from basic propositions, and thus to establish an additional logical principle, the principle of truth relaying.”

According to academician V.S. Stepin, “the main feature of theoretical schemes is that they are not the result of a purely deductive generalization of experience.” In advanced science, theoretical schemes are first constructed as hypothetical models using previously formulated abstract objects. In the early stages of scientific research, the constructs of theoretical models are created through direct schematization of experience.

Important characteristics of a theoretical model are its structure, as well as the ability to transfer abstract objects from other areas of knowledge. Lakatos believes that the main structural units- this is a hard core, a belt of protective hypotheses, positive and negative heuristics. The negative heuristic prohibits the application of refutations to the hard core of a program. Positive heuristics allow further development and expansion of the theoretical model.

Theoretical objects convey the meaning of such concepts as “ideal gas”, “absolute black body”, “point”, “force”, “circle”, “segment”, etc. Abstract objects are aimed at replacing certain connections of reality, but they cannot exist in the status of real objects, since they represent idealizations.

The transfer of abstract objects from one field of knowledge to another presupposes the existence of a solid basis for analogies, which indicate similarities between things. Modern interpreters highlight: 1) an analogy of inequality, when different objects have the same name (heavenly body, earthly body); 2) analogy of proportionality (physical health - mental health); 3) an analogy of attribution, when the same relationships are attributed to an object in different ways (healthy lifestyle - healthy body - healthy society, etc.). Thus, inference by analogy allows us to liken a new individual phenomenon to another, already known phenomenon. Analogy, with a certain degree of probability, makes it possible to expand existing knowledge by including new subject areas within its scope. It is noteworthy that Hegel highly valued the capabilities of the method of analogies, calling the latter the “instinct of reason.”

Abstract objects must satisfy the connections and interactions of the emerging field of knowledge. Therefore, the question of the reliability of the analogy is always relevant. There are analogies of objects and analogies of relations, as well as strict and non-strict analogies. Strict analogy provides the necessary connection of the transferred feature with the sign of similarity. A loose analogy is problematic. It is important to note that the difference between analogy and deductive inference is that in analogy there is a likening of individual objects, and not the subsuming of an individual case under a general position, as in deduction.

As V.N. Porus notes, “an important role in the development of classical mechanics was played by the analogy between the movement of a thrown body and the movement of celestial bodies; the analogy between geometric and algebraic objects is realized by Descartes in analytical geometry; the analogy of selective work in cattle breeding was used by Darwin in his theory of natural selection; The analogy between light, electrical and magnetic phenomena proved fruitful for Maxwell's theory of the electromagnetic field. An extensive class of analogies is used in modern scientific disciplines: in architecture and urban planning theory, bionics and cybernetics, pharmacology and medicine, logic and linguistics; in the field of technical sciences, for which the procedure of schematization is of great importance, which replaces a real engineering object with an idealized representation (scheme, model).

There are also numerous examples of false analogies. These are the analogies between the movement of fluid and the spread of heat in the doctrine of “caloric” in the 17th-18th centuries. (hypothetical thermal matter (weightless liquid), the presence of which in bodies was used to explain the observed thermal phenomena (heating of bodies, heat transfer, thermal expansion, thermal equilibrium, etc.); a specific liquid that supposedly pours into heated bodies).

The formation of laws involves that an experimentally or empirically substantiated hypothetical model has the potential to be transformed into a diagram. Moreover, “theoretical schemes are introduced at first as hypothetical constructions, but then they are adapted to a certain set of experiments and in this process are justified as a generalization of experience.” Then followed the stage of its application to the qualitative variety of things, that is, its qualitative expansion. And only after this followed the stage of quantitative mathematical formulation in the form of an equation or formula, which marked the phase of the emergence of the law.

So, model → diagram → qualitative and quantitative extensions → mathematization → formulation of the law. At all stages, without exception, both the adjustment of the abstract objects themselves, and their theoretical schemes, as well as their quantitative mathematical formalizations, were actually carried out. Theoretical schemes could also be modified under the influence of mathematical means, but all these transformations remained within the limits of the hypothetical model put forward.

B.C. Stepin emphasizes that “in classical physics we can talk about two stages of constructing particular theoretical schemes as hypotheses: the stage of their construction as meaningful physical models of a certain area of ​​interactions and the stage of possible restructuring of theoretical models in the process of their connection with the mathematical apparatus.” At higher stages of development these two aspects of the hypothesis merge, but at early stages they are separated. The concept of “law” indicates the presence of internally necessary, stable and repeating connections between events and states of objects. The law reflects objectively existing interactions in nature and in this sense is understood as a natural law. The laws of science resort to artificial languages ​​to formulate these natural patterns. Laws developed by the human community as norms of human coexistence, as a rule, have a conventional (clearly structured) character.

The laws of science strive to adequately reflect the laws of reality. However, the very measure of adequacy and the fact that the laws of science are generalizations that are changeable and subject to falsification raise a very acute philosophical and methodological problem. It is no coincidence that Kepler and Copernicus understood the laws of science as hypotheses. Kant was generally confident that laws are not derived from nature, but are prescribed by it.

Therefore, one of the most important procedures in science, procedure has always been considered scientific justification theoretical knowledge, and science itself was often interpreted as a purely “explanatory event.” However, recent advances in science show that many processes in the modern physical picture of the world are fundamentally inconceivable and unimaginable. This suggests that justification is deprived of its model character and clarity and must rely on purely conceptual techniques in which the very procedure of reducing (reducing) the unknown to the known is called into question.

Another paradoxical phenomenon arises: the objects that need to be explained, it turns out, cannot be observed in principle (a quark is a fundamental particle with an electric charge; an unobservable entity). Thus, scientific-theoretical knowledge acquires, alas, an inexperienced character. Non-experiential reality allows you to have non-experiential knowledge about yourself. This conclusion, at which modern philosophy of science has stopped, is not perceived by all scientists as scientific, because the procedure of scientific justification is based on what cannot be explained.

Proper justification is facilitated by isolating one or more important groups of facts that must be specified in the initial conditions and asserting that the event in question is "determined" and therefore must be explained in terms of only that group of facts.

Scientific explanation includes the following elements:

a) empirical verification of propositions talking about certain conditions;

b) empirical testing of universal hypotheses on which the explanation is based;

c) examining whether an explanation is logically convincing.

The explanation of a pattern is carried out on the basis of subsuming it under another, more general pattern. Based on this, a two-part explanation structure is derived: explanandum- this is a description of the phenomenon; explanans- a class of sentences that are given to explain a given phenomenon. Explanation, in turn, is divided into two subclasses: one of them describes the conditions; the other is general laws.

The explanandum must be logically deducible from the explanans - this is the logical condition of adequacy. The explanance must be confirmed by all available empirical material and must be true - this is an empirical condition of adequacy.

Incomplete explanations omit part of the explanans as obvious. Causal or deterministic laws differ from statistical laws in that the latter establish that in the future a certain percentage of all cases satisfying a given set of conditions will be accompanied by a phenomenon of a certain type.

Prediction in contrast to an explanation, it consists of a statement about some future event. Here the initial conditions are given, and the consequences do not yet take place, but must be established. We can talk about the structural equality of justification and prediction procedures. Very rarely, however, explanations are formulated so completely that they can demonstrate their predictive character; more often, explanations are incomplete. There are “causal” and “probabilistic” explanations, based on probabilistic hypotheses rather than on general “deterministic” laws, that is, laws in the form of universal conditions.

The most general view of the mechanism of development of scientific knowledge from the standpoint of rationalism suggests that knowledge can be generalizing (synthetic) And dissecting (analytical). Synthetic knowledge leads not simply to generalization, but to the creation of a fundamentally new content, which is not contained either in isolated elements or in their summative integrity. The essence of the analytical approach is that the main essential aspects and patterns of the phenomenon being studied are considered as something contained in the given, taken as the source material. The synthetic approach guides the researcher to find dependencies outside the object itself, in the context of externally occurring systemic relationships. It is the synthetic movement that presupposes the formation of new theoretical meanings, types of mental content, new horizons, a new layer of reality. Synthetic is that new thing that leads to the discovery of a qualitatively different basis, different from the previous one, available.

Analytical knowledge allows you to clarify details and particulars, to reveal the full potential of the content present in the original basis. The analytical movement involves a logic aimed at identifying elements that were not yet known, but which were contained in the previous basis. The analytical form of obtaining new knowledge records new connections and relationships of objects that have already fallen into the sphere of human practical activity. It is closely related to deduction and the concept of “logical consequence.” An example of such an analytical increment of new knowledge is the finding of new chemical elements in the periodic table of Mendeleev.