Arts and Humanities â€ș History and Philosophy of Science

Philosophy and History of Science

Description

This cluster of papers explores the concept of mechanistic explanation in scientific discovery and evolution, encompassing topics such as evolutionary synthesis, causation, genetic information, scientific models, social mechanisms, and the philosophy of science. It delves into the intricacies of biological explanation, genetic code, and developmental biology.

Keywords

Mechanistic Explanation; Evolutionary Synthesis; Causation; Genetic Information; Scientific Models; Social Mechanisms; Philosophy of Science; Biological Explanation; Genetic Code; Developmental Biology

* *1. Defining and Revising the Structure of Evolutionary Theory * Part I: The History of Darwinian Logic and Debate *2. The Essence of Darwinism and the Basis of Modern 
 * *1. Defining and Revising the Structure of Evolutionary Theory * Part I: The History of Darwinian Logic and Debate *2. The Essence of Darwinism and the Basis of Modern Orthodoxy: An Exegesis of the Origin of Species *3. Seeds of Hierarchy *4. Internalism and Laws of Form: Pre-Darwinian Alternatives to Functionalism *5. The Fruitful Facets of Galton's Polyhedron: Channels and Saltations in Post-Darwinian Formalism *6. Pattern and Progress on the Geological Stage *7. The Modern Synthesis as a Limited Consensus * Part II: Towards a Revised and Expanded Evolutionary Theory *8. Species as Individuals in the Hierarchical Theory of Selection *9. Punctuated Equilibrium and the Validation of Macroevolutionary Theory *10. The Integration of Constraint and Adaptation (Structure and Function) in Ontogeny and Phylogeny: Historical Constraints and the Evolution of Development *11. The Integration of Constraint and Adaptation (Structure and Function) in Ontogeny and Phylogeny: Structural Constraints, Spandrels, and the Centrality of Exaptation in Macroevolution *12. Tiers of Time and Trials of Extrapolationism, With an Epilog on the Interaction of General Theory and Contingent History * Bibliography * Index
From the subjectivist point of view (de Finetti, 1937/1964), a probability is a degree of belief in a proposition. It expresses a purely internal state; there is no “right,” “correct,” 
 From the subjectivist point of view (de Finetti, 1937/1964), a probability is a degree of belief in a proposition. It expresses a purely internal state; there is no “right,” “correct,” or “objective” probability residing somewhere “in reality” against which one's degree of belief can be compared. In many circumstances, however, it may become possible to verify the truth or falsity of the proposition to which a probability was attached. Today, one assesses the probability of the proposition “it will rain tomorrow.” Tomorrow, one looks at the rain gauge to see whether or not it has rained. When possible, such verification can be used to determine the adequacy of probability assessments.
Described by the philosopher A.J. Ayer as a work of 'great originality and power', this book revolutionized contemporary thinking on science and knowledge. Ideas such as the now legendary doctrine 
 Described by the philosopher A.J. Ayer as a work of 'great originality and power', this book revolutionized contemporary thinking on science and knowledge. Ideas such as the now legendary doctrine of 'falsificationism' electrified the scientific community, influencing even working scientists, as well as post-war philosophy. This astonishing work ranks alongside The Open Society and Its Enemies as one of Popper's most enduring books and contains insights and arguments that demand to be read to this day.
Journal Article Book Reviews Get access Conjectures and Refutations : The growth of scientific knowledge. By Karl R. Popper (London : Routledge. 1963. Pp. xiv + 412. Price 55s). Mary 
 Journal Article Book Reviews Get access Conjectures and Refutations : The growth of scientific knowledge. By Karl R. Popper (London : Routledge. 1963. Pp. xiv + 412. Price 55s). Mary Hesse Mary Hesse Search for other works by this author on: Oxford Academic Google Scholar The Philosophical Quarterly, Volume 15, Issue 61, October 1965, Pages 372–374, https://doi.org/10.2307/2218271 Published: 01 October 1965
Thomas S. KuhnChicago: University of Chicago Press, 19622nd enlarged edition, 1970. 3rd edition, 1996.Thomas S. Kuhn's The structure of scientific revolutionsis a classic text in the history and ph... Thomas S. KuhnChicago: University of Chicago Press, 19622nd enlarged edition, 1970. 3rd edition, 1996.Thomas S. Kuhn's The structure of scientific revolutionsis a classic text in the history and ph...
Zusammenfassung Ich habe in diesem Aufsatz kurz anzudeuten versucht, was meiner Ansicht nach das Wesentliche in Fragestellung und Methode der Ethologie ist und weshalb wir in Konrad Lorenz den BegrĂŒnder 
 Zusammenfassung Ich habe in diesem Aufsatz kurz anzudeuten versucht, was meiner Ansicht nach das Wesentliche in Fragestellung und Methode der Ethologie ist und weshalb wir in Konrad Lorenz den BegrĂŒnder moderner Ethologie erblicken. Hierbei habe ich vielleicht das Arbeitsgebiet der Ethologie weiter gefaßt, als unter Ethologen gebrĂ€uchlich ist. Wenn man aber die vielartige Arbeit jener Forscher ĂŒbersieht, die sich Ethologen nennen, ist man zu dieser weiten Fassung geradezu gezwungen. Ich habe in meiner Darstellung weder VollstĂ€ndigkeit noch Gleichgewicht angestrebt und, um zur FortfĂŒhrung des GesprĂ€chs anzuregen, ruhig meine Steckenpferde geritten, vor allem das VerhĂ€ltnis zwischen Ethologie und Physiologie, die Gefahr der VernachlĂ€ssigung der Frage der Arterhaltung, Fragen der Methodik der ontogenetischen Forschung, und Aufgaben und Methoden der Evolutionsforschung. Bei der EinschĂ€tzung des Anteils, den Lorenz an der Entwicklung der Ethologie genommen hat und noch nimmt, habe ich als seinen Hauptbeitrag den bezeichnet, daß er uns gezeigt hat, wie man bewĂ€hrtes “biologisches Denken” folgerichtig auf Verhalten anwenden kann. Daß er dabei an die Arbeit seiner VorgĂ€nger angeknĂŒpft hat, ist nicht mehr verwunderlich, als daß jeder Vater selbst einen Vater hat. Insbesondere scheint mir das Wesentliche an Lorenz‘ Arbeit zu sein, daß er klar gesehen hat, daß Verhaltensweisen Teile von “Organen”, von Systemen der Arterhaltung sind; daß ihre Verursachung genau so exakt untersucht werden kann wie die gleich welcher anderer LebensvorgĂ€nge, daß ihr arterhaltender Wert ebenso systematisch und exakt aufweisbar ist wie ihre Verursachung, daß Verhaltensontogenie in grundsĂ€tzlich gleicher Weise erforscht werden kann wie die Ontogenie der Form und daß die Erforschung der Verhaltensevolution der Untersuchung der Strukturevolution parallel geht. Und obwohl Lorenz ein riesiges Tatsachenmaterial gesammelt hat, ist die Ethologie doch noch mehr durch seine Fragestellung und durch kĂŒhne Hypothesen gefördert als durch eigene NachprĂŒfung dieser Hypothesen. Ohne den Wert solcher NachprĂŒfung zu unterschĂ€tzen — ohne die es natĂŒrlich keine Weiterentwicklung gĂ€be — möchte ich doch behaupten, daß die durch NachprĂŒfung notwendig gewordenen Modifikationen neben der Leistung des ursprĂŒnglichen Ansatzes vergleichsweise unbedeutend sind. Nebenbei sei auch daran erinnert, daß eine der vielen heilsamen Nachwirkungen der Lorenzschen Arbeit das wachsende Interesse ist, das die Humanpsychologie der Ethologie entgegenbringt ‐ ein erster Ansatz einer Entwicklung, deren Tragweite wir noch kaum ĂŒbersehen können. Am Schluß noch eine Bemerkung zur Terminologie. Ich habe hier das Wort “Ethologie” auf einen Riesenkomplex von Wissenschaften angewandt, von denen manche, wie Psychologie und Physiologie, schon lĂ€ngst anerkannte Namen tragen. Das heißt natĂŒrlich nicht, daß ich den Namen Ethologie fĂŒr dieses ganze Gebiet vorschlagen will; das wĂ€re geschichtlich einfach falsch, weil das Wort historisch nur die Arbeit einer kleinen Gruppe von Zoologen kennzeichnet. Der Name ist natĂŒrlich gleichgĂŒltig; worauf es mir vor allem ankommt, ist darzutun, daß wir das Zusammenwachsen vieler Einzeldisziplinen zu einer vielumfassenden Wissenschaft erleben, fĂŒr die es nur einen richtigen Namen gibt: “ Verhaltensbiologie”. SelbstverstĂ€ndlich ist diese synthetische Entwicklung nicht die Arbeit eines Mannes oder gar die der Ethologen. Sie ist die Folge einer allgemeinen Neigung, BrĂŒcken zwischen verwandten Wissenschaften zu schlagen, einer Neigung, die sich in vielen Disziplinen entwickelt hat. Unter den Zoologen ist es Lorenz, der hierzu am meisten beigetragen und zudem manche Nachbardisziplinen stĂ€rker beeinflußt hat als irgendein anderer. Ich bin sogar davon ĂŒberzeugt, daß diese Einwirkungen auf Nachbarwissenschaften noch lange anhalten werden und daß die Verhaltensbiologie erst am Anfang ihrer Ontogenie steht.
Scientific work is heterogeneous, requiring many different actors and viewpoints. It also requires cooperation. The two create tension between divergent viewpoints and the need for generalizable findings. We present a 
 Scientific work is heterogeneous, requiring many different actors and viewpoints. It also requires cooperation. The two create tension between divergent viewpoints and the need for generalizable findings. We present a model of how one group of actors managed this tension. It draws on the work of amateurs, professionals, administrators and others connected to the Museum of Vertebrate Zoology at the University of California, Berkeley, during its early years. Extending the Latour-Callon model of interessement, two major activities are central for translating between viewpoints: standardization of methods, and the development of `boundary objects'. Boundary objects are both adaptable to different viewpoints and robust enough to maintain identity across them. We distinguish four types of boundary objects: repositories, ideal types, coincident boundaries and standardized forms.
To explain the phenomena in the world of our experience, to answer the question “why?” rather than only the question “what?”, is one of the foremost objectives of all rational 
 To explain the phenomena in the world of our experience, to answer the question “why?” rather than only the question “what?”, is one of the foremost objectives of all rational inquiry; and especially, scientific research in its various branches strives to go beyond a mere description of its subject matter by providing an explanation of the phenomena it investigates. While there is rather general agreement about this chief objective of science, there exists considerable difference of opinion as to the function and the essential characteristics of scientific explanation. In the present essay, an attempt will be made to shed some light on these issues by means of an elementary survey of the basic pattern of scientific explanation and a subsequent more rigorous analysis of the concept of law and of the logical structure of explanatory arguments.
In this work the distinguished physical chemist and philosopher, Michael Polanyi, demonstrates that the scientist's personal participation in his knowledge, in both its discovery and its validation, is an indispensable 
 In this work the distinguished physical chemist and philosopher, Michael Polanyi, demonstrates that the scientist's personal participation in his knowledge, in both its discovery and its validation, is an indispensable part of science itself. Even in the exact sciences, knowing is an art, of which the skill of the knower, guided by his personal committment and his passionate sense of increasing contact with reality, is a logically necessary part. In the biological and social sciences this becomes even more evident.
Woodward's long awaited book is an attempt to construct a comprehensive account of causation explanation that applies to a wide variety of causal and explanatory claims in different areas of 
 Woodward's long awaited book is an attempt to construct a comprehensive account of causation explanation that applies to a wide variety of causal and explanatory claims in different areas of science and everyday life. The book engages some of the relevant literature from other disciplines, as Woodward weaves together examples, counterexamples, criticisms, defenses, objections, and replies into a convincing defense of the core of his theory, which is that we can analyze causation by appeal to the notion of manipulation.
One possible reason for the continued neglect of statistical power analysis in research in the behavioral sciences is the inaccessibility of or difficulty with the standard material. A convenient, although 
 One possible reason for the continued neglect of statistical power analysis in research in the behavioral sciences is the inaccessibility of or difficulty with the standard material. A convenient, although not comprehensive, presentation of required sample sizes is provided here. Effect-size indexes and conventional values for these are given for operationally defined small, medium, and large effects. The sample sizes necessary for .80 power to detect effects at these levels are tabled for eight standard statistical tests: (a) the difference between independent means, (b) the significance of a product-moment correlation, (c) the difference between independent rs, (d) the sign test, (e) the difference between independent proportions, (f) chi-square tests for goodness of fit and contingency tables, (g) one-way analysis of variance, and (h) the significance of a multiple or multiple partial correlation.
The concept of mechanism is analyzed in terms of entities and activities, organized such that they are productive of regular changes. Examples show how mechanisms work in neurobiology and molecular 
 The concept of mechanism is analyzed in terms of entities and activities, organized such that they are productive of regular changes. Examples show how mechanisms work in neurobiology and molecular biology. Thinking in terms of mechanisms provides a new framework for addressing many traditional philosophical issues: causality, laws, explanation, reduction, and scientific change.
pression, Journal of Clinical Psychiatry, 51, 61-69 (1990). 11. L.R. Baxter, Jr., J.M. Schwartz, B.H. Guze, J.C. Mazziotta, M.P. Szuba, K. Bergman, A. Alazraki, C.E. Selin, H.K. Freng, P. Munford, 
 pression, Journal of Clinical Psychiatry, 51, 61-69 (1990). 11. L.R. Baxter, Jr., J.M. Schwartz, B.H. Guze, J.C. Mazziotta, M.P. Szuba, K. Bergman, A. Alazraki, C.E. Selin, H.K. Freng, P. Munford, and M.E. Phelps, Obsessive-compulsive disorder vs. Tourette's disorder: Differential function in subdivi sions of the neostriatum, paper presented at the an nual meeting of the American College of Neuropsy chopharmacology, San Juan, Puerto Rico (December 1991). 12. E.M. Reiman, M.E. Raichle, F.K. Butler, P. Herscovitch, and E. Robins, A focal brain abnormal ity in panic disorder, a severe form of anxiety, Na ture, 310, 683-685 (1984); E.M. Reiman, M.E. Ra ichle, E. Robins, F.K. Butler, P. Herscovitch, P. Fox, and J. Perlmutter, The application of positron emis sion tomography to the study of panic disorder, American Journal of Psychiatry, 143, 469-477 (1986); T.E. Nordahl, W.E. Semple, M. Gross, T.A. Mellman, M.B. Stein, P. Goyer, A.C. King, T.W. Uhde, and R.M. Cohen, Cerebral glucose metabolic differences in patients with panic disorder, Neuro psychopharmacology, 3, 261-272 (1990).
An adaptationist programme has dominated evolutionary thought in England and the United States during the past 40 years. It is based on faith in the power of natural selection as 
 An adaptationist programme has dominated evolutionary thought in England and the United States during the past 40 years. It is based on faith in the power of natural selection as an optimizing agent. It proceeds by breaking an oragnism into unitary 'traits' and proposing an adaptive story for each considered separately. Trade-offs among competing selective demands exert the only brake upon perfection; non-optimality is thereby rendered as a result of adaptation as well. We criticize this approach and attempt to reassert a competing notion (long popular in continental Europe) that organisms must be analysed as integrated wholes, with BauplÀne so constrained by phyletic heritage, pathways of development and general architecture that the constraints themselves become more interesting and more important in delimiting pathways of change than the selective force that may mediate change when it occurs. We fault the adaptationist programme for its failure to distinguish current utility from reasons for origin (male tyrannosaurs may have used their diminutive front legs to titillate female partners, but this will not explain why they got so small); for its unwillingness to consider alternatives to adaptive stories; for its reliance upon plausibility alone as a criterion for accepting speculative tales; and for its failure to consider adequately such competing themes as random fixation of alleles, production of non-adaptive structures by developmental correlation with selected features (allometry, pleiotropy, material compensation, mechanically forced correlation), the separability of adaptation and selection, multiple adaptive peaks, and current utility as an epiphenomenon of non-adaptive structures. We support Darwin's own pluralistic approach to identifying the agents of evolutionary change.
Preface to the second edition. Introduction. 1. Knowledge in context 2. Theory, observation and practical adequacy 3. Theory and method I: abstraction, structure and cause 4. Theory and method II: 
 Preface to the second edition. Introduction. 1. Knowledge in context 2. Theory, observation and practical adequacy 3. Theory and method I: abstraction, structure and cause 4. Theory and method II: types of system and their implications 5. Some influential misadventures in the philosophy of science 6. Quantitative methods in social science 7. Verification and falsification 8. Popper's 'falsificationism' 9. Problems of explanation and the aims of social science. Notes and references. Bibliography. Index
In recent years increasing need has been felt for a body of systematic theoretical constructs which will discuss the general relationships of the empirical world. This is the quest of 
 In recent years increasing need has been felt for a body of systematic theoretical constructs which will discuss the general relationships of the empirical world. This is the quest of General Systems Theory. It does not seek, of course, to establish a single, self-contained “general theory of practically everything” which will replace all the special theories of particular disciplines. Such a theory would be almost without content, for we always pay for generality by sacrificing content, and all we can say about practically everything is almost nothing. Somewhere however between the specific that has no meaning and the general that has no content there must be, for each purpose and at each level of abstraction, an optimum degree of generality. It is the contention of the General Systems Theorists that this optimum degree of generality in theory is not always reached by the particular sciences.
How do we go about weighing evidence, testing hypotheses, and making inferences? According to the model of Inference to the Best Explanation, we work out what to infer from the 
 How do we go about weighing evidence, testing hypotheses, and making inferences? According to the model of Inference to the Best Explanation, we work out what to infer from the evidence by thinking about what would actually explain that evidence, and we take the ability of a hypothesis to explain the evidence as a sign that the hypothesis is correct. In Inference to the Best Explanation, Peter Lipton gives this important and influential idea the development and assessment it deserves.The second edition has been substantially enlarged and reworked, with a new chapter on the relationship between explanation and Bayesianism, and an extension and defence of the account of contrastive explanation. It also includes an expanded defence of the claims that our inferences really are guided by diverse explanatory considerations, and that this pattern of inference can take us towards the truth. This edition of Inference to the Best Explanation has also been updated throughout and includes a new bibliography.
Strategies for hypothesis testing in scientific investigation and everyday reasoning have interested both psychologists and philosophers.A number of these scholars stress the importance of disconnrmation in reasoning and suggest that 
 Strategies for hypothesis testing in scientific investigation and everyday reasoning have interested both psychologists and philosophers.A number of these scholars stress the importance of disconnrmation in reasoning and suggest that people are instead prone to a general deleterious "confirmation bias."In particular, it is suggested that people tend to test those cases that have the best chance of verifying current beliefs rather than those that have the best chance of falsifying them.We show, howevei; that many phenomena labeled "confirmation bias" are better understood in terms of a general positive test strategy.With this strategy, there is a tendency to test cases that are expected (or known) to have the property of interest rather than those expected (or known) to lack that property.This strategy is not equivalent to confirmation bias in the first sense; we show that the positive test strategy can be a very good heuristic for determining the truth or falsity of a hypothesis under realistic conditions.It can, however, lead to systematic errors or inefficiencies.The appropriateness of human hypothesis-testing strategies and prescriptions about optimal strategies must be understood in terms of the interaction between the strategy and the task at hand.A substantial proportion of the psychological literature on hypothesis testing has dealt with issues of confirmation and disconfirmation.Interest in this topic was spurred by the research findings of Wason (e.g., 1960Wason (e.g., ,1968) ) and by writings in the philosophy of science (e.g., Lakatos, 1970; Platt, 1964;Popper, 1959Popper, , 1972)), which related hypothesis testing to the pursuit of scientific inquiry.Much of the work in this area, both empirical and theoretical, stresses the importance of disconfirmation in learning and reasoning.In contrast, human reasoning is often said to be prone to a "confirmation bias" that hinders effective learning.However, confirmation bias has meant different things to different investigators, as Fischhoff and Beyth-Marom point out in a recent review (1983).For example, researchers studying the perception of correlations have proposed that people are overly influenced by the co-occurrence of two events and insufficiently influenced by instances in which one event occurs without the other (e.g.,
The demarcation of from other intellectual activities-long an analytic problem for philosophers and sociologists-is here examined as a practical problem for scientists. Construction of a boundary between and varieties of 
 The demarcation of from other intellectual activities-long an analytic problem for philosophers and sociologists-is here examined as a practical problem for scientists. Construction of a boundary between and varieties of non-science is useful for scientists' pursuit of professional goals: acquisition of intellectual authority and career opportunities; denial of these resources to pseudoscientists; and protection of the autonomy of scientific research from political interference. Boundary-work describes an ideological style found in scientists' attempts to create a public image for by contrasting it favorably to non-scientific intellectual or technical activities. Alternative sets of characteristics available for ideological attribution to reflect ambivalences or strains within the institution: can be made to look empirical or theoretical, pure or applied. However, selection of one or another description depends on which characteristics best achieve the demarcation in a way that justifies scientists' claims to authority or resources. Thus, science is no single thing: its boundaries are drawn and redrawn inflexible, historically changing and sometimes ambiguous ways.
This classic work in the philosophy of physical science is an incisive and readable account of the scientific method. Pierre Duhem was one of the great figures in French science, 
 This classic work in the philosophy of physical science is an incisive and readable account of the scientific method. Pierre Duhem was one of the great figures in French science, a devoted teacher, and a distinguished scholar of the history and philosophy of science. This book represents his most mature thought on a wide range of topics.
A summary is not available for this content so a preview has been provided. Please use the Get access link above for information on how to access this content. A summary is not available for this content so a preview has been provided. Please use the Get access link above for information on how to access this content.
Journal Article Book Reviews Get access The Structure of Scientific Revolutions. By Thomas S. Kuhn. International Encyclopaedia of Unified Science, Vol. II , No. 2. (Chicago and London : University 
 Journal Article Book Reviews Get access The Structure of Scientific Revolutions. By Thomas S. Kuhn. International Encyclopaedia of Unified Science, Vol. II , No. 2. (Chicago and London : University of Chicago Press. 1962. Pp. xvi + 172. Price 22s 6d or $3.00). David Bohm David Bohm Search for other works by this author on: Oxford Academic Google Scholar The Philosophical Quarterly, Volume 14, Issue 57, October 1964, Pages 377–379, https://doi.org/10.2307/2217783 Published: 01 October 1964
Described by the philosopher A.J. Ayer as a work of 'great originality and power', this book revolutionized contemporary thinking on science and knowledge. Ideas such as the now legendary doctrine 
 Described by the philosopher A.J. Ayer as a work of 'great originality and power', this book revolutionized contemporary thinking on science and knowledge. Ideas such as the now legendary doctrine of 'falsificationism' electrified the scientific community, influencing even working scientists, as well as post-war philosophy. This astonishing work ranks alongside The Open Society and Its Enemies as one of Popper's most enduring books and contains insights and arguments that demand to be read to this day.
Described by the philosopher A.J. Ayer as a work of 'great originality and power', this book revolutionized contemporary thinking on science and knowledge. Ideas such as the now legendary doctrine 
 Described by the philosopher A.J. Ayer as a work of 'great originality and power', this book revolutionized contemporary thinking on science and knowledge. Ideas such as the now legendary doctrine of 'falsificationism' electrified the scientific community, influencing even working scientists, as well as post-war philosophy. This astonishing work ranks alongside The Open Society and Its Enemies as one of Popper's most enduring books and contains insights and arguments that demand to be read to this day.
When first published in 1959, this book revolutionized contemporary thinking about science and knowledge. It remains the one of the most widely read books about science to come out of 
 When first published in 1959, this book revolutionized contemporary thinking about science and knowledge. It remains the one of the most widely read books about science to come out of the twentieth century.
Abstract This book presents an empiricist alternative (‘constructive empiricism’) to both logical positivism and scientific realism. Against the former, it insists on a literal understanding of the language of science 
 Abstract This book presents an empiricist alternative (‘constructive empiricism’) to both logical positivism and scientific realism. Against the former, it insists on a literal understanding of the language of science and on an irreducibly pragmatic dimension of theory acceptance. Against scientific realism, it insists that the central aim of science is empirical adequacy (‘saving the phenomena’) and that even unqualified acceptance of a theory involves no more belief than that this goal is met. Beginning with a critique of the metaphysical arguments that typically accompany scientific realism, a new characterization of empirical adequacy is presented, together with an interpretation of probability in both modern and contemporary physics and a pragmatic theory of explanation.
Two books have been particularly influential in contemporary philosophy of science: Karl R. Popper's Logic of Scientific Discovery, and Thomas S. Kuhn's Structure of Scientific Revolutions. Both agree upon the 
 Two books have been particularly influential in contemporary philosophy of science: Karl R. Popper's Logic of Scientific Discovery, and Thomas S. Kuhn's Structure of Scientific Revolutions. Both agree upon the importance of revolutions in science, but differ about the role of criticism in science's revolutionary growth. This volume arose out of a symposium on Kuhn's work, with Popper in the chair, at an international colloquium held in London in 1965. The book begins with Kuhn's statement of his position followed by seven essays offering criticism and analysis, and finally by Kuhn's reply. The book will interest senior undergraduates and graduate students of the philosophy and history of science, as well as professional philosophers, philosophically inclined scientists, and some psychologists and sociologists.
Abstract Nancy Cartwright argues for a novel conception of the role of fundamental scientific laws in modern natural science. If we attend closely to the manner in which theoretical laws 
 Abstract Nancy Cartwright argues for a novel conception of the role of fundamental scientific laws in modern natural science. If we attend closely to the manner in which theoretical laws figure in the practice of science, we see that despite their great explanatory power these laws do not describe reality. Instead, fundamental laws describe highly idealized objects in models. Thus, the correct account of explanation in science is not the traditional covering law view, but the ‘simulacrum’ account. On this view, explanation is a matter of constructing a model that may employ, but need not be consistent with, a theoretical framework, in which phenomenological laws that are true of the empirical case in question can be derived. Anti‐realism about theoretical laws does not, however, commit one to anti‐realism about theoretical entities. Belief in theoretical entities can be grounded in well‐tested localized causal claims about concrete physical processes, sometimes now called ‘entity realism’. Such causal claims provide the basis for partial realism and they are ineliminable from the practice of explanation and intervention in nature.
Vitaly Tambovtsev | Science management theory and practice
The interaction of society with social sciences differs from its relations with natural sciences. After all, the development of the latter can produce information on the basis of which various 
 The interaction of society with social sciences differs from its relations with natural sciences. After all, the development of the latter can produce information on the basis of which various technologies are created that improve the conditions and quality of life of large masses or individual groups of the population, while many of the results of modern social sciences benefit primarily those who are engaged in them. Of course, there are exceptions, but not always. The article discusses two main reasons for this: firstly, the widespread use of popular (naive, intuitive) social theories among all citizens, which often replace decision makers’ reliance on scientific results, and secondly, the opposition of objects studied by natural and social sciences, carried out by a number of social science methodologists, with an emphasis on the fact that in the latter it makes sense to conduct mainly qualitative research, which boils down to identifying subjective understanding of the reasons or meanings of why people behave this way and not otherwise, while identifying regularities is too difficult to do. In conclusion, it is discussed whether these reasons can be overcome.
Mark P. Hertenstein | Oxford University Press eBooks
Abstract Karl Barth’s Church Dogmatics and Wolfhart Pannenberg’s Systematic Theology are now considered classic texts of twentieth-century theology. Befitting their deserved status, Barth and Pannenberg elaborate deep theologies of the 
 Abstract Karl Barth’s Church Dogmatics and Wolfhart Pannenberg’s Systematic Theology are now considered classic texts of twentieth-century theology. Befitting their deserved status, Barth and Pannenberg elaborate deep theologies of the divine attributes in general and of omnipresence in particular. While both share common concerns, foremost that omnipresence is an absolute attribute of the trinitarian God, their distinctiveness is instructive and illuminating. Barth describes a divine spatiality in God’s perfect inner-trinitarian presence, and this is the basis for God’s general presence in creation and his special presence in reconciliation in Christ. Pannenberg’s doctrine describes a diversity of ways in which God is omnipresent, enduring through time, founded upon the trinitarian relations that resolve the tension of transcendence and immanence. This chapter explores the distinct doctrines of these important twentieth-century theologians and analyses the potential advantages and flaws of Barth and Pannenberg’s approaches.
Andrew Cooper | British Journal for the History of Philosophy
Abstract Mona Simion and Christoph Kelp (2020) have recently challenged the traditional conceptual engineering project. They defend a reorientation of this project that moves away from correcting conceptual shortcomings and 
 Abstract Mona Simion and Christoph Kelp (2020) have recently challenged the traditional conceptual engineering project. They defend a reorientation of this project that moves away from correcting conceptual shortcomings and emphasizes conceptual innovation instead. Central to their proposal is the role played by etiological functions. The present paper argues that this approach leaves them without a specific mechanism for conceptual innovation. It then proposes one such mechanism, which operates independently of functions, by identifying the crucial role of conceptual refinement. After developing the refinement‐based approach, it illustrates how this approach works by considering instances of refined conceptual change from mathematics and logic as well as the sciences, and argues that the approach avoids the challenges faced by the function‐based view.
Abstract This paper augments Mark Bickhard’s interactivist model by incorporating mechanistic explanation, computation, and a non-encodingist correspondence. It argues for a mechanistic framework (drawing on the new mechanism’s focus on 
 Abstract This paper augments Mark Bickhard’s interactivist model by incorporating mechanistic explanation, computation, and a non-encodingist correspondence. It argues for a mechanistic framework (drawing on the new mechanism’s focus on organized entities and activities) that preserves core interactivist principles: process ontology, representation as emergent from interaction, rejection of encodingism, and the centrality of system-detectable error. However, grounding representational normativity solely in action outcomes or self-maintenance faces limitations. This paper proposes instead that normativity arises from epistemic norms within cognitive problem-solving, encompassing alethic, pragmatic, and economic dimensions addressed through bounded rationality under functional constraints. A novel framework integrating interactivism with this mechanistic view, computation, and infocorrespondence (a specific, non-encodingist correspondence based on information channels) is developed. This framework emphasizes semantic information processed by computational mechanisms, with inconsistency detection between information vehicles (leveraging informational redundancy) serving as a key system-accessible proxy for representational error (falsity). This synthesis offers a more robust account of error detection, the truth-success relationship, and the grounding of epistemic normativity, thereby enriching interactivist theory.
Michele Pizzochero | Journal for General Philosophy of Science
The term “epistemic object” has been recently used by some scholars in the history and philosophy of science to refer to the peculiar history of objects of inquiry such as 
 The term “epistemic object” has been recently used by some scholars in the history and philosophy of science to refer to the peculiar history of objects of inquiry such as RNA, genes, electrons, or phlogiston. Despite the relative success of this neologism as an analytical tool, a comprehensive analysis of its many versions is still lacking. In this article, an attempt has been made to sketch such an analysis first by comparing three main versions of this idea: epistemic things, epistemic objects, and representations of theoretical entities. Second, these conceptions are compared with the notion of scientific concept, arguing that, although similar, they are not the same thing. However, a proposal suggested from the history of concepts program, Klaus Hentschel's semantic layered methodology, could be usefully adapted for epistemic things. Third, accomplishing such adaptation by drawing from the tradition of evolutionary epistemology is recommended, analyzing the potential fit between historical epistemology and the Evolutionary Epistemology of Theories programme.
Ave Mets | Acta Baltica Historiae et Philosophiae Scientiarum
I continue to examine the characteristics of φ-science in detail, this time focusing specifically on their empirical, material side: the material world studied by the sciences that could be classified 
 I continue to examine the characteristics of φ-science in detail, this time focusing specifically on their empirical, material side: the material world studied by the sciences that could be classified as φ. Vihalemm has left us with only a few indications of what he thinks about the material world as a domain of science. I seek further clues in Galileo’s studies, which were the epitome of φ-ness for Vihalemm, along with contemporary recreations of those. I discuss contemporary guides on addressing the manifestations of “imperfections” of materiality and their impact on measurement—errors and uncertainties, applying them to Galileo’s cases as well as to other previously explored case studies from Vihalemm’s repertoire and also on enabling comparisons with non- or less-φ sciences. Finally, I look into Norman Robert Campbell’s ideas of quantities, briefly discussed previously, and apply these to the cases at hand. The article concludes with a recourse to the idea of essence.
Abstract How are we to understand the historical origins of contemporary predictive systems? How have criminologists designed methods that were built into these systems? The article responds to these questions 
 Abstract How are we to understand the historical origins of contemporary predictive systems? How have criminologists designed methods that were built into these systems? The article responds to these questions by narrating a prehistory of predictive systems and the work of British criminologist Leslie Thomas Wilkins (1915–2000). Co-author of the first Home Office research study, Prediction Methods in relation to borstal training (1955), Wilkins spent the 1960s translating cybernetic ideas and methods to the discipline of criminology. Through leadership positions in the Home Office, the United Nations in Japan and at UC Berkeley, Wilkins formulated foundational methodological ideas that influenced a generation of policy and systems-oriented criminologists and the products of their work. The article turns to archival records from a 25-year period in Wilkins’s career (1945–70) to advance the claim that Wilkins’s unrealized proposals for an ‘operational criminology’ reveal several key latent methodological commitments designed into contemporary predictive systems.
Abstract This article is a response to Ronald J. Allen's “Reflections on Complexity, Evidence, and Law.” I begin by analyzing three key concepts that Allen employs in his argument: reductionism, 
 Abstract This article is a response to Ronald J. Allen's “Reflections on Complexity, Evidence, and Law.” I begin by analyzing three key concepts that Allen employs in his argument: reductionism, emergence, and complexity. On the basis of this analysis, I question Allen's criticism of the reductionist approach that, according to him, legal scholarship has traditionally taken to the study of law. There is a neutral sense of the word “reductionism” according to which most disciplines and sciences can be considered reductionist, and a pejorative sense according to which reductionism is epistemically objectionable. I try to distinguish these two senses, arguing that while legal scholarship is clearly reductionist in the first sense, it is not at all clear that it is also reductionist in the second sense. I conclude by putting some questions to Allen about the practical advantages that the study of law may gain by adopting methods and concepts drawn from complexity theory.