This volume collects twelve essays from a diverse group of scholars (mainly philosophers). Each contribution addresses some foundational issues in the growing, and so far epistemologically underexplored, field of molecular medicine. To focus philosophers’ attention on such a field of biomedical inquiry is a valuable goal in itself, and a merit of this book. But this volume is especially valuable because, as the editors, Giovanni Boniolo and Marco J. Nathan, affirm in the Introduction, it represents an example of how philosophers of science are contributing to the theoretical development of scientific disciplines. In many fields, we are witnessing the spread of an anti-intellectual attitude that considers conceptual reflection “an expensive ‘luxury’”. But, “discounting the significance of conceptual reflection is myopic” (p. 1), especially for those disciplines that are experiencing a fast increase of empirical data accumulation, but do not yet possess a well-established theoretical framework that is able to adequately account for all those data. This book tries to counter that tendency.
Some of the reflections in it will also be of interest to scientists and practitioners involved in the development of molecular medicine. Indeed, even though many chapters deal with an analysis of some traditional philosophical issues (e.g. causation, explanation) in the context of molecular medicine, these contributions do not merely review current research. Many chapters advance and defend precise theoretical proposals, which are documented and empirically informed.
More concretely, this volume tries to contribute to the debate on some controversial issues in molecular medicine (e.g. cancer heterogeneity, clinical trials, data mining of biological databases), and to do that in a proactive way. This is a relevant feature of the book, which is worth noting, since it makes this book interesting whether or not one agrees with any of the theses proposed in it. Even if every thesis advanced and defended turns out to be wrong in some sense, the relevance of such an attempt would remain intact. Indeed, its main aim is not to provide an exhaustive treatment of acquired knowledge, but to foster the current debate at the frontier of biomedical research, and provide fertile hypotheses to be discussed, tested, rejected, or accepted. In other words, this book tries to contribute to the theoretical advancement of a scientific discipline from a philosophical perspective. Advancement of science is impossible when new theoretical hypotheses are not proposed. This book faces the challenge of showing that philosophers of science can provide fertile theoretical hypotheses that are able to stimulate the scientific debate and the search for better answers.
The book has four parts, each containing three chapters. Each part focuses, respectively, on one of the following “key aspects of molecular medicine: What are its nature, scope and limits? How does it provide explanations? How does it represent and model phenomena of interest? How does it infer new knowledge from data and experiments?” (p. 6). The editors’ Introduction is a useful tool. It traces a brief historical sketch of the origin of molecular medicine, and recapitulates some of the crucial steps in the development of the philosophy of science, particularly focusing on the philosophical investigation of biology. In this way, the Introduction helps the reader to locate the volume in the current philosophical landscape. The editors also give a brief overview of the collected essays. An index completes the book.
In what follows, for reasons of space, I will briefly describe each chapter in order. These descriptions will vary in length due to two factors: 1. the diverse degree of closeness of the topics addressed in each chapter to my own area of competence; 2. the diverse degrees of closeness of the topics addressed in each chapter to the overall topic of molecular medicine.
In “Molecular Medicine: The Clinical Method Enters the Lab”, Boniolo addresses the issue of whether molecular medicine represents a genuine novelty. He discards the hypothesis that the novelty of molecular medicine is given by the level at which reality is investigated in this discipline, since investigations of the molecular level are not the exclusive concern of molecular medicine. He also discards the hypothesis that molecular medicine displays some methodological novelty, since this discipline has so far conformed to the traditional standards of the experimental method, which was introduced in medicine in the 19th century by Claude Bernard. To support the view that molecular medicine conforms to this method, Boniolo considers how cell lines are artificially made homogeneous and clonal in the laboratory. Obtaining such artificially produced cell lines is crucial, because it allows scientists to set up experiments through which a well-defined and isolated object can be investigated and manipulated in repeatable and controlled ways. In this perspective, molecular medicine is a science that studies kinds rather than particulars in the same way that physics does.
But something is changing. The current researches on tumor heterogeneity (i.e. researches devoted to investigating the relevant differences which can be found among patients affected by the same kind of tumor, and the differences which can be found among different parts of the same tumor) are blurring the traditional distinction between clinical method and experimental method. Clinical method deals with observations made at ‘the bed side’ of individual patients, while experimental method manipulates standardized objects in order to formulate general law-like statements. Molecular medicine has so far used the experimental method. But this method seems to be inadequate to face the challenges posed by tumor heterogeneity. To better understand the peculiarity of each tumor, tumor specimens have recently started to be cultivated in a different way, which no longer leads to a clearly individuated and general ‘object’ (i.e. a homogenous and clonal cell line), but which takes into account the complex and peculiar cellular environment of a specific tumor (primary tumor cells). Boniolo claims that this methodological shift constitutes the genuine novelty of molecular medicine. In this view, molecular medicine adopts techniques developed according to the experimental method to investigate a specific individual ‘object’ in order to make diagnosis and prognosis in a way that is analogue to the clinical method. According to Boniolo, in this way, contrary to the traditional opinion, it is possible to do science of the individual, since “the tumor cancer cell population actually is an individual (patient) in vitro” (p. 29).
Mariacarla Gadebusch Bondio and Francesco Spöring (“Personalized Medicine: Historical Roots of a Medical Model”) discuss the historical development of the concept of ‘personalized medicine’. By analyzing the historical roots and the methodological difficulties that the proponents of personalized medicine had to face in order to quantify qualities and features of individuals, they allow us to better understand important and often neglected aspects of contemporary medicine, such as psychodynamics and culture-specific aspects of illness. These considerations are also philosophically relevant, because they help to highlight the limits of genetic reductionism.
“From the Concept of Disease to the Geneticization of Diseases: Analyzing and Solving the Paradox of Contemporary Medical Genetics”, by Marie Darrason, addresses what Darrason refers to as the ‘paradoxes of contemporary medical genetics’. Until the 1960s, ‘genetic disease’ meant ‘monogenic disease’: an inherited Mendelian mutation in one gene was the cause of one disease. But things have changed. On the one hand, nowadays almost any disease can be considered to be ‘genetic’, at least to some extent (this is the ‘geneticization of diseases’). On the other hand, we now consider almost every genetic disease not to be truly ‘monogenic’. Thus, it seems that we lack a plausible definition of genetic disease, whilst simultaneously considering almost every disease to be genetic. Darrason scrutinizes and discards three proposed ‘solutions’ to this paradox, and finally proposes her own solution. According to her, we should not continue in trying to neatly distinguish between genetic and non-genetic diseases to solve the paradox of contemporary medical genetics. This strategy is based on the idea that identifying a disease as a ‘genetic disease’ allows us to identify the most relevant cause of the disease. But if diseases widely display multi-causal dependencies, this strategy cannot lead to clinical success. We should instead abandon the concept of ‘genetic disease’, and develop genetic theories of diseases, i.e. explanations for the common role of genes in diseases, even when genes are not ‘the most’ or ‘the only’ relevant cause of the disease. Interestingly, she raises the point that solving or clarifying such kind of epistemic issues might also be relevant in order to solve ethical issues.
In “Molecular Complexity: Why Has Psychiatry Not Been Revolutionized by Genomics (Yet)?”, Maël Lemoine tries to answer the question: why has psychiatry not been a successful field of application for genomics? Lemoine focuses mainly on Genome-Wide Associations Studies (GWAS). GWAS consist in investigating “all genes in the genomes of a group of diseased people, as compared with healthy people’s genome in search for a statistically significant association with particular disease” (p. 81). GWAS have been extensively used in molecular psychiatry. Nevertheless, they have been unsuccessful in investigating schizophrenia, bipolar disorders, and major depressive disorders, i.e. the three most investigated mental disorders in molecular psychiatry. According to Lemoine, GWAS have been unsuccessful because mental disorders display a form of causal complexity that this technique is unable to manage. The intricate way in which the environmental and genetical levels interact in triggering mental disorders cannot adequately be dealt with tools like GWAS. More sophisticated statistical models are needed to advance in molecular psychiatry.
“How Cancer Spreads: Reconceptualizing a Disease”, by Katherine E. Liu, Alan C. Love, and Michael Travisano, suggests that cancer should be conceptualized as an infectious disease and not just as a genetic disease. Despite a huge amount of investment, and the tremendous increase in our knowledge of the biology of cancer, cancer mortality did not reduce significantly in last decades. Since cancer is usually considered a genetic disease, and this way of conceptualizing the disease did not lead to the development of effective treatments, Liu, Love, and Travisano suggest that it may be useful to model cancer as an infectious disease. They support their view by drawing an analogy between cancer and cystic fibrosis. Cystic fibrosis is "both a genetic and an infectious disease, and its pathological facets require taking the latter into account explicitly (p. 102). Analogously, in this perspective, cancer should be modeled as an infectious disease in order to better understand and contrast the formation and spread of metastasis, which is what actually kills people in many cases. The way that cancer spreads is usually an under-investigated topic exactly because modeling cancer as a genetic disease led scientists to focus especially on tumorigenesis (how cancer grows).
“Evolutionary Perspectives on Molecular Medicine: Cancer from an Evolutionary Perspective”, by Anya Plutynski, maintains that an evolutionary perspective is necessary in order to make sense of cancer progression. In this view, “cancer arises from a Darwinian process of mutation and selection among somatic cells” (p. 126). An evolutionary approach to cancer investigates how competition among cancer cells for space and resources affects cancer progression. In order to illustrate the theoretical gains that can be derived from modeling cancer progression from an evolutionary perspective, Plutynski illustrates two mathematical models of cancer progression where two cell populations interact. This kind of model, which is widely used in ecology and evolutionary biology, describes the coevolution of two interacting populations by means of a set of differential equations. According to Plutynski, by describing the dynamics of interacting cancer cell populations, these models can help scientists to predict when and why cancer progresses or fails to progress. Moreover, modeling cancer in evolutionary terms can have a heuristic role, helping to formulate new hypotheses worth of further investigations.
Federico Boem and Emanuele Ratti (“Toward a Notion of Intervention in Big-Data Biology and Molecular Medicine”) address the issue of how to conceive of data mining in big-data driven projects. They maintain that data mining of big biological databases can be considered a sort of ‘formal manipulation’, which is more akin to traditional experimentation than is usually thought. In this view, searching for relevant correlations in the vast sea of data stored in a database by re-ordering those data according to some criterion is in some sense analoguous to what experimenters routinely do, i.e. to play with phenomena in order to discover something new about them through empirical manipulation.
In “Pathways to the Clinic: Cancer Stem Cells and Challenges for Translational Research”, Melinda Bonnie Fagan, tries to explain why clinical translations of cancer stem cells researches have been so far ineffective. Cancer stem cells are a small sub-population of self-renewing cells within a tumor or blood-borne cancer, posited to be responsible for maintaining and growing the malignancy. According to Fagan, it is necessary to clarify the very concept of ‘cancer stem cell’ in order to settle the controversy on the validity of researches devoted to understanding the role of this kind of cells in tumorigenesis. According to Fagan, we must distinguish between two cancer stem cells concepts with different content, i.e. the clinical concept and the basic one, which have to be evaluated by their own specific success criteria.
Nathan focuses on counterfactual reasoning in medicine in “Counterfactual Reasoning in Molecular Medicine”. He claims that counterfactuals are central in modeling diagnosis and prognosis in medical practice. To support his view, he shows that diagnosis can be conceived of as an inference to the best explanation, while prognosis can be construed as an inference to the best prediction. According to Nathan, for both these kinds of inferences, modal reasoning is crucial. Indeed, for selecting the best explanation (or prediction), we not only have to deal with what is the case, but also with what could have been the case. How should we conceive of counterfactuals? The debate on counterfactuals usually deals with semantic issues. According to Nathan, counterfactuals can instead be better interpreted as placeholders. In this view, counterfactuals stand in for predictions or explanations that a speaker commits to producing. Thus, counterfactuals are not explanations nor predictions per se (and so there is no need to refer to their truth or falsity when analyzing them), but useful tools (in some sense analogous to models) for reasoning about certain circumstances and convey information on these circumstances in order to take decisions on future course of actions.
“Forms of Extrapolation in Molecular Medicine”, by Pierre-Luc Germain and Tudor Baetu, is devoted to clarifying the concept of extrapolation in the context of biomedicine. Indeed, in biomedical research scientists routinely draw inferences from some source system to some target system, relying on the existence of some degree of similarity between the two. Just think of the use of model organisms in the development of new drugs (e.g. mice), where a model organism is supposed to be sufficiently similar to humans, or the use of clinical trials in the validation of a new treatment, where a trial population is considered to be sufficiently similar to the overall target population. According to Germain and Baetu, the problem is that the validity of extrapolation cannot be ensured by purely statistical means. Indeed, the validation of an extrapolation requires an assessment of relevant similarities and dissimilarities. But assessment of relevant similarities is a process that not only “escapes statistical solutions” (p. 218), but also “keeps resisting philosophical analysis” (p. 230).
David Teira (“Testing Oncological Treatments in the Era of Personalized Medicine”) argues for the adoption of a ‘moderate reformist position’ on cancer drug testing rules. There is indeed a tension between the requirement of large clinical randomized trials by governmental agencies to authorize the commercialization of a new treatment, and improvements in the way we are able to distinguish among diverse populations of patients on a molecular basis in order to produce targeted therapy. The more we restrict our target population, the more difficult it is to reach the population size needed to conform to the statistical requirements that have to be fulfilled to perform clinical trials. So, many remedies risk not being accessible by many patients. According to Teira, in certain circumstances, smaller phase II trials may provide enough ground for regulatory agencies to authorize targeted therapies (to patients who give their informed consensus) during the period of time needed to test those therapies by performing large and long random trials.
In “Opportunities and Challenges of Molecular Epidemiology”, Federica Russo and Paolo Vineis describe how epidemiology changed in recent years as a result of the adoption of a molecular approach. They focus in particular on the introduction of the concept of ‘exposome’ and the development of the ‘meeting-in-the-middle’ methodology. The concept of exposome refers to the study of molecular risk factors that can be detected both at the exterior and in the interior of the exposed organism. The ‘meeting-in-the-middle’ methodology is designed to detect not merely correlations between dangerous factors and disease, but also to detect all the relevant causal steps that go from exposure to disease. Russo and Vineis stress, too, how new -omics technologies produce a huge amount of data, which cannot be simply statistically treated, but which requires prior knowledge and hypotheses in order to be analyzed.
At the beginning of this review, I said that this book’s general philosophical attitude is valuable. It is more difficult to assess whether the book completely succeeds in fulfilling its self-assigned task, i.e. showing that philosophy integrates and complements the scientific enterprise. It is fair to conclude that this volume is a clear example of an increasing tendency towards integration among different human fields of knowledge in the process of better understanding what science is as a human enterprise. Like many collections, the chapters display differing degrees of fit to the volume’s main topic. They also display different capacities in formulating new fertile theoretical hypotheses. Indeed, some of the theoretical proposals advanced seem more directed to the analytic philosophy community than to the molecular scientific community, while others should confront more systemic viewpoints that are also already part of the contemporary debates on the same issues. Such proposals thus risk being deemed less relevant for scientists and philosophers as well, since they address questions that are of major concern for only a limited portion of the professional philosophy community. Other proposals, however, look very promising and worthy of serious consideration. However, a clear evaluation of the fertility of such promising hypotheses will be possible only after that they have been widely discussed. I hope to have offered a useful summary of each chapter with this objective in mind.
Finally, the volume addresses some relevant issues, but does not consider others, nor are relevant alternative perspectives on some of the controversial issues that are treated presented and discussed. For example, the somatic mutation theory of carcinogenesis is discussed (or, more often, tacitly assumed), while its main rival theory and explanatory models, e.g. the tissue field organization theory, is not mentioned. Another neglected issue is the difficulty with which traditional molecular techniques (and the molecular perspective in general), deals with the complexity of the inter-level regulatory dynamics of biological organisms. So, it is not easy to agree with the theses presented in some of the chapters. But, since exhaustivity or shared consensus are not declared aims of this book, these features need not be considered real weaknesses of this stimulating volume, which will certainly inspire further research on the topics that it addresses and discussion of the theses that it advances.
Originally published by ndpr.nd.edu on June 04, 2017.at