Lecture notes with Theory of Science at the University of Groningen - 2014/2015
Lecture 1
1. What is this thing?
A common view of science is that scientific knowledge is simply derived from the facts. In the first part of the book this view has been criticized because much of this statement cannot be justified, in this chapter it will be described why this view is not completely incorrect. It is claimed that science is based on facts and that these can be collected with the use of the senses. It is about what we can perceive with our senses instead of personal opinions. If the observation is done careful and unprejudiced, the facts will be a secure objective basis for science.
In the seventeenth century, modern science became possible. Until that time, the knowledge was based on authority of Aristotle and the Bible. Two schools of thoughts formally state this common view. Empiricism claims that scientific knowledge is based on sense perception. Positivism communicates the same in a broader sense still and was less psychological orientated. The logical positivists build further on the idea of positivism and paid more attention to the logical form of the relationship between scientific knowledge and facts. There are two issues involved in the statement that science is derived from facts. The first is the nature of the facts and how scientists can have access to them, the second is how laws and theories are derived from facts once they have been obtained. The assumed basis of science can be summarized within three statements. Firstly, observers perceive the world via their senses in an unprejudiced fashion. Secondly, the observed exists independent of any theory. And thirdly, the observed is handled as facts that serve as a reliable foundation for scientific knowledge.
Seeing is believing
Especially sight is used to observe the world, therefore, in this article only the sense sight will be addressed but this example is extendable to the other senses. There are two points in the empiricist view of science that are important. First, human observers have direct access to knowledge of facts about the world as long as they can see them. Second, two observers who look from the same place will the same object. However, these ideas are misguiding.
Visual experiences not determined solely by the object viewed
There is a lot of evidence that what observers see is not always the same although they are looking at the same thing. For example, optical illusions show that two observers do not always see the same objects in a certain picture. Moreover, they cannot voluntarily switch between which object they want to see in a picture. What you see is depended of culture, which has been found in experiments with African participants. These examples show that one’s sensory experience is not only determined by the outside world but also by one’s inner world; one’s believes and former experiences. Recording of what one observes is not just simply writing down one’s perceptions but includes background information. Phenomena can only be understood if one knows the context. Therefore, it is important to have knowledge prior to one’s perceptions. However, under most circumstances, what we see remains stable and therefore, science is not totally impossible.
Observable facts expressed as statements
The meaning of the word ‘fact’ is ambiguous because it can refer to a statement that expresses the fact and it can refer to the state of affairs referred to by such a statement. In science, it is about the latter meaning. Additionally, we have to distinguish statements of facts from perceptions that might occasion the acceptance of those statements as facts. If you claim that knowledge is derived from facts you have to think of statements, not of perceptions or objects. These notions counter argue the claims made earlier about the nature of facts. Even if we assume that we directly see things, it is obvious that we do not perceive statements that describe states of affairs (observation statements) via the senses. Statements do not enter the brain via senses. To formulate an observation statement, you have to have a conceptual framework. We do not have to observe facts first before we can derive knowledge about the topic from those facts. This is because the facts presuppose knowledge about the topic. For instance, a child has already learned a lot about apples before it can indicate the presence of an apple).
Why should facts precede theory?
The idea that we have to establish facts first before we can build a fitting theory around them is a rather silly idea if we have a closer look at it. How can we establish facts through observation if we do not know what kind of knowledge we are looking for or which problems we are trying to solve? Therefore, we forget now the idea that establishing facts should be prior to the formulation of theories. We acknowledge that the formulation of observation statements presupposes knowledge and that the search for facts is guided by knowledge. Despite of this view, we still have the question which statements can be derived from observation and which not.
The fallibility of observation statements
In the previous part we presupposed that observation statements can be established by observation but is this presupposition legitimate? As already mentioned, observers can perceive certain things differently which can result in disagreements about what the observable states exactly are. If we use categories to describe observations there is a risk that the categories are mistaken and hence the observation statements as well. This is illustrated with the example of the moving of the earth (see book). From this example it can also be concluded that judgement of the truth of an observation statement depends on the knowledge that forms the background against which the judgement is made. In addition to adjusting theory, also the definition of observable facts had to be changed (for an illustration with regard to Copernicus, see book). In this example we can see that relying on observable facts caused a mistake that had to be corrected. It also shows that, whether you believe that scientific knowledge is based on the facts acquired by observation or not, you have to acknowledge that knowledge and facts are fallible and subject to correction. Therefore, science based on observing is not as obvious as believed by the public.
2. Why, and how?
Science nowadays is seen as the embodiment of rationality, society brings forward almost unconditional trust of and esteem for science. Associations made with science include a critical attitude and open-mindedness. Science is further associated with liberal democracy, hence it functions as a characteristic of modern society. Science reveals new knowledge and communicates this knowledge to society. Epistemology, a main branch of philosophy, is the theory of knowledge. The questions that are asked in epistemology deal with the origin and the legitimacy of knowledge. Rationalism is one direction in epistemology. Rationalists believe that knowledge is based innate, i.e. naturally given ideas. Empiricists believe the exact opposite, namely that knowledge comes from the senses. Everything a person knows has been experienced through one of the five senses beforehand. At birth a human being does not possess any innate knowledge.
Related to the philosophy of epistemology are Realism and Idealism. Realists believe that knowledge pictures the objective world. Truth functions as the correspondence between knowledge and the world. There does not exist a way to prove realism, however, since it is impossible for us to find out the truth. We will always have to be satisfied with theoretical knowledge which cannot be proven on a final note. Idealists on the other hand regard knowledge as a subjective social construction. All knowledge has originally been made up by individuals and has been carried along in society. The knowledge of the individual is his or her own construct, knowledge is mind-dependent. The individual truth is coherent with the knowledge of the individual.
The idea of idealism goes in line with the idea of relativism. Relativism holds that no theory or concept is absolutely true or valid but that they are dependent on subjective views and social or historical contexts. Objective truth does not exist; the truth is dependent on the individual.
The pragmatic view holds that knowledge is functional and interactive. Knowledge exists for human beings to cope with the world. The truth is a way to succeed. The pragmatic view can be regarded as the intermediate position between both objectivism and subjectivism and further between realism and idealism.
There are several concepts that characterize the scientific method. These are the following: systematic procedures, well-defined methods, reduction (reducing phenomena to underlying principles), objectivity and clarity. Further, science is revisable. The hallmark of science are explanations; describing the underlying mechanisms that cause a phenomenon. In contrast to non-scientific common sense concepts, science is testable. Science is continuous, a sharp division between prescientific and scientific knowledge does not exist. Science is driven by theories. Theories are defined as coherent statements that organize, predict and explain behavior. Hypotheses about events can be derived from theories.
Bem explains the difference between deductive and inductive reasoning tasks. If a problem has got a well-defined structure and exists in a system of formal logic the conclusion is certain and we are dealing with a deductive reasoning task. Background knowledge from long-term memory is not necessary to solve these problems; the problem can be solved with the help of simple logically valid or invalid arguments. The content is not important, only the form of the arguments counts. The truth of the conclusion depends on the truth of the arguments.
Inductive reasoning tasks on the other hand provide a highly probable conclusion which is not necessarily true, however. The problem still has a well-defined structure in a system of formal logic.
Inductive problem-solving involves processes of hypothesis generation and testing. Accumulating information from positive or negative instances is used to come to a solution to the problem.
In science we are dealing with two different contexts: the context of discovery and the context of justification. In the context of discovery the focus lies on the description of the historical, social, psychological circumstances and influences relevant to the invention and discovery of scientific theories. The aim is to find out under which conditions science works. In the context of justification we focus on normative criteria for holding a theory true. According to the traditional view, philosophy of science is only about justification and not about the circumstances of the problem-solving situation.
Lecture 2
3. Falsificationism
Karl Popper functioned as the main advocate of falsificationism. Genuine scientific theories and hypothesis must make definite predictions which allow for them to be ruled out by observations; they must be falsifiable. Observations that would falsify s theory or a hypothesis must be clearly definable. The easier it would be to falsify a theory, i.e. the broader the predictions are, the better the theory. If observations are made that contradict are theory this theory is refuted. A universal, well-established theory can well be refuted by a single observation. Like evolution, in science only the fittest survive: only the theories that cannot be refuted by any observations will remain. Scientists’ aim should be to develop highly falsifiable theories and hypotheses. In a second phase scientists should make deliberate attempts to falsify their theories. The idea of falsification illustrates that a theory can never be proven correct. All that can be achieved is for a theory not to be proven wrong, i.e. not to be falsified.
Sophisticated falsificationism
The aim of science is to falsify theories and hypotheses and to replace them by better ones. If a hypothesis replaces another one it should be falsifiable to a greater extent in order to guarantee progress. We encounter a technical problem with this idea of science: no absolute measure of falsifiability exists. Ad-hoc hypotheses are hypotheses developed during research. They are conclusion drawn from falsified theories but they do not lead to new tests. Progress in science will be made through the confirmation of broad, hence bold, theories and through the falsification of precise, hence cautious, conjectures.
Inductivism can be regarded as the opposite of falsificationism. Inductivism requires empirical observations in order to claim that a statement is true whilst in falsificationism the absence of observations that falsify a statement is valued. Chalmers describes the advantaged of falsificationism over inductivism. The fact that theories and knowledge are fallible is not a problem since science is regarded as in search of constant improvement, the aim is progress rather than truth. Falsificationism is better at determining circumstances. Further, knowledge of the unobservable can be derived from observing facts, since as long as no observation that falsifies our theory has been made we conclude it to be correct. Lastly, unlike inductivism falsification does not involve inductive conclusions, i.e. falsificationism does not make conclusions about the large based on observations on a little part.
Limitations of falsificationism
Chalmers here looks at the limitations of falsificationism. If an observation is made that clashes with a theory it is not always the theory which is rejected. In some cases the observation statement can be rejected. The theory is retained. This leads us to the conclusion that conclusive falsifications of theories by observations are not achievable. How are we supposed to know in which cases we are to reject the observation and in which cases we are to reject the theory?
A further limitation that needs to be considered is the broad nature of a theory. Most theories are complex statements consisting of many sub statements.
A falsifications simply tells us that one part of this large complex does not hold. In most cases ruling out the entire theory should be considered rash when possibly only one of the premises is not correct.
Moreover, many real-life examples do not speak in favor of the method of falsificationism. Many important, and today well-acknowledged theories, would have never been successful if falsificationism had been applied. Very often premises were believed that were inconsistent with a theory and would have ruled the theory out. The Copernican Revolution serves as an example.
Chalmers concludes his chapter by referring to Popper’s notion that the condition of being falsifiable is easy to satisfy. This alone is not sufficient for a good theory. A good theory must further not be falsified.
4. Philosophy of Science
In the history of the philosophy of science there has long been a quest for a demarcation criterion, i.e. the quest for an unfailing criterion separating rational scientific knowledge from metaphysical speculation. Metaphysics is a branch of psychology dealing with the abstract nature of reality. Metaphysics are not accessible by the scientific method and related concepts such as consciousness, intentionality and causality. A clear account of a scientific method is required, however, to account for scientific progress.
Logical positivism is the philosophical position that emphasises on empirical data and scientific methods. According to logical positivists science consists of statements describing positive object facts plus logical relations between these statements. The connecting positive object facts have to obey to so-called correspondence rules. Logical positivists believe in verifiability, the test of meaningfulness. The meaning of a statement is the way in which it can be verified. If it is not specified how a claim can be verified or can be falsified it is meaningless. Verification is defined as assessing the fit between a theory and empirical facts. Several problems connected to logical positivism exist. The first one is that theory and observation are not independent; a completely objective observation is hence impossible. Further, no satisfactory demarcation criterion has been found yet; therefore no cumulative progress is guaranteed. Moreover, the notion exists that objective knowledge is not even desirable. Humanity is about understanding individual meaningful actions and not about looking at the world from an objective perspective. Lastly, it is impossible to verify general laws. According to standards of today’s science they can only be confirmed or falsified.
Falsification is what Popper believed in. He abandoned the idea of verification and confirmation and claimed that only falsification is logically possible. In science it is necessary to have a critical attitude to prevent meaningless cumulative knowledge of confirmations. Falsifiability holds up until today as the hallmark of rationality and of science, even though in real science it is rare. Often ad-hoc hypotheses are formulated instead of falsifying theories completely. Canap believed in a less strict and similarly more practical alternative to verification. According to him, for a statement to be meaningful some degree of confirmation must be possible. Confirmation to him depends on the number of observations that support a statement.
Several other schools of thoughts and positions towards the origin and legitimacy of knowledge of single philosophers and scientists are discussed in the chapter.
Post-positivism states that no rule can guarantee scientific rationality and that scientists have a dogmatic faith in their theories. Theory choice is socially and historically determined.
Hanson believed that observations are theory-laden; having different theories makes observers literally see different worlds. Data is not interpreted independently of the theories researchers hold.
Quine is an opponent of positivism. He is in favour of what he called epistemological holism. Epistemological holism states that no knowledge is immune to scientific refutation and further that no knowledge is completely theory-independent. Statements of science cannot be looked at in isolation. Together with his colleague Duhem he formulated the Quine-Duhem thesis claiming that any statement can be held true if we make adjustments somewhere else in the system. He describes the underdetermination of theory by observation.
The standard view of science today holds that the basic elements of scientific knowledge are sense data. Senses give us access to the world and observation statements reflect elementary facts. Theories in science can either be observation statements or can be based on theoretical ideas. Abstract unobservable terms in theories must be translated in terms of observation, a process called operationalization. All sciences should use the same methods of observation so that explanations and theories of different sciences can be unified. The scientific process is cumulative.
Lecture 3
5. Kuhn’s paradigms
Since Chalmers identified a range of limitations within the falsificationism approach Chalmers concludes that we need a more adequate framework in order to understand scientific theoretical frameworks. Chalmers introduces the need to view theories as structures. Historically we have seen that progress in major science does not work with the falsificationist approach, we need a more structured approach. A more general philosophical argument is associated with the fact that observation depends on a theory. There is a close connection between the precision of the meaning of a term and the role played by this term in a theory. Concepts derive their meanings from the role they play in a theory, which directly leads us to a need for more structured theories.
There are different ways how a concept can acquire its meaning. First, concepts can acquire meaning by the way of definition. Definitions must be rejected as a fundamental way of establishing meanings. This because concepts can only be defined whilst using other concepts, the meaning of which has already been defined. For instance, to use a dictionary is only useful if you know already the meaning of some other words. Second, concepts might acquire meaning by way of ostensive definition. It is difficult, if you are a child, to learn the meaning of an elementary word such as ‘’apple’’ and even more difficult for abstract terms such as ‘’mass’’. Concepts acquire meaning at least partially from the role they have within a certain theory. This can be seen in several historical examples, for example the term electric field, which became clear as the other electromagnetic quantities became better defined.
Kuhn (1962) summarized the progress of science as the following: pre-science, normal science, crisis, revolution, new normal science and new crisis. First we find ourselves in a state of pre-science. Next, we enter a period of normal science, people believe in the paradigms they are currently working with. A mature science is governed by a single paradigm which is precisely defined. An explicit characterization of the paradigm might not exist; scientists require knowledge about the paradigm indirectly through their scientific education. Paradigms are made of theoretical assumptions, laws and the techniques that are applied by the community. After sometime within one paradigm a crisis evolves; scientists begin to have doubt about the scientific paradigms. A crisis can be caused by anomalies that are seen as not agreeable with the current paradigm on very fundamental levels. A crisis becomes more serious when a rival paradigm emerges. The crisis develops into a revolution, a “gestalt switch” takes place. Kuhn added that this transition must occur suddenly or not at all. This statement is found to be puzzling by many later philosophers and scientists since a “gestalt switch” cannot take place all of a sudden before a new theoretical framework has been developed. The revolution ends in the establishment of a new period of normal science. The cycle can start anew after a while.
Paradigms and normal science
Science is directed by one paradigm that set standards for legitimate work within the field. Normal science includes trying to construct a paradigm in order to improve the match with nature. Therefore, Kuhn perceives normal science as solving a puzzle guided by the standards of a paradigm. If the puzzle cannot be solved this is because of the scientist rather than the paradigm. Puzzles that cannot be solved are called anomalies instead of falsifications of a paradigm. A normal scientist should not be critical about the paradigm, lack of disagreement over the fundamentals is a characteristic of pre-science, which is often relatively disorganized. Kuhn argues that if you try to give a precise description of a paradigm, it always occurs that some work within the paradigm does not meet the criteria.
Crisis and revolution
How can a paradigm be replaced by an alternative? Normal scientists work within an area defined by a paradigm. If there are failures within the paradigm this could result in a crisis and finally the paradigm might be rejected and replaced by an alternative. Even if there are anomalies (there will always be anomalies) that does not have to undermine the confidence in a paradigm. If the crisis becomes too serious, a revolution could start. Rival paradigms focus on other questions and contain incompatible standards. It is not possible to demonstrate that one paradigm is better than the other because many factors are involved in choosing for one paradigm. Furthermore, scientists using different paradigms also use different sets of standards and metaphysical principles. These standards that cause people to change to another paradigm should be discovered by psychologists or sociologists. Kuhn argues that rival paradigms are incommensurable because of the reasons mentioned above. According to Kuhn, after the crisis and revolution there will be some relevant theories left whereas the others will be excluded and finally disappear.
The function of normal science and revolutions
It might be argued that Kuhn perceives the nature of science as descriptive only, that he only wants to describe theories and paradigms but this is not the case. Kuhn argues that he also explains the function of various components of a theory. What are these functions? In the period of normal science, the scientist has the opportunity to work on matching the paradigm with nature without questioning the paradigm itself too much. In order to facilitate progress and not having only one paradigm it is necessary to have revolutions. The replacement of a paradigm is necessary if there arises a serious crisis. These revolutions are therefore seen as opportunities to make progress. This is Kuhns alternative to the progress that can also be made in inductivist accounts of science. This refers to the increase of scientific knowledge by making more observations through which new concepts can be formed. According to Kuhn this is not correct because the guiding role of the paradigm is neglected. However, the paradigms are not defined so specific by Kuhn as is stated above. Every scientist interprets the paradigm slightly different through which attempts to study the paradigm are multiplied and the chance of success is increased.
The merits of Kuhn’s account of science
The idea of Kuhn that science is about solving problems with a framework that is not being criticized is in a certain way descriptively correct. The notion of the revolution to replace a paradigm was not first mentioned by Kuhn, also Popper argued that theories should be critical evaluated and replaced if necessary. However, according to Kuhn, the replacement is not only a replacement of a set of standards. It involves a change in how the world is viewed and made up. According to Kuhn, problems could vary from paradigm to paradigm but which standards to judge the paradigms should then be used? And how can revolutions specifically cause progress in science?
Kuhn’s ambivalence on progress through revolutions
The answer on the question how progress could be derived from revolutions is quite ambiguous. What Kuhn states in his book in 1970 includes a relativist and a non-relativist strand, which are actually incompatible. It is therefore possible to choose and develop the relativist strand in Kuhn’s thought such as sociological research or to ignore the relativism and rewrite Kuhn in a way that is compatible with a broader meaning of progress in science. Which option Chalmer finds best will be at clear at the end of the book.
Objective knowledge
Kuhn states that the transition between paradigms should occur all at once or not at all. According to Chalmer, Kuhn confuses two sorts of knowledge here that should be distinguished. With knowledge, often a state of mind or attitude is meant, the author calls this knowledge in the subjective sense which he will distinguish from knowledge in the objective sense. He argues that scientists might be unaware of the relevance of some findings but that objective relationships exist between parts of the structure independently of whether individuals are aware of that relationship. It happens often that unexpected consequences are contrasting another theory. Therefore, it is relevant to take the sense (objective/subjective) in which a paradigm can be seen as an improvement into account.
When Kuhn talks about the gestalt switches this is more on the subjective side of the dichotomy. It suggests that different viewpoints cannot be compared. The author suggests to remove what Kuhn said about gestalt switches and to focus on objective characterization of paradigms and the relationship between them. The question why one paradigm might be better than another paradigm is different from the question why scientists change their allegiance from one to the other. These decisions are often influenced by subjective factors. Another question is what the relationship between one paradigm and another is. If the sense has to be found in which science progresses we have to look at the relationship between paradigms.
6. Kuhn
The post-positivist Anglo-American philosophers were criticizing empiricism and positivism, as a reaction they focused more on knowing the subject instead of the observed object. The second generation philosophers became more interested in the way by which scientists establish their theories and hypotheses than in the logical structure of theories. They increasingly emphasized the context and history which resulted in more subjectivity and finally lead to relativism.
A role for history
With the publishing of Kuhn’s book The Structure of Scientific Revolutions (1962), a revolution and new paradigm came into being. Whereas in the time of the positivists logical theorizing and the context of justification were most important, now the historical, social and personal factors became more important. Kuhn thought that the standards for justification would vary with history, theory and social practices. Presentism is that what appeared to be the correct hypothesis is the real science and that what turned out to be wrong the pseudo-science. If you do not want presentism, you have to acknowledge that rationality is directly connected to context place and time.
Paradigms
A paradigm is described as a whole complex consisting of (1) theories and concepts within worldviews as well as (2) laboratory techniques and material as well as (3) social processes and institutional structures. Paradigms determine which are legitimate problems and solutions in a field of scientific research. Paradigms are incommensurable; each of them applies to different categories and settings. In periods of “normal science” a paradigm is used as a framework that has been agreed on. This framework is then filled with new data and is extended to new domains. Methods and findings will develop to be more exact and precise. The general framework, however, is not falsified or even doubted.
Revolutions
Kuhn argues that the idea of the positivists of cumulative progress in science is incorrect. Science progresses through revolutions followed by periods of normal science. A revolutions means a change of paradigm which occurs after a crisis of the old methods. Those revolutions are irrational since paradigms cannot be compared with one another and criteria for rationality can only be applied within one paradigm. Paradigms arise through circular reasoning, they are aimed to give a better explanation of the facts but these facts are first created by the paradigm. Therefore, ‘’facts are theory-laden and paradigms make their data’’. However, it is necessary to adopt a new paradigm if one is excluded in order to continue doing research. A revolution does not contribute to the cumulative process but starts a completely new era of science.
Normal science
Revolutions are the opposites of times of normal science in which frameworks are not questioned or criticized. Scientists aim to redetermine the previously known, establishing more facts that fit into the paradigm. From this perspective it would not be useful to falsify because making progress lays in finding more details within the paradigm. It is like solving a puzzle, only when the puzzle cannot be solved, people may develop a wish to adjust strategies. It can be concluded that revolutions are necessary to continue renewing whereas normal science is necessary to gather deeper knowledge and extension of the framework.
Summary of circle of paradigms
The circle of paradigms is summarised in the phase model of scientific development: We begin in the preparadigmatic phase in which a paradigm is developed. After a while this paradigm starts to be criticised; a crisis arises. A revolution takes place and a new paradigm will be developed and implemented.
Laboratory practices
A paradigm is more than just a theory because also instruments and methods are included. In addition, according to Kuhn they include more fundamental shared commitments. Over time, within philosophy of science the theory focus decreased and practice, intervention and laboratory skills became more important which is called pragmatism.
Incommensurability and relativism
The underlying cause behind shifts in paradigms is not the truth or attempt to approach reality more accurate but a struggle between competing research communities in which grants, publications and jobs are involved. Relativism is the idea that not truth but irrational social and historical factors determine the results of a crisis might be unavoidable. Kuhn attempted to show that communication between research groups is possible, he did not agree with the relativist view. It is argued that paradigms might be similar to language games (developed by Wittgenstein) that are forms of life. Language games are about practice and not about rules in which communication might be difficult but understanding is possible. These language games are not totally irrational and therefore relativism might be avoided and rationality, in the way that was strived for by the positivists, preserved. The positivists viewed rationality as the cumulative progress of objective data. This kind of rationality is possible according to Kuhn, since social and historical factors are always part of science and the context of discovery cannot be separated by justification.
7. Philosophy of Science – Recent proposals and debates on scientific knowledge
Bem discusses several ideas related to the philosophy of science in this chapter. One of these ideas is hermeneutics. Hermeneutics was a method for understanding difficult texts before it turned into a philosophical approach related to epistemology. It claims that one needs to understand the meaning of an idea to be able to experience them. The meaning of an idea is embedded in a historical and social context. According to hermeneutics understanding becomes a complicated and insecure interpretation of the world. Interpretations do not come with a conclusive objectivity, they are further constantly changing.
Social constructionism claims, as the name states already, that scientific knowledge is the product of social construction. Language does not depict reality, it creates reality. Objectivity does not exist.
Several theories of truth exist. The correspondence theory of truth states that truth does consist, namely in the correspondence between a thought and reality. This theory is related to realism. The problem with this approach, however, is how to assess the correspondence.
According to the coherence theory of truth, truth consists in the coherence between a thought and other beliefs. This theory is related to idealism or relativism. The issue arising with this approach is that there is no mind-independent reality.
The consensus theory of truth argues that truth is what is agreed upon by common consent. This idea is associated with relativism and social constructionism. Again, the arising problem is that no mind-independent reality seems to exist. In accordance with this approach everything would be constructed.
The pragmatic theory claims that truth is only perceived in activities, it cannot be seen apart from its practical consequences. This approach, however, is ridiculed by the notion “true is what works”. Criticisms do not regard this approach as a valid description of truth.
Bem goes on to explore relativism. Relativism claims that there is no objective or absolute aspect to knowledge but that it depends on differences in perception and consideration. There are several different notions to the idea of relativism. Ontological relativism claims that the existence of objects depends on our own thoughts and concepts. Epistemological relativism indicates that we do not know a mind-independent world. Even if it existed it would not be possible for us to perceive it since we have to perceive it with our minds that make subjective changes to the perceptions. According to the relativism of truth we cannot find truth outside of individual human interests and interests of human communities. Relativism of rationality asserts that universal standards for rationality and rational discourse do not exist. Likewise, relativism of morality argues that universal standards for rationality do not exist. Several problems arise with relativism. The main problem should be that if relativism holds true there is no room for relativism itself, it is self-defeating. Further, the evaluation of other viewpoints is impossible because one cannot judge it objectively, it is impossible to step back from one’s own subjective perception.
An extreme version of relativism is solipsism, the idea that only what exists in one’s thoughts is real: “I am the only reality”.
Bem then goes on to explore realism. Realism claims that the world exists independently of its observer. Bem describes different notions to realism. In direct realism the object is perceived directly, there is a direct connection between the organism and the environment. No mental mediation is necessary. According to internal realism this kind of mediation is necessary; the world is perceived through mediation. Knowledge is merely a belief or a mental representation that supposedly corresponds with objects in the real world. Scientific realism describes the dichotomy between everyday perception and the real underlying world. Scientific rules should be interpreted realistically. Scientific rules should be confirmable. Theories should be based on previous theories.
Several problems exist in relation to realism. According to realism scientific theories are fallible; we do not possess absolute knowledge about the world. Further, an indubitable foundation for language does not exist.
Pragmatism is the belief that knowledge should be used to guide our actions. We should deal with knowledge rather in a practical than in a theoretical way. Beliefs should help us to deal with realty; they should not be regarded as a representation of reality. Knowledge should be seen as skills. According to Hacking experimental work provides the strongest evidence for scientific realism.
Naturalism aims to look at how science has actually done instead of giving universal standards beforehand.
8. The neurolaw debate
Introduction
Developments in neuroscience and neurotechnology are important topics of debate at the moment. It is argued that neuroscience challenges our notions of free will and responsibility and that this threatening societal practices concerning accountability and punishment for one’s deeds. Possible issues are the adjusting of legal doctrines according to neuroscientific findings or improving the practices in forensic settings, such as the assessment of the credibility of testimony. There are some cases in which neurogenetics and neuroimaging evidence led to mitigated sentences since they showed a tendency towards aggressive behaviour or a mental disorder. Research has shown that providing neuroscientific knowledge or images increases scientific credibility. Some believe that everything will stand or fall depending on answers that will be given with the help of brain research, whereas others argue that these technologies are not yet ready to be practically applied because of methodological and conceptual limitations. In this article two domains that are important in neurolaw are analysed. The first is how neuroscience knowledge is understood psychologically in the context of free will and the second is the application of the identification of dangerous brains in forensic settings. Neuroscience explanations depend at least partially on social and psychological context knowledge and therefore they elicit scientific challenges. Neuroscience will probably gradually improve the legal context rather than overthrowing social practices.
Free will in context
Freud’s idea that the ego is not the master in its own house is already 100 years old but it has also become popular in neuroscience research. An example is the research conducted by Libet in the 1980s that showed that the intention to press a button is preceded by a readiness potential that can be recorded from the premotor cortex. This finding, among others, leads to the idea that the causes behind our actions are maybe not conscious. Although the study has been criticized for various reasons, the finding that the readiness potential does not differ between the decision to move or a decision not to move has been reproduced and this is interpreted as evidence against unconscious movement initiation. A more recent study by Soon and colleagues found that ten seconds before the intention becomes conscious, unconscious determinants of the action, either pressing the left or right button, become visible. The important point here is that behaviour might be caused by processes of which one does not have control. There are various aspects on which this experiment by Soon can be criticized, for instance, the predictive value was only 60% whereas 50% is chance level and many participants were excluded after a pre-test and even after the data was collected.
Subjects were told that, at some point when they felt the urge to do so, they were to freely decide between one of two buttons and press it immediately (Soon et al, 2008). However, in the supplementary material the instructions were different, subjects were told to relax and to press either the left or right button with the index finger of the corresponding hand immediately when they became aware of the urge to do so. In the first instruction it is assumed that there is a decision process but in the second this is not. This shows a conceptual problem of what it exactly means to have the urge to press a button, which is an urge that is not often present in daily life and it undermines the meaning of the kind of decision the subjects had to make, if it was a decision at all.
The second critical point relates to the brain areas from which the predictions were made, the lateral areas in the prefrontal cortex. It could be expected that brain areas that are more often related to unconscious biases were activated but the contrary was found. The subjects had to assign the time of the conscious intention to perform the action and this presumes that a psychological process is unconscious and suddenly pops up into consciousness. This way it is not possible anymore that such a process enters consciousness gradually. Especially because of the high-level control areas, it is neuroscientifically implausible to argue for unconscious determinants to undermine free will.
The compatibilist stance, that is, the idea that determination of actions is compatible with free will, is no solution for the free will problem because if our actions were caused by unconscious processes without control, it would undermine our understanding of being to some extent rational agents. It might be possible that, over time, the assessment of the mental capacities of a victim will be complemented or replaced by neuroscience but only if these capacities can be translated into assessable neuroscientific properties. It is the question to what extent neuroscience can add something to the behavioural ascriptions of mental capacity that are used currently. Also, there are various statistical-theoretical considerations such as that findings should be generalizable to the real world, not only in experimental settings. Furthermore, practical relevance has to be demonstrated, since significance on average will not be sufficient to refute rationality in general. In addition, irrationality should be found in all kinds of situations since not being rational in some situations does not mean that people are irrational at all. People should also look at evidence against the notion that we are irrational actors. For example in priming tasks in which a subliminal (unconscious) prime is provided, this should lead to a reduced reaction time if the prime were to be effective, which is also found in a study by Jaskowski (2008). In another study it was found that the more mindful participants are, the less susceptible to unconscious priming they are. This can be interpreted as favouring the idea of having at least some conscious control of our actions.
Dangerous brains in context
In the debate on forensic implications of neuroscience is often referred to cases where acquired brain damage is related to criminal behaviour. The case of Phineas Gage is one of the most famous examples but there are various aspects of this case misinterpreted, leading to over-estimation of the contribution of brain damage. Gage lost an amount of brain tissue in the ventromedial prefrontal cortex (VMPFC) after an accident with a tamping iron. It is stated that Gage has become a pseudo-psychopathic, immoral person and that this change was the direct consequence of the accident damaged his VMPFC. Macmillian has found that in the past 150 years the descriptions of the accident and therefore the personality changes have changed a lot. Among others, after the accident Gage was described as clear and rational but when recovered from the shock many people wanted to put their fingers into the wound to see whether it was possible to touch the other finger inside the skull. This is relevant as Gage had severe infections afterwards. There are only a few statements in the report of Harlow (his doctor) that provide evidence for personality changes. Other people who saw Gage in his first year after the accident did not mention any personality issues. Only in the report published by Harlow 20 years after the accident and eight years after Gage’s death because of epileptic fits, there is written more about possible personality changes. It can be concluded that the evidence for personality change is scarce and abstract and that it does not refer to criminal behaviour. In addition, associations with the brain damage neglect trauma effects or damage due to infection of epileptic fits.
More recent cases that are often referred to are those of EVR and JZ who had both suffered from a brain tumour that was surgically removed. JZ fulfilled five of the core DSM 3 criteria of an antisocial personality disorder (ASPD) such as lack of consistent work behaviour and planning failures etc. Although these problems lead to impaired functioning, this is not real criminal behaviour. Moreover, the criteria for sociopathic disorder and ASPD changed over time which means that when criteria change, the interpretation whether these patients were antisocial can change as well.
Another case that is often mentioned is that of a 40 year old man who started collecting child pornography and molesting his stepdaughter. He could choose a treatment or go to jail. However chosen the treatment, he was expelled and had to go into prison. The evening before, he increasingly experienced neurological problems and after a MRI scan huge tumours were found in the right orbitofrontal cortex. After resection of the tumour, the man successfully followed a treatment program. After a few months he started collection pornographic materials again and the headache returned. Regrowth of the tumour was discovered and after removing this, his behaviour improved again. It is easy to conclude that the brain tumour caused the paedophilia but there are strong reasons to doubt this. The patients problem was much more general then only child pornography and although this might be interpreted as limited capacity of self-control, there is also evidence that he did have some self-control. For instance, hiding the pornographic materials and going to the hospital on his own. He was apparently aware that his actions were wrong and he could to some degree resist his urges. Therefore, he was minimally rational although his neurological problem might have been an aggravating factor. It becomes clear that each case has to be analysed individually and thoroughly. Also, since parts of his VMPFC had been removed, antisocial behaviour could have been expected but his behavioural problems disappeared.
Two contrary cases
The following two cases provide evidence against a direct link between damage to the VMPFC and criminality. A man who had fallen into a spiked iron gate at age 20 did not have problems until 60 years later. Large parts of the VMPFC were damaged. After the accident, the man became depended on others and was incapable of planning and fulfilling his responsibilities but also cheerful with no difficulties controlling emotions. He had a very caring environment and did not need psychiatric services. Another 33-year-old man had previously shown pathologic aggressive and violent behaviour. After a failed suicide whereby his VMPFC was damaged, he became docile, indifferent to his situation and inappropriately cheerful.
Associations between VMPFC damage and antisocial or criminal behaviour are frequently present in the literature as shown by the described cases. An analysis shows, however, that the problems after brain lesions do not have to be criminal. The course of life is also depended of social support. The VMPFC cannot be seen as an area where morality or appropriate behaviour is localized. According to the author, the mentioned cases receive too much attention and contextual features are often neglected. Also, meta analyses show that criminals do not share neural conditions at all. We can doubt that complex behaviours such as in ASPD can be localized in the brain because it is such a general category of behaviours. ASPD can be diagnosed in individuals who do not share any symptoms as 99 different combinations of symptoms can be made.
Conclusion
Neuroscience may improve forensic science in the long run, especially if legal questions pertain to neural questions but it is far from threatening to overthrow social institutions and the free will problem. The evidence so far cannot be generalized to most aspects of decision-making and could be re-interpreted as shown in this analysis. That in an Italian case sentence mitigation took place is therefore surprising.
Lecture 4
9. Towards responsible use of drugs
Cognitively enhancing drugs are prevalent and used by many individuals already. Society should accept the advantages of these improvements and should make cognitively enhancing drugs accessible whilst taking care of the involved risks. Stimulants such as Ritalin and Adderall are often prescribed for medical purposes, for instance to treat the symptoms of attention deficit hyperactivity disorder (ADHD). The drugs, however, that were originally developed for the purpose of treatment do also increase cognitive functioning in healthy people. In most cases human beings are in favor of intellectual innovation. Greely and colleagues see enhancing drugs in one line with for instance education and healthy eating habits – ways to benefit from present means in order to improve a species’ skills. The argument that those drugs alter brain functions does not hold since –strictly speaking- every single action one undertakes in order to improve cognition alters one’s brain structure. Many people would just simply reject the idea of cognitive enhancers since they perceive it to be “against the rules”. There are no rules, however, that allow tutoring and caffeine but do not allow enhancing drugs. The argument that drugs are unnatural is invalidated by the notion that most elements in today’s daily life are not very natural anymore, so why would it make a difference in this particular case? Greely and colleagues suggest that mentally healthy adults should have access to the drugs in question.
The authors acknowledge three ethical issues related to the use of cognitive enhancers. Firstly, the authors are concerned about the safety of the drugs; specifically they mention the occurrence of a new class of side-effects such as unwanted recollections and side-effects in children. More clarity should be accomplished by the means of research. A second concern is freedom; it is possible that an implicit or explicit pressure to take drugs will emerge as a result of the fear not to keep up with other members of the society. A third, related, concern deals with fairness. Unfairness already exist with respect to methods of enhancing cognitive abilities, drugs could further increase this unfairness depending on the general availability on those drugs. Greely and colleagues suggest enforceable policies in order to deal with these issues, a policy which is not laissez-faire but which is not primarily characterized by laws. A many-sided approach based on four mechanisms is proposed. The first step should be to conduct more research and reach a state of more certainty concerning the use, advantage and possible risks of drug use. Secondly, organizations and associations should formulate guidelines for their members concerning the use of cognitive enhancers. Physicians should offer their knowledge and help with this. Professional standards rather than laws would control the use of enhancing drugs. Thirdly, society should be broadly educated concerning advantages and disadvantages. Lastly, laws should be enacted, but in a limited and careful way.
The authors conclude with the appeal to not ignore the problems enhancing drugs can bring along but to solve these problems by maximizing the benefits whilst simultaneously minimizing the harms.
10. A phantom debate
Quednow, pharmacopsychologist himself, announces the review to rather be a reply to the book “Neuro-Enhancement: Ethik vor neuen Herausforderungen”. The book contains a variety of articles dealing with cognitive and mood enhancement than a classical book review. It can be divided into three parts according to issues that are dealt with. The first topic is the distinction between treating sick and healthy individuals with enhancing drugs. The second part deals with general problems and chances those drugs bring with them.
One article addresses the question of whether they would morally have to be made obligatory. The last part outlines several issues of emotional enhancement. Quednow would like to critically look at some of presumptions concerning pharmacology and epidemiology on which recent literature on neurological enhancement is based. He disagrees with the claim of the concluding article of the book, a statement that is found in most of the recent articles. It states that the drugs in questions will soon be efficient and agreeable and will therefore be attractive to possible users. According to Quednow the present drugs are far from attractive to possible clients. So-called stimulants (e.g. methylphenidate and modafinil) increase attention, arousal and motivation and only indirectly affect cognitive functioning. Further, stimulants only increase performance in subjects who are not cognitively very highly functioning in the first place. Has an optimal level of neurotransmitter concentration in an individual been reached before the drug is taken there is no cognitive gain to be expected.
Moreover, a disadvantage of cognitively enhancing drugs is the frequently occurring decrease in another cognitive domain that goes along with an increase in the first cognitive domain; it is very unlikely that several domains can be improved at once. Effects and many of the still present side effects vary between individuals, not enough certainty concerning the effects has been reached yet. Quednow warns that many dangerous side-effects come with the drugs and that many of the drugs carry a high potential of addiction. Quednow forecasts that in close future there will be no safe and attractive drug for cognitive enhancement available.
The article concludes on a note on frequent abuse of the enhancing stimulants, especially students of the age from 19 until 24 use them to improve both study skills and mood. Males are more likely than females to make use of stimulants. Women tend to make use of antidepressants rather than stimulants. On the whole, lifetime prevalence of stimulant use seems to have decreased recently and Quednow does not forecast this to change in the close future since no unquestionably attractive enhancing drugs are available at the moment.
11. Cognitive enhancement
The article “Just How Cognitive Is „Cognitive Enhancement“? On the Significance of Emotions in University Students Experiences with Study Drugs” written by Scott Vrecko was published in 2013 in the journal of “ajob Neuroscience”. The article deals with the effects of study drugs on cognitive enhancement which is claimed to be the main reason why students use stimulant drugs as Ritalin and Adderall. Usually these are stimulant medications used in the treatment of mental illnesses, disturbances and disabilities but the number of individuals using these drugs without having any of these problems steadily increases. This fact raises a lot of social, ethical and policymaking questions. Examples of these would be whether it should be seen as a form of cheating or if these drugs should be legalized for nontherapeutic purposes. It might even be that the increasing prevalence of using study drugs might lead to peer pressure resulting in more and more students taking these drugs. Even though there has been a lot of research in this area none of it has examined the everyday uses and users of these medications so far. In-depth research and the collection of qualitative rather than quantitative data is lacking. Furthermore research that has been conducted by now is portraying the effects of stimulant medication as merely affecting intellectual capacities like executive function, working memory and information processes, claiming ADHD pills as “smart pills”. The study reported here rather looked at the associated changes in emotional states and how these might impact the improved academic performance.
Methods and procedure
The researchers collected their data via 24 semistructured interviews. Their participants gave verbal informed consent to avoid the record of any personal information. The participants were all students from an elite university on the East Coast of the United States and had to have experience with taking prescription drugs to improve their academic performance without having any psychiatric condition associated with impaired academic performance. Gender was roughly balanced. The interviews that were held with the participants were recorded as well as transcribed into textual data. These transcripts were analyzed by software and coded for general themes the participants had mentioned. For analyzing the data the researchers used a grounded theory approach, thus they identified categories or themes merely on its emergence from the data. By using this method the researcher were able to identify four broad terms which describe the affectionate components participants reported to have had experienced while using study drugs. The identified categories were mentioned in the majority of the conducted interviews in connection with enhanced academic performance after the use of study drugs. These categories are “Feeling Up”, “Driveness”, “Interestedness” and “Enjoyment”. Further we will elaborate more on each of them.
Feeling Up
Many respondents reported that taking the stimulant medication enhanced their general levels of well-being and their perception of energy. This phenomenon was recorded to be experienced mentally as well as physically and might be a driving force in the increased productivity study drug users report. Many respondents reported tiredness to be a major obstacle for them to get study work done without taking any drugs. This tiredness was not reported to be caused by lack of sleep but rather by the draining thought of studying, thus the stimulants might help to overcome the negative feelings many students associate with academic work. Next to the energization participants reported improved moods following drug use which were perceived to enhance their abilities to study.
Driveness
The feeling of driveness was described as an internal push or even pressure to do something. Respondents told that this urge to get started in some way generalizes to other domains like cleaning or doing sports. To prevent themselves of getting distracted by these urges many students try to focus their attention and thus their overwhelming energy on studying in particular. To do so they might repeat specific internal dialogues (e.g. “Ok, its work time. Ok, its work time…”) or practices like opening and staring at their textbooks until the effects of the medication they took start to occur. This usually takes 20-30 minutes. Once the effects of the drug have kicked in participants reported an inner urge to work until their work is completed. This might lead them to study for seven hours without having a single break, sometimes they even forget to eat. Some of the participants describe the experience of driveness to be associated with feelings of tension or stress that even might continue after all the work has been done. Still the functional improvements it provokes seem to be valued highly enough by the participants to continue the use of study drugs.
Interestedness
Many of the participants reported that taking stimulant drugs to engage in academic work actually made them to become more interested in the topic they had to study or they had to write about. Thus their emotional state concerning their specific work has been altered due to the stimulant drug leading to significant study benefits. Their enhanced interest made them to not only become more engaged in their work but also to stay engaged without interrupting themselves by for instance checking e-mails or webpages. Furthermore participants reported that they are also less interested in social interactions while they are on study drugs since they do not seem to be important to them anymore. Many of the participants reported that they have in general very little interest in academic work as a whole and often times do not see why they should complete a certain assignment. Taking Adderall or Ritalin helps them to overcome these kinds of obstacles.
Enjoyment
The feeling of enjoying their academic work is another emotional phenomenon reported by participants. They feel to get really connected to their work, get into it and experience fun while completing it. Many of them reported to forget the time over their work and that they are often surprised for how long they have been working without really noticing it. This correlation between enjoyment and productivity might be driven from both directions, thus the enjoyment might enhance the productivity as well as the productivity might enhance the enjoyment of the work.
Discussion and conclusion
Based on these findings it can be stated that emotional dynamics seem to play an important role in improved academic performance after the use of stimulant-based medication. These changes in emotional states might not only be due to the stimulants effects on the central nervous system of the user, but might also be caused by the users’ experiences and expectations of the effects of the drugs. Furthermore it is interesting that most participants do not believe that the drug enhances their intelligence or cognitive abilities which is often said to be the case (e.g. “smart pills”). Most of the interviewed study drug users have strong or even exceptional records in their studies and on standardized tests. They also report to be able to maintain focus and attention in activities they enjoy to do. Rather than improving their cognitive capacities study drugs simply may help to focus the attention on academic work many students struggle to focus on.
In line with the current research findings are:
Research findings that indicate only little improvement of performance in cognitive tests after stimulant use.
Research findings that bring verifying evidence by the fact that the brains dopamine system is largely affected by stimulant drugs which is associated to attention as well as to pleasure and emotions.
The well-established prescription of stimulants in order to produce euphoric effects provides further confirming evidence.
Future research should try to investigate the link between the changes in cognitive functioning and affective states in more detail in order understand them. A more representative sample that varies in ethnicity and socioeconomic status would enhance the generalizability of the study findings.
12. Methylphenidate
John Harris and Anjan Chatterjee, respectively, outline the positive and negative aspects of taking drugs that enhance cognitive performance such as methylphenidate, better known as Ritalin.
Harris points out the successful and additionally clinically proven safe nature of Ritalin. Safe means that the benefits of taking the drug outweigh the number and degree of observed side effects. Ritalin is used safely in many children and adults who suffer from attention deficit hyperactivity disorder (ADHD) already.
Ritalin fulfills the safety guidelines that are required for safe use of drugs used for medical purposes. Harris suggests that these guidelines should hold for cognitively enhancing drugs as well. The benefits of the drug are the following: a clear improvement in focused attention, hence enhanced study skills, and an improved executive functioning in general. Harris points out the difference of “real life” -where we are dealing with life or death and where technically no rules against taking enhancing drugs exist- to sports, where taking drugs is cheating simply because someone claimed rules that forbid drugs.
According to Harris the ethical phenomenon of advantages that people gain from taking enhancing drugs is not distinct; many other technical developments beforehand bestowed the user with advantages. The solution was not to ban those but to make them accessible to everyone. Likewise, the idea of brighter human beings who learn more in a shorter time due to enhancing drugs should be embraced.
According to Chatterjee the risks of enhancing drugs do outweigh the benefits. He refers to the high rates of abuse and high statistics of sudden death in relation to the drug. The side-effects further include possible cognitive trade-offs on the long run such as the decrease of creativity. To Chatterjee it is clear that drugs exist for treating diseases and not for improving naturally well-working functions. He critically remarks that the distinction between illness and shortcoming is not always straight-forward and further, that many doctors do not agree with this position and happily prescribe drugs in unnecessary cases.
Another point Chatterjee makes is concerned with the issue of Equity. Enhancing drugs will be more available to those with sufficient financial means. The counterargument that those inequities exist already should not justify producing more. Chatterjee is afraid that the choice to take enhancing drugs will transform into some form of coercion. Implicit and explicit pressures to improve by the means of drugs in order to acquire or maintain employment might occur. The author concludes by suggesting that more research needs to be conducted on the effects of enhancing drugs before they can be used to improve healthy individuals’ cognitive skills.
13. Whose well-being? The enhancement debate
In this article different forms of brain stimulation, pharmacology and psychobiological training are being discussed. According to ethicists, these methods can only be called ’’ human enhancement’’ if they are a ‘’change in the biology or psychology of a person which increases the chances of leading a good life in the relevant set of circumstances’’ (Savulescu et al., 2011). Also for other authors is the notion of improvement and well-being the core issue in the enhancement debate. In this article the conceptualization of well-being, the framing of enhancement, and the translational promises found in the literature will be addressed.
Whose Well-Being?
In the enhancement debate, clinical studies on well-being are often mentioned, however this might be risky because clinical advantages are not similar to overall well-being. Therefore, this might become a normative fallacy. Another risk is that people are seen psychobiologically without taking the environment and context into account. This risk can be reduced if other methods of assessing well-being are used, such as the Better Life Index developed by the OECD (2013), in which people can choose their own standards. Results of this study suggest that enhancement can also be achieved by socio-political reform instead of at the level of individual psychobiology. Research by Savulescu suggests that not the individual should be changed with respect to the circumstances but the circumstances with respect to the subject.
Framing and Relevance
A media analysis of newspaper articles has found that in 94% of the articles on prevalence of psychopharmacological enhancement the terms common and/or increasing are used. However, it is difficult to make these statements because what does common exactly mean? There is a lot of variance in research findings. In the case of methylphenidate, researchers have the intention to measure non-medical use but non-medical does not inherently mean that people use it to enhance their cognitive capacities. In addition to enhancing study performance, also recreational or lifestyle use was often mentioned. Another study found that students use the stimulant to increase their motivation or positive feelings. In the media, the term ‘’common’’ often refers to relevance and ‘’non-medical use’’ to cognitive enhancement which is, according to the author, not in line with available evidence. The consequence of not using this framing is a reduction in urgency of the issue. Both medical sociologists and neuro-ethicists have a conflict of interest in framing stimulant consumption in the competition for research funds and high-impact publications.
Promises
There is not much known about the working of stimulants and it is not clear whether it is actually better to change the individual in its circumstances than changing the circumstances for the individual. Some researchers call the enhancement debate a ‘’phantom debate’’ or ‘’neuroenhancement bubble’’ because in many countries people are quite happy and their happiness will not increase by the use of enhancement technology. A too big emphasize on increasing well-being in not only a clinical population might even make more people unhappy.
Lecture 5
14. The Post-normal science of precaution
The article “The post-normal science of precaution” written by Jerry Ravetz was published in the scientific journal “Futures” in 2004. It deals with two different ways of how to look at science in general, namely the “mainstream style” and the “post-normal” approach.
The mainstream approach to science
Ravetz states that the mainstream science embodies a reductionist tradition of Western science. It assumes that complex system can be taken apart, studied in their elements to understand them in detail and then be reassembled again. Science helps to discover pieces of facts and enables their application, thus the revealed discoveries become tools that improve human welfare. The downside of this procedure is that it treats systemic properties to be incapable of scientific study and thus ignores them. Furthermore mainstream science operates on the assumption that all innovations are safe until they proved to be dangerous. Working on this assumption can have fatal consequences as can be seen in our degraded and destabilized natural environment. Thus the way science works and has worked for centuries has now become a threat to the survival of our civilization. Another threat to civilization that has to be noticed is the fact that the social processes of research have changed in ways that most knowledge which is discovered is withheld from the public. Many researchers are dependent on funds which owners often times decide against a publication of the discovered knowledge to preserve its company an advantage.
Biomedicine and molecular genetics have become the leading areas of science and combine every researchers aim to discover something spectacular with corporate greed in commercialization. Everything that can be done at some point in time will be done. This approach to science does not seem to evaluate potential risks for humanity that certain discoveries might incorporate. Examples are human clones or xeno-transplants.
The myth of value-free science
The mainstream approach to science is built on the doctrine of progress and its application directed at the growth of profit and power. Safety and ethics on the other hand seems to be in the way and are therefore treated as less important. Additionally it has been proven by now that sciences claim to be value free is not true. In fact all statistical tests are value-loaded in some way since they are designed to avoid one or another type of error. A test might be either overly selective which means that it has a high risk of rejecting correlations that are actually real, or it is overly sensitive, accepting correlations that happened to occur by accident. Since every researcher has to decide which kind of error he or she wants to minimize a value-loaded choice is made. This choice determines what knowledge we attain and what knowledge we are prone to ignore. The fact that there are values present in all areas and kinds of research reminds us of the fact that it is impossible to erase uncertainty to 100%. This is contrary to the widespread belief that science can indeed achieve certainty, preferably in numerical form. This belief has discarded so called “soft” science like social and behavioural sciences as to be of a lower status. Precautionary science (which we will turn to later on) operates exclusively in areas with severe uncertainties and is therefore at a high risk to be despised or neglected by the mainstream elite scientists. Post–normal science teaches us the great lesson that we do not need to eliminate all uncertainty in order to get scientific results with quality. Rather we should attain skills that help us to deal with it.
Post-normal science of precaution
Up to this point it should have become clear that even though a lot of our knowledge has been discovered using the approach suggested by mainstream science does not seem to fit in today’s time and society anymore. In the following part I will elaborate on the suggestion of a post-normal science of precaution. The mainstream approach has embodied the principle that innovations are safe until they are proved to be dangerous. On the other hand the precautionary principle turns this around, claiming that all innovations are dangerous until they are proven to be safe. It argues that we need to combine nature, science, and society. The precautionary science deals with issues that usually have uncertain facts, contentious values, high stakes and the need of urgent decisions. Examples of this would be some areas of medicine and public health as AIDS for instance. An example where it would be needed is the study of environmental toxicants since its effects on human health can only be discovered after a long delay. Thus, when we operate in this field on the mainstream principle which says that something is safe until its proven to be dangerous most damage might be already irreversible.
Ravetz argues that endorsement of the twin goals of science namely the advancement of knowledge and the conquest of nature is not sufficient in guiding post-normal science. Ethics, society and ecology need to be taken into account as well. All of these might be summed up to “safety” and “sustainability”. Thus it contains science and technology as well as positive visions of humanity, its welfare and destiny. The post-normal approach of science does not deal with the abstraction of real world problems anymore but rather with the real world problems themselves. Namely it has to deal with interactions of the natural world as well as with synergies with profit, bureaucracy, poverty, exploitation and war. Problems that are researched arise and become salient due to its public debate. The values that this approach aims to incorporate are precaution, sustainability, community and citizens. The term “post-normal” seems to include all of these and serves as a reminder that science as everything else needs to adapt to new standards. It asks for a “paradigm shift”, a “scientific revolution” as Kuhn calls it.
Essential to this post-normal approach is its methodology- systems uncertainties and decision stakes might be high. The latter describes that investments and commitments of personal, commercial and institutional nature might be at stake in the inquiry. The post-normal approach asks additionally for an extended peer community which should incorporate people from all different kinds of domains including local knowledge and investigative journalism, since it will bring extended facts to the discussion.
Issues of culture and quality preservation
Next we will turn to a cultural perspective and the issues of scientific quality. It appears to be almost paradoxical that everybody knows about the important link between science and freedom of thoughts, still we let our students most of the time learn facts without giving them space for own judgments or independent thinking. Thus, there might be a need for an educational reform. The downside of such a reform might be that set standards could be lost as it might happen in every reform. On the other hand it is equally probable that if we do not aim for these new understandings quality in our prevailing mainstream science might become compromised and corrupted. We are already facing the problem of privatized knowledge due to the fact that most research and thus the discovery of knowledge depends on funding which is often provided by private companies. These companies will oftentimes use the gained knowledge for their rather than for societies best. Further we are in need of effective external quality assessors to ensure the quality of conducted research. Post-normal science suggests mutual learning by taking perspectives and commitments of other parties in the extended peer community which was mentioned earlier. Still it has its problems since it is built on mutual trust and depends on policies.
15. Medicalisation
Panels with conflicting interests
Despite the fact that the overall health has increased over the years, the prevalence of specific diseases and thus its medication consumption also increases remarkably. How could this paradoxical phenomenon be explained? The problem is that many of the people who are members of the panels which decide what counts as a disease as well as about the treatment thresholds have conflicting interests. Namely most of them have financial ties to for instance pharmaceutical institutions which benefit from wider patient pools.
A well-known example for this dilemma is the Diagnostic and Statistical Manual of Mental Disorders. Allen Frances chaired the taskforce for its fourth edition (DSM-IV) and now believes himself that this edition led to a lot of unnecessary diagnoses especially in the fields of autism, bipolar disorder and attention deficit. By now it is clear that 56% of the panel members had financial ties to drug companies, the panel for mood disorders was to 100% tied to them. Frances states that the problem not merely lies in the financial conflict but also in intellectual ones since all researchers aim for recognition. Therefore he advises that people outside of the profession of psychology/psychiatry would be more appropriate to become members of these panels since experts of a field could never assess the risk and benefits of defining a new disease without bias.
A positive example
Common sense would advise us to only allow members who are not paid by any drug or device company to become members of these panels, but it is argued that those are rather hard to find. A positive example seems to be the US National Institutes of Health in which experts with financial or intellectual conflicts are prohibited to be a member of its panels.
Furthermore their panels often include representatives from diverse disciplines like nurses, social workers and doctors from different fields of expertise. They would even like to broaden their pool of representatives further including biostatisticians, epidemiologists, non-health professional and people representing the wider public as well as health economists to assess the cost effectiveness of changing diagnostic categories.
But even with more independent and diverse panels, the key problem of a golden standard of how to define a disease remains. Frances emphasizes the risks of new diagnoses since they can lead to the unnecessary treatment of people with drugs that might even harm them.
Lecture 6
X. The examples of free will and “dangerous“ brains
This article was used before and can be found on page 19 of this summary (lecture 3, article 8).
16. Brain Imaging, biopower and practical neuroscience
The occurrence of neuroscience in public discourse has recently increased. A range of self-help books for instance refers to knowledge and wisdom derived from neuroscientific research. One of these books is “Making a Good Brain Great: The Amen Clinic Program for Achieving and Sustaining Optimal Mental Performance” by Daniel Amen. Johnson focuses on this recent publication in order to illustrate the persuasive nature of brain imaging and the responsibility to be healthy that has been assigned to individuals through the development of brain imaging techniques. Readers of this kind of self-help literature are seen as an active audience: it is communicated that there various techniques to change the brain in order to improve well-being exist.
Our current political and economic system supports the notion of responsibility for one’s own brain and well-being. Neoliberalism translates political, personal and social issues into technical problems ready for planned intervention. Secondly, neoliberalism emphasizes on individual freedom, including responsibility of the individual. Individuals collectively referred to as “biopower” play an important role in the governing process. The afore mentioned self-help books support the idea of “biopower” since they look at life in calculable and scientific terms and view individuals as active agents capable of self-management. Johnson relates this notion to “healthism”, a growing consciousness for healthy living.
Amen’s book is subdivided into two parts. The first part functions as an instruction on the brain and on imaging technologies. The second half is a program for improving the brain within 15 days. It suggests many activities that are beneficial for the functioning and well-being of your brain. These include both physical and psychological advices. Even the thoughts one is having should be paid attention to since positive and negative thoughts can change the brain in positive or negative ways, respectively. Johnson critically describes the second part as common-sense knowledge which sounds a lot more elaborated and special when brought in relation to the brain. An important characteristic of the book are the brain scanning images. Many of the advices are illustrated with the help of those images. We are talking about so-called SPECT images, images derived from nuclear medicine procedures that measure blood flow. The procedures are based on the assumptions that blood flow in the brain is correlated with activity in the brain. SPECT images are three-dimensional, one can produce surface and active images. They are perceived as evidence for states and developments of the brain by many. Their role is more persuasive than diagnostic. Pictures have been proven to be powerful, they convince the patient, or in this case the reader of the book, to adhere to treatment.
The healthy brain in Amen’s book is defined by external behavior. A citizen who behaves well and is well-adjusted must have a healthy brain. Individuals must understand their problems in neuroscoentific terms and with neuroscientific vocabulary. An understanding of this kind leads to the transformation of the self, the willingness to work on oneself by working on one’s brain. A biological understanding of this kind further leads to reduced feeling of shame and guilt since one can blame one’s behavior on one’s brain structures. This leads to self-forgiveness which can be the first step of working on oneself.
In the context of Amen’s book brain scans display not only one’s biological feature but further function as images of a person’s character.
Amen postulates a direct relationship between a person’s biological features and a person’s character. An individual’s character can hence be calculated through technologies.
Health encompasses both biology and social aspects. One’s character functions as a statement about the brain, whether someone possesses a healthy or a sick brain. A healthy character is driven by the will, whilst an unhealthy character is driven by the unhealthy brain. Health functions as a mediator between the brain and the character. The natural state of a person is the healthy state, humans are originally good. By defining healthy as “being a good citizen” Amen promotes the will to be a good citizen. People want to be healthy. If the state of health is equivalent with being a good citizen, people should want to be good citizens as well.
Health is not an end state, but an ever-receding aim. We can always increase our health but can never reach a state of absolute health. We are impelled to try to reach for health and manage our own brains. It is possible to change our brains to the positive and to choose health. The good citizen must choose health and must work for it.
17. How has neuroscience affected lay understandings in personhood?
The article “How has neuroscience affected lay understandings in personhood? A review of the evidence” written by Cliodhna O´Connor and Helene Joffe was published in the scientific journal SAGE in 2013. It deals with the upcoming prominence of neuroscience and how this affects our society and peoples self-concepts. The authors reviewed several empirical studies in order to investigate this rather broad phenomenon accordingly.
New brain scan technologies have led to implications of neuroscience in more and more social fields as law, marketing, public policy, education, parenting and economics. Thus, it has become of social scientists concern in which way members of society deal with this recently acquired piece of knowledge. Terms of the deficit model evaluate the accuracy of public understandings of neuroscience, since whether a conception is scientifically true or false is often irrelevant for the effect on the laypersons thinking. It is more the meaning that people infer from the neuroscientific ideas to their personal and social life that matters. It would be interesting to investigate in which ways neuroscience impacts folk psychology since it guides people’s behavior, how they perceive their environment and how they interact with others. According to O`Connor and Joffe, some even claim that recent developments in neuroscience have led to revolutionary changes in the ways how individuals and society is understood. It is supposed to alter the dynamic between personal identity, responsibility and free will.
Opposed to this view it can be argued for the well-established social representation theory. It states that people tend to look for scientific findings that validate their already existent values, identities and beliefs. Thus, new scientific information as it is gained by neuroscience can either challenge and change present understandings as well as it can assimilate and strengthen established concepts. The presented article will try to identify where we can find most of the truth.
The prominence of neuroscience
First, we will look how conscious people really are about neuroscience, thus how prominent it has become. Indeed, we can observe a sharp increase in media coverage between 2000 and 2010. Main topics of neuroscience related news are neuro-realism, neuro-essentialism and neuro-policy.
Neuro-realism describes how people use neuroscientific information to let phenomena appear to be objective, real.
Neuro-essentialism refers to the idea that the brain and its properties is the essence of a person.
Neuro-policy describes how brain research can be deployed in order to support political agendas.
A reasonable explanation for the rising media coverage is the rhetorical force neuroscientific informations imply. It helps claims to seem more credible and confer legitimacy on its arguments. Brain images which are pictured three-dimensional are especially convincing. To what extent neuroscience has been established to the public’s naturalistic thoughts and conversations remains unclear, studies suggest a rather low familiarity.
Neurosciences effect on self-concepts
A second question that comes to mind when thinking about neuroscience is whether it fosters people to see their self-concepts as more strongly rooted in biology rather than in their experiences and environments. Materialist theories of the person argue that the mind is physical matter. Dualist theories see the mind and body as being separated, the mind consisting of non-physical plane. Establishing terms as “neurochemical self”, “cerebral subject” and “brainhood” in the sociological literature suggests that neuroscientific findings may have led to a stronger endorsement of a materialist view. Nevertheless studies have revealed that behavior is mainly regarded as built by an interaction of relationships with parents, teachers and society. At least this holds true for the general population.
The neurochemical self
Looking at clinical populations research has discovered a much higher endorsement of neuroscience into the self-concept. This might have several reasons. First it allows for objectification of the disorder, making qualitative information quantitative. Furthermore it is regarded as a neutral tool to legitimize peoples experience and making a disorder easier to understand for healthy populations. Maybe foremost it helps to sustain a positive self-identity which actually led to the “neurodiversity movement”. This movement emphasizes disorders as being simply an alternative of biological expression.
These diverse research findings of the salience of neuroscientific ways to see the self in clinical and non-clinical populations suggest that the “neurochemical self” might be endorsed by events such as diagnosis and medication. Thus people who are confronted with their bodies biological components as well as their impact might see themselves in relation to these more frequently. Another idea which is introduced is the multi-dimensionality of disorder meanings. This concept endorses the split of the self-concept in two distinct elements namely one that is based and one that is completely separated from the brain.
Neuroscience: Predeterminism vs. the free will
Given the idea that who we are and how we behave is controlled by our biological set-up the question is raised to what extent are we still free agents? To what extent can we be held responsible for our actions when the development of certain brain areas have led us for instance to cheat on our spouse, kill our neighbor or hit our child? This philosophical battle between predeterminism and free will is now raised up again, integrating neuroscientific ideas. Apparently the new discovered connections between certain brain areas or a persons’ constitution of neurotransmitter and its actions support the idea of a predetermined life.
How can we hold a person responsible when science proves us that its genetic make-up led to certain actions or at least make their occurrence much more likely? This would lead to major social implications, since our whole society is built on the assumption of personal responsibility. We would need to reconsider our education as well as our legal system. Luckily our brain is much more complex, in fact it has plasticity. Plasticity means that it can change due to experience, nutrition and environment for instance. Thus, it weakens the idea of biological predeterminism and allows for the idea of personal responsibility. This is further supported by implicit theories of agency which are robust cultural theories that have existed for many years and got transmitted from one generation to the next. They define the entities that act autonomously and intentionally to cause events. Most people prefer the idea of everyone possessing a free will and dislike the concept of predetermined actions. This might be a reason why the concept of brain plasticity has become so popular predominantly in terms of training or boosting one’s own brain. People like to embrace the idea that they can enhance or limit their neural functions by their lifestyle choices. Good nutrition and mental exercise should maximize their abilities whereas substance abuse and risky behaviors might endanger their capacities. Dementia research has revealed that humans fear especially the idea that the disease will dissolve one’s own personal identity, independence, self-determination and self-control.
Neuroscience and society
The last question the authors investigate is if neuroscience explanations are capable of reducing stigma set by society. Indeed there has been investigated evidence that supports this idea. Neuroscientific ideas have been embraced in order to explain the deviance of subpopulations such as mentally ill people from the “norm”. This has taken the blame and moral condemnation society has put on them, thus promoting tolerance. Unfortunately it also enhances social distance, perceived dangerousness, fear, perceived unpredictability and harsh treatment. Of course this applies also to attitudes towards gender, race and obesity and even may act as a self-fulfilling prophecy. For instance a man acts more aggressively because it is expected of him or a woman is bad in mathematics as her gender stereotype prescribes her to be. It may also increase the calorie intake of someone suffering from obesity or promote fatalism among mentally ill people regarding their recovery.
A negative social implication of biological elements as the basis of social categories is for instance the reinforcement of psychological essentialism. Essentialism can be defined as “the attribution of a group’s characteristic to an unalterable and causal essence” and involves several elements. These are the establishment of discrete, impermeable category boundaries, the perception of within-category homogeneity, the use of explaining and predicting a group’s superficial traits in terms of the essence and the naturalization of a category. This promotes the differences between groups, reinforcing an “us-them split” now also on a biological basis. This gives stereotypes of particular social groups a natural constitution which makes them seem to be even more legitimate. Thus may reinforce stigmatization and discrimination.
In sum it is hard to disentangle whether neuroscience has positive or negative effects on society’s attitudes towards social subgroups. It seems to differ across domains having enhancing effects on attitudes towards homosexuality rather than on race, gender, mental illness or obesity.
18. The anatomy of violence
The book “The biological roots of crime” written by Raine deals with the idea that we will soon be able to use brain scan techniques to identify the risk of somebody to become criminal. By now we are already able to identify correlations between the development of certain brain areas and criminal acts. Raine introduces several ideas how this might affect our future legal system.
He starts with introducing LOMBROSO, Legal Offensive on Murder: Brain Research Operation for the Screening of Offenders. LOMBROSO is a programme in which all males are supposed to go to a hospital to have their brain scanned as soon as they turn eighteen. By this it will be possible, Raine hypothesized to identify criminal risk groups. Men identified as Lombroso Positive-Violence will have a 79% chance of commiting a violent crime within the next five years.
Members of the category Lombroso Positive-Sex have a chance of 82% to commit rape or pedophilic offenses. Males that are out in the category Lombroso Positive- Homicide have a 51% chance of killing someone in the upcoming five years. Those who are identified to fall in one of these categories will be held in detention forever. This detention centers will be highly secure, but built as a “home away from home” since its inhibitants have not committed a crime yet.
Raine takes this idea even further by arguing that due to this systems success it will be taken even further. By 2049 the government will introduce the National Screening programme NCSD in which all 10-year old boys will undergo a comprehensive medical, psychological, social and behavioral evaluation.
After a high risk of becoming a criminal has been identified the parents of these children are advised to get them into residential treatment programs. These programs will be able to cut the odds of becoming criminal by more than half. Two years later this intervention is not a parental but a compulsory decision since parents of such a “rotten apple” as a child are thought to be not capable of taking the responsibility of such an important decision.
8 years later
The minority report proposes to stop crime before it happens. Following this reasoning the Parental License Act is introduced which states that all parents have to have a license before they are allowed to have children.
The author Raine believes that such a dystopia he describes in his book might become reality. He argues that first steps in this direction have already been taken naming Guantanamo Bay as a present example.
Lecture 7
19. Kinds of people: Moving targets
The following text summarizes the British Academy Lecture “Kinds of people: Moving targets” which was held by Ian Hacking in 2006. The lecture is about classifications of people and how they impact the people who are classified as well as how these people themselves impact their classifications. Hacking states he has been interested in classifications for many years and that he also wrote two books about this topic. Furthermore he invented two new terms concerning this topic.
Making up people and the looping effect
First, “making up people” which refers to the process of creating a new kind of person due to its classifications. Thus, we create a new kind of person by giving it a name that imposes a certain way of behavior and thinking on it. By this we create kinds of people that have not existed before. How this is done will be clarified in two examples namely multiple personalities and autism.
Second, he invented the “looping effect” which describes how a classification and its classified target interact with one another. Thus, he coined those who are classified as moving targets since our observations and investigations interact with the targets themselves and change them.
He sees himself as operating in the “Human sciences” which include many social sciences, psychology, psychiatry as well as parts of clinical medicine. Hacking classifies his thoughts and reflections about the classification of people as a form of Nominalism. An exception to the concept of nominalism is that he sees his work as dynamic rather than static, as it looks at how names interact with the named people.
Example: Multiple personality disorder
To clarify the introduced concepts he names the example of multiple personality disorder. He describes how people of multiple personality have been made up in the following sentence:
“In 1955 this was not a way to be a person, people did not experience themselves in this way, they did not interact with their friends, their families, their employers, their counsellors, in this way; but in 1985 this was a way to be a person, to experience oneself, to live in society.”
He states that this process of making-up is done in five parts. First, we have a classification namely “Multiple Personality Disorder” which is now our target. Second, we have people that we can put in such a classification. For this they have to fulfill certain criteria, for instance they are unhappy or unable to cope. Third, we have institutions which are clinics or annual meeting of the International Society for the Study of Multiple Personality and Dissociation. Further, we have talk shows like Oprah Winfrey which emphasize the importance and prevalence of our new kind of person as well as special training programs for therapists so that they are prepared for their new kind of patient.
Fourth, we have knowledge which Hacking defines as the “presumptions that are taught, spread and refined within the context of the institutions” rather than true belief. On one hand we have expert knowledge which is the knowledge of the professionals of a certain field. On the other hand we have popular knowledge which is spread in the wider population of interest. In our running example it might be that people diagnosed with multiple personality disorder have diverse personalities and that they cannot control who they are right now. Or that they even have different hand-writings. Fifth, we have the experts that generate the knowledge, decide how valid it is and whether they should integrate it in their practices. To guarantee their legitimacy, authenticity and status they work in institutions. Further they investigate, study, try to help and advise those people that are classified of being a certain kind. As you may have noticed, this framework operates in a circular manner. Thus, the process of making up people starts with a classification and works its way up including people, institutions and knowledge until it arrives at experts. From experts on it operates its way back to the start of classification. Hacking states that this framework:
Classification
People
Institutions
Knowledge
Experts
as a positivist list. He argues that all five elements are needed and interact in the process of making up people and the looping effect. Finding historical or earlier manifestations of a certain classification helps it to appear more legitimate. For example when people state that homosexuals have always exist referring to ancient paintings of the Greece of sexual acts of people of the same sex.
Turning to our example of Multiple personality disorder Hacking states that it was renamed Dissociative Identity Disorder and termed it a transient disorder. Transient because it disappeared as soon as its name and expected symptoms disappeared.
Example: Autism
A more recent example is the conception of autism. It was invented in 1908 as an abnormal introversion and self-absorption, a definition that was valid until 1992. In 1943 it was termed as infantile autism since it was thought to refer to children. Today it is known that autism usually lasts a lifetime. It can be recognized early, no later than 30 months after birth. So far it has not been discovered what causes autism, even though there were a lot of correlations drawn for instance with the mothers length of fingers. It is assumed that a combination of neurological, biological and genetic abnormalities causes autistic disorder. No cure has been identified either. At least it has been investigated that behavioural therapy more specific pure operant conditioning can help to compensate for certain deficits. A loving and caring environment helps as well to overcome some of the symptoms.
In 1973 autism was rare as well as characterized with a definite and narrow stereotype. Until today we have developed an entire spectrum of autistic disorders, including Asperger’s syndrome which involves high-functioning people with autism. These are people who have all the symptoms of autism except for the language difficulties. A famous example is Temple Grandin who says that she sees the world rather as an animal than a human and used this to help developing more animal-friendly slaughterhouse techniques. There has been founded an autism liberation front which argues against aiming of assimilate autists to normal people as they are better at some things as well the rest is better at other things. Hacking argues that the class of high-functioning autists rapidly expanded. Per definition they are autists who kind of recovered from their disorder, grew out of most of their symptoms. Once these “recovered” autists were established in society more and more adults recognized similar behavioural patterns in themselves. Even though they have never been diagnosed with autism they classified themselves as high-functioning autists, which led to rapid expansion of this classification. You should have noticed that the previous described framework of a) classification, b) people, c) institutions, d) knowledge and e) experts also fits with this example.
Engines of discovery
These two examples of how the five frameworks works give us an idea how Hackings making up of people takes place. But what are the driving forces behind them? Hacking named these the engines of discovery:
Count
Quantify
Create Norms
Correlate
Medicalise
Biologise
Geneticise
Normalize
Bureaucratize
Reclaim our identity
The first seven (count, quantify, create norms, correlate, medicalise, biologise, geneticise) are identified as classical engines of discovery. Normalise is seen as an engine of practice and bureaucratize as an engine of administration. The tenth is for those who discovered the process of how people are made up and claim their own identity back.
To understand all of these engines in depth Hacking guides his audience through all of them using the examples of autism and obesity.
Count! People start to look at the prevalence of a person that displays a certain type of behavior. The first attempt of counting autistic children yielded 4.5 per 10.000. Today eighty countings have been published yielding 40 autistic children per 10.000.Obesity has been increased all over the world in the past twenty years.
Quantify! In the 1970s the Body Mass Index has been introduced. In 1988 having a BMI above 25 was defined as being overweight, having a BMI above 30 as obese. 18.5 is defined as the cut-off score of being underweight. To give you a sense for the meaning of this numbers: Marilyn Monroe’s BMI varied between 21 and 24, the models in the Playboy magazine have gone down from having a BMI of 19 to 16.5.
Create norms! Define what counts as normal and what counts as deviant. In weight we have the normal range of the BMI, in most disorders it is rather hard to state specific norms. Further it cannot be said what was first there, the normalcy or deviance.
Correlate! To find explanations for all sorts of phenomena and disorders we correlate them with everything we can think of. For social sciences correlations are especially fundamental. We found out that most autistics are male and tried to identify certain risk factors as the mothers’ nutrition. We say being overweight is bad for a person because of its high correlations with a range of diseases as well as its social stigma.
Medicalise! We aim to medicalise all kinds of people that deviate from the norm in order to make them more normal again. The increasing prescriptions of anti-craving medicines are one example as well as the expansion of the DSM including more and more symptoms and disorders.
As soon as a child is diagnosed with autism it has a mental disorder and thus a medical problem. Another prominent example is the diagnosis of ADHD which numbers of sufferers has also been rapidly rising within the last two decades. This might also be due to the process of medicalization.
Biologise! After we have stated that something is a disease or disorder that needs treatment (medicalise) we assume that this observed deviance needs to have a biological cause, more specific neurobiological. For instance overeating might be explained by a chemical imbalance, autism might be due to a shortage of mirror neurons. In every case biologising takes the responsibility at least partly from a person. This might be good in terms that certain stereotypes as fat people are lazy are loosened. On the other hand it can lead to a feeling of helplessness.
Genetise! One step further from medicalization to biologisation is to geneticalisation. This refers to our aim to find explanations for certain deviations in a person’s genetical make-up. A popular example is the discovery that criminal behavior might have genetic origins. Due to new brain imaging techniques this debate has recently been raised again. Of course the question for responsibility is a prevalent issue here as well.
Normalize! Normalization is an engine of practice rather than of discovery. It involves our urge to make deviant people as normal as possible. For this we use behavioural therapies for autistic children as well as the anti-craving drugs for obesity as mentioned before.
Bureacritize! Bureacracy is an engine of administration. We have a system that identifies people that deviate from the norm and assumes that these need help to get back on track to be able to function properly. An example would be that we scan children in the early years of schooling for developmental problems. If we detect problems we place these children in special schools where they should receive best support. Hacking argues that this might lead to a feedback effect in which the developmental problem becomes more salient for the children themselves and thus they tend to express them more often than they would in a normal school. Obesity is in this case a contrast case, since it has not been bureaucratized yet.
Resist! Resistance is the engine that people who get medicalised, normalized, bureaucratized might use to reclaim their identity. They might use it to get back some control the experts and institutions took from them while they were putting them in a certain kind of category. A very famous example of this process of resistance is the movement of Gay pride. Followers of gay pride as well as its predecessors were able to restore some control over the classification in which they fall by redefining it. In autism we have the “autism liberation front” that was mentioned earlier. In obesity we have several organizations that call for pride and dignity in heavy bodies. An example is the French organization Groupe de Reflexion sur l`Obesite et le Surpoids, which short form is GROS.
All these ten engines that were just described act and interact in a dynamic manner. They constantly set the limitations and boundaries of the kinds of people we made up new. This is why Hacking termed these kinds of people moving targets.
Kinds of People: “autistic child” vs. “child with autism”
Hacking states that the process of making up people can be done in different ways. The species mode would include forming a new species for instance the autistic child. On one hand this might be problematic since it includes a depersonalization of the people and turn them into objects for scientific inquiry. On the other hand people could argue that something like having autism is more than just a characteristic, it is rather part of the nature of a person, essential property. Thus, the term “autistic child” would capture it much better than a “child with autism”. A contrast case for this is again obesity since being overweight is usually only a characteristic of a person and not part of his or her enduring identity. It is rather seen as a certain property as the hair color someone has. This can be further elaborated by thinking of similarities people in these two kinds of categories may have. Autistic children tend to have a lot in common ranging from language problem, social problems to an obsession with order and literalness. Obese people on the other hand do not have much in common except for their overweight.
Further Hacking lists some more example of how our society has been making up kinds of people in the past decades. He mentions the introduction of the poverty line in the 1890s which defined who and what is poor and who and what is not. Now we use “the poor” in a sense of species.
Suicide is another example of how the five frameworks interacted to change a concept. Suicide has always been existent but now it is tied to depression, a cry for help. Or it is used as a ruthless and terrifying weapon indifferent to what kind of people it kills. Hacking argues that the way we see and define suicide is not a human universal, but rather something that became true in our Western society.
Genius is the last example Hacking mentions. He refers to Galton’s Hereditary Genius and the introduction of IQ tests which helps us to identify the genius in our population.
20. What kinds of things are psychiatric disorders?
The article “What kinds of things are psychiatric disorders” written by Kendler, Zachar and Craver was published in the journal “Psychological Medicine” in 2011. It deals with four different models how we can classify psychiatric disorders, namely the essentialist kinds, the socially constructed kinds, the practical kinds and the mechanistic property (MPC) kinds. The authors present each of these models and then evaluate them according to their practicability and usefulness.
Essentialist kinds
Our first model the essentialist kinds model proposes that everything in our case each specific disorder has one underlying essence. This essence is thought to be inherent in every individual that suffers from a disorder D. All individuals that do NOT suffer from disorder D will not have this underlying essence. The essence is a single, well-defined etiological agent that stands in direct causation to each key defining feature of the disorder.
To clarify the concept we will use the eating disorder anorexia nervosa as an example. So we start with choosing an essence as the fundamental cause for the disorder. Everyone who suffers from anorexia nervosa should have this essence and people who do not should not have it. The constant urge of losing weight might be our essence. This impacts and causes the key features of anorexia, like the preoccupation of thoughts with food, losing pleasure and interest in many activities, exercising a lot and so forth. Maybe you have noticed some flaws in this concept. So can we ever be sure that we chose for the right underlying essence? And can we ever be sure that this essence is really the only causal agent in our disorder? Most times we cannot. Usually disorders are caused by several very different factors, as we have learned so far (remember “Nature and Nurture!”). Thus, the concept of essentialism simplifies our concept of psychiatric disorders too much and reaches its limits. In addition it blends out the fact that features of a disorder are interdependent and thus interact with one another. Regarding biology it would not allow for any biological explanations since biological factors as genetics are too diverse, flexible and ever-changing.
Socially constructed kinds
The second model of interest is the model of socially constructed kinds. This model suggests that kinds of psychiatric disorders are constructed by the categorizations societies and cultures introduced to them. Surely it is known by now that social factors have a large impact on the development of psychiatric disorders. Further the authors agree that cultures define and conceptualize psychiatric disorders. The idea which they reject is that the existence of a specific psychiatric disorder is merely based on a cultures conceptual distinction of it. In their opinion we should find classifications for psychiatric disorders that integrate common biological, psychological and social factors and account for the distribution of specific disorders across cultures and history. Depression would be a popular example as a disorder which has existed for hundreds of years and occurs across different cultural contexts.
Practical kinds
Third, the authors present the model of practical kinds. As its name suggests it uses an instrumentalist approach to science, thus focus on elements that are most useful in practice. Regarding psychiatric disorders this means that a good categorization of a certain kind of disorder would help us to make reliable diagnosis, prognostication and treatment selection. If the diagnosis of someone with a psychiatric disorder would help us to describe how this person will behave in detail as well as how it can be cured, we can say that this kind would be very helpful for science. This atheoretical/descriptive approach was adopted by the DSM-III, DSM-III-R and DSM-IV.
The authors regard this practical kinds approach as more useful than essentialism and social constructionism, since it embraces both of these aspects without reaching its limits so fast. Still they see its limitations as it does not give any clear advice as how classifications should be built.
Mechanistic property cluster (MPC)
To solve the problems of the models described so far the authors introduce the mechanistic property cluster (MPC) model. This is supposed to capture all the various and complex mechanisms that underlie, produce and sustain psychiatric syndromes. The mechanistic property cluster (MPC) suggests that there are indeed essence-like underlying explanatory structures which are inherent to most psychiatric disorders. Thus, they are somewhat universal. But it also cautions us that these are messy as well as interrelated and therefore hard to disentangle. Further it emphasizes that these essences are rather regarded as relatively stable complex interactions between behavior, environment and physiology. They might have arisen through development, evolution as well as an interaction with the environment.
The authors emphasize that the mechanistic property cluster (MPC) model is open to systematic differences that might arise within subpopulations. So as well as we have different hair colors and body weight or food preferences across cultures we might have hybrids in psychiatric disorders. Still, the fuzziness of boundaries that might occur because of that does not take stability from them. As soon as we have identified these underlying properties that interact within our psychiatric disorder we can use the mechanistic property cluster (MCP) to predict, explain and control these kinds. Therefore knowing the MPC kinds clusters will enable us to make inferences about the past, present and future of an item that operates within a specific kind.
The mechanisms just described will typically span several levels and thus will also include interactions among specific symptoms. For example the psychiatric disorder of phobia leads to avoidance of the feared stimulus. This avoidance will prevent the patient from habituation to that stimulus. In schizophrenia, hallucinations are often causing delusions. Therefore, the authors conclude that individual symptoms of a psychiatric disorder interact with one another to sustain the other to be characteristic of the illness. That is why illnesses are often regarded as composed of a relatively stable set of traits. This can also create self-fulfilling negative expectations that worsen the overall condition of a patient suffering from a psychiatric disorder. For instance the cognitive biases towards negative information that are found in depression compound the depressive state even further.
A cause of a psychiatric illness might be psychological as well as biological. It also might be that a cause that was psychological at first, for instance substance dependence resulted in changes in biological levels which now in turn contribute to causing the dependence.
Mechanistic property cluster vs. essentialist kind
You may have noticed the similarity between mechanistic property cluster (MPC) and our first model of the essentialist kind since both argue that kinds of psychiatric disorders are grounded in shared features of the causal structure of the world. Still they differ in various ways:
MPC recognizes that there might be numerous causes for a specific kind of a psychiatric disorder to exist including evolutionary, developmental, genetic, physiological, psychological, behavioral and social.Essentialists argue for only one underlying mechanism (essence) that causes a specific kind of psychiatric disorder.
MPC acknowledges probabilistic relationships between relevant causal factors and the occurrence of specific symptoms. Thus, as a cause may change the probability that a specific symptom or a set of symptoms arise may change.
Essentialists on the other hand emphasize deterministic causes.
MPC assumes that the same network of symptoms might be caused by different underlying mechanisms. Thus, MPC kinds are “multiply realizable” by the causes or sets of causes that produce them.
The authors predict that more information about underlying mechanisms will provide new possibilities for classification. To get them might be rather challenging due to the potential overlap.
The concept of mechanistic property cluster (MPC) was retrieved from the philosophy of biology as the authors state it. Borsboom used a similar approach but relied more on a psychometric perspective. He argues for a “causal systems perspective” as psychiatric disorders should be seen as “sets of symptoms that are connected through a system of causal relations”. Factor analytic concepts have revealed that there are indeed underlying liabilities to illness which causes their symptoms.
Conclusion
The authors conclude their article with arguing that mechanistic property cluster (MPC) model provides a prescriptive guidance for classifying the complex concept of psychiatric disorders. It helps to link psychiatric illnesses as closely as possible to our growing knowledge of the causal factors that operate in producing, sustaining and hopefully helps to prevent and treat them some day. Even though not perfect, the authors argue that the mechanistic property cluster (MPC) kinds approach offers the most promising answer to the question “what kinds of things are psychiatric disorders?” with which they started their article.
21. What has neuroscience ever done for us?
Despite the developments in neuroscience over the past 25 years, the standard treatments of mental health problems hardly changed. In this article it will be described why there exists such a disconnect between knowledge and application and could we increase the effectiveness of existing treatments and developing new treatments?
Although there is strong evidence that pharmacological and psychological interventions can be effective in treating depression, there are still many people who suffer from depression and die from suicide. The costs of mental health problems are very high but why do they remain so high despite treatment has improved? It is a great challenge to choose the right treatment for the right individual. There is large variability in the effectiveness of treatments for different people. In a study of Trivedi and colleagues (2006), only one third of the participants with a depression recovered fully after antidepressant prescription and Cuipers and colleagues (2014) found that less than half recovered fully across various kinds of psychotherapies of which cognitive behavioural therapy performed the best.
The difficulty in applying neuroscience research
The only positive influence of neuroscience research on mental health practice has been the use of animal models to develop new drugs, which has resulted in a few new treatments over the past ten years. The difference between neuroscience and mental health reflects the problem of consciousness: How does the brain generate experience? Linking neurons to experience is still a huge challenge. Measurements can be related to behaviour but experience can only be inferred indirectly. Mental health practice takes subjective experience as its starting point, there is no objective test for it. Practitioners rely on descriptive definitions in which the symptoms specify the spectrum of diagnosis.
Why can’t we diagnose disorders using brain scans?
Mapping symptomatically and categorically defined mental disorders onto brain circuits is an impossible task because individuals diagnosed both with depression can have different combinations of symptoms and therefore these presentations do not necessarily correspond to one underlying causal mechanism. Also, the same symptoms could be caused by different mechanisms. The difference with conditions such as a coughing, is that this is defined mechanistically and not according to symptoms. This example illustrates the limitations of a symptom-based diagnostic system when selecting a treatment. One single mechanism for depression is not possible and therefore depression cannot be diagnosed with a brain scan. Neuroscientific measurements are more appropriate in spectrum approaches than in categorical models which is one of the reasons that the American National Institute of mental health decreased the funding of studies based on descriptive categorical diagnoses. An experienced therapist might be able to adjust an intervention according to identified underlying factors but there is still a lot of trial and error in treatment selection.
What do we mean by a ‘cause’?
According to the author it is needed to specify symptoms at brain level. It might be argued that in this way the psychosocial contexts is ignored but brain function is influenced by both genetics and social environment. Brain-based explanations of symptoms should include both biological and psychosocial explanations. These are different levels of the same question and therefore integration of both frameworks is required.
Symptoms can be lying on a spectrum with proximal on the one side and distal at the other side. Proximal causes are directly related to mechanisms causing symptoms and are targets for treatment. These are identified by fundamental research. Distal causes are indirectly related to the mechanisms causing symptoms and are targets for prevention. These are identified through epidemiological research. More knowledge about proximal causes, might contribute to better treatment.
Causes of depression
Research on depression has focused on distal (underlying) causes such as heritability, genetics, life experience and personality. Proximal causes include: various forms of stress, psychological constructs such as dysfunctional negative schemata, information processing biases and disrupted transmission in neurotransmitter systems. These factors are not specified at the level of activity in the brain circuits. There is also no strong evidence for the serotonin hypothesis, that low levels of this hormone directly elicit depressive symptoms. It is tried to specify depression at the level of brain circuits. Several studies suggest that negative affective perception and experience in depressed people are caused (proximally) by disrupted function in the brain circuits that support normal emotional processing. There are robust abnormalities found in the subgenual anterior cingulate cortex (sgACC) during emotional processing (The sgACC was reduced in volume).
Interesting, but useful?
It can be concluded that ‘depression’ does not have one mechanism and that it is not a single entity at brain level. Despite statistical evidence at group level for the reduced volume of the ssACC, this cannot be identified at an individual level. Will we be able to exploit this variability in brain structure and function among individuals with the same diagnosis to improve treatment choice? Several studies are conducted on emotional processing in individuals with depressive symptoms. The results show that psychological treatment was the most effective in individuals with normal baseline sgACC activation during negative emotional processing and pharmacological treatments were better if the baseline activation of the sgACC was abnormal. The identification of these abnormally functioning regions resulted in both invasive and non-invasive methods, such as deep brain stimulation (DBS). Interesting preliminary results found in research on DBS have to be replicated in randomised controlled trials. Non-invasive methods such repetitive transcranial magnetic stimulation (rTMS) has already been approved in America. Recently, a new method in which visuo-spatial distraction is used to prevent the consolidation of memories after a traumatic experience has been developed.
A new era?
Although neuroscience has not had a large impact on mental health practice yet, an interesting period is coming. In the short term we should change the way we think about symptoms, focusing on proximal causes at brain level and how these relate to psychological processes. By accepting a mechanistic variety, an improved classification systems and new approaches and tools should be developed in order to improve the treatment selection for the individual.
Source
Lecture notes are based on the course in 2014/2015.
Join with a free account for more service, or become a member for full access to exclusives and extra support of WorldSupporter >>
Contributions: posts
Spotlight: topics
Online access to all summaries, study notes en practice exams
- Check out: Register with JoHo WorldSupporter: starting page (EN)
- Check out: Aanmelden bij JoHo WorldSupporter - startpagina (NL)
How and why use WorldSupporter.org for your summaries and study assistance?
- For free use of many of the summaries and study aids provided or collected by your fellow students.
- For free use of many of the lecture and study group notes, exam questions and practice questions.
- For use of all exclusive summaries and study assistance for those who are member with JoHo WorldSupporter with online access
- For compiling your own materials and contributions with relevant study help
- For sharing and finding relevant and interesting summaries, documents, notes, blogs, tips, videos, discussions, activities, recipes, side jobs and more.
Using and finding summaries, notes and practice exams on JoHo WorldSupporter
There are several ways to navigate the large amount of summaries, study notes en practice exams on JoHo WorldSupporter.
- Use the summaries home pages for your study or field of study
- Use the check and search pages for summaries and study aids by field of study, subject or faculty
- Use and follow your (study) organization
- by using your own student organization as a starting point, and continuing to follow it, easily discover which study materials are relevant to you
- this option is only available through partner organizations
- Check or follow authors or other WorldSupporters
- Use the menu above each page to go to the main theme pages for summaries
- Theme pages can be found for international studies as well as Dutch studies
Do you want to share your summaries with JoHo WorldSupporter and its visitors?
- Check out: Why and how to add a WorldSupporter contributions
- JoHo members: JoHo WorldSupporter members can share content directly and have access to all content: Join JoHo and become a JoHo member
- Non-members: When you are not a member you do not have full access, but if you want to share your own content with others you can fill out the contact form
Quicklinks to fields of study for summaries and study assistance
Main summaries home pages:
- Business organization and economics - Communication and marketing -International relations and international organizations - IT, logistics and technology - Law and administration - Leisure, sports and tourism - Medicine and healthcare - Pedagogy and educational science - Psychology and behavioral sciences - Society, culture and arts - Statistics and research
- Summaries: the best textbooks summarized per field of study
- Summaries: the best scientific articles summarized per field of study
- Summaries: the best definitions, descriptions and lists of terms per field of study
- Exams: home page for exams, exam tips and study tips
Main study fields:
Business organization and economics, Communication & Marketing, Education & Pedagogic Sciences, International Relations and Politics, IT and Technology, Law & Administration, Medicine & Health Care, Nature & Environmental Sciences, Psychology and behavioral sciences, Science and academic Research, Society & Culture, Tourisme & Sports
Main study fields NL:
- Studies: Bedrijfskunde en economie, communicatie en marketing, geneeskunde en gezondheidszorg, internationale studies en betrekkingen, IT, Logistiek en technologie, maatschappij, cultuur en sociale studies, pedagogiek en onderwijskunde, rechten en bestuurskunde, statistiek, onderzoeksmethoden en SPSS
- Studie instellingen: Maatschappij: ISW in Utrecht - Pedagogiek: Groningen, Leiden , Utrecht - Psychologie: Amsterdam, Leiden, Nijmegen, Twente, Utrecht - Recht: Arresten en jurisprudentie, Groningen, Leiden
JoHo can really use your help! Check out the various student jobs here that match your studies, improve your competencies, strengthen your CV and contribute to a more tolerant world
2321 |
Add new contribution