sabato 20 gennaio 2007

Philosophy and the cognitive sciences. Toward an ethic of cooperation in scientific research

The cognitive sciences have as object of study the natural world and the functioning of the mind in a thinking system, whether it be natural or artificial. It is one of the most exciting and fascinating of endeavors in the contemporary cultural landscape. Its research program aspires to explaining mental processes in such a clear and transparent manner, to allow machines the possibility of reproducing them, by simulating the procedures involved in our daily activities: such as walking, seeing colors, recognizing shapes, smelling fragrances, remembering people and things, making decisions, formulating sentences and discourse, and even that of loving someone.
It is no doubt a very ambitious objective. Different disciplines have occupied themselves with similar issues: certainly philosophy has, as well as, psychology, neurology, linguistics and computer science. In fact, it is no coincidence that, from the time of its inception, these disciplines have represented the pillars of cognitivism. The history of cognitive science, however, cannot be identified with the history of any of these disciplines if considered separately. More than in a general program of declarations of intent and objectives, the cognitive sciences identify themselves with a method, an interdisciplinary method adopted worldwide by a growing number of researchers from a variety of disciplines. The cognitive sciences have the merit, above all, of having resolved problems that had become irresolvable, closed off as they were, within single isolated disciplines.
The opening up of new and original perspectives of inquiry, thanks to a new ethic of integrated research and the production of effective results and the cooperation between different sources and techniques of knowledge; these are in general, the “big” boons that the cognitive sciences offer the entire scientific community, regardless of this community’s specific sectarian content. Here, at the beginning of this new millennium, in fact, they lead us not only to a cultural but moral crossroads. One in which we must decide whether to face the scientific enterprise in ways that multiply the synergies of creative resources, or in ways that instead dissipate them, in the attempt to continue nurturing the heroic myth of solitary research.
Let us take a minute to reflect on this crucial junction, and indulge in the luxury of asking ourselves a couple of somewhat inconvenient questions. Why are we so attached to scientific individualism? Why are we more attracted to finding refuge in exasperated specializations as opposed to searching for the possibility to experience what it is like to speak a common language? What is it that renders us so devoid of the curiosity of exploring the questions and answers other sciences, from other perspectives, offer to our very own problems? Why do physics, biology, mathematics, struggle in making themselves understood, and loved, by our younger generations? Why is it that even within these disciplines many manifest an attitude of scorn and self-sufficiency, in the face of the divulgation of the big ideas of science? On the other hand, on the so called “humanities front”, what is it that feeds to an extreme the fear of imitation, the horror of being plagiarized, the over valuing of the “monograph” to the detriment of the work done in teams, of brainstorming, and of group creativity?
I am sorry and I pray that you will forgive me, but I do not have a non-rhetorical answer to these questions. What use would it be, in fact, for me to take offense with the corporate bodies of academic tradition, with the history of our educational institutions? It would be of no use whatsoever: one who would deny the value of cultural and scientific cooperation is already so self-mutilating that it would be completely useless to remind him of the ethical responsibilities he has undertaken.
It might be much more useful to review, as a real and true antidote to every type of autism in scientific research, some of the results obtained by the cognitive sciences.
From studies in ethology, for example, we have obtained the certainty that the species-specificity that has rendered the history of man so different from that of other animals, is due to the evolutionary cumulative nature of culture. We might find it surprising, but it is not so much our capacity for invention or for problem-solving, or in other words, that “creative component”, that distinguishes us from other primates, but it is instead the ability to set in motion that which Michael Tomasello (1999) calls, “the ratchet effect” of mental processes. This is the possibility, allowed to us by language, to hand down, perfect and preserve in our collective memory, the artifacts of our culture, saving them in an irreversible manner from disappearing, a condition any other animal practice or procedure is instead condemned. This discovery, substantiated by a great deal of experimental evidence, teaches us that humans are humans, specifically because they are able to put together, “con-dividere”, share, their own cognitive resources in a way that is unthinkable to other animal species.
Natural history has taught us of the central role cooperative cognition has played, it has now also become a crucial factor in the history of culture, that is, with the technologies of knowledge transmission: from oral knowledge to writing, from printed newspapers to modern media, to the present day fusion of local cultures in the endless holistic global nature of the Internet. Here, let us pause a minute in order to think about this, in order not to risk falling once again victims of the criticisms of the Platonic Fedro, leveled against writing and other techniques that would entrust our memory to superficial and trivial signs external to us. This ancient aristocratic prejudice, that believed that, the more public culture became, the more vulgar it was rendered, is still the basis today for that age-old presumptuousness that G.B. Vico called “la boria dei dotti e delle nazioni” (transl. “the arrogance of the learned and of nations”). “In all of my life – he wrote in his De nostri temporis studiorum ratione – one single thought has caused me anguish beyond compare: that I should be the only one to know. An extremely dangerous thing, like being a god or a fool”. Cognitive information technologies have transformed the abstract sense of this plea into the reality of telematic cooperation, or in other words, that wonderful work method, that for the first time in the history of man, places in front of our eyes the thoughts of thousands of other intelligence laborers engaged in the work of solving our very same problems, in real time. Bergson was right when he said that: “memory is never lost, it is attention to life”. It does not belong to single individuals, or to any one group, but is instead eternally available to us all. It lives, in fact, a virtual life. It is memory of a species, for biologists. It is memory of humankind, for historians.
Culture, born of the nature of man is thus, a social thing. Other important voices coming from the cognitive science arena illuminate, from other perspectives, this very same concept. The Nobel prize winning laureate for biology, Gerald Edelmann, founder of “Neural Darwinism”, reminds us, for example, how even if a biological basis for our values exists, it is only by way of our social exchanges that we, as human beings, are able to develop them (…) From a causal point of view, it is only as the result of the value systems found within selective brains, that the basis of the extraordinary phenomenological gift of conscience can emerge” (2004: 115).
In the end, there are no better examples of the social advantages to be gained from a fruitful scientific cooperation, than the results attained by the cognitive neurosciences: example of the most successful synthesis between the “natural” sciences and those of “human” sciences. My intention is not that of making mere “propaganda”, by reviewing a list of successes. What I think is important here, is understanding how these successful results were attained. In order not to risk boring you, I will tell you how, by remembering a famous episode that has represented a turning point in the studies of neurology.
We are in the year 1947. A woman suffering from aphasia comes accompanied by her daughter, to the office of one of the greatest neurologists of the century: Theodore Alajouanine. The professor, asks the woman to tell him the name of her daughter. The patient becomes agitated, trying desperately to find in her memory the response to his question. She remains immobile and looks at her child. Several minutes filled with anguish pass, and then the woman bursting in tears says, “Ah! My little Jacqueline, I no longer know your name!”
What is happening? This woman is weeping because she cannot say her daughter’s name, while she is at the same time, pronouncing it? And yet, the field of neurology in the 1800’s had affirmed itself (thanks to Paol Broca and Karl Wernicke) with the idea, that cases of aphasia depended on organic lesions of certain brain areas, in which our words were stored. If these words, however, are able to re-emerge in a strongly emotionally charged situation, then they have not really been completely and materially cancelled from our brain.
This case became famous and lead to the discovery of many others. Some aphasics that are not able to articulate the words “three” or “February” can, nonetheless, pronounce the series: “one, two, three, four”, or “January, February, March”. In other patients, the words “buried” within their minds come forth, almost miraculously, when they sing or swear. Others lose only adjectives and nouns, but not verbs. Others yet, are unable to say common nouns, but are instead able to say proper nouns. Incidents like these, discovered as a result of linguistic or psychological studies keep multiplying.
In this way, the disciplines of Linguistics and Psychology continue to enrich the discipline of clinical neurology with questions as well as answers. We have discovered that language is not a mass collection of words squeezed into a group of cells: it has its own internal organization, a real and proper hierarchy of phonetic, syntactic, lexical and semantic structures. We have also come to understand, how the speech act requested of the aphasic, results not only in the re-activation of that which the naturalists of the 1800’s called the “mécanique des langues”, but is also responsible for the re-sparking of the psychological will and motivation to speak, and live.
A new paradigm was born: more than one kind of science began to compete in the process of rendering the image of the injured mind that has lost its words, increasingly more complex. Practical results have come in: rehabilitation therapies have produced surprising effects. As a result, all theories on the relationships that exist between superior functions and cortical areas have been reformulated from their very basis.
It is important to emphasize, from a very general point of view, how this fortuitous passage from the neuroscience of the 1800’s to the cognitive neuroscience of today, has been made possible by the medical field’s generous renunciation of its self-sufficient clinical organicism. The rising number of successful neuroscientific results has been proportional to that of new and evolving computer technologies applied to brainimaging, and to our growing capacity to recognize the complexity and heterogeneousness of the cognitive processes underlying physiological causality. Cognitive neuroscience, because of this, has come to symbolize team research: it is modular, interactive and effective, because highly cooperative.
If we were to use a metaphor to describe the field of cognitive neuroscience, we could compare it to an integrated department in which a team of experts from a variety of professional backgrounds work on crime prevention using a data bank of information that arrives from a variety of official government sources and that comes to be stored in one unique and specific place. As if by magic, through the integration of data coming in from separate and distinct sources of intelligence, the identikit or profile of a wanted yet still unknown criminal, is reconstructed, a little bit at a time, piece by piece, until an always clearer picture emerges of who it is they are looking for. A certain expression of the eyes, a particular physical characteristic, a certain tone and timbre of voice, a bad habit, a psychological tendency, etc., the profile of the wanted criminal starts taking on shape and form. The advantages of such a methodological approach that comes up with a new and homogeneous picture from the comparison of apparently heterogeneous pieces of information are unimaginable. Implicit to this process, however, is the renunciation of many small and particular privileges.
It is important here to emphasize that the renunciation of disciplinary self-sufficiency, is an issue that does not only regard the neurosciences. We can find analogous examples of this in all the “human sciences” committed to the cognitive perspective: linguistics, anthropology, psychology, and philosophy. Their renunciations have certainly been not any less important. What is being asked of these sciences is to abolish from their disciplines any and every corporative mind set and attitude. They are furthermore, in a certain sense, being asked to increase their willingness to question the very principles behind their idea of “self-sufficiency”, in the favor of more universal research values and a more effective practice in resolving problems. So, the anthropologist is asked to give up on the idea that there are no natural constraints of any kind on the historical determination of culture. To the linguist, is asked to give up on her belief that to describe languages is the same as explaining them, considering that languages live inextricably linked to the physiological complexities of organisms and to the cultural complexities of societies. The psychologist will be asked to give up on his opinion, according to which, all that we need is an accurate method for measuring reactions and certain types of actions, in order to understand and predict the behavior of, and even the knowledge of, the human soul.
What will philosophy be asked to renounce? What will the philosopher have to give up?
As opposed to all the other branches of knowledge, philosophy does not love specificity. Its corporate tendency is of a completely different nature. What Konrad Lorenz (1954) said about the human species in respect to other animal species, can be also be said of philosophy in respect to the other sciences: it is the specialist of non-specialization. In following its very own natural proclivity, philosophy is per se an excellent candidate to serve as propellant to the cognitive mix. Bringer of important critical questions, provocateur of doubts and consumer of experimental curiosities regarding the problems and procedures of the mind, this is exactly what Vico had always hoped for, when he augured to the learned, the gift of ingenium, where “verum et factum convertuntur”.
In these past thirty years, this process has already taken place in the Anglo-Saxon culture, giving philosophy a new season of success and the re-attainment of the kind of importance and prestige that fortune bestows on only those sciences that are useful to humanity.
In the Old World, however, and often in the Italian tradition, the non-specialization of philosophy has been considered a kind of systematic skepticism on not only what its specific role is, but on what role a conscious and positive meditation (in its entirety) of the things of the world, can play in the world. This philosophical tradition has ended up screwed ever so tightly around its own “historicity”, to the point of projecting (in the name of the just but misguided cause of making variability valuable), the inconclusive shadow of cultural relativism, as well as the defamatory one concerning the “pseudo conceptuality” of the sciences. In doing so, it has relinquished any possible value of practicality and has in fact, emarginated itself from a knowledge area capable of transforming the world; it has done this with an air of self-satisfaction that is ethically questionable. From here, we arrive at the flight of those that S. Auroux calls today “the genitive philosophies”: bioethics, the philosophies of science, language and of the mind, compromised and contaminated by scientific knowledge and by utilitarian objectives.
The first consideration I would like to make on this point is that the separation between philosophy and science, responsible for this skepticism, in reality, has never existed in the history of ideas. Aristotle was a great natural scientist: he treated categories, such as the soul or the morphology of animal bodies, with the same method and with equal clarity. Mathematics and medicine were born in the Greek and Arab tradition and came from the same philosophical base. Logic, ontology and medieval and renaissance psychology were the fruits of a unitary cosmogonist way of reasoning that considered as inseparable the physical and mental universes, the movement of the celestial bodies and human volitions. To the culture of Descartes, Leibniz and Locke we owe rationalism, empirical philosophy, as well as algebraic analysis, combinatorial calculus, optics and mechanics. The encylopedists of the 17th century studied political societies and civil pedagogies together with the laws of economy, of anatomy and of iatromechanics. Even in the age of spiritualism, Henri Bergson, before beginning to write a single line of his Matière et mémoire, systematically read the psycho-physiological journals of his time, passing down to us the most extraordinary synthesis of philosophy and science ever written. Only historicism and its provincial dialects separate philosophy and science, with all the negative consequences we must deal with today.
The second consideration I would like to make is this, that basically this process by which the constraints between philosophy and science have been loosened has dangerously exasperated another divarication that impends upon on the ethics of research: the one found between the human sciences and the natural sciences, degrading it to a mere ideological juxtaposition between “humanism” and “scientism”. Ernst Mayer, one of the most illustrious evolutionary biologists of the 19th century, in his last book, written at the venerable age of almost 100, had recently ridiculed this juxtaposition:

“If we consider just how profound the similarity between evolutionary biology and the historical sciences is, or on the contrary, how much the former differs from physics in both its conceptual formulation and its method, it is not in any way surprising that it is so difficult, or even almost impossible, to trace a clear demarcation line between the natural sciences and the human sciences” (Mayr, 2001: 13).

Mayr, a Darwinist, with his words, seems to have brought the whole experience of an entire century to a close. An experience, in which the 18th century’s juxtaposition of not only the human sciences and the natural sciences, but of also religious and scientific thought, slowly lost sense.
From the heated controversies on Darwin to our current day debate on neo-evolutionary genetics, a lot of water has passed under the bridge: with the exception of some pathetic nostalgic to be found here and there, no one is still willing to rationally contest the idea that Man descended from the apes in the flow of the interminable “chain of being” (A. Lovejoy).
Sometimes, nonetheless, advocates of the social sciences have interpreted the acceptance of the evolutionary basis of Man’s nature, as an unconditional surrender to scientism. On the other hand, the supporters of a mythological idea of science – like the fundamentalist biologist Richard Dawkins, for instance – consider the Darwinian turning point as the certified death of philosophy and other “weak branches of knowledge”.
The first, end up in an irrationality that ignores the very idea of the causality of physics and engages in searching in human thought for an escape route that transcends it. The second, fool themselves into thinking that by reducing all phenomena to physical causality one can entirely explain these phenomena.
The first, in short, manifest an ethical deficiency, having lost the sense or the direction of their research: how many humanists have felt the duty to interrogate themselves on the utility of what they study? The second, manifest a semantic deficiency, in that they do not even ask themselves, if and how, the signs of physical causality can be interpreted culturally: how many scientists refuse to look at, from a bird’s eye view, the forest to which belongs the leaf of the tree, to which they themselves are also attached? In either case, there is nothing more meaningless.
In conclusion, what I think emerges from what has been discussed here, is that the ideal of cooperation and solidarity characteristic of the cognitive sciences, may very well constitute the pars costruens for a new ethic in research that can be born out of the renunciation of cultural corporatism.
This is an ethic that begins to weigh on the responsible choices made by Universities the world over, those Universities committed to the globalization of resources and knowledge. An ethic, that is even, why not say it? Local. One in which the obstinate juxtaposition between the “two cultures”, that looks to the past, can be replaced by a pilot-program experience based on the integration of different branches of knowledge, that is turned towards the scientific future of our new generations.