Beyond the codification debate: Knowledge as emergence

0 downloads 0 Views 221KB Size Report
viewpoint and focused their research on the institutional structures of scientific research. ..... Many phenomena, especially in life-sciences, but also in physics and mathematics, simply ... Herbert Simon, instead of being attached to a problematic model of ...... Electronic version at .
EAEPE 2003 Conference The Information Society Understanding Its Institutions Interdisciplinary November 7 - 10 Maastricht

Beyond the codification debate: Knowledge as emergence Jorge Bateira* *

Instituto Superior de Serviço Social do Porto Av. Dr. Manuel Teixeira Ruela, 370 4460-362 Senhora da Hora, Portugal. (E-mail: [email protected])

1. Introduction Information has become a catchword in the media speech when referring to a new dimension involved in current changes in developed societies. This is also evident in the discourse of economists and political authorities, although in large number of cases information comes side by side with the word knowledge, sometimes as substitutes, other times meaning different things. The lack of knowledge and information science-based concepts in economics literature about growth, science and technology, and in the broader field of innovation, has been acknowledged in recent papers (Ramlogan and Metcalfe, 2002; Smith, 2002). This may be due to a defensive attitude manifested by mainstream economists who refrain from interdisciplinary dialogue and easily admit that the discipline should economise this kind of speculation. Therefore, a large number of texts adopt some kind of folk psychology or, at best, a widely diffused (although problematic) model of cognition. Indeed, in the sixties of the last century economic science considered knowledge as the output of science activities viewed as a market. Information and knowledge were much the same thing, a ‘public’ good within neoclassical theoretical framework, which underpinned public support to R&D investment (Arrow, 1962a; Nelson, 1959). In this context, knowledge per se was not an object of inquiry but rather was treated as an exogenous variable of formal models. Economics was more interested on knowledge (basic science) applications leading to technological change and on its effects on economic growth. It was Arrow (1962b) who first considered that knowledge could be included in economic models as an endogenous variable when he focused on the effects of learning by doing. However, only in the eighties more refined models were proposed (Romer, 1986; Lucas, 1988), which put forward the role of education and research investments in creating externalities with potential increasing returns. Current attraction of these models comes from the fact they inscribe economic growth inside the framework of neoclassical economics and in this way attempt to eliminate what was a serious theoretical shortcoming (Foss, 1998). Despite this formal manoeuvre, knowledge goes on being conceptualised as a ‘personal stock’ that determines the specific knowledge hold by firms and the ‘general level’ of a society’s knowledge. In the eighties a rival stream of reflection emerged. Under the influence of Kuhn’s philosophy of science (who in turn received intuitions from Michael Polanyi), and the work of sociologists of science, a few economists began to distance from the Marshallian individualist viewpoint and focused their research on the institutional structures of scientific research. At the same time, the pioneering dialogue of Herbert Simon with cognitive science, dating back from the fifties, became recognised by a few economists and organizational scientists. In fact, both

1

Simon and Polanyi inspired the foundational work of Nelson and Winter (1982), which was decisive in the awakening of an evolutionist thinking in economics that is now quickly expanding and gaining scientific status (Cantner and Hanusch, 2002). However, along the nineties a majority of economists went on applying neoclassical analysis to the ‘production’ and ‘transfer’ of knowledge, although in the second half of the decade heterogeneous approaches emerged. A new strand became more affirmative as distinctions between knowledge and information, and between tacit and codified knowledge were assumed. It acknowledged the importance of context and economic incentives to determine the mix of these two types of knowledge. This ‘new economics of science’ accepts insights from new institutionalism and rejects the so-called linear model of “basic science - applied science technology” (Sent, 1999). Information is identified with codified knowledge and codification recognised a necessary condition for knowledge commodification. In this intellectual context, the paper aims to bring a fresh look into the foundations of the current debate about knowledge codification, and the importance of tacit knowledge, as held for instance in Industrial and Corporate Change (2000) 9(2).1 The crucial question is: what if the major results of contemporary natural sciences’ research were inconsistent with the core assumptions of the economics literature about knowledge and information, particularly in economics of science and technology? According to Cohendet and Steinmueller (2000) current debates on the importance of tacit knowledge, and its potential for codification, certainly reflects underlying epistemological tensions that need to be clarified. However, it seems doubtful the debate has so far produced useful advances, and certainly will not be settled by economics empirical research as far as it deals with crucial issues pertaining to the philosophy of sciences. The paper argues that participants in the above-mentioned debate share a problematic paradigm of human cognition, and therefore will not produce new insights as long as they ignore major results of the last two decades in neurobiology, psychology, and linguistics. This lack of interdisciplinary dialogue has been acknowledge by Sent (1999: 114) as “too much self-absorption”, although it could be related to the fact those results actually challenge core assumptions taken for granted in most economics literature. Considering this background, the paper brings together ideas from different sources in order to support an interdisciplinary understanding of knowledge and information that could help economic science to bridge its current gap with natural sciences. In doing so, the paper joins the

1

More recently the debate received a new contribution from Nightingale (2003) when this paper was already finished. Yet, taking account both texts have a lot in common, but also disagreements in basic issues, I found necessary to make a brief comment on it by adding footnotes.

2

effort of many economists that feel the need to find more sound underpinnings for a science that some argue should be evolutionary (Hodgson, 2002a; Lawson, 2002). The second part of the paper reviews the dichotomy ‘tacit – codified’ knowledge and confronts the established interpretation of these concepts with the original formulations of Michael Polanyi. The discussion takes Cowan et al. (2000) and Ancori et al. (2000) as good representatives of different sensibilities in the debate. This part ends with a critical analysis of the cognitivist paradigm that frames both texts and is widely taken for granted in economics literature. The third part describes an alternative paradigm to human cognition taking as point of departure important western philosophy reflections of the twentieth century. The inquiry about human knowledge proceeds with a thorough review of most recent contemporary research on neurobiology, artificial intelligence and embodied semantics in order to identify a model of human cognition that fits a revised modern synthesis of Darwinism. The ideas summed up in this part are called the ‘naturalist paradigm’, which in the perspective of the paper should underlie economic science. The fourth part confronts mainstream concepts of knowledge and information with the implications coming out of the ‘naturalist paradigm’. In this framework it accepts an understanding of information in line with the Cybersemiotics model proposed by Brier (2002/03) at the crossroads of three self-organised systems: social, biological, and psychological. Some implications for the development of new information and communication systems and the so-called ‘Knowledge Management’ projects are also discussed.

2. Disentangling a debate 2.1 Back to Polanyi Since Lundvall and Johnson’s (1994) arguments on the emergence of a ‘knowledge economy’, economists have manifested an increased concern for this fuzzy reality called knowledge. However, with a few exceptions, most of economics literature has not built on recent human sciences research addressing knowledge. A possible reason for this may reside on the pivotal work of Nelson and Winter (1982) (from now on N&W), which exerted a strong influence in economics by highlighting the importance of Michael Polanyi’s (1958; 1966) analysis of human cognition.2 Taking stock of this crucial work, different streams of non-orthodox economists joined other social scientists to adopt the concept of ‘tacit knowledge’, leaving behind both the original texts

2

The same occurred in the management literature with Nonaka and Takeuchi (1995).

3

of Polanyi and his epistemological framework. Although this movement has been important for the revitalisation of an alternative to the neoclassical school, focus on N&W interpretation of Polanyi had the consequence of diverting attention from cognitive science’s own evolution. The mediated reception of the ‘tacit knowledge’ concept may be exemplified by the definitions used in the texts under discussion. Following Ancori et al. (2000) definition, “basically, explicit knowledge refers to knowledge that is transmittable in formal, systematic language, while tacit knowledge has a personal quality which makes it hard to formalize and communicate” (p. 273). According to Cowan et al. (2000), the fact that certain types of knowledge do not appear articulated doesn’t authorise to classify them as tacit knowledge. He uses a tripartite typology including articulated, unarticulated, and unarticulable knowledge, the latter being a residual type to put aside “as not very interesting for the social sciences” (p. 230). For instance, after citing N&W on the nature and significance of tacitness in the knowledge that supports a skilful performance, Cowan et al. (2000) claim that those authors “accept Polanyi’s (1967) account of such situations as being contextual, rather than absolute” (p. 219). Moreover, to support their views they point to the detailed analysis of the chapter ‘Skills and Tacit Knowledge’ in N&W, which concludes that costs matter: “Rather, the question is whether the costs associated with the obstacles to articulation are sufficiently high so that the knowledge in fact remains tacit” (p. 82). These and other passages document in two ways the crucial role of N&W in shaping the debate: on the one hand, it called attention to the work of Michael Polanyi and its potential for a reformulation of the theory of the firm; on the other hand, it diffused a particular interpretation of Polanyi’s concepts that in my view is not accurate. N&W certainly is a most valuable landmark towards economics opening to interdisciplinary dialogue. However, his prominent role of translator from Polanyi became a serious obstacle to the progress of economic thinking as I intend to show in the next paragraphs. In fact, Cowan et al. (2000) found enough material in N&W to consolidate the core idea (supposedly subscribed by Polanyi) that a skilful performance needs two types of knowledge – articulated knowledge in correspondence with focal awareness and tacit knowledge in correspondence with subsidiary awareness – and there exists a boundary between the two types of knowledge that is not absolute but contextual. However, when we read Polanyi’s texts we do not find this dichotomy. On the contrary, we see a deep analysis of a unitary process of knowing, even if the author identifies different dimensions underlying this complex process. Just in the ‘Introduction’ of Polanyi (1966) we may read: all thought contains components of which we are subsidiarily aware in the focal content of our thinking, and that all thought dwells in its subsidiaries, as if they were parts of our body. Hence thinking is not only necessarily intentional, as Brentano has taught: it is also necessarily fraught with the roots that it embodies. It has a from-to structure. (p. x)

4

This paragraph and many others in the same book reveals that Polanyi considers knowledge a process, not a thing, as clearly stated in the title of the first part (‘Tacit Knowing’), which sharply contrasts with much diffused ideas on the subject. Further, Polanyi considered this process as based on the articulation of two dimensions, the awareness and the engagement. The former comprises two poles that are mutually exclusive, focal and subsidiary awareness. The latter ranges from the more bodily to the more conceptual pole (Chia, 2002). Therefore, Polanyi conceived human cognition in the intersection of these two dimensions, which give rise to a richness of knowledge processes between two poles: explicit and tacit knowledge. Explicit knowledge rests under focal awareness and is more conceptual, while tacit knowledge rests under subsidiary awareness and is of a more bodily (less conscious) nature. Contrary to the current fad about ‘tacit versus explicit knowledge’, Polanyi argued for the unitary nature of all knowing. For Polanyi there is no purely explicit knowledge as far as, besides body mechanisms involved, focal awareness always needs the support of subsidiary awareness. Highlighting the tacit dimension of human perception, Chia (2002: 11) summarises Polanyi’s thinking in the following terms: In other words, the object produced for our conceptual awareness is always already a product of the tacit structure operating to abstract figure from the ground. Such a claim implies that a hidden order underpins the creation of tacit knowledge. It is an unconscious order well traced by the art theorist Anton Ehrenzweig in his classic study of the psychology of artistic vision.

As a former medical scientist and chemist, Polanyi has been attentive to the research of other disciplines, which led him to stress the embodiment of the overall process in broad accordance with current neurobiological research on the mind: We may venture, therefore, to extend the scope of tacit knowing to include neural traces in the cortex of the nervous system. (…) This brings us to the point at which I hinted when I first mentioned perception as an instance of tacit knowing. I said that by elucidating the way our bodily processes participate in our perceptions we will throw light on the bodily roots of all thought, including man’s highest creative powers. (Polanyi, 1966: 15)

If any doubt could subsist, the conclusion of the first chapter couldn’t be clearer. Scientific research is rooted in a process of tacit knowing, just as any other human activity; thus, when the use of this faculty turns out to be an indispensable element of all knowing, we are forced to conclude that all knowledge is of the same kind as the knowledge of a problem. This is in fact our result. (p. 24, emphasis mine)

After this brief overview of the original formulations I have no doubt that Polanyi (1966) has been misread by N&W. By reifying knowledge in a dual typology, their version is at the basis of a flawed discussion currently held on the codification of knowledge, which both Cowan et al. (2000) and Ancori et al. (2000) admit in more or less extent (Tsoukas, 1996).3

3

In a first moment Ancori et al. (2000) captured correctly Polanyi’s thinking. However they made a step back in order to save the knowledge categorisation they were engaged with: There is thus an indispensable component of all knowledge that has to be taken into account when discussing its formation. (…) Without being so categorical, we want to insist on the strong specific dimension of crude knowledge. (p. 271, emphasis mine)

5

Finally, the above-mentioned problematic account is not so strange if we accept the incommensurability of Polanyi’s paradigm with the one of those authors.4 In fact, Polanyi (1958) approached this dimension of knowledge in a chapter on ‘Intellectual Passions’, which is summarised by Jacobs (2002) in the following: ‘Intellectual passion’ is applied by Polanyi to feelings of attractions or of antipathy to beliefs. Polanyi is particularly interested in the composition of the emotions of inquirers and in the part that their emotions play in science. Intellectual passion is essential to making scientific discoveries but as fallible and capable of misdirection. (p. 110)

2.2 The computer metaphor Another dimension of the current literature on the economics of knowledge is given by the adoption of the cognitivist point of view in line with Herbert Simon. This is recognised by both Cowan et al. (2000) and Ancori et al. (2000) whose texts, in more or less explicit form, adopt the framework and the vocabulary of that stream of cognitive science. In the former, current criticism against the cognitivist approach is rejected and identified with those who not only give too much importance to tacit knowledge but also use it to attack the principles of rational behaviour and undermine Artificial Intelligence (AI) projects:5 Quite clearly, the challenge being brought against algorithmic representations of knowledge generation and acquisition goes much deeper than arguments for ‘bounded’ rationality following the work of Newell and Simon (1972); (…) In those quarters, tacit knowledge has come to stand for the aspects of human intelligence that cannot be mimicked by any (computer) algorithm. (Cowan et al., 2000: 218)

Cowan et al. (2000) acknowledge the important contribution of N&W in the diffusion of Polanyi’s writings. However, they recognise that the interpretation of N&W was made through the lens of the cognitivist psychology and early AI, which where converging at that time under the leading role of Simon and Newell (footnote 7, page 219). In fact, in the next paragraphs I attempt to show that Simon’s cognitivism explains N&W misreading of Polanyi’s concept of tacit knowledge. Although less explicit, the same influence underlies Ancori et al. (2000) as we can see by the frequent use of a cognitivist vocabulary (for instance, pieces of knowledge, retrieval property of knowledge, knowledge codification), and their model of the cognitive process structured in

4

Nightingale (2003) doesn’t acknowledge this epistemological incommensurability. Cognition as ‘computation’ (Simon) is radically distinct from cognition as ‘sense making’ (Polanyi), and these views don’t overlap as Nightingale admits in page 150. In fact, the tacit dimension of all knowing, as we find in Polanyi, is dismissed as a rhetorical extremism ‘that cannot be enquired into’. Yet, in the second section Nightingale links the ‘phenomenology of Polanyi’ to the modern research in neurobiology: “Consciousness is a process, and tacit knowledge - the form of unattended neural images - is involved in all our actions (cf. Polanyi, 1967; Edelman, 1989)”. For a deep discussion on the problematic of a scientific inquiry in the phenomenological realm see Roy et al. (1999). 5 Economic rationality has been under much scrutiny by non-orthodox economists, who obtained support from cognitive science research to criticise neo-classical formulations. On this point see Rizzello (1999). For an alternative to the cognitivist concept of rationality see Viskovatoff (2001).

6

different layers that interact with each other according to the principles of mechanical cybernetics. Against behaviourist theory, which focused on observed stimulus-response on the individual and ignored what was going on in the mind, the cognitivist theory became dominant in the sixties in close articulation with the development of cybernetics. The cognitive functions of the individual and his interactivity with the environment were integrated by recurring to the ‘feedback’ concept (Rizzello, 1999). Indeed, cybernetic modelling is the foundational assumption of the beginnings of cognitive science, which considered the brain a symbolic rule-based system (the hardware) operating through a mind (the software) comprising two levels: a) data – symbols directly represent specific objects or concepts; b) program – a set of logical relationships (rules) between symbols. The set of rules is governed by a centralised control (meta-rules) (Cilliers, 1998). Therefore, if we could transfer the right software to a computer of adequate power we should have results identical to the mind.6 Dating from the fifties, this intellectual source still maintains its supporters. However, it has been under strong attack from research of different sciences along the last decades mostly because cognitivism is basically analytical and deterministic in methods and descriptions. As recalled by Cilliers (1998: 35): Many phenomena, especially in life-sciences, but also in physics and mathematics, simply cannot be understood properly in terms of deterministic, rule-based or statistic processes. (...) In the light of these examples, it is certainly strange that when it comes to descriptions of the functioning of the brain, an obvious relational structure, there is still such a strong adherence to atomic representation and deterministic algorithms.

Representation plays a crucial role in the cognitivist theory by making the linkage between system’s states (brain states) and conceptual meaning (mind states). A theory of representation has been argued by Jerry Fodor (1975) who assumed we are born with an ‘inner language’ implemented in our neural structure as a language-like symbol system, which shares with spoken language a formal syntax.7 Although accepted by mainstream psychology, this theory has been under serious criticism, for instance by Hillary Putnam a former proponent of the functionalist theory of the mind who changed his views on this symbolic formal modelling.

6

Describing the computational paradigm Mingers (2001) states: It is based on four principles – that there is a Cartesian separation between mind and body; that thinking consists of manipulating abstract representations; that these representations can be expressed in a formal language; and that this is deterministic enough to be embodied in a machine. (p. 106-7) 7

This model is associated to Chomsky’s linguistics in the following terms: We must, according to him, possess an innate universal grammar, a language module in the brain – a module completely missing in other higher species including apes. (…) the linguistic module corresponds to an innate piece of computer architecture which connects physical symbols in the brain with the surrounding world. (Stjernfelt, 2000: 77)

7

These criticisms were assumed by the connectionist stream born in the eighties, which addresses the relevance of neural networks in the explanation of complex systems. Against the ‘rule-based’ model of representation, connectionism argues that: - Meaning is produced internally by the relationships established between the structural components of the system following external activation. Therefore, meaning comes out of a process involving internal and external as well as historical elements of the system. - Representation is distributed as far as the elements of the system have no meaning of themselves. They only have patterns of relationships with many other elements of the system. Indeed, ‘relationship’ is the core concept of the paradigm, not the elements (neurones) or the nodes (synapses) of the network. - Information is not ‘stored’ (or represented) in any specific symbol, which should be recalled when necessary. On the contrary, building on previous patterns of relationships, information is reconstructed each time the network (or part of it) is activated. A careful understanding of the above issues acknowledges that a connectionist theory of meaning does not deny the articulation between the distributed system and the environment. However, it accepts a naturalistic and non-objectivist perspective of representation as far as the external stimulus is similar to the neuronal stimulus. In fact, it is the distributed activity of the brain that turns (in not fully understood processes) quantities of neuronal stimulus into the ‘qualia’ of mind. As explained by Cilliers (1998): In a representational system, the representation and that which is being represented operate at different logical levels; they belong to different categories. This is not the case with a neural network. There is no difference in kind between the sensory traces entering the network and the traces that interact inside the network. In a certain sense we have the outside repeated, or reiterated, on the inside, thereby deconstructing the distinction between outside and inside. The gap between the two has collapsed. (p. 83)

The above considerations present a brief sketch on two opposing models of the brain.8 They show that so far most economists have taken uncritically the cognitivist theory as the state of the art in cognitive science, whereas this field of research offers alternative models that (at least in their basic aspects, but see section 3) are currently being used by outstanding neurobiologists. Therefore, I argue economic science has much to gain by adopting the open-minded attitude of Herbert Simon, instead of being attached to a problematic model of cognition.9 At least this is the spirit of the present paper.

8

Although Nightingale (2003) attacks ‘information-processing theories of knowledge’ his arguments are still framed by the cognitivist paradigm. He adopts the concept of ‘levels of explanation’ by Newell, and John Searle’s account of mind-brain relations. The former is one of the fathers of cognitivism and his levels of explanation are not underpinned by a credible ontology that could support a non-Cartesian explanation of mind. For the latter, it must be acknowledge that Searle shares with cognitivists a symbolic understanding of mind representations that underpins his problematic idea of Background. For recent critiques of Searle’s arguments from a non-cognitivist point of view, see chapter 4 of Cilliers (1998) and Viskovatoff (2002). 9 Trying to save Simon’s cognitivism, Nightingale (2003) finally adopts a stance that accepts the computational view of cognition as an operationally useful tool, although incompatible with modern neurological research,

8

3. Naturalist view of knowledge 3.1 Philosophical background The cognitivist paradigm splits cognition from the body and describes it as a computational process that parallels the functioning of the brain. Therefore, in considering cognition a phenomenon of ‘pure conscience’, cognitivism stands on the Cartesian dualism that pervades the discourse of both cognitive psychology and mainstream AI. In the eighties of last century Heidegger’s early philosophy has been influential and supported ideas that view human cognition, action, and language as integrated dimensions of the human being (Mingers, 2001). Heidegger stressed that humans are self-conscious of their living in a world and that this is their way of being (Being-in-the-World). The concept highlights that we live in a multifaceted world comprising things, the others and the self, which make up a medium where we may act in alternative ways according to our consciousness. This medium is a world of possibilities (multiple ways to the future) some of which, being actualised, definitely change the world and ourselves. Further, by affecting our state of mind the new choices also open up new possibilities. It is through this reflexive process that human discourse takes part in the creation of shared states of mind.10 An important development was given by the thinking of Merleau-Ponty who put forward that human subjectivity is essentially an embodied phenomenon. [Merleau-Ponty] argued that human behaviour could neither be explained in a behaviourist way in terms of external causes, nor internally in terms of conscious intentionality. Rather, it had to be explained structurally in terms of the physical structures of the body and nervous system as they develop in circular interplay within the world. The world does not determine our perception, nor does our perception constitute the world. (Mingers, 2001: 112)

Merleau-Ponty’s analysis of perception certainly influenced Polanyi’s analysis of skills and tacit knowledge. In fact, by reading Mingers (2001: 113-4) we could say the views of both authors are very similar: A movement is learned when the body has understood it, that is, when it has incorporated it into its “world”, and to move one’s body is to aim at things through it; it is to allow oneself to respond to their call, which is made upon it independently of any representation (…) My body has its world, or understands its world, without having to make use of my “symbolic” or “objectifying function”. (Merleau-Ponty, 1962, pp. 139-140).

This view of human cognition sustained by the interplay between body and environment rejects the cognitivist split between subject and object and, at the same time, is very close to the and Polanyi’s thinking. In the end of this paper I hope to have shown that this kind of compromise is scientifically untenable, and ultimately doesn’t help in the much needed effort to revise our concepts and models according to a new scientific paradigm that calls for a revision in the basic assumptions of economics. 10 An important consequence of this Being-in-the-World is explained by Mingers (2001): ‘Thinking’ is not detached reflection but part of our basic attitude to the world – one of contingent purposeful action. (…) Knowledge does not consist of representations, in individuals’ heads, of objective independent entities. Rather, we make distinctions through our language in the course of our interactions with others, continually structuring the world as we co-ordinate our purposeful activities. (p. 110)

9

concept of ‘structural coupling’ of living systems.11 Merleau-Ponty’s thinking resisted to dualistic reasoning like ‘perception vs. action’ or ‘subject vs. object’; rather he recognised the circular interdependence of both poles even in the more mediated (cultural) actions.12 An important implication of this point of view is Merleau-Ponty’s idea that language is structurally coupled to thinking: “There is not thought and language: upon examination each of the two orders splits in two and puts out a branch into the other (…) Speaking to others (or to myself), I do not speak of my thoughts; I speak them and what is between them (…) (MerleauPonty, 1964, p. 18, original emphasis)” (Mingers, 2001: 115). The same can be said for the learning of language by the child. From birth we learn a language by observing how the others speak, not by studying its rules: “As for the meaning of the word, I learn it as I learn to use a tool, by seeing it used in the context of a certain situation. (Merleau-Ponty, 1962, p. 403)” (Mingers, 2001: 115). This philosophical reflection on the human condition gives economists a background they cannot ignore when speaking about knowledge. Moreover, two crucial points emphasised above – firstly, human subjectivity is essentially an embodied phenomenon; secondly, human cognition is not detached from action and language – have been object along the last decades of intense scientific research whose main results are currently being acknowledged by some philosophers even if it implies to abandon very old assumptions of the discipline (Lakoff and Johnson, 1999).

3.2 The naturalist paradigm As I have mentioned, a crucial aspect of Polanyi’s thinking was not understood by most of its readers, which consists in looking at knowledge as a ‘process’. This is a fundamental point of current dialogue between contemporaneous philosophers and natural sciences researchers, namely with physics. Although our common categories still reflect the reductionist way of making science, which is largely based on the idea that every entity (or form) is decomposable in hierarchically inferior entities, the fact is that quantum theory opened up a radically new perspective, which leads to a dynamic view of reality.13

11

‘Structural coupling’ is part of Maturana and Varela’s (1980) autopoiesis theory about how living systems maintain their organisation. It gives account of structure-determined (and structure-determining) engagement of a given living system with either its environment or another system (Whitaker, 1995). 12 See on this point Hutchins (1995) who studied skills of navigation in the US Army and identified the importance of the tacit and embodied dimension in the execution of tasks. 13 According to a physicist, “we no longer try, as Bohr did, to use classical physics as our unique reference, as the only domain where logic can be applied and of which we can legitimately speak. On the contrary, it is the quantum world that has its own rules of description and reasoning from which those of the classical world emanate.” (Omnès, 1999: 194)

10

The classical endeavour of physics has been to successively split physical forms into particles so as to arrive at the most remote bottom of reality. What are made of the most basic particles that physics arrived to identify? According to quantum physics, although reality appears to our senses in discontinuous forms, its inner nature is continuous and made up of field waves. Therefore, we are invited to abandon traditional metaphysics of substance, implicit in reductionist thinking, and adopt a metaphysics of process (quantum fields). In this sense, everything is organization out of process: quantum fields are processes and can only exist in various patterns. Those patterns will be of many different physical and temporal scales, but they are all equally patterns of quantum field process. Therefore, there is no “bottoming out” level in quantum field theory – its patterns of process all the way down, and all the way up.(…) Everything is organizations of quantum field processes – at many different scales and hierarchical complexities. Micro - and macro - alike are such organizations. (Bickhard, 2000)

Therefore: In such a process view, then, everything that has causal power does so in virtue of, among other things, its organization. If a process framework is adopted, then organization is legitimated as a potential locus of causal power – included such macro-level organization as constitutes living entities and as constitutes mental processes. (Bickhard, 2000, emphasis mine)

In these citations the concept of ‘levels of organization’ corresponds to the emergent properties of particular levels in quantum field process, which are endowed with genuine causal powers of their own (Emmeche et al. 1997; Campbell and Bickhard, 2002).14 This concept of ‘emergence’ has its historical background in biology and re-emerged after the Second World War with an important contribution by Michael Polanyi, as it is evident in the second part of ‘The Tacit Dimension’ (Hodgson, 2000). Hence, quantum physics research points to the existence of an independent reality that transcends our sensible reach and renders problematic the model of cognitivist representation, as I will argue with the support of contemporary neurobiology. In the 70’s of last century, the Chilean neurophysiologists Maturana and Varela (1980) developed the concept of ‘autopoiesis’ as a tool for understanding the general nature of living systems.15 The autopoietic system is a unity that emerges from a network of components, which continuously generate and realise the processes that produce them, and thereby constitute and maintain the system’s organisation. According to those researchers, the autopoietic organisation also applies to the brain.

14

For a comprehensive overview of organisational complexity in natural systems see Bar-Yam (1997). Whitaker (1995) says that “autopoiesis involves both organizational preservation and componential (re-) production, with the former being the central theme and the latter being the specific means characteristic of the class of autopoietic systems.” Along the seventies Francisco Varela developed Maturana’s model and distinguished the broader set of ‘autonomous’ systems of which autopoietic systems are a subset: “The difference between autonomy and autopoiesis is that autopoietic systems must produce their own components in addition to conserving their organization.” 15

11

Since the seventies the autopoietic model of the brain has been adopted by a few neurobiologists, thereby diffusing a radical shift in the understanding of our relationship with the world. Adopting this way of understanding how the brain works forcefully leads us to accept that human knowledge has a ‘weak objectivity’, an implication explicitly accepted by the neurobiologist António Damásio. While there is an external reality, what we know of it would come through the agency of the body proper in action, via representations of its perturbations. We would never know how faithful our knowledge is to “absolute” reality. What we need to have, and I believe we do have, is a remarkable consistency in the constructions of reality that our brains make and share. (Damasio, 1994: 235)

The embodiment of knowledge is a crucial aspect of the fundamental unity of human beings and nature, in spite of difficulties that a great number of social sciences researchers have in acknowledging that unity (Capra, 1996). This view is also supported by recent convergence between physicians and biologists on the compatibility between self-organised processes of complex systems and evolutionary mechanisms of life (Corning, 1995). The Darwinian tradition did not address the problem of the ‘origin’ of life and concentrated on the processes of subsequent dynamics of populations under natural selection. Furthermore, Darwin’s thinking, and the so-called Modern Synthesis focused on gene frequencies in populations, both adopt a common auxiliary assumption for their theory of evolution: dynamic systems converge to equilibrium. This assumption has been an obstacle to the integration within Darwinist framework of the research on thermodynamic open systems, modern developmental biology, and the science of ecological systems (Depew and Weber, 1989). However, the work of Wicken (1998) points to a possible integration of both evolutionism and complexity theories in the framework of an ‘expanded Darwinism’, on the condition of abandoning equilibrium assumptions and gene-based atomism. This stance does not eliminate the essential role of functional adaptation and selection in view of survival at any level of life organisation. Scientists operating in the neo-Darwinian tradition typically talk about the ‘origin’ of life, not its ‘emergence’. Life ‘happened’ somehow; then variation and natural selection did the rest. This isn’t enough. We need to define life so that its emergence can be connected with its operation. (…) If, however, we understand genes as parts of autocatalytic organizations, they don’t have to bear the burden of being the ‘molecular secrets’ of life. They have their generational histories and evolutionary identities as parts of systemic wholes which stretch from organisms to ecosystems to biosphere. (Wicken, 1998: 370)

Therefore, non-equilibrium thermodynamics is compatible with a third wave for the Darwinian tradition, which needs a reformulated natural selection concept. The same is argued for developmental biology within a new understanding of the evolutionary process: Interactions at the emergent level are initially weaker than at the preceding level, and more subject to the effects of natural selection. (…) Newly emergent levels, however, eventually develop more refined capabilities than the simpler forms of self-regulation found at the embedded, lower level, and generate new weak levels of control above them. Thus we would expect natural selection to act, differently at different times, on the various hierarchical levels – as well as to allow drift and

12

neutralism within certain parameters, as has been empirically found at the level of protein evolution. (Depew and Weber, 1989: 262)

This convergence of natural sciences shows how Darwin’s ideas are now being replaced by a larger and unifying scientific framework that points to a naturalist view of man.16 It is a view that accepts reality as something veiled to our senses, whose ontology we foresee as ultimately being quantum field process, and to which human beings fully belong (d’Espagnat, 1995; McFadden, 2001). Therefore, as developed in the next section, I find the naturalist paradigm the adequate framework for an inquiry about human knowledge.

3.3 Knowledge as emergence In the second part of Polanyi (1966) the act of knowing is structured in two levels of reality, each one controlled by distinct principles: the proximal, including the particulars of an entity; the distal, corresponding to the comprehensive meaning. The latter is an upper level that relies on the laws of the former, the lower level, in order to operate. However, the higher level operation is not understood in terms of lower level laws, which basically means that Polanyi addressed ‘tacit knowing’ as an emergent reality in the same terms ‘emergence’ has been defined in the previous section. Moreover, he applied the concept to other examples in order to picture a holistic view of reality layered by emergent phenomena: “The sequence of these levels is built up by the rise of higher forms of life from lower ones. We can see all the levels of evolution at a glance in an individual human being.” (Polanyi, 1966: 36) Polanyi was aware of the polemic nature of this world-view. Indeed, he not only questioned the standard Darwinian model of the evolutionary process but also claimed that a theory of evolution should also make sense of the emergence of moral compelling, which for instance is involved in the peculiarly human capacity to feel reverence for men greater than oneself. In a comprehensive synthesis of contemporary research in cognitive science Wilson (1999) adopts a similar stance. He places human intelligence and culture in the heights of the last of four great steps in the overall existence of the universe. Although admitting the possibility of regression, he stresses the fact that evolution shows a progress in the sense of continued emergence of increasingly complex organisms and societies.17 In this sense, it becomes reasonable to consider the mind as an emergent level in the evolutionary process, which appears by the functioning of the human body with a brain that “bears the stamp of 400 million years of trial and error. (…) In the final step the brain was catapulted to a radically new level, equipped for language and culture” (Wilson, 1999: 116).

16

This naturalist view adopts an ontology of levels that entails the autonomy of the psychological system as another level of reality emerging from the biological system. Thus, it both rejects any kind of eliminativism and dualism. For a similar point of view see Martin (2003) and the anthropology of Entralgo (1992). 17 In the same vein see Bar-Yam (2003).

13

In my view, Wilson (1999) presents a systematising work of most importance for two main reasons: firstly, because it is a useful tool in the quest for interdisciplinary dialogue with contemporary cognitive sciences; secondly, because he brings together the elements of a ‘gravitational centre of opinion’ among neurobiologists that is coherent with the naturalist paradigm sketched above, the same that inspired Michael Polanyi. Exploring this line of convergence, I summarise in the following paragraphs the most relevant aspects that, in my point of view, inform a naturalist approach to human cognition: -

Human beings are ‘far-from-equilibrium’ (living) systems according to the vocabulary of complexity approach to natural phenomena. Along the evolutionary process on earth, by interacting with their environment they developed a structural coupling that enables them to generate conditions favourable to self-maintenance. Mind is the highest function acquired by higher-order ‘far-from-equilibrium’ systems.

-

Mind is a stream of conscious and subconscious experience of human beings. It comes out of an enormous number of interactions of different kinds involving multiple neural networks in the brain and physiological processes relating to the rest of the body.

-

A parallel processing of vast numbers of activated networks of neurons, neurotransmitters, and hormones produces an internal mapping of sensory impressions (mental images). Some of these are ‘representations’ of the outside world, others are recalled from memory; all together, they create a forward and backward flow of multiple scenarios that form consciousness. No single part of the brain is the site of conscious experience.

-

Certain subsets of the brain are associated with conscious experience. For a group of neurons to contribute to this experience it must be part of a distributed and highly complex functional cluster that achieves integration. This ‘dynamic core’ is not made up of an invariant set of neurons, or brain areas, rather neural participation depends on rapidly shifting functional connectivity (Tononi and Edelman, 1998).

-

Scenarios that build consciousness are submitted to the influence of emotion. This means that the networks originating scenarios are under influence of the physiological activity of the body (hormones) that selects some scenarios that end in certain ways over others. The competitive selection among scenarios is what we usually call decision-making.

-

Memory is a consequence of the brain’s limited capacity to deal with large numbers of mental images. Therefore, we have a ready state of conscious mind (‘short-term memory’) occupied with a limited number of scenarios for a few time. Some scenarios may be provided by certain neural groups that bring together ‘traces’ left by previous experiences (‘long-term memory’).

14

-

Meaning is formed by linkage of neural networks, which are created by spreading excitation that gives rise to mental images engaging emotion. The building of these images involves largely unconscious operations (imagination) even in the simplest construction of meaning. Hence, the concepts we use are semantic entities built up with recourse to embodied and creative processes of the mind (Fauconnier and Turner, 2002).

-

Categories are products of our mind without which we couldn’t live either in the physical or in the social world. Usually we refer to categories as made up according to properties different entities have in common. However, empirical studies have shown that categories have best examples (prototypes) and their building also involve peculiarities of that one which is doing the categorisation (capacities to perceive, to learn and remember, to organise things learned, to communicate) (Lakoff, 1987).

This synthesis of main results of neurobiology and other related disciplines has important implications for economists. One of them is that human relationship with the environment is not commanded by information given a priori to the brain, as assumed by the idea of innate cognitive capacity. Human beings are not born with ‘predetermined structures’ in higher cerebral centres that put labels on (or encode) neural signals coming from the periphery, rather brain’s evolution enabled them to learn a language.18 Another important element of the above given description is the distributed functioning of the brain and the non-linear causality that governs interactions between multiple neural networks. To retain a core concept in this model I should emphasise that mind emerges from interaction involving neurones, the rest of the body and the medium of the subject. On the whole, the above mentioned aspects undermine the cognitivist modelling of the brain viewed as an abstract system relatively isolated from the rest of the body, to which could apply the property of ‘near decomposability’ that facilitates logical computations. Rather, the picture is more compatible with the ‘Interactivist-Constructivist’ (IC) model I refer below, which aims to present an alternative to the traditional stances of ‘Cartesian minded’ cognitivism and to the materialist reductionism for which the brain is all we have to study (Christensen and Hooker, 2000). In the context of this model, we need to see closer how the mind makes ‘representations’ about the environment. This issue has been slightly approached in the section about models of the brain when referring to the criticism addressed by the connectionist approach to cognitivism. The moment is arrived to acknowledge a theoretical improvement that should overcome an 18

Referring to Deacon (1997) and its neurological research Stjernfelt (2000) stated: the processing of language takes place in an intimate connection to sensory input and motoric output on one side – and with large and diffuse parts of the cortex on the other. The human brain’s large cortex is thus decisive for the symbolic

15

important flaw of both these two views: the absence of a normative dimension (Christensen and Bickhard, 2002). If representations are internally generated, as neurobiologists argue, how can we say that they are true? At the outset it is necessary to settle the idea that mental representations are not a kind of photocopies of outside reality. On this point, I follow António Damásio in his two books: When I use the term image, I always mean mental image. A synonym for image is mental pattern. (…) When I refer to the neural aspect of the process I use terms such as neural pattern or map. (…) Images in all modalities “depict” processes and entities of all kinds, concrete as well as abstract. (Damasio, 1999: 317). These various images – perceptual, recalled from real past, and recalled from plans of the future – are constructions of your organism’s brain. All that you can know for certain is that they are real to your self, and that other beings make comparable images. If our organisms were designed differently, the constructions we make of the world around us would be different as well. (Damasio, 1994: 97).

The articulation between mental images and outside reality has been from a long time a problematic issue. The traditional idea understands ‘representation as correspondence’: the image of an object has a structure or state that corresponds to that of the object in the outside world that is being represented. However, a major objection to this concept of representation points that it cannot detect errors. Either correspondence is effective and there is no error, or correspondence fails and mind has not actually built a representation. If we use the concept of ‘representation as correspondence’ we cannot have a mental representation that has the wrong content. This means that the view of representation as correspondence lacks normative dimension. In the discussion about mental representations we find different stances. On the one hand, a sceptical would argue that we are unable to detect error in our representations as far as the unique access we have to reality is through our mind and its representations.19 An opposing view comes from rationalists like Fodor who argued that most representations are innate, probably of genome origin, and new representations are made out of prior representations. On the other hand, empiricists argue that the environment is the source of representational content in mental images and see no problem with factual correspondences. Other positions are available but all share a basic assumption: representations have a content that encodes the information about an entity outside the mind that which is in correspondence

“jump” into an abstract, symbolic semantics separated from the more sensory-motor close icons and indices. (p. 79) 19 This stance goes against the sceptical anti-realism of John Searle for whom we could have representations of objects in the world and yet those objects might not exist. As Viskovatoff (2002) states: It is sufficient for us to realize that the reason that Searle makes his curious claim that ‘Background abilities are not dependent on how things in fact work in the world’ is that he is hanging on to the obsolete Cartesian philosophy of mind, the scepticism of which is self-refuting. (p. 70)

16

with.20 As put forward by different authors, this view of representation is a strong inspiration for the mainstream computational discourse: cognition is taken to consist of various stages of the input and processing, and sometimes the output, of such encodings. The fundamental backbone of cognition is assumed to be the sequence from perception to cognitive processing to the re-encoding into utterances in some language. This view lends itself to, if not forces, a strong modularization of models of the mind. (Bickhard, 1997)

As an alternative, the I-C model of cognition addresses this issue in terms of emergence: along the evolutionary process, representation and its normative dimension emerged to serve a function that contributes to the self-maintenance of human beings. Autonomous systems select an action according to an anticipated internal outcome that is expected to help maintaining their living conditions.21 It might be that the anticipated outcome is not reached and, in this case, representation is considered in error. However, the living system is able to functionally detect the failure by confrontation of effective and anticipated outcomes. This indication has ‘truth value’ for him and corresponds to the normative aspect of representation. Using his memory he learns from this error when selecting further interactions (Bickhard, 1999). Therefore, despite the fact that representations are internal to the subject, the properties of internal outcomes are in some sense present in the environment whenever anticipation reveals to be true for the system. This truth value is given by the fact that the environment supports the anticipated internal outcome. However, the environment does not send any information per se. In this sense an informational approach to representation, as an updated version of the ‘correspondence model’, also suffers from the same weaknesses of encodingism. This general overview of multidisciplinary research about knowledge is enough for our present purpose.22 As an intermediate result, it seems convenient to sum up three important aspects associated to the I-C model: (i) we need to shift from a substance to a process metaphysics in order to see mind as an emergent pattern that is specific of autonomous selfdirected systems; (ii) human beings have developed a particular type of adaptive capacity that includes mental representations of a complex kind and smooth higher order integration of internal with external processes; (iii) learning and intelligence emerged as sophisticated forms of self-directedness based on anticipation and evaluation that mutually amplify through focused action and improved interactive differentiation (Christensen and Bickhard, 2002).

20

Certainly codification is possible but what is at stake here is the nature of mental representations that is not of that kind. 21 According to Bickhard, the anticipated outcomes are internal to the system and are not necessarily formulated in terms of representation of goal states. This avoids the flaw of circularity in modelling representation. 22 A detailed discussion on how the model deals with specific forms of representation (for instance objects, numbers) is found in Bickhard (1997).

17

As part of the naturalist paradigm summarily described above, this model of cognition gives a complex unity to concepts we are used to apply to separate categories of phenomena, such as knowledge, reasoning and imagination. It draws on the results of different disciplines that point to an intimate relationship between these embodied phenomena and contradicts ideas widely taken for granted such as: - Meaning is based on truth reference: it concerns the relationship between symbols and things in the world. (…) - Emotion has no conceptual content. - Grammar is a matter of pure form. - Reason is transcendental, in that it transcends – goes beyond – the way human beings, or any other kinds of beings, happen to think. (Lakoff, 1987: 9)

Indeed, the last decades produced an important convergence between neurobiology, cybernetics, and embodied semantics, that means a shift not only in our understanding of how we think but also in our global understanding of the world we live in. As I attempt to explain in the next point, this forms a framework that also requires a shift in economists’ traditional way of dealing with knowledge and information.

4. Knowledge and information The codification debate is manifestly based on the objectivist understanding of information, in the sense that information is considered a property of matter existing in the ‘world outside’, independent of the observer (Brier, 1992). This idea underpins a world-view that sees nature full of information, which controls the expansion of the universe from the Big Bang on, and the evolutionary process through matter to life and mind. The whole informational project has been summarised by Brier (1992) in the following: Accepting information as such an objective universal law-determining thing, which humans absorb into their minds and machines from nature, change and multiply by thinking, and bring into society through language, means it must be possible to establish a new unifying science of information. Information science then will also include cognitive science, and all epistemological problems will be solved empirically.

However, the informational project is flawed in its roots as it is framed inside Cartesian dualism that underlies the cognitivist paradigm, about which I attempted to give in the previous point a critical perspective. Indeed, mind representations of reality are complex internal processes, and sensorial connections to external world are not of the figurative or linguistic kind that would mechanically transfer meaningful information to an observer. On the contrary, the observer is the author of his (mental) images, being at the same time part of this world he attempts to transform in order to live in. As we saw, human cognition is both an embodied and a social experience that needs a second order cybernetics to be accounted for.23 23

Second order cybernetics adopts the point of view of a living system, which organises exchanges with the environment in order to preserve its autonomy. First order cybernetics applies to mechanical input-output systems that passively adapt to changes in the environment. On the evolution of cybernetics see Heylighen and Joslyn (2001).

18

This is not to say that information has no place in current research on nature and living systems. In fact, building on what biology and neurobiology have acquired in the last decades it is possible to integrate an information concept in the whole model on the condition we abandon the objectivist perspective of Shannon. As a point of departure, biologists recall a suggestive formulation given by Bateson (1973, p. 428): “In fact, what we mean by information – the elementary unit of information – is a difference which makes a difference”. That is a statement to be understood in the sense that information (‘a difference’) is inseparable from a subject to whom it makes a difference, or to whom it makes sense. The above given formulation not only excludes a solipsistic view of the subject’s experience, as far as it refers to an outside world that changes in a particular way, but also excludes the objectivist view as far as there is no information without a subject that selects a difference and gives meaning to it. This is a non-anthropocentric definition that puts our discussion in the context of an enlarged semiotics (Biosemiotics), which besides human semantics includes meaningful interactions between all living systems and their medium. A description by a biologist may be helpful to understand this enlarged concept of information: How could pre-biotic systems acquire the ability of turning differences in their surroundings into distinctions? Even a bacterium is capable of orienting itself (by moving) in a nutritional gradient. The amount of nutrient molecules hitting the receptors of the outer cell membrane changes as the bacterium moves, and this change is registered by the cell, allowing the cell to select the direction in which further movements are done. (…) Semiosis in its most modest form arose in the very process which created the first living systems on earth. From this tender beginning a new evolutionary dynamics was implemented in the world and in the course of time organisms capable of mastering increasingly more sophisticated semiotic interactions developed. (Hoffmeyer, 1997)

So, according to a biosemiotics approach, information is necessarily subjective. It comes out in the process of selecting and interpreting differences and patterns (signs) in relation to internal constraints of the living system. This act of interpretation corresponds to the attribution of meaning, which comes from the value attributed to the perturbation. In the case of human beings, it is “partly for the sake of survival, partly for the sake of human flourishing beyond mere survival, and partly by chance” (Lakoff and Johnson, 1999: 91). This view is much similar to the concept of autopoiesis, although Maturana and Varela didn’t use the word information to avoid the ambiguity associated to its common sense understanding. For these authors an organisationally closed system (a cell or a human being) is ‘irritated’ by its environment and reacts by making internal changes, which means that there is no such a thing as the transfer of (external, objective) information into the living system. However, for the purpose of our discussion we need to add to the biosemiotics model the specifics of human beings, the rich ‘inner world’ of mind-body interactions reviewed in the previous point. According to Vygotsky (1934), after two years of age human beings connect and melt thought and language in a way that thought becomes verbal and speech becomes rational. Further, in the process of children’s development interactions with social environment play a

19

crucial role. Particularly important is the idea that word meanings are part of our sense of the world brought by all the psychological events aroused in consciousness. This inner life, along with the variable context of words, makes meaning a dynamic reality that is subject to development by conceptual acquisition, resulting from interactions established in school (scientific concepts), in family and other contexts (spontaneous concepts). Although these ideas date back from the first decades of the twentieth century, it seems they are largely compatible with current research on embodied semantics by Lakoff and Johnson (1999). So far we have insisted in the fact that human beings (as all living systems) internally create meaning for sensorial perturbations they receive, and at the same time they communicate meaning by languaging, which is intimately connected to thought. Therefore, as referred by Emmeche and Hoffmeyer (1991), the ‘original’ information is not given, there is no stable ‘code’ that once and for all can translate the ‘real’ information content in a text to its actual meaning to a person or a cultural epoch; rather, meaning is a still-floating transformative principle, not found in the text or in the mind of the beholder, but in the dynamic process of interaction between communicating persons interpreting signs: other persons’ thoughts, their own thoughts and texts. (Emphasis mine)

This naturalist understanding implies that we must see information both in the context of cognition and in the context of communication. It is a perspective that forcefully brings together three types of dimensions that mutually intersect: - Biological dimension, which takes account of the embodiement of cognitive processes, a structural coupling between the living and the environment that is based on second order cybernetics; - Psychological dimension, which takes account of the complex autonomy of the mind (conscious and non-conscious) as the ultimate ‘interpretant’ of signs; - Semiotic dimension, which takes account of human interactions, and particularly of sociocommunicative processes that constitute human beings through ‘language games’ in the sense of Wittgenstein. This view is proposed by Brier (2002/03) and makes the core of what he calls the Cybersemiotics model.24 Brier adopts the complex systems approach to social systems applied by Luhmann (1995), who understood them as ‘communication systems’ or self-referential sense-making systems. According to Luhmann, human beings are not part of social systems; they stay in their boundaries because the elements of these systems are communications themselves that connect the inner world of those who communicate. Thus, social systems have an autonomy of their own that emerges through the interactions of human beings, which are

24

Mingers (2001) gives a global framework for a critical understanding of information that has important aspects in common with the Cybersemiotics model of Brier, namely the autopoietic and the phenomenological dimension of an embodied cognition. It seems that the crucial difference resides in the absence of the Biosemiotic dimension.

20

made possible through the interplay (specific ‘sign games’) between the biological and the psychological systems of human beings. Thus, in the above mentioned model there is no place for the metaphor of the ‘codebook’ formulated by Cowan et al. (2000), who argue that knowledge once articulated could be codified. Underlying this metaphor we find a problematic assumption: meaning can be objectified and becomes attached to the word as symbol’s meaning. This widely accepted ‘encodingism’ makes sense in the computational model of the mind (in fact a dualist view) but becomes undermined in the context of the naturalist paradigm.25 Our understanding also implies there is no such ‘codified yet non-articulated’ knowledge supposed to exist somewhere when we see members of a group to behave and communicate in such a way that it seems they have internalised a codebook’s content. According to a naturalist paradigm of knowledge and information, what happens is explained in the following terms: members of the group have acquired personal knowledge (to use Polanyi’s terms) either through identical learning histories, and (or) through shared experience and intense interaction.26 This is the reason why messages are endowed with stable (but not fixed) meanings inside that group, in spite the fact they were subjectively generated through an embodied process. The above made distinctions also apply to Ancori et al. (2000) as far as they accept the idea that knowledge may be codified and transmitted. Although they try to innovate in relation to less elaborated formulations adopted by other economists, in core assumptions they still use a conceptual framework produced inside AI cognitivism. For instance, inside the naturalist paradigm there is no relevance for a concept of “crude knowledge’ – i.e. information as an event without meaning. More exactly: information before the building of a meaning by a receiver, as seen by Shannon and Weaver (1949)” (Ancori et al., 2000: 266). Before received by someone the signs of a message are mere syntactic forms, something we might call ‘data’. After the signs are captured by the sensory system, the receiver immediately generates a message, which means he creates images that come inextricably associated to meanings. Just before the subject’s interpretation of the signs nothing exists besides crude sensory stimulus. Indeed, there is no place for an objectivist concept of information.

25

As Lakoff and Johnson (1999: 92) argue, this stance does not entail that ‘anything goes’: “since it brings our understanding of what science is in line with the best neuroscience and cognitive science of our age. It allows us to understand science better. And it allows us to appreciate Kuhn’s contributions while recognizing, as he did, the success of science.” 26 As argued by Lakoff and Johnson (1999), personal knowledge is deeply moulded by metaphorical thought. Primary metaphors are stable for long time as they come out of relationships between the same body’s structure and the same natural environment human beings share. There are also (not so stable) complex metaphors that are built out of the primary ones and of multiple forms of cultural manifestations as widely accepted beliefs,

21

Therefore, to keep the word ‘information’ economists need to understand it as outsideinduced processes operating inside of both the emitter and the receiver that attribute meaning to data. They are inherently subjective, although the emitter’s meaning may seem identical to the receiver’s in specific contexts such as: messages conveyed in a common language and addressing primary categories about the world; messages between members of epistemic communities. Focusing on the relations between knowledge and information, it is useful to remember that adult-child communication gives rise to an information flux that stimulates the process of knowing by the child. Therefore, I could say that research about child development supports the idea that knowledge is this ongoing process of (co-determined) relationship between the subject and its environment, plus the traces that it leaves in his physical and phenomenological nature. Thus, we call knowledge both that process and its traces in the long-run memory of the person.27 Therefore, whatever categorisation we make, it must be said that knowledge ultimately refers to the unique personal experience bearing any communicative event. In the words of Brier (2000: 441): All the different kinds of knowledge such as scientific, phenomenological and practical knowledge are specializations of the basic human ability to make distinctions (with survival value) and communicate them through everyday language.

We can’t escape the radically subjective nature of knowledge and information, a stance that is supported by a naturalist framework that goes beyond the tacit-codified dichotomy widely adopted in economics literature. In this point it is important to admit that western culture emphasises the understanding of knowledge as a thing, in opposition to eastern philosophy that sees the world as a process. Perhaps we could still keep in our research that static dimension if we do not forget it is a kind of artificial freezing, or simple location, of the real dynamics of cognition. Chia (1998: 354) framed this issue recurring to Bergson’s philosophy: For Bergson, all real movement is indivisible and cannot be treated as a series of distinct states which form, as it were, a line in time. Bergson’s main claim is that the temporal structure of our experience does not consist of putting together pre-given discrete items. (…) We are easily misled by language which is biased to static description. For instance, we talk about the ‘state of things’. But what we call a state is in effect the appearance which a change assumes in the eyes of an observing being who him/herself is changing according to an identical or analogous rhythm.

folk theories, etc. Hence, there are some objective conditions for the relative stability of concepts we need in order to live. 27

Lakoff and Johnson (1999) emphasised that the ‘cognitive unconscious’ is at work all along this process: This hidden hand gives form to the metaphysics that is built into our ordinary conceptual systems. It creates the entities that inhabit the cognitive unconscious – abstract entities like friendships, bargains, failures, and lies – that we use in ordinary unconscious reasoning. It thus shapes how we automatically and unconsciously comprehend what we experience. It constitutes our unreflective common sense. (p. 13)

22

A naturalist view of knowledge has important implications in respect to the way we manage the new ICT projects in our organisations. Mingers (2001) showed that an embodied view of knowledge questions the primary concerns of the so-called ‘knowledge management’ (KM): identifying, capturing, converting, representing, and disseminating knowledge. Framed by the disembodied cognitivist vision of knowledge, ‘Information Systems’ (IS) execute (and are good at) tasks that actually consist of data processing, rather then information processing. This author also called our attention to the current and generalised misunderstanding of the role of IS: Computers are good at carrying out repetitive operations faultlessly and quickly; at sorting and recalling large amounts of essentially arbitrary data; and reacting extremely quickly to well-specified events. In comparison, humans are relatively poor in these areas, but are excellent at motor and perceptual tasks; synthesising diverse types of information and sensation; interpreting poorly defined situations and events; making complex judgements. Note that most of these skills rely on embodiment. We should accept this situation and, rather than try to force computers to do that which they inherently cannot we should aim to maximize the fruitful co-ordination of the two together. (Mingers, 2001: 122)

This point suggests we are in need of a reflection about how technological development is addressed in contemporary developed societies. Brier (2001) says we are confronted with a basic choice: either we develop technologies that take over human activities and attempt to substitute them with mechanical operations, therefore setting the frame for mechanical interactions; or we develop technologies that support and enhance human capabilities for doing things, learning and making judgments. A few authors that are sensitive to the way new ICT impinge on social organisation have argued for a priority to projects focused on complex learning processes, which means that ICT should support the improvement of individuals skills and competencies, promote personal interaction at all levels, and sustain communication processes that lead to ‘generative dialogue’. This seems to be a more interesting perspective than that of a strong codification programme ultimately led by a machine-like understanding of human knowledge (Cilliers, 2000; Stacey, 2000). Most KM projects are based on the assumption that knowledge may be codified (more or less extensively) into messages, which are supposed to transfer ‘information’ to a receiver that reconstitutes it into knowledge on the condition he shares the ‘codebook’. These projects are deeply rooted in the mechanistic information paradigm that views knowledge as static, disembodied, and individual phenomenon. However, rather than static (stable meanings attached to symbols) knowledge is the historical process of a subject; rather than disembodied (abstract algorithmic process of mind) knowledge is bodily determined; rather than individually produced knowledge is socially produced.28 The failure of large number of KM projects, and the

28

Characterised in these terms (dynamic, embodied, action-oriented), ‘knowledge’ does not belong to the realm of economic science. The contrary may be said about production and use of tools conveying data, which make up the growing market of the so-called ‘information economy’.

23

evidence coming from global networks of product development that cannot dispense regular face-to-face interacting, strongly suggest that a naturalist paradigm of knowledge should guide organisations’ ICT-based projects (Desouza, 2003; Orlikowsky, 2002).29 According to the naturalist approach, the effort developed by economists in the aim of applying some kind of cost-benefit analysis to establish rational conditions for knowledge codification is basically misguided. Economic incentives may be effective in producing new documents, video records, and electronic files that are useful to help memory and mitigate the effects of valuable employee departures. However, taking account that knowledge is a personal experience acquired through human interacting, I think it is necessary to keep in mind that, above all technological tools we can have, the highest priority in management is still people and their communication processes.

5. Conclusion and further research The perspective adopted in this paper is deeply different from the one prevailing in neoclassical economics, and departures in fundamental aspects from assumptions accepted by most evolutionary literature. Therefore, some might be tempted to range the naturalist paradigm in the bookshelf of post-modern relativism, as far as it argues for the subjective nature of both knowledge and information. In fact, the paper argued that: -

Cognition is a highly complex process of internal representation. But, at the same time, it pointed to the existence of an outside reality as far as human beings share this phenomenological experience of a ‘world-out-there’ and have developed a normative function to build true cognitive representations.

-

Information has a subjective nature. But, at the same time, it pointed to the stability and commonality of basic concepts in language due to the fact that human beings have the same body structure, and live in basically the same natural environment.

-

Knowing is an embodied experience. But, at the same time, this also means that the inherent inter-subjectivity of knowing may lead to stable meanings and to self-organised communication systems in the sense proposed by Luhmann (1995), which certainly is not a relativistic stance.

29

See also the management experience of Bartlett and Ghoshal (2002): At the heart of the problem is a widespread failure to recognize that although knowledge management can be supported by an efficient technical infrastructure, it is operated through a social network. Information technologists may help in

24

Supported by the convergence of important streams of research in different fields of science, the above mentioned aspects belong to a paradigm that offers a view of human nature far richer than the traditional ‘economic man’. This naturalist view of human beings and their systems is based on the effort of interdisciplinary dialogue, which has the potential to provide economic science with a sound basis to develop concepts and theories. By building on the naturalist paradigm economics becomes more capable to cooperate in the scientific endeavour of understanding social phenomena, which is a more fruitful attitude than to persist in the use of problematic concepts. Finally, this paper corresponds to a first step towards an inquiry on the so-called ‘collective knowledge’, a concept currently much discussed in the economics of the firm and in the management literature. Therefore, further research will attempt to reconsider the theory of the firm in line with the above described naturalist paradigm and its understanding of ‘knowledge as emergence’. Inside this evolutionary framework, the research is expected to produce useful clues for the understanding of innovation processes inside and outside the firms, which is a subject far from being exhausted taking account of recent evolutionary contributions in the field (Hodgson, 2002b).

References Ancori, B., A. Bureth and P. Cohendet (2000), ‘The Economics of Knowledge: The Debate about Codification and Tacit Knowledge’, Industrial and Corporate Change, 9(2), 255-287. Arrow, K. (1962a), ‘Economic welfare and the allocation of resources for invention’, in R. R. Nelson (ed.) The Rate and Direction of Inventive Activity. Princeton, N.J.: Princeton University Press, pp. 609-26. Arrow, K. (1962b) ‘Economic Implications of Learning by Doing’, Review of Economic Studies 29, 155-173. Bar-Yam, Y. (1997), Dynamics of Complex Systems. Reading, MA: Addison-Wesley. Electronic version at Bar-Yam, Y. (2003), ‘Complexity rising: From human beings to human civilization, a complexity profile’; at Bartlett, A. and S. Ghoshal (2002), ‘Building Competitive Advantage Through People’, MIT Sloan Management Review, Winter 2002, 34-41. Bateson, G. (1973), Steps to an Ecology of Mind. Frogmore-St. Albans, US: Paladin. Bickhard, M. H. (1997), ‘Is Cognition an Autonomous Subsystem?’, in S. O’Nuallain et al. (eds.) Two Sciences of Mind. Amsterdam: John Benjamins; at Bickhard, M. H. (1999), ‘Interaction and Representation’, Theory and Psychology 9(4): 435-58; at Bickhard, M. H. (2000), ‘Emergence’, in P. B. Andersen, et al. (eds), Downward Causation, pp. 322-48. Aarhus, Denmark: University of Aarhus Press; at organizing data and making it accessible, but they must be teamed up with – and operate in support of – those who understand human motivation and social interaction. (p. 38-9)

25

Brier, S. (1992), ‘Information and Consciousness: A Critique of the Mechanistic Concept of Information’, Cybernetics and Human Knowing, 1(2/3), 71-94; at Brier, S. (2000), ‘Trans-Scientific Frameworks of Knowing: Complementary Views of the Different Types of Human Knowledge’, Systems Research and Behavioral Systems, 17, 433-58. Brier, S. (2001), ‘The necessity of Trans-Scientific Frameworks for doing Interdisciplinary Research’; at Brier, S. (2002/03), ‘Luhmann Semioticised’, Journal of Sociocybernetics, 3(2), 13-22; at Campbell, R. J. and M. H. Bickhard (2002), ‘Physicalism, Emergence and Downward Causation’; at Cantner, U. and H. Hanusch (2002), ‘Evolutionary economics, its basic concepts and methods’, in H. Lim et al., Editing Economics – Essays in honour of Mark Perlman. London & New York: Routledge, pp. 182-207. Capra, F. (1996), The Web of Life. New York: Doubleday. Chia, R. (1998), ‘From Complexity Science to Complex Thinking: Organization as Simple Location’, Organization, 5(3), 341-369. Chia, R. (2002), ‘Entrepreneurial Strategising: The Tacit Mode’, Working Paper 02/05 - School of Business & Economics, University of Exeter. Christensen, W. D. and C. A. Hooker (2000), ‘An interactivist-constructivist approach to intelligence: self-directed anticipative learning’, Philosophical Psychology, 13(1), 5-45. Christensen, W. D. and M. H. Bickhard (2002), ‘The Process Dynamics of Normative Function’, The Monist, 85(1), 3-28. Cilliers, P. (1998), Complexity and Postmodernism. London and New York: Routledge. Cilliers, P. (2000), ‘Knowledge, Complexity, and Understanding’, Emergence, 2(4), 7-13. Cohendet, P. and W. E. Steinmueller (2000), ‘The Codification of Knowledge: a Conceptual and Empirical Exploration’, Industrial and Corporate Change, 9(2), 195-209. Corning, P. A. (1995), ‘Synergy and Self-organization in the Evolution of Complex Systems’, Systems Research, 12(2), 89-121. Cowan, R., P. A. David and D. Foray (2000), ‘The Explicit Economics of Knowledge Codification and Tacitness’, Industrial and Corporate Change, 9(2), 211-253. Damasio, A. (1994), Descartes’ Error – Emotion, Reason and the Human Brain. New York: Putnam’s Suns. Damasio, A. (1999), The Feeling of What Happens – Body and Emotion in the Making of Consciousness. New York: Harcourt Brace & Company. Deacon, T. (1997), The Symbolic Mind: The Co-evolution of Language and the Brain. New York: Norton. Depew, D. J. and B. H. Weber (1989), ‘The Evolution of the Darwinian Research Tradition’, Systems Research, 6(3), 255-263. Desouza, K. C. (2003), ‘Knowledge management: Why the technology imperative seldom works’, Business Horizons, January- February, 25-29. D’Espagnat, B. (1995), Une incertaine réalité. Paris: Bordas. Emmeche, C. and J. Hoffmeyer (1991), ‘From Language to Nature – the semiotic metaphor in biology’, Semiotica, 84(1/2), 1-42. Emmeche, C., S. Koppe and F. Stjernfelt (1997), ‘Explaining Emergence: Toward an Ontology of Levels’, Journal of General Philosophy of Science, 28, 83-119.

26

Entralgo, P. L. (1992/2003), Corpo e Alma. Coimbra: Livraria Almedina; Portuguese edition of ‘Cuerpo y Alma’, Madrid: Editorial Espasa Calpe. Fauconnier, G. and M. Turner (2002), The Way we Think – Conceptual Blending and the Mind’s Hidden Complexities. New York: Basic Books. Fodor, J. A. (1975), The Language of Thought. New York: Thomas Crowell. Foss, N. (1998), ‘The New Growth Theory: Some Intellectual Growth Accounting’, Journal of Economic Methodology, 5(2), 223-46. Heylighen, F. and C. Joslyn (2001), ‘Cybernetics and Second-Order Cybernetics’, in R. A. Meyers (ed.) Encyclopedia of Physical Science and Technology (3rd ed.). New York: Academic Press; at Hodgson, G. M. (2000), ‘The Concept of Emergence in Social Science: Its History and Importance’, Emergence, 2(4), 65-77. Hodgson, G. H. (2002a), ‘Darwinism in economics: from analogy to ontology’, Journal of Evolutionary Economics, 12, 259-281. Hodgson, G. H. (2002b), ‘The Mystery of the Routine: The Darwinian Destiny of “An Evolutionary Theory of Economic Change”, Paper presented at the EAEPE 2002 Conference, Aix-en-Provence, France. Hoffmeyer, J. (1997), ‘Biosemiotics: Towards a New Synthesis in Biology’, European Journal for Semiotic Studies, 9 (2), 355-376. Hutchins, E. (1995), Cognition in the Wild. Cambridge, Mass.: MIT Press. Jacobs, S. (2002), ‘Polanyi’s presagement of the incommensurability concept’, Studies in History and Philosophy of Science, 33, 105-120. Lakoff, G. (1987), Women, Fire, and Dangerous Things. Chicago: University of Chicago Press. Lakoff, G. and M. Johnson (1999), Philosophy in the flesh – The embodied mind and its challenge to western thought. New York: Basic Books. Lawson, T. (2002), ‘Should Economics Be an Evolutionary Science? Veblen's Concern and Philosophical Legacy’, Journal of Economic Issues, XXXVI(2): 279-92. Lucas, R. (1988), ‘On the Mechanics of Economic Development’, Journal of Monetary Economics 22, p. 3-42. Luhmann, N. (1995), Social Systems. Stanford: Stanford University Press. Lundvall, B.-A. and B. Johnson (1994), ‘The learning economy’, Journal of Industry Studies, 1(2), 23-42. Martin, J. (2003), ‘Emergent Persons’. Working Paper presented to the Interactivist Summer Institute, July 22-26, Copenhagen; at: Maturana, H. and F. Varela (1980), Autopoiesis and Cognition: The Realization of the Living. Dordrecht: Reidel. McFadden, J. (2001), Quantum Evolution – The New Science of Life. New York: Norton. Mingers, J. (2001), ‘Embodying information systems: the contribution of phenomenology’, Information and Organization, 11, 103-128. Nelson, R. (1959), ‘The Simple Economics of Basic Scientific Research’, Journal of Political Economy 67, 297-306. Nelson, R. and S. Winter (1982), An Evolutionary Theory of Economic Change. Cambridge, MA: The Belknap Press. Nightingale, P. (2003), ‘If Nelson and Winter are only half right about tacit knowledge, which half? A Searlean critique of ‘codification’’, Industrial and Corporate Change, 12(2), 149-183.

27

Nonaka, I. and H. Takeuchi (1995), The Knowledge-creating Company. Oxford: Oxford University Press. Omnès, R. (1999), Quantum Philosophy – Understanding and Interpreting Contemporary Science. Princeton, NJ: Princeton University Press. Orlikowski, W. J. (2002), ‘Knowing in Practice: Enacting a Collective Capability in Distributed Organizing’, Organization Science, 13(3), 249-273. Polanyi, M. (1958/1962), Personal Knowledge – Towards a Post-Critical Philosophy. London: Routledge & Kegan Paul. Polanyi, M. (1966/1983), The Tacit Dimension. Gloucester, MASS.: Peter Smith. Ramlogan, R. and J. S. Metcalfe (2002), ‘Limits to the Economy of Knowledge and Knowledge of the Economy’, Discussion Paper for the “Living with Limits to Knowledge” Nexsus Workshop, CRIC – University of Manchester, Manchester. Rizzello, S. (1999), The Economics of the Mind. Cheltenham, UK: Edward Elgar. Romer, P. (1986), ‘Increasing Returns and Long-Run Growth’, Journal of Political Economy 98(5), p. 1002-37. Roy, J.-M., J. Petitot, B. Pachoud and F. J. Varela (1999), ‘Beyond the Gap: An Introduction to Naturalizing Phenomenology’, in Jean Petitot et al., Naturalizing Phenomenology. Stanford: Stanford University Press, pp. 1-80. Sent, E.-M. (1999), ‘Economics of science: survey and suggestions’, Journal of Economic Methodology, 6(1), 95-124. Smith, K. (2002), ‘What is the ‘Knowledge Economy’? Knowledge Intensity and Distributed Knowledge Bases’, Discussion Paper Series #2002-6, INTECH – The United Nations University, Maastricht. Stacey, R. (2000), ‘The Emergence of Knowledge in Organizations’, Emergence, 2(4), 23-39. Stjernfelt, F. (2000) ‘The Idea that Changed the World’, Cybernetics and Human Knowing, 17(1): 77-82. Tononi, G. and Edelman, G. M. (1998), ‘Consciousness and Complexity’, Neuroscience, 282. Tsoukas, H. (1996), ‘The Firm as a Distributed Knowledge System: A Constructionist Approach’, Strategic Management Journal, 17, 11-25. Viskovatoff, A. (2001), ‘Rationality as optimal choice versus rationality as valid inference’, Journal of Economic Methodology, 8, 313-37. Viskovatoff, A. (2002), ‘Searle’s Background: comments on Runde and Faulkner’, Journal of Economic Methodology, 9(1), 65-80. Vygotsky, L. S. (1934/1962), Thought and Language. Cambridge, MA: MIT Press. Whitaker, R. (1995), ‘Observer Web’; at Wicken, J. S. (1998), ‘Evolution and Thermodynamics: The New Paradigm’, Systems Research and Behavioral Science, 15, 365-372. Wilson, E. O. (1999), Consilience – The Unity of Knowledge. London: Abacus.

28