Structural and functional complexity: an informational

0 downloads 0 Views 226KB Size Report
JOSk ROBERTO CASTILHO PIQUEIRA. Abstract. [9]). The problem is very complex and a lot of brilliant scientists have been working at it. Now consider a third-.
"STRUCTURALAND FUNCTIONAL COMPLEXITY: AN INFORMATIONALAPPROACH"

JOSk ROBERTO CASTILHO PIQUEIRA

[9]). The problem is very complex and a lot of brilliant

Abstract

scientists have been working at it. Now consider a thirdThe purpose of this brief paper is to establish some

world telecommunications engineer that has studied

definitions about entropy and information in the context of

information theory for data compression and coding

biological complexity.

reasons. Give him a taste for studying basic statistical

It is a trial on unifying ideas from Shanon's information

physics and. after that throw him into a group of biologists

theory and from thermodynamics, providing a coherent set

at a university with plenty of desire to understand

of definitions. in order to help out in the understanding uf

complexity at the light of mathematical physics theories. As would be expected, he has gone a bit crazy and

biological complexity. The main contribution is a precise definition of complexity.

he thought he could write a useful article about the subject

distinguishing structural complexity from functional

putting all these things together.

This is the main objective of this article: to present

complexity.

a definition of entropy that is useful for communication

theory and statisticalphysics. 1. INTRODUCTION

After that to try to establish the relation between this definition and biological complexity. distinguishing

There are a lot of books on information theory

structuralcomplexityfromfunctionalcomplexity.

applied to communication (111; 121; 131; 143). there are a lot of books on statistical thermodynamics (151; 161) and there a

lot of articles and books with articles trying to apply both

2. THE SPACE OF WORK AS AN ABSTRACTION

theories to the problem of biological complexity (171; 181;

0-7803-2129-4/94 $3.00 0 1994 IEEE

Complexity is an issue that belongs to a lot of

Formally speaking, we have a vector valuable

research fields such as computation, physiology, neurology,

function of time x(t) that associates to each time a set of n

physics, chemistry, engineering, social sciences and so on.

real numbers describing the states of the system.

Recently it has become fashionable and now it is

Here we are going to deal with a problem in such a

even flattering to say that someone is a researcher on

way each xi state variable can only assume an integer and

sciences of complexity.

finite number of real numbers - Ni. This assumption will be

But when it comes to being serious, some mathematics is needed and clear umcepts are necessary.

called "quantum assumption" and it is according to D. Gabor on his work described in ([loll.

The diversity of types of applications places the

So, we are working with a system, mainly a

mathematician in a very difficult role in order to define

biological system, described by a discrete number of time

anything in a precise way.

functions XI.

Let us take any natural phenomenom and let us

x2. .... xn. each of then assuming,

respectively, NI; N2;

...;Nn discrete real values.

take a set of variables that can represent the phenomenom

It is easy to see,in this case, that the number (N)

at a given instant of time. One could say that this

of the possible states that the system can assume is given

phenomenom is so complex that one can not establish the

by: n

N = ~ N ~

number of variables that one needs to study.

il That is not a problem. We can wait for Finally, from a mathematical point of view, we improvements in his fEld of work until this class of have N discrete possible states and we can assign to each descriptionmay be applied to his problem. one a real number pi so that: So we have an integer and ftnte number of

variables XI. x2.

.... xn

, that describe the state of the

system. Each of these variables is called state variable.

The more we know about the behavior of this set as time passes, the more we know about our study subject.

i) O S p i < t ' d l S i S N N

ii) cpi= 1 i-1

The number pi will be the probability assigned to

the state i, and the criterion USBd to assign the number to the state is supposed to be known beforehand.

An important thing to say is that the expression

3. INFORMATION AND ENTROPY

given by (3) has the maximum value when all the N Relating to the description presented in the latter

possible states have the same probability (141).

section we are going to define a function that will be called

As the probability of one states increases the

probability of others certainly decreases, because the sum

information. The defintion excludes any kind of human values

needs to be equal to the unit, but the entropy decreases.

So, the maximum entropy hax) is given

such moral qualities, intellectual or artistic values and this definition of information should never be wnfused with

by:

&' (k).k log, (id :. N

Considering the space constituted by the N

1

=

E,

"science" or "knowledge". (11 11).

= klog,N

E-

(4)

possible states, each of them assigned with a probability pi, we associate to each state the individual information (I;) as: A

The expression (4) is compatible with the wncept of thermodynamic entropy (161).

1

Considering the substitution of (1) on (4). we have: n

E,

The constants k and b are free choice positive real

= k l o g , ( n N j ) :. i-1 n

number and they define de unit of individual information.

E,

= k C log,Nj

(5)

i=l

On Shannon's theory it is usual to choose k=l and b=2 defining the bit (Binary Digit) as the unit. 4.

Individual information does not say too much

COMPLEXITY:

STRUCTURAL

AND

FUNCTIONAL

about the system's behavior. It is better to define the mean value of information

Let us take the system that we have already

that will be called entropy Q as:

defined, described by n state variables XI, x2 .... xn; each A N AN 1 E = x p i I i =xpiklOgb ill

i=l

Pi

xj variable assuming Nj discrete states. Then, the number of

(31 different possible states N given by (1).

Under these conditions we

Mi complexity (s

)

of the system as:

their absolute complexities. Calling q 1,2 the complexity of

the system 2, relative system 1. we have:

A

~=u.E,,

(6). with a b e i i an arbitrary positive

constant.

2logb

j.1

id

The definition (6) is intuitive in a certain point d view, because it is simply a number proportional to the "um

If q2.1 > 1 we say that system 2 is more complex

value of the entropy.

In order to distinguish structural complexity from

than system 1. if q2.1 < 1 we say that system 2 is less complex as system 1.

functionalcomplexity. we can combine (6)and (5).

The relative complexity depends on the number af

so:

the state variables of the systems (structure) and the number states of each variable (function).

Observing (7) we note that can be haeased due

5. SECONDARY COMPLEXITY

to two causes: There is one more point that has been forgotten by

i) Increasing the number n d state variables. so we

the complexity researchers. It is the assumption that each d

have structural complexity. ii) Increasing the number Nj d the possible values

the N possible state of the system CouId not be equiprobable.

far a single state variable, so we have functional

In this case we can define secondary complexity (( s) as:

complexity.

Obviously complexity (5) can be a function d time, Ss = aE (9) , with E given by (3).

As we have already said. the maximum value for

because as time passes n and each Nj can change. There is some room to def'ii relative complexity

between two systems 1 and 2. through a quotient between 1977