Suchen und Finden

Titel

Autor

Inhaltsverzeichnis

Nur ebooks mit Firmenlizenz anzeigen:

 

Informational Tracking

Informational Tracking

Sylvie Leleu-Merviel

 

Verlag Wiley-ISTE, 2018

ISBN 9781119522577 , 278 Seiten

Format ePUB

Kopierschutz DRM

Geräte

139,99 EUR

Für Firmen: Nutzung über Internet und Intranet (ab 2 Exemplaren) freigegeben

Derzeit können über den Shop maximal 500 Exemplare bestellt werden. Benötigen Sie mehr Exemplare, nehmen Sie bitte Kontakt mit uns auf.

Mehr zum Inhalt

Informational Tracking


 

1
The First Information Theories


1.1. Introduction


Information1 has been at the heart of scientific considerations for almost 70 years now. The importance given to information in our societies keeps growing, together with the versatility of the concept and the lack of rigor in its definition. From the texts of its forerunners to the most recent publications, all agree on this point. Jean-Louis Le Moigne [LE 73, p. 10] expresses it in the following words: “Information! Is there a more familiar, a more intuitive word? Is there even a more international term? Is it not central to every management conversation? At the heart of the act of Decision–making, do we not find the immediate answer: Information? Is it not the source of this mutation that human societies are currently experiencing – not without emotion – under the name of the computer revolution? And yet, is there not a more difficult, a more multifaceted, a more ambiguous word?”

Or Jean-Paul Delahaye [DEL 94, pp. 13–14]: “The word information is used in a variety of phrases and contexts. For example, we usually say things like: “The information contained in this book, the available information we have regarding a problem, the information encoded in the genome, the poor information that his long speech provided”. […] We sense that behind this word is hidden something complex, something changeable perhaps, something that, in any case, deserves our reflection. Thus, we are led to wonder: is it possible to make a general scientific theory of information? And if so, how to do it? It is not easy to answer seriously and it is very easy to answer badly, because some mathematical or physical theories already employ the term information, enunciate theorems and give the impression that the problem has been solved and that we can mathematically speak of information”.

The absence of scientific rigor in the characterization of the concept is absolute. We can clearly perceive this by taking a look at the definitions that appear in the dictionary:

  • – the action of informing, of giving information ACTION;
  • – the news, the information that is communicated about someone or something, STATE;
  • – the body of knowledge acquired regarding someone or something, A SET OF STATES;
  • – the actual contents of transmitted messages, CONTENT;
  • – a signal by which a system transmits a piece of knowledge, CONTAINER.

Unfortunately, there is no more rigor in the Sciences index of the term’s definition (reference: Petit Robert):

  • – an element or system capable of being transmitted by a signal or a combination of signals;
  • – what is transmitted, the object of knowledge or memory.

Let us observe that the word “system” frequently appears in most of the scientific definitions of the term.

The perspective of our work intends to be that of a scientific approach towards information. In this sense, we will only take into consideration the works devoted to:

  • – the modeling of the information processes;
  • – the study of the operational laws ruling the functioning of these processes;
  • – more or less formalized and more or less quantified proposals of abstract representations, associated with the corresponding phenomena.

We will purposefully exclude the principle of a study linked to a specific field of application or to a specific category of practices.

1.2. The mathematical theory of information by Shannon [SHA 48]


Research carried out by the pioneers in the field of “information theory” had a variety of destinies. There is no doubt that the works of Claude Elwood Shannon almost immediately found their application in the field of telecommunications: this contribution arrived at its own pace, particularly after the research conducted by other signal engineers from Bell Telephone Laboratories. Nowadays, these works constitute the core of the results unanimously recognized as scientific (in the Cartesian sense of the term, and even in its strictly mathematical sense) as regards a theory of information.

1.2.1. Beginnings of this theory


The starting point can be presented in almost naïve terms. The more improbable and uncertain an event, the more information concerning its advent will be significant: the amount of information in a message depends on the improbability of the event that the message informs us about.

To support this hypothesis, we will illustrate it with a simple intuitive example. Let us imagine that, for decades, all your family members have been without any news from uncle Sam, who left a long time ago to lead an adventurous life in unexplored territories. A message announcing uncle Sam’s arrival the next day contains a lot of information because, given his long absence and prolonged silence, the probability of a message announcing this specific event is extremely low; in fact, such an occurrence is close to zero. Besides, it is precisely due to the fact that the event itself is perceived as improbable that the probability of the message announcing it is feeble. Therefore, we can perceive that in this case there is confusion between the probability of the event taking place and the announcement of the event, a matter that we will discuss at further length. On the other hand, let us imagine that uncle Sam sent a letter that the post was unable to deliver promptly – which does sometimes happen. If this same message, by the same means, arrives one day after the reappearance of uncle Sam, it contains no more information, because the event has already taken place; so it is no longer improbable, it has become certain. As a matter of fact, there was a time when this kind of inconvenience was commonplace; for example, in Tristes Tropiques, Claude Lévi–Strauss observed: “Since the ‘official and urgent telegram’, sent from Lahore on the previous day in order to announce my arrival, reached the director only five days later, due to the floods that raged the Punjab, I might as well have come impromptu” [LÉV 55, p. 473]. Thus, we reckon that the informational estimate of the same message, in an identical form, can vary from one extreme to the other in a short time, depending on the probability of occurrence of the event to which the message makes reference.

Starting from this observation, Shannon’s theory establishes a biunivocal relation between the amount of I information and the probability of occurrence of a message or, more precisely, the number of N states that the expected message can eventually adopt.

This relation takes the following form: I = K.log N, K being constant.

being the appearance probability of one of the N possible states, these N states being equally likely, the above relation can also be expressed as:

Figure 1.1. Relation between the amount of information of a message and the appearance probability of such a message

This curve represents the relationship between the amount of information and the probability of occurrence of such a message.

1.2.2. Shannon’s generalization


The preceding definition of the measurement of the amount of information is filtered by the restrictive hypothesis of the equiprobability of possible states. One of Shannon’s key contributions lies precisely in the fact that he suggested such a generalization. In fact, all possible states no longer have the same probability of occurring, and then, when approaching the limit, the distribution of probabilities assumes a continuous form. By doing so, Shannon associates an amount of information with an amount of entropy.

In this way, we get a measure of the information of a message considered from the viewpoint of the appearance probabilities of the message, which assumes the very general form:

pi being the appearance probability of one of the n states: N1, N2, …, Ni, …, Nn.

Nevertheless, in its generality, this measure still has some limits, mainly associated with the notion of probability. In fact, who are these states likely or unlikely for? For an objective, statistical receiver or for a manager who often estimates a probability via a legitimately subjective judgment? The manager using this notion will often reach the limit. If he does so consciously, he will be right.

Therefore, we should always bear in mind that the distribution of probabilities assumes a continuous form when it approaches the limit.

1.2.3. Information and entropy


The theory of systems has given great importance to the notion of entropy, a measure for uncertainty, disorder, for diversity. We can briefly summarize the evidence of the negentropic equivalence of information as follows:

Given an initial situation, about which we know nothing (I0 = 0) and a priori characterized by N equally probable situations, I1 information (I1 > 0) makes it possible to reduce the number of equally probable alternatives from N0 to N1 (N1 < N0).

The evolution of the physical entropy of this...