Measure

Operation of measure(ment) consists in attributing values to phenomena of interest in the frame of geographical questioning. On the one hand, it is used to characterise attributes of studied objects, and in these cases comes before any processing. On the other hand , it plays a role downstream, to characterise spatial forms, to describe nature and intensity of relations, to qualify similarities, etc. In the first case, measure is apprehended in the phase of acquisition of information and data needed to handle the studied issue. That information comes either form statistical sources of public or private organisms, either from enquiries (field surveys, polls etc .). In the second case, the question is to give account of results derived from statistical or other processing by the means of appropriate measures.

To elaborate a system of measures implies preliminary identification of relevant objects and attributes with respect to stated questions. According to the studied phenomenon, objects may be of quite different nature: concepts, spatial entities, individuals, social groups, maps are examples of observable objects which attributes may be associated with. According to available and/or measurable information, there may be a more or less significant gap between the phenomenon one wishes to measure and what is actually observable. In each case the question is to have solid hypotheses on the causality chain that links what is measured with the studied phenomenon.

In many cases, study objects are spatial entities. They may constitute a partition of space (all municipalities of a region, all regions of a continent, etc.), they may be network segments or simple localised points (settlements, cities..). Terms “geographical information matrix ” are used to designate a matrix whose rows are the set of considered spatial entities and whose columns are the set of attributes chosen in order to characterise the studied phenomenon.

In a statistical framework, terms “features”, “indicators”, “variables” are also used to designate attributes. According to research objectives and nature of those attributes, several representation and processing tools exist. The main distinction concerns variables nature, either quantitative (numbering, ratio, measure) or qualitative (category, rank). The most classical processing methods use statistical methods. When the aim is to realise a typology, or to emphasise interrelations among a set of variables, data analysis methods are used, principal components analysis if variables are measurable, correspondences analysis if they are qualitative. In those statistical analyses, variables play symmetric roles. When focusing on variability of a particular phenomenon in function of other factors, statistical models are used (multiple regression, variance analysis, covariance analysis, logit model, etc. according to nature of the so-called “to explain” variable and of the so-called “explicative” variables).

There are different levels of space integration in statistical processing according to methods that are used and attributes that are chosen for characterising the study objects. Space may be present through? the mere fact to apply classical statistical methods to statistical individuals that are spatial entities. At a higher integration level, it is possible to take into account attributes containing a spatial dimension, the most classical ones being distance to a structuring object and characterisation of the neighbourhood. Finally, there are processing methods that explicitly integrate space (geo-statistics, spatial statistics, fractal measures, mathematical morphology).

Other models also resort to measure. It is the case of models that are formalised by means of mathematical equations, of logical rules, or on basis of a combination of both. Such models are used to describe and explain the state of a system, the evolution of this state, or the intensity of interactions between different objects (for example the gravity model if the objects are spatial entities). Some models allow running simulations and are used to test scenarios and to make predictions (dynamic models).