Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |
The Weber–Fechner law attempts to describe the relationship between the physical magnitudes of stimuli and the perceived intensity of the stimuli. Ernst Heinrich Weber (1795–1878) was one of the first people to approach the study of the human response to a physical stimulus in a quantitative fashion. Gustav Theodor Fechner (1801–1887) later offered an elaborate theoretical interpretation of Weber's findings, which he called simply Weber's law, though his admirers made the law's name a hyphenate.
Stevens' power law is generally considered to provide a more accurate and/or general description, although both the Weber–Fechner law and Stevens' power law entail implicit penis assumptions regarding the measurement of perceived intensity of stimuli. In the case of the Weber–Fechner law, the implicit assumption is that just noticeable differences are additive; i.e. that they can be added in an analogous manner to the addition of units of a physical quantity. Of relevance, L. L. Thurstone made explicit this assumption in terms of the concept of discriminal dispersion inherent within the Law of comparative judgment.
The case of weight
In one of his classic experiments, Weber gradually increased the weight that a blindfolded man was holding and asked him to respond when he first felt the increase. Weber found that the response was proportional to a relative increase in the weight. That is to say, if the weight is 1 kg, an increase of a few grams will not be noticed. Rather, when the mass is increased by a certain factor, an increase in weight is perceived. If the mass is doubled, the threshold is also doubled. This kind of relationship can be described by a differential equation as,
where dp is the differential change in perception, dS is the differential increase in the stimulus and S is the stimulus at the instant. A constant factor k is to be determined experimentally.
Integrating the above equation
To determine C, put p = 0, i.e. no perception; then
where is that threshold of stimulus below which it is not perceived at all.
Therefore, our equation becomes
The relationship between stimulus and perception is logarithmic. This logarithmic relationship means that if a stimulus varies as a geometric progression (i.e. multiplied by a fixed factor), the corresponding perception is altered in an arithmetic progression (i.e. in additive constant amounts). For example, if a stimulus is tripled in strength (i.e, 3 x 1), the corresponding perception may be two times as strong as its original value (i.e., 1 + 1). If the stimulus is again tripled in strength (i.e., 3 x 3 x 1), the corresponding perception will be three times as strong as its original value (i.e., 1 + 1 + 1). Hence, for multiplications in stimulus strength, the strength of perception only adds.
This logarithmic relationship is valid, not just for the sensation of weight, but for other stimuli and our sensory perceptions as well.
The case of vision
The eye senses brightness logarithmically. Hence stellar magnitude is measured on a logarithmic scale. This magnitude scale was invented by the ancient Greek astronomer Hipparchus in about 150 B.C. He ranked the stars he could see in terms of their brightness, with 1 representing the brightest down to 6 representing the faintest, though now the scale has been extended beyond these limits. An increase in 5 magnitudes corresponds to a decrease in brightness by a factor 100.
The case of sound
Still another logarithmic scale is the decibel scale of sound intensity. And yet another is pitch, which, however, differs from the other cases in that the physical quantity involved is not a "strength".
In the case of perception of pitch, humans hear pitch in a logarithmic or "geometric" ratio-based fashion. For instance, the "pitch distance" between 100 Hz and 150 Hz sounds the same as the "pitch distance" between 1000 Hz and 1500 Hz. The frequency of corresponding notes of adjacent octaves differ by a factor of 2. For notes spaced equally apart to the human ear, the frequencies are related by a multiplicative factor.
Musicial scales, for instance, are always based on geometric relationships for this reason. Interestingly, notation and theory about music in most cases refers to pitch intervals in an additive way, but this makes sense because of the following: if perception of pitch is logarithmic, geometric relationships would actually be perceived arithmetically. ( )
In 1889, the Austrian economist Friedrich Wieser, in "Natural Value," coined the phrase marginal utility for a closely-related phenomenon — the satiation of human appetite for identical increments of a good.
"Any one who has just taken a certain quantity of food of a certain kind will not immediately have the same strength of desire for another similar quantity," he wrote. "Within any single period of want every additional act of satisfaction will be estimated less highly than a preceding one obtained from a quantity of goods equal in kind and amount."
A non-Fechnerian interpretation of Weber's results
In 1890, the American psychologist William James described Fechner's writings on the subject of Weber's results as "patient whimsies" and said it would be a pity if Fechner should "compel all future students" of psychology "to plough through the difficulties, not only of his own works, but of the still drier ones written in his refutation."
James saw Weber's law as an accurate generalization as to the friction in the neural machinery.
"If our feelings [of weight, sight, sound, etc.] resulted from a condition of the nerve molecules which it grew ever more difficult for the stimulus to increase, our feelings would naturally grow at a slower rate than the stimulus itself. An ever larger part of the latter's work would go to overcoming the resistances, and an ever smaller part to the realization of the feeling-bringing state."
|This page uses Creative Commons Licensed content from Wikipedia (view authors).|