Psychology Wiki
Register
Advertisement

Assessment | Biopsychology | Comparative | Cognitive | Developmental | Language | Individual differences | Personality | Philosophy | Social |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |

Statistics: Scientific method · Research methods · Experimental design · Undergraduate statistics courses · Statistical tests · Game theory · Decision theory


The word random is used to express lack of purpose, cause, order or predictability in non-scientific parlance.

A random process is a repeating process whose outcomes follow no describable deterministic pattern, but follow a probability distribution.

The term randomness is often used in statistics to signify well defined statistical properties, such as lack of bias or correlation. Random is different from arbitrary, because to say that a variable is random means that the variable follows a probability distribution. Arbitrary on the other hand implies that there is no such determinable probability distribution for the variable.

Randomness has an important place in science and philosophy.

History[]

Humankind has been concerned with random physical processes since prehistoric times. Examples are divination (cleromancy, reading messages in random patterns) and gambling.

Despite the prevalence of gambling in all times and cultures, for a long time there was little western inquiry into the subject. Though Gerolamo Cardano and Galileo wrote about games of chance, the first mathematical treatments were given by Blaise Pascal, Pierre de Fermat and Christiaan Huygens. The classical version of probability theory that they developed proceeds from the assumption that outcomes of random processes are equally likely; thus they were among the first to give a definition of randomness in statistical terms. The concept of statistical randomness was later developed into the concept of information entropy in information theory.

In the early 1960s Gregory Chaitin, Andrey Kolmogorov and Ray Solomonoff introduced the notion of algorithmic randomness, in which the randomness of a sequence depends on whether it is possible to compress it.

Randomness in science[]

Many scientific fields are concerned with randomness:

In the physical sciences[]

In the 19th century scientists used the idea of random motions of molecules in the development of statistical mechanics in order to explain phenomena in thermodynamics and the properties of gases.

According to some standard interpretations of quantum mechanics, microscopic phenomena are objectively random. That is, in an experiment where all causally relevant parameters are controlled, there will still be some aspects of the outcome which vary randomly. An example of such an experiment is placing a single unstable atom in a controlled environment; it cannot be predicted how long it will take for the atom to decay; only the probability of decay within a given time can be calculated. Thus quantum mechanics does not specify the outcome of individual experiments but only the probabilities. Hidden variable theories attempt to escape the view that nature contains irreducible randomness: such theories posit that in the processes that appear random, unobservable (hidden) properties with a certain statistical distribution are somehow at work behind the scenes, determining the outcome in each case.

In biology[]

The theory of evolution ascribes the observed diversity of life to random mutations which are retained in the gene pool due to the improved chances for survival and reproduction that some mutated genes confer on individuals who possess them.

The characteristics of an organism arise to some extent deterministically (e.g., under the influence of genes and the environment) and to some extent randomly. For example, genes and exposure to light only control the density of freckles that appear on a person's skin; the exact location of individual freckles appears to be determined randomly.

In mathematics[]

The mathematical theory of probability arose from attempts to formulate mathematical descriptions of chance events, originally in the context of gambling but soon in connection with situations of interest in physics. Statistics is used to infer the underlying probability distribution of a collection of empirical observations. For the purposes of simulation it is necessary to have a large supply of random numbers, or means to generate them on demand.

Algorithmic information theory studies, among other topics, what constitutes a random sequence. The central idea is that a string of bits is random if and only if it is shorter than any computer program that can produce that string (Chaitin-Kolmogorov randomness) - this basically means that random strings are those that cannot be compressed. Pioneers of this field include Andrey Kolmogorov, Ray Solomonoff, Gregory Chaitin, Anders Martin-Löf, and others.

In communication theory[]

In communication theory, randomness in a signal is called noise and is opposed to that component of its variation that is causally attributable to the source, the signal.

In finance[]

The random walk hypothesis considers that asset prices in an organized market evolve at random.

Randomness versus unpredictability[]

Randomness is an objective property. Nevertheless, what appears random to one observer may not appear random to another observer. Consider two observers of a sequence of bits, only one of which who has the cryptographic key needed to turn the sequence of bits into a readable message. The message is not random, but is for one of the observers unpredictable.

One of the intriguing aspects of random processes is that it is hard to know whether the process is truly random. The observer can always suspect that there is some "key" that unlocks the message. This is one of the foundations of superstition.

Under the cosmological hypothesis of determinism there is no randomness in the universe, only unpredictability.

Some mathematically defined sequences exhibit some of the same characteristics as random sequences, but because they are generated by a describable mechanism they are called pseudo-random. To an observer who does not know the mechanism, the pseudo-random sequence is unpredictable.

Chaotic systems are unpredictable in practice due to their extreme dependence on initial conditions. Whether or not they are unpredictable in terms of computability theory is a subject of current research. At least in some disciplines of computability theory the notion of randomness turns out to be identified with computational unpredictability.

It is important to remember that phenomena that are random in some respects may be precisely characterizable in other respects. Quantum mechanics allows a very precise calculation of the half-lives of atoms even though the process of atomic decay is a random one. Ohm's law and the kinetic theory of gases are precise characterizations of macroscopic phenomena which are random on the microscopic level.

Randomness and religion[]

Some theologians have attempted to resolve the apparent contradiction between an omniscient deity, or a first cause, and free will using randomness. Discordianists have a strong belief in randomness and unpredictability.

Applications and use of randomness[]

Main article: Applications of randomness

Random numbers were first investigated in the context of gambling, and many randomizing devices such as dice, shuffling playing cards, and roulette wheels, were first developed for use in gambling. The ability to fairly produce random numbers is vital to electronic gambling and, as such, the methods used to create them are usually regulated by government Gaming Control Boards.

Random numbers are also used for non-gambling purposes, both where their use is mathematically important, such as sampling for opinion polls, and in situations where "fairness" is approximated by randomization, such as selecting jurors and military draft lotteries. Computational solutions for some types of problems use random numbers extensively, such as in the Monte Carlo method and in genetic algorithms.

Generating randomness[]

Main article: Random number generation

In his book A New Kind of Science, Stephen Wolfram describes three mechanisms responsible for (apparently) random behavior in systems :

  1. Randomness coming from the environment (for example, brownian motion, but also hardware random number generators)
  2. Randomness coming from the initial conditions. This aspect is studied by chaos theory, and is observed in systems whose behavior is very sensitive to small variations in initial conditions (such as pachinko machines, dice ...).
  3. Randomness intrinsically generated by the system. This is also called pseudorandomness, and is the kind used in pseudo-random number generators. There are many algorithms (based on arithmetics or cellular automaton) to generate pseudorandom numbers. The behavior of the system can be determined by knowing the seed state and the algorithm used. This method is quicker than getting "true" randomness from the environment.

The many applications of randomness have led to many different methods for generating random data. These methods may vary as to how unpredictable or statistically random they are, and how quickly they can generate random numbers.

Before the advent of computational random number generators, generating large amount of sufficiently random numbers (important in statistics) required a lot of work. Results would sometimes be collected and distributed as random number tables.

One of the most well known advocates of randomness is Kenneth Chan, the author of the highly referenced book simply titled Random, which explores the many concepts associated with the term itself and includes the Random Scale, for grading the level of randomness and which was awarded the coveted "Adam Milligan Award for Excellence" at the 2006 International Psychology Conference in Miami.

Links related to generating randomness[]

Misconceptions/logical fallacies[]

Popular perceptions of randomness are frequently wrong, based on logical fallacies. Following is an attempt to identify the source of such fallacies and correct the logical errors. For a more detailed discussion, see Gambler's fallacy.

A number is "due"[]

This argument says that "since all numbers will eventually come up in a random selection, those that have not come up yet are 'due' and thus more likely to come up soon". This logic is only correct if applied to a system where numbers that come up are removed from the system, such as when playing cards are drawn and not returned to the deck. It's true, for example, that once a jack is removed from the deck, the next draw is less likely to be a jack and more likely to be some other card. However, if the jack is returned to the deck, and the deck is thoroughly reshuffled, there is an equal chance of drawing a jack or any other card the next time. The same truth applies to any other case where objects are selected independently and nothing is removed from the system after each event, such as a die roll, coin toss or most lottery number selection schemes.

A number is "cursed"[]

This argument is almost the reverse of the above, and says that numbers which have come up less often in the past will continue to come up less often in the future. A similar "number is 'blessed'" argument might be made saying that numbers which have come up more often in the past are likely to do so in the future. This logic is only valid if the roll is somehow biased and results don't have equal probabilities - for example, with weighted dice. If we know for certain that the roll is fair, then previous events have no influence over future events.

Note that in nature, unexpected or uncertain events rarely occur with perfectly equal frequencies, so learning which events are likely to have higher probability by observing outcomes makes sense. What is fallacious is to apply this logic to systems which are specially designed so that all outcomes are equally likely - such as dice, roulette wheels, and so on.

Books[]

  • Randomness by Deborah J. Bennett. Harvard University Press, 1998. ISBN 0-674-10745-4
  • Random Measures, 4th ed. by Olav Kallenberg. Academic Press, New York, London; Akademie-Verlag, Berlin (1986). MR0854102
  • The Art of Computer Programming. Vol. 2: Seminumerical Algorithms, 3rd ed. by Donald E. Knuth, Reading, MA: Addison-Wesley, 1997. ISBN 0-201-89684-2
  • Fooled by Randomness, 2nd ed. by Nassim Nicholas Taleb. Thomson Texere, 2004. ISBN 1-58799-190-X
  • Exploring Randomness by Gregory Chaitin. Springer-Verlag London, 2001. ISBN 1-85233-417-7

See also[]

External links[]

Wiktionary-logo-en
Look up randomness in Wiktionary, the free dictionary.
Wikiquote-logo-en
Wikiquote has a collection of quotations related to:

ar:عشوائية ca:Atzar cs:Náhoda da:Tilfældighed de:Zufall es:Azar eo:Hazardo fr:Hasard io:Hazardo la:Fors nl:Toeval ru:Случайное событие simple:Random sv:Slump

This page uses Creative Commons Licensed content from Wikipedia (view authors).
Advertisement