Psychology Wiki
Advertisement

Assessment | Biopsychology | Comparative | Cognitive | Developmental | Language | Individual differences | Personality | Philosophy | Social |
Methods | Statistics | Clinical | Educational | Industrial | Professional items | World psychology |

Cognitive Psychology: Attention · Decision making · Learning · Judgement · Memory · Motivation · Perception · Reasoning · Thinking  - Cognitive processes Cognition - Outline Index


Vapnik–Chervonenkis theory (also known as VC theory) was developed during 1960–1990 by Vladimir Vapnik and Alexey Chervonenkis. The theory is a form of computational learning theory, which attempts to explain the learning process from a statistical point of view.

VC theory is related to statistical learning theory and to empirical processes. Richard M. Dudley and Vladimir Vapnik himself, among others, apply VC-theory to empirical processes.

VC theory covers at least four parts (as explained in The Nature of Statistical Learning Theory[1]):

  • Theory of consistency of learning processes
    • What are (necessary and sufficient) conditions for consistency of a learning process based on the empirical risk minimization principle ?
  • Nonasymptotic theory of the rate of convergence of learning processes
    • How fast is the rate of convergence of the learning process?
  • Theory of controlling the generalization ability of learning processes
    • How can one control the rate of convergence (the generalization ability) of the learning process?
  • Theory of constructing learning machines
    • How can one construct algorithms that can control the generalization ability?

In addition, VC theory and VC dimension are instrumental in the theory of empirical processes, in the case of processes indexed by VC classes.

The last part of VC theory introduced a well-known learning algorithm: the support vector machine.

VC theory contains important concepts such as the VC dimension and structural risk minimization. This theory is related to mathematical subjects such as:

References[]

  • ^  Vapnik, Vladimir N (2000). The Nature of Statistical Learning Theory, Springer-Verlag.
  • Vapnik, Vladimir N (1989). 'Statistical Learning Theory', Wiley-Interscience.
  • See references in articles: Richard M. Dudley, empirical processes, shattering.
This page uses Creative Commons Licensed content from Wikipedia (view authors).
Advertisement