# D'

34,142pages on
this wiki
The title of this article should be d'. The initial letter is capitalized due to technical restrictions.

The sensitivity index or d' (pronounced 'dee-prime') is a statistic used in signal detection theory. It provides the separation between the means of the signal and the noise distributions, in units of the standard deviation of the noise distribution. For normally distributed signal and noise with mean and standard deviations $\mu_S$ and $\sigma_S$, and $\mu_N$ and $\sigma_N$, respectively, d' is defined as:

$d' = \frac{\mu_S - \mu_N}{\sqrt{\frac{1}{2}(\sigma_S^2 + \sigma_N^2)}}$[1]

An estimate of d' can be also found from measurements of the hit rate and false-alarm rate. It is calculated as:

d' = Z(hit rate) - Z(false alarm rate),[2]

where function Z(p), p ∈ [0,1], is the inverse of the cumulative Gaussian distribution.

A higher d' indicates that the signal can be more readily detected.