Plot of a Gaussian white noise signal.
In signal processing, white noise is a random signal with a constant power spectral density.^{[1]} The term is used, with this or similar meanings, in many scientific and technical disciplines, including physics, acoustic engineering, telecommunications, statistical forecasting, and many more. White noise refers to a statistical model for signals and signal sources, rather than to any specific signal.
A "white noise" image.
The term is also used for a discrete signal whose samples are regarded as a sequence of serially uncorrelated random variables with zero mean and finite variance. Depending on the context, one may also require that the samples be independent and have the same probability distribution (in other words i.i.d is a simplest representative of the white noise). In particular, if each sample has a normal distribution with zero mean, the signal is said to be Gaussian white noise.^{[2]}
The samples of a white noise signal may be sequential in time, or arranged along one or more spatial dimensions. In digital image processing, the pixels of a white noise image are typically arranged in a rectangular grid, and are assumed to be independent random variables with uniform probability distribution over some interval. The concept can be defined also for signals spread over more complicated domains, such as a sphere or a torus.
Some "white noise" sound.
An infinitebandwidth white noise signal is a purely theoretical construction. The bandwidth of white noise is limited in practice by the mechanism of noise generation, by the transmission medium and by finite observation capabilities. Thus, a random signal is considered "white noise" if it is observed to have a flat spectrum over the range of frequencies that is relevant to the context. For an audio signal, for example, the relevant range is the band of audible sound frequencies, between 20 to 20,000 Hz. Such a signal is heard as a hissing sound, resembling the /sh/ sound in "ash". In music and acoustics, the term "white noise" may be used for any signal that has a similar hissing sound.
White noise draws its name from white light, although light that appears white generally does not have a flat spectral power density over the visible band.
The term white noise is sometimes used in the context of phylogenetically based statistical methods to refer to a lack of phylogenetic pattern in comparative data.^{[3]} It is sometimes used in non technical contexts, in the metaphoric sense of "random talk without meaningful contents".^{[4]}^{[5]}
Contents

Statistical properties 1

Practical applications 2

Music 2.1

Electronics engineering 2.2

Acoustics 2.3

Computing 2.4

Tinnitus treatment 2.5

Work environment 2.6

Mathematical definitions 3

White noise vector 3.1

Continuoustime white noise 3.2

Mathematical applications 4

Time series analysis and regression 4.1

Random vector transformations 4.2

Generation 5

See also 6

References 7

External links 8
Statistical properties
Being uncorrelated in time does not restrict the values a signal can take. Any distribution of values is possible (although it must have zero DC component). Even a binary signal which can only take on the values 1 or 1 will be white if the sequence is statistically uncorrelated. Noise having a continuous distribution, such as a normal distribution, can of course be white.
It is often incorrectly assumed that Gaussian noise (i.e., noise with a Gaussian amplitude distribution — see normal distribution) necessarily refers to white noise, yet neither property implies the other. Gaussianity refers to the probability distribution with respect to the value, in this context the probability of the signal falling within any particular range of amplitudes, while the term 'white' refers to the way the signal power is distributed (i.e., independently) over time or among frequencies.
We can therefore find Gaussian white noise, but also Poisson, Cauchy, etc. white noises. Thus, the two words "Gaussian" and "white" are often both specified in mathematical models of systems. Gaussian white noise is a good approximation of many realworld situations and generates mathematically tractable models. These models are used so frequently that the term additive white Gaussian noise has a standard abbreviation: AWGN.
White noise is the generalized meansquare derivative of the Wiener process or Brownian motion.
A generalization to random elements on infinite dimensional spaces, such as random fields, is the white noise measure.
Practical applications
Music
White noise is commonly used in the production of electronic music, usually either directly or as an input for a filter to create other types of noise signal. It is used extensively in audio synthesis, typically to recreate percussive instruments such as cymbals or snare drums which have high noise content in their frequency domain.
Electronics engineering
White noise is also used to obtain the impulse response of an electrical circuit, in particular of amplifiers and other audio equipment. It is not used for testing loudspeakers as its spectrum contains too great an amount of high frequency content. Pink noise, which differs from white noise in that it has equal energy in each octave, is used for testing transducers such as loudspeakers and microphones.
Acoustics
To set up the equalization for a concert or other performance in a venue, a short burst of white or pink noise is sent through the PA system and monitored from various points in the venue so that the engineer can tell if the acoustics of the building naturally boost or cut any frequencies. The engineer can then adjust the overall equalization to ensure a balanced mix.
Computing
White noise is used as the basis of some Random.org uses a system of atmospheric antennae to generate random digit patterns from white noise.
Tinnitus treatment
White noise is a common synthetic noise source used for sound masking by a tinnitus masker.^{[6]} White noise machines and other white noise sources are sold as privacy enhancers and sleep aids and to mask tinnitus.^{[7]} Alternatively, the use of an FM radio tuned to unused frequencies ("static") is a simpler and more costeffective source of white noise.^{[8]} However, white noise generated from a common commercial radio receiver tuned to an unused frequency is extremely vulnerable to being contaminated with spurious signals, such as adjacent radio stations, harmonics from nonadjacent radio stations, electrical equipment in the vicinity of the receiving antenna causing interference, or even atmospheric events such as solar flares and especially lightning.
Work environment
The effects of white noise upon cognitive function are mixed. Recently, a small study found that white noise background stimulation improves cognitive functioning among secondary students with attention deficit hyperactivity disorder (ADHD), while decreasing performance of nonADHD students.^{[9]}^{[10]} Other work indicates it is effective in improving the mood and performance of workers by masking background office noise,^{[11]} but decreases cognitive performance in complex card sorting tasks.^{[12]}
Mathematical definitions
White noise vector
A random vector (that is, a partially indeterminate process that produces vectors of real numbers) is said to be a white noise vector or white random vector if its components each have a probability distribution with zero mean and finite variance, and are statistically independent: that is, their joint probability distribution must be the product of the distributions of the individual components.^{[13]}
A necessary (but, in general, not sufficient) condition for statistical independence of two variables is that they be statistically uncorrelated; that is, their covariance is zero. Therefore, the covariance matrix R of the components of a white noise vector w with n elements must be an n by n diagonal matrix, where each diagonal element R_{ii} is the variance of component w_{i}; and the correlation matrix must the n by n identity matrix.
In particular, if in addition to being independent every variable in w also has a normal distribution with zero mean and the same variance \sigma^2, w is said to be a Gaussian white noise vector. In that case, the joint distribution of w is a multivariate normal distribution; the independence between the variables then implies that the distribution has spherical symmetry in ndimensional space. Therefore, any orthogonal transformation of the vector will result in a Gaussian white random vector. In particular, under most types of discrete Fourier transform, such as FFT and Hartley, the transform W of w will be a Gaussian white noise vector, too; that is, its n Fourier coefficients will be independent Gaussian variables with zero mean and the same variance \sigma^2.
The power spectrum P of a random vector w can be defined as the expected value of the squared modulus of each coefficient of its Fourier transform W, that is, P_{i} = E(W_{i}^{2}). Under that definition, a Gaussian white noise vector will have a perfectly flat power spectrum, with P_{i} = \sigma^2 for all i.
If w is a white random vector, but not a Gaussian one, its Fourier coefficients W_{i} will not be completely independent of each other; although for large n and common probability distributions the dependencies are very subtle, and their pairwise correlations can be assumed to be zero.
Often the weaker condition "statistically uncorrelated" is used in the definition of white noise, instead of "statistically independent". However some of the commonly expected properties of white noise (such as flat power spectrum) may not hold for this weaker version. Under this assumption, the stricter version can be referred to explicitly as independent white noise vector.^{[14]}^{:p.60} Other authors use strongly white and weakly white instead.^{[15]}
An example of a random vector that is "Gaussian white noise" in the weak but not in the strong sense is x=[x_{1},x_{2}] where x_{1} is a normal random variable with zero mean, and x_{2} is equal to +x_{1} or to −x_{1}, with equal probability. These two variables are uncorrelated and individually normally distributed, but they are not jointly normally distributed and are not independent. If x is rotated by 45 degrees, its two components will still be uncorrelated, but their distribution will no longer be normal.
In some situations one may relax the definition by allowing each component of a white random vector w to have nonzero expected value \mu. In image processing especially, where samples are typically restricted to positive values, one often takes \mu to be one half of the maximum sample value. In that case, the Fourier coefficient W_{0} corresponding to the zerofrequency component (essentially, the average of the w_i) will also have a nonzero expected value \mu\sqrt{n}; and the power spectrum P will be flat only over the nonzero frequencies.
Continuoustime white noise
In order to define the notion of "white noise" in the theory of continuoustime signals, one must replace the concept of a "random vector" by a continuoustime random signal; that is, a random process that generates a function w of a realvalued parameter t.
Such a process is said to be white noise in the strongest sense if the value w(t) for any time t is a random variable that is statistically independent of its entire history before t. A weaker definition requires independence only between the values w(t_1) and w(t_2) at every pair of distinct times t_1 and t_2. An even weaker definition requires only that such pairs w(t_1) and w(t_2) be uncorrelated.^{[16]} As in the discrete case, some authors adopt the weaker definition for "white noise", and use the qualifier independent to refer to either of the stronger definitions. Others use weakly white and strongly white to distinguish between them.
However, a precise definition of these concepts is not trivial, because some quantities that are finite sums in the finite discrete case must be replaced by integrals that may not converge. Indeed, the set of all possible instances of a signal w is no longer a finitedimensional space \mathbb{R}^n, but an infinitedimensional function space. Moreover, by any definition a white noise signal w would have to be essentially discontinuous at every point; therefore even the simplest operations on w, like integration over a finite interval, require advanced mathematical machinery.
Some authors require each value w(t) to be a realvalued random variable with some finite variance \sigma^2. Then the covariance \mathrm{E}(w(t_1)\cdot w(t_2)) between the values at two times t_1 and t_2 is welldefined: it is zero if the times are distinct, and \sigma^2 if they are equal. However, by this definition, the integral

W_ = \int_a^{a+r} w(t)\, dt
over any interval with positive width r would be zero. This property would render the concept inadequate as a model of physical "white noise" signals.
Therefore, most authors define the signal w indirectly by specifying nonzero values for the integrals of w(t) and w(t)^2 over any interval [a,a+r], as a function of its width r. In this approach, however, the value of w(t) at an isolated time cannot be defined as a realvalued random variable. Also the covariance \mathrm{E}(w(t_1)\cdot w(t_2)) becomes infinite when t_1=t_2; and the autocorrelation function \mathrm{R}(t_1,t_2) must be defined as N \delta(t_1t_2), where N is some real constant and \delta is Dirac's "function".
In this approach, one usually specifies that the integral W_I of w(t) over an interval I=[a,b] is a real random variable with normal distribution, zero mean, and variance (ba)\sigma^2; and also that the covariance \mathrm{E}(W_I\cdot W_J) of the integrals W_I, W_J is r\sigma^2, where r is the width of the intersection I\cap J of the two intervals I,J. This model is called a Gaussian white noise signal (or process).
Mathematical applications
Time series analysis and regression
In statistics and econometrics one often assumes that an observed series of data values is the sum of a series of values generated by a deterministic linear process, depending on certain independent (explanatory) variables, and on a series of random noise values. Then regression analysis is used to infer the parameters of the model process from the observed data, e.g. by ordinary least squares, and to test the null hypothesis that each of the parameters is zero against the alternative hypothesis that it is nonzero. Hypothesis testing typically assumes that the noise values are mutually uncorrelated with zero mean and the same Gaussian probability distribution — in other words, that the noise is white. If there is nonzero correlation between the noise values underlying different observations then the estimated model parameters are still unbiased, but estimates of their uncertainties (such as confidence intervals) will be biased (not accurate on average). This is also true if the noise is heteroskedastic — that is, if it has different variances for different data points.
Alternatively, in the subset of regression analysis known as time series analysis there are often no explanatory variables other than the past values of the variable being modeled (the dependent variable). In this case the noise process is often modeled as a moving average process, in which the current value of the dependent variable depends on current and past values of a sequential white noise process.
Random vector transformations
These two ideas are crucial in applications such as channel estimation and channel equalization in communications and audio. These concepts are also used in data compression. In particular, by a suitable linear transformation (a coloring transformation), a white random vector can be used to produce a "nonwhite" random vector (that is, a list of random variables) whose elements have a prescribed covariance matrix. Conversely, a random vector with known covariance matrix can be transformed into a white random vector by a suitable whitening transformation.
Generation
White noise may be generated digitally with a digital signal processor, microprocessor, or microcontroller. Generating white noise typically entails feeding an appropriate stream of random numbers to a digitaltoanalog converter. The quality of the white noise will depend on the quality of the algorithm used.^{[17]}
See also
References

^ Carter,Mancini, Bruce,Ron (2009). Op Amps for Everyone. Texas Instruments. p. 1011.

^ Diebold, Frank (2007). Elements of Forecasting (Fourth ed.).

^ Fusco, G; Garland, T., Jr; Hunt, G; Hughes, NC (2011). "Developmental trait evolution in trilobites". Evolution 66: 314–329.

^ Claire Shipman (2005), Good Morning America: "The political rhetoric on Social Security is white noise. Said on ABC's Good Morning America TV show, January 11, 2005.

^ Don DeLillo (1985), White Noise

^ Jastreboff, P. J. (2000). "Tinnitus Habituation Therapy (THT) and Tinnitus Retraining Therapy (TRT)". Tinnitus Handbook. San Diego: Singular. pp. 357–376.

^ López, HH; Bracha, AS; Bracha, HS (September 2002). "Evidence based complementary intervention for insomnia". Hawaii Med J 61 (9): 192, 213.

^ Noell, Courtney A; William L Meyerhoff (February 2003). "Tinnitus. Diagnosis and treatment of this elusive symptom". Geriatrics 58 (2): 28–34.

^ Soderlund, Goran; Sverker Sikstrom; Jan Loftesnes; Edmund Sonuga Barke (2010). "The effects of background white noise on memory performance in inattentive school children". Behavioral and Brain Functions 6 (1): 55.

^ Söderlund, Göran; Sverker Sikström; Andrew Smart (2007). "Listen to the noise: Noise is beneficial for cognitive performance in ADHD.". Journal of Child Psychology and Psychiatry 48 (8): 840–847.

^ Loewen, Laura J.; Peter Suedfeld (19920501). "Cognitive and Arousal Effects of Masking Office Noise". Environment and Behavior 24 (3): 381–395.

^ Baker, Mary Anne; Dennis H. Holding (July 1993). "The effects of noise and speech on cognitive task performance.". Journal of General Psychology 120 (3): 339–355.

^ Jeffrey A. Fessler (1998), On Transformations of Random Vectors. Technical report 314, Dept. of Electrical Engineering and Computer Science, Univ. of Michigan.

^ Eric Zivot and Jiahui Wang (2006), Modeling Financial Time Series with SPLUS. Second Edition.

^ Francis X. Diebold (2007), Elements of Forecasting, 4th edition.

^ White noise process. By Econterms via About.com. Accessed on 20130212.

^ Matt Donadio. "How to Generate White Gaussian Noise". Retrieved 20120919.
External links

Meaning of a WhiteNoise Process  "proper" definition of the term white noise
Noise (physics and telecommunications)


General



Noise in...



Class of noise



Engineering
terms



Ratios



Related topics



This article was sourced from Creative Commons AttributionShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and USA.gov, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for USA.gov and content contributors is made possible from the U.S. Congress, EGovernment Act of 2002.
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a nonprofit organization.