3.6. Spectral Estimation#
Energy and power signals#
The amplitude of an harmonic wave is related to its energy (denoted by \(E\)) and power (denoted by \(P\)). In this chapter we introduce the Power Spectral Density (PSD) which describes how the signal power is distributed over frequency. For later analyses it is relevant to know how much power is contained at what frequencies.
The PSD is used with many different observed signals (e.g. displacement in [m], and acceleration in [m/s2]), maintaining the notions of power and energy, but the classical development originates from electrical engineering, starting with the fact that an electrical sensor delivers a voltage, which hopefully is proportional to the observed physical phenomenon, in which we’re interested.
We’ll start with a flashback to high-school physics, and use Ohm’s law. Suppose \(u(t)\) is voltage across a resistor \(R\) producing a time-varying current \(i(t)\).
Instantaneous power is defined as \(p(t)=u(t)i(t)\), and \(u(t)=i(t)R\), so \(p(t)=i^2(t)R=\frac{u^2(t)}{R}\). With \(R=1\Omega\), the instantaneous power, in Ohm, is given as:
For 1 Ohm resistor, power \(P\) in Watt [W], equals the square of the voltage \(u(t)\) [V2]. The total energy \(E\), in Joule [J], is obtained by integrating the power \(P\) (power equals energy per unit time [W = J/s]).
Integrating over \(|t|\leq T\), we define total energy and average power as
For signal \(x(t)\), the total energy, normalized to a unit resistance, is defined similarly as:
and average power, normalized to a unit resistance, as:
For real signals, the modulus signs can be removed from the two equations above.
Parseval’s theorem#
Definition
Average (normalized) power of signal in time domain is the same as the average (normalized) power of corresponding a signal in the frequency domain; and similarly for total (normalized) energy.
MUDE Exam Information
This derivation is provided for additional insight and will not be part of the exam.
Fourier Series
Average power of periodic waveform \(x(t)\) with period \(T_0\) can be written as:
Since \(|x(t)|^2=x(t)x^*(t)\), we can replace \(x^*(t)\) with its complex exponential Fourier series:
Where we just interchanged the order of summation and integration.
Now, we may write:
because \(|X_k|\) is even.
In other words, average power of a periodic signal is simply the sum of powers in phasors of its Fourier Series, or just the sum of the squared moduli of its complex Fourier series coefficients.
Note
\(X_0\) is signal average and \(X_k\) are harmonic components
We derived that, for periodic signals \(x(t)\), the average power in the time domain equals the average power in the frequency domain:
This is known as Parseval’s theorem.
Fourier Transform
Now we can also show that a similar theorem exists to express the total energy in aperiodic signals! We start with expressing the total energy of the aperiodic signal in time and frequency domain using an inverse Fourier transform:
Reversing the integration order:
We obtain Parseval’s theorem for Fourier transforms that can be expressed as:
Energy in the timedomain equals energy in the frequency domain.
Discrete Fourier Transform
Starting from a discrete, finite sequence of samples, \(x_n\), in the time domain, the average power start as:
where we have discretized the integral over \(x^2(t)\), which then becomes:
Note that \(X_k\) denotes DFT-coefficients, with \(\Delta t\) included (\(X_k=\Delta t\sum_{n=0}^{N-1}x_ne^{-j\frac{2\pi}{N}kn}\)). The full proof is ommitted (if you think this optional derivation box is long, you don’t want to see that one!).
For a sampled signal, with the coefficients \(X_k\), obtained through the DFT (with the \(\Delta t\) included), the power of the signal, contained in a frequency band of width \(\Delta f=\frac{1}{T}\), at frequency \(f=k\Delta f\) is:
and this is actually the power density [W/Hz].
Returning to a voltage signal \(x(t)\), or its sampled, discrete time version \(x_n\), we have, see the previous chapter on the DFT, that \(X_k\) is in [Vs] or equivalently in [V/Hz]. Then the unit of \(S\) equals [V2 s], which equals, with the above exposition on the 1 Ohm resistor, [W/Hz].
Periodogram#
This turns out to be an estimate for the power spectral density (PSD), and it is referred to as a periodogram (estimate may be indicated by a hat-symbol, hence \(\hat{S}\)).
The product \(\Delta fS(k\Delta f)\) will, therefore, represent the contribution by frequency band with width \(\Delta f\) at a frequency \(f=k\Delta f\), to a power \(P\) of the signal.
The periodogram, \(S(f)\), defined for \(0\leq f<f_s\) or, equivalently, \(-\frac{f_s}{2}<f\leq\frac{f_s}{2}\), with \(f_s=\frac{1}{\Delta t}\) will be given by:
Power Spectral Density (PSD)#
\(S(f)\) represents the power spectral density, with \(P\) being given trivially (considering the definition of power spectral density) by:
Integrating the PSD over frequency yields the power contained in the signal.
Summary#
The energy and the power of a signal are given by:
Finally, we derived and gave a formal definition for Parseval’s theorem which reads (for an a-periodic signal) as:
Attribution
This chapter is written by Christiaan Tiberius. Find out more here.