Astrophysics (Index) | About |
Phase dispersion minimization (PDM) is a time series analysis mathematical technique used in astrophysics to help identify periodically repeating components of a signal, given multiple measurements gathered over time, e.g., of light curves. PDM is used in conjunction with data folding, to analyze time-series data in which each measurement has some source(s) of randomness (such as variations in the instrument's sensitivity) that obscures any pattern. Data folding consists of trying out a possible period by mapping the data to its phase within the proposed period (i.e., mapping to the absolute time modulo the period length). PDM uses a particular formula to assign a score to the result of data folding a candidate period, providing a means to compare such candidate periods, minimization of these scores indicating the best fit. The PDM formula can also be used to compare a candidate period with a totally random signal, yielding how much the given candidate differs from random.
The Fourier transform can also be used for finding repeating components of time series data but it is suited to signals collected consistently with a fixed cadence, and workarounds to simulate that affect its quality. If there are varying time gaps, data folding with PDM can be useful.