喝什么茶对睡眠有帮助| 精神分裂症吃什么药| 姨妈期不能吃什么| 男生小肚子疼是什么原因| cto是什么职位| gt是什么意思| 吃什么增强抵抗力| 555是什么烟| 失常是什么意思| 微喇裤配什么鞋子好看| 什么是个性| 有情人终成眷属是什么意思| 略略略是什么意思| 蒲公英和玫瑰花一起泡有什么功效| 健康证是什么样的| 什么情况下需要打破伤风针| 做核磁共振需要注意什么| 不假思索的假是什么意思| dpl是什么意思| 自由基是什么东西| 你的美丽让你带走是什么歌| 右鼻子经常出血是什么原因| 吃什么治肝病| 湿温病是什么症状| 胸膜炎什么症状| 白细胞计数偏高是什么意思| 香醋是什么醋| 什么叫肝功能不全| 鲻鱼是什么鱼| 大腿内侧肌肉叫什么| 胃不好吃什么好| 生孩子送什么| 建档是什么意思| 梦见别人开车翻车是什么预兆| 体育总局局长什么级别| 扁桃体肥大有什么症状| 梦见小猫崽是什么意思| brush是什么意思| 鹿角粉有什么功效和作用| 鼻鼽病是什么意思| 海虫草是什么| 性侵是什么意思| 孕妇为什么要左侧睡不能右侧睡| 坐地户是什么意思| g点是什么| 脑萎缩是什么意思| 流产吃什么药可以堕胎| 丝状疣用什么药| 姜粉什么时候喝最好| 虾不能和什么东西一起吃| 农历十月初八是什么星座| 鸡犬不宁是什么生肖| 色令智昏是什么意思| 茉莉花茶有什么作用| 贫血会出现什么症状| 尼维达手表什么档次| 受体是什么| gda是什么血管| 七月份什么星座| 运营商是什么意思| 黑芝麻和白芝麻有什么区别| 区块链技术是什么| 乳腺癌ki67是什么意思| 脾门区结节是什么意思| 人贫血吃什么补得快| 蚊子为什么要吸血| 菊花和金银花一起泡水有什么效果| 伟岸一般形容什么人| 玫瑰什么时候开花| 马润什么意思| 黄芪什么季节喝最好| 透明的什么填词语| 后知后觉什么意思| 魂牵梦萦的意思是什么| 缘字五行属什么| 北京市长什么级别| 鸟代表什么生肖| 农历2月份是什么星座| 榛子是什么树的果实| 水能变成什么| 牙龈出血什么原因| 甲状腺素低吃什么能补| 伤口发炎吃什么消炎药| 梨状肌综合征吃什么药| 小腿发黑是什么原因| 灶心土是什么| 记忆力下降是什么原因引起的| 一年四季都盛开的花是什么花| 离卦代表什么| 乙肝表面抗体阴性什么意思| 吃什么才能瘦下来| 补阳气吃什么| hct是什么意思| 皮脂腺囊肿吃什么消炎药| 明胶是什么| 水钠潴留什么意思| 语塞是什么意思| 是什么东西| phc是什么意思| 七个月宝宝可以吃什么辅食| 鱼平念什么| 查五行缺什么| eb病毒iga抗体阳性是什么意思| oa期刊是什么意思| 老虎油是什么意思| 天外飞仙是什么意思| 什么水果泡酒最好| 交会是什么意思| 急性扁桃体化脓是什么原因引起的| 失温是什么意思| 五更是什么生肖| 太平公主叫什么名字| 专情是什么意思| 家字是什么结构| 两肺散在小结节是什么意思| 长生香是什么意思| hpv52阳性是什么意思| hn是什么意思| 天麻有什么功效| 季度是什么意思| 三手烟是什么| 做空什么意思| 什么叫湿热| 西腾手表属于什么档次| 吃什么水果减肥| 5月10日什么星座| 女人下巴有痣代表什么| 糖尿病什么原因引起的| 更年期燥热吃什么食物| 七月八号是什么星座| 少帅是什么军衔| 给女生送礼物送什么好| 四个雷念什么| 肛门痛什么原因| 什么球身上长毛| 什么的白桦| 加盟店是什么意思| 有白带发黄是什么原因| 二重唱是什么意思| 胸闷什么感觉| 籍贯填写什么| 三什么一什么| 心脏不舒服吃什么药| 抠脚大汉什么意思| rash什么意思| 感冒什么时候传染性最强| 尿道感染吃什么消炎药| 真正的爱情是什么| 拔牙后吃什么消炎药最好| 晕车药有什么副作用| 义是什么意思| 孩子半夜咳嗽是什么原因| 手脱皮用什么药好得快| 左氧氟沙星治什么病| 猫代表什么象征意义| 神经递质是什么| 肛门瘙痒是什么问题| 女人背心正中间疼是什么原因| 争强好胜什么意思| 勃起困难是什么原因造成的| 黄眉大王是什么妖怪| 根管预备是什么意思| 八月十日是什么星座| 什么是尾货| kappa是什么牌子| 咳嗽可以吃什么水果| 维生素b族为什么不能晚上吃| 晚上吃黄瓜有什么好处| 祖马龙是什么档次| 新生儿缺氧会有什么后遗症| 狗狗犬窝咳吃什么药| 小黄鱼是什么鱼| 二氧化硅是什么东西| 化妆水是干什么用的| 胃胀想吐是什么原因| 鼻炎有什么症状| 荷花是什么季节| 放的屁很臭是什么原因| 平安夜送女朋友什么| 男生属鸡和什么属相配| 梅毒是什么病| 籍贯是填什么| 肚子不舒服挂什么科| 吸烟有害健康为什么国家还生产烟| 心率90左右意味着什么| 什么鱼清蒸最好吃| 曼陀罗是什么意思| 命好的人都有什么特征| 山炮是什么意思| 眼睛发黄是什么原因| 吃亏是什么意思| 什么是修行| 咽喉痛吃什么药| 猫的尾巴有什么用处| 黄芪什么季节喝最好| 女人吃藕有什么好处| 羊肉补什么| 尿酸高吃什么药降尿酸效果好| 珩字五行属什么| 湿疹是什么症状图片| 思前想后是什么意思| 防代表什么生肖| 直接胆红素是什么| 小猫的特点是什么| 胃火重口臭吃什么药好| 肝硬化有什么症状| 青钱柳有什么功效与作用| 蛋黄吃多了有什么坏处| 什么叫suv车| 蒸馏水敷脸有什么作用| 细菌性炎症用什么药| 跑步后头晕是什么原因| gln是什么意思| 观音得道日是什么时候| wonderful什么意思| 头发长的慢是什么原因| 梦见赢钱了是什么预兆| 唯字五行属什么| 绿加红是什么颜色| 左侧上颌窦炎症是什么意思| 不爱说话的人是什么性格| 男人腿毛多代表什么| 脚为什么会臭| seeyou是什么意思| 小便少是什么原因| 阴道炎症是什么症状| 竹蔗是什么| 狮子座上升星座是什么| 线下是什么意思| 咳嗽发烧吃什么药| 为什么会梦到自己怀孕| 红黄是什么颜色| 小便黄是什么原因| 胡说八道是什么意思| 衢是什么意思| 五塔标行军散有什么功效| 翘嘴鱼吃什么食物| 家里起火代表什么预兆| 太妃糖为什么叫太妃糖| 车水马龙的意思是什么| 什么叫五行| 女生学什么专业好| 出恭什么意思| 焦糖色裤子配什么颜色上衣| 野字五行属什么| 投食是什么意思| 四月二十六是什么星座| 右束支传导阻滞是什么病| 低压高吃什么中成药| 糖化高是什么意思| 吃蜂蜜有什么好处| happy halloween是什么意思| 1月1号是什么星座| 71年属猪是什么命| 痔疮应该挂什么科室| 乳头为什么会内陷| 反绒皮是什么材质| 玫瑰痤疮吃什么药| 中国特工组织叫什么| 又双叒叕念什么啥意思| 什么是快闪| 天赦日是什么意思| 玄学是什么意思| 然五行属什么| 百度Jump to content

扬帆十三五之北京房山:从“灰姑娘”变身“俏佳人”

From Wikipedia, the free encyclopedia
Above: A plot of a series of 100 random numbers concealing a sine function. Below: The sine function revealed in a correlogram produced by autocorrelation.
Visual comparison of convolution, cross-correlation, and autocorrelation. For the operations involving function f, and assuming the height of f is 1.0, the value of the result at 5 different points is indicated by the shaded area below each point. Also, the symmetry of f is the reason and are identical in this example.
百度 站在新的起点上,中国愿与沿线国家一道,以共建"一带一路"为契机,平等协商,兼顾各方利益,反映各方诉求,携手推动更大范围、更高水平、更深层次的大开放、大交流、大融合。

Autocorrelation, sometimes known as serial correlation in the discrete time case, measures the correlation of a signal with a delayed copy of itself. Essentially, it quantifies the similarity between observations of a random variable at different points in time. The analysis of autocorrelation is a mathematical tool for identifying repeating patterns or hidden periodicities within a signal obscured by noise. Autocorrelation is widely used in signal processing, time domain and time series analysis to understand the behavior of data over time.

Different fields of study define autocorrelation differently, and not all of these definitions are equivalent. In some fields, the term is used interchangeably with autocovariance.

Various time series models incorporate autocorrelation, such as unit root processes, trend-stationary processes, autoregressive processes, and moving average processes.

Autocorrelation of stochastic processes

[edit]

In statistics, the autocorrelation of a real or complex random process is the Pearson correlation between values of the process at different times, as a function of the two times or of the time lag. Let be a random process, and be any point in time ( may be an integer for a discrete-time process or a real number for a continuous-time process). Then is the value (or realization) produced by a given run of the process at time . Suppose that the process has mean and variance at time , for each . Then the definition of the autocorrelation function between times and is[1]:?p.388?[2]:?p.165?

where is the expected value operator and the bar represents complex conjugation. Note that the expectation may not be well defined.

Subtracting the mean before multiplication yields the auto-covariance function between times and :[1]:?p.392?[2]:?p.168?

Note that this expression is not well defined for all-time series or processes, because the mean may not exist, or the variance may be zero (for a constant process) or infinite (for processes with distribution lacking well-behaved moments, such as certain types of power law).

Definition for wide-sense stationary stochastic process

[edit]

If is a wide-sense stationary process then the mean and the variance are time-independent, and further the autocovariance function depends only on the lag between and : the autocovariance depends only on the time-distance between the pair of values but not on their position in time. This further implies that the autocovariance and autocorrelation can be expressed as a function of the time-lag, and that this would be an even function of the lag . This gives the more familiar forms for the autocorrelation function[1]:?p.395?

and the auto-covariance function:

In particular, note that

Normalization

[edit]

It is common practice in some disciplines (e.g. statistics and time series analysis) to normalize the autocovariance function to get a time-dependent Pearson correlation coefficient. However, in other disciplines (e.g. engineering) the normalization is usually dropped and the terms "autocorrelation" and "autocovariance" are used interchangeably.

The definition of the autocorrelation coefficient of a stochastic process is[2]:?p.169?

If the function is well defined, its value must lie in the range , with 1 indicating perfect correlation and ?1 indicating perfect anti-correlation.

For a wide-sense stationary (WSS) process, the definition is

.

The normalization is important both because the interpretation of the autocorrelation as a correlation provides a scale-free measure of the strength of statistical dependence, and because the normalization has an effect on the statistical properties of the estimated autocorrelations.

Properties

[edit]

Symmetry property

[edit]

The fact that the autocorrelation function is an even function can be stated as[2]:?p.171? respectively for a WSS process:[2]:?p.173?

Maximum at zero

[edit]

For a WSS process:[2]:?p.174? Notice that is always real.

Cauchy–Schwarz inequality

[edit]

The Cauchy–Schwarz inequality, inequality for stochastic processes:[1]:?p.392?

Autocorrelation of white noise

[edit]

The autocorrelation of a continuous-time white noise signal will have a strong peak (represented by a Dirac delta function) at and will be exactly for all other .

Wiener–Khinchin theorem

[edit]

The Wiener–Khinchin theorem relates the autocorrelation function to the power spectral density via the Fourier transform:

For real-valued functions, the symmetric autocorrelation function has a real symmetric transform, so the Wiener–Khinchin theorem can be re-expressed in terms of real cosines only:

Autocorrelation of random vectors

[edit]

The (potentially time-dependent) autocorrelation matrix (also called second moment) of a (potentially time-dependent) random vector is an matrix containing as elements the autocorrelations of all pairs of elements of the random vector . The autocorrelation matrix is used in various digital signal processing algorithms.

For a random vector containing random elements whose expected value and variance exist, the autocorrelation matrix is defined by[3]:?p.190?[1]:?p.334?

where denotes the transposed matrix of dimensions .

Written component-wise:

If is a complex random vector, the autocorrelation matrix is instead defined by

Here denotes Hermitian transpose.

For example, if is a random vector, then is a matrix whose -th entry is .

Properties of the autocorrelation matrix

[edit]
  • The autocorrelation matrix is a Hermitian matrix for complex random vectors and a symmetric matrix for real random vectors.[3]:?p.190?
  • The autocorrelation matrix is a positive semidefinite matrix,[3]:?p.190? i.e. for a real random vector, and respectively in case of a complex random vector.
  • All eigenvalues of the autocorrelation matrix are real and non-negative.
  • The auto-covariance matrix is related to the autocorrelation matrix as follows:Respectively for complex random vectors:

Autocorrelation of deterministic signals

[edit]

In signal processing, the above definition is often used without the normalization, that is, without subtracting the mean and dividing by the variance. When the autocorrelation function is normalized by mean and variance, it is sometimes referred to as the autocorrelation coefficient[4] or autocovariance function.

Autocorrelation of continuous-time signal

[edit]

Given a signal , the continuous autocorrelation is most often defined as the continuous cross-correlation integral of with itself, at lag .[1]:?p.411?

where represents the complex conjugate of . Note that the parameter in the integral is a dummy variable and is only necessary to calculate the integral. It has no specific meaning.

Autocorrelation of discrete-time signal

[edit]

The discrete autocorrelation at lag for a discrete-time signal is

The above definitions work for signals that are square integrable, or square summable, that is, of finite energy. Signals that "last forever" are treated instead as random processes, in which case different definitions are needed, based on expected values. For wide-sense-stationary random processes, the autocorrelations are defined as

For processes that are not stationary, these will also be functions of , or .

For processes that are also ergodic, the expectation can be replaced by the limit of a time average. The autocorrelation of an ergodic process is sometimes defined as or equated to[4]

These definitions have the advantage that they give sensible well-defined single-parameter results for periodic functions, even when those functions are not the output of stationary ergodic processes.

Alternatively, signals that last forever can be treated by a short-time autocorrelation function analysis, using finite time integrals. (See short-time Fourier transform for a related process.)

Definition for periodic signals

[edit]

If is a continuous periodic function of period , the integration from to is replaced by integration over any interval of length :

which is equivalent to

Properties

[edit]

In the following, we will describe properties of one-dimensional autocorrelations only, since most properties are easily transferred from the one-dimensional case to the multi-dimensional cases. These properties hold for wide-sense stationary processes.[5]

  • A fundamental property of the autocorrelation is symmetry, , which is easy to prove from the definition. In the continuous case,
    • the autocorrelation is an even function when is a real function, and
    • the autocorrelation is a Hermitian function when is a complex function.
  • The continuous autocorrelation function reaches its peak at the origin, where it takes a real value, i.e. for any delay , .[1]:?p.410? This is a consequence of the rearrangement inequality. The same result holds in the discrete case.
  • The autocorrelation of a periodic function is, itself, periodic with the same period.
  • The autocorrelation of the sum of two completely uncorrelated functions (the cross-correlation is zero for all ) is the sum of the autocorrelations of each function separately.
  • Since autocorrelation is a specific type of cross-correlation, it maintains all the properties of cross-correlation.
  • By using the symbol to represent convolution and is a function which manipulates the function and is defined as , the definition for may be written as:

Multi-dimensional autocorrelation

[edit]

Multi-dimensional autocorrelation is defined similarly. For example, in three dimensions the autocorrelation of a square-summable discrete signal would be

When mean values are subtracted from signals before computing an autocorrelation function, the resulting function is usually called an auto-covariance function.

Efficient computation

[edit]

For data expressed as a discrete sequence, it is frequently necessary to compute the autocorrelation with high computational efficiency. A brute force method based on the signal processing definition can be used when the signal size is small. For example, to calculate the autocorrelation of the real signal sequence (i.e. , and for all other values of i) by hand, we first recognize that the definition just given is the same as the "usual" multiplication, but with right shifts, where each vertical addition gives the autocorrelation for particular lag values:

Thus the required autocorrelation sequence is , where and the autocorrelation for other lag values being zero. In this calculation we do not perform the carry-over operation during addition as is usual in normal multiplication. Note that we can halve the number of operations required by exploiting the inherent symmetry of the autocorrelation. If the signal happens to be periodic, i.e. then we get a circular autocorrelation (similar to circular convolution) where the left and right tails of the previous autocorrelation sequence will overlap and give which has the same period as the signal sequence The procedure can be regarded as an application of the convolution property of Z-transform of a discrete signal.

While the brute force algorithm is order n2, several efficient algorithms exist which can compute the autocorrelation in order n log(n). For example, the Wiener–Khinchin theorem allows computing the autocorrelation from the raw data X(t) with two fast Fourier transforms (FFT):[6][page needed]

where IFFT denotes the inverse fast Fourier transform. The asterisk denotes complex conjugate.

Alternatively, a multiple τ correlation can be performed by using brute force calculation for low τ values, and then progressively binning the X(t) data with a logarithmic density to compute higher values, resulting in the same n log(n) efficiency, but with lower memory requirements.[7][8]

Estimation

[edit]

For a discrete process with known mean and variance for which we observe observations , an estimate of the autocorrelation coefficient may be obtained as

for any positive integer . When the true mean and variance are known, this estimate is unbiased. If the true mean and variance of the process are not known there are several possibilities:

  • If and are replaced by the standard formulae for sample mean and sample variance, then this is a biased estimate.
  • A periodogram-based estimate replaces in the above formula with . This estimate is always biased; however, it usually has a smaller mean squared error.[9][10]
  • Other possibilities derive from treating the two portions of data and separately and calculating separate sample means and/or sample variances for use in defining the estimate.[citation needed]

The advantage of estimates of the last type is that the set of estimated autocorrelations, as a function of , then form a function which is a valid autocorrelation in the sense that it is possible to define a theoretical process having exactly that autocorrelation. Other estimates can suffer from the problem that, if they are used to calculate the variance of a linear combination of the 's, the variance calculated may turn out to be negative.[11]

Regression analysis

[edit]

In regression analysis using time series data, autocorrelation in a variable of interest is typically modeled either with an autoregressive model (AR), a moving average model (MA), their combination as an autoregressive-moving-average model (ARMA), or an extension of the latter called an autoregressive integrated moving average model (ARIMA). With multiple interrelated data series, vector autoregression (VAR) or its extensions are used.

In ordinary least squares (OLS), the adequacy of a model specification can be checked in part by establishing whether there is autocorrelation of the regression residuals. Problematic autocorrelation of the errors, which themselves are unobserved, can generally be detected because it produces autocorrelation in the observable residuals. (Errors are also known as "error terms" in econometrics.) Autocorrelation of the errors violates the ordinary least squares assumption that the error terms are uncorrelated, meaning that the Gauss Markov theorem does not apply, and that OLS estimators are no longer the Best Linear Unbiased Estimators (BLUE). While it does not bias the OLS coefficient estimates, the standard errors tend to be underestimated (and the t-scores overestimated) when the autocorrelations of the errors at low lags are positive.

The traditional test for the presence of first-order autocorrelation is the Durbin–Watson statistic or, if the explanatory variables include a lagged dependent variable, Durbin's h statistic. The Durbin-Watson can be linearly mapped however to the Pearson correlation between values and their lags.[12] A more flexible test, covering autocorrelation of higher orders and applicable whether or not the regressors include lags of the dependent variable, is the Breusch–Godfrey test. This involves an auxiliary regression, wherein the residuals obtained from estimating the model of interest are regressed on (a) the original regressors and (b) k lags of the residuals, where 'k' is the order of the test. The simplest version of the test statistic from this auxiliary regression is TR2, where T is the sample size and R2 is the coefficient of determination. Under the null hypothesis of no autocorrelation, this statistic is asymptotically distributed as with k degrees of freedom.

Responses to nonzero autocorrelation include generalized least squares and the Newey–West HAC estimator (Heteroskedasticity and Autocorrelation Consistent).[13]

In the estimation of a moving average model (MA), the autocorrelation function is used to determine the appropriate number of lagged error terms to be included. This is based on the fact that for an MA process of order q, we have , for , and , for .

Applications

[edit]

Autocorrelation's ability to find repeating patterns in data yields many applications, including:

  • Autocorrelation analysis is used heavily in fluorescence correlation spectroscopy[14] to provide quantitative insight into molecular-level diffusion and chemical reactions.[15]
  • Another application of autocorrelation is the measurement of optical spectra and the measurement of very-short-duration light pulses produced by lasers, both using optical autocorrelators.
  • Autocorrelation is used to analyze dynamic light scattering data, which notably enables determination of the particle size distributions of nanometer-sized particles or micelles suspended in a fluid. A laser shining into the mixture produces a speckle pattern that results from the motion of the particles. Autocorrelation of the signal can be analyzed in terms of the diffusion of the particles. From this, knowing the viscosity of the fluid, the sizes of the particles can be calculated.
  • Utilized in the GPS system to correct for the propagation delay, or time shift, between the point of time at the transmission of the carrier signal at the satellites, and the point of time at the receiver on the ground. This is done by the receiver generating a replica signal of the 1,023-bit C/A (Coarse/Acquisition) code, and generating lines of code chips [-1,1] in packets of ten at a time, or 10,230 chips (1,023 × 10), shifting slightly as it goes along in order to accommodate for the doppler shift in the incoming satellite signal, until the receiver replica signal and the satellite signal codes match up.[16]
  • The small-angle X-ray scattering intensity of a nanostructured system is the Fourier transform of the spatial autocorrelation function of the electron density.
  • In surface science and scanning probe microscopy, autocorrelation is used to establish a link between surface morphology and functional characteristics.[17]
  • In optics, normalized autocorrelations and cross-correlations give the degree of coherence of an electromagnetic field.
  • In astronomy, autocorrelation can determine the frequency of pulsars.
  • In music, autocorrelation (when applied at time scales smaller than a second) is used as a pitch detection algorithm for both instrument tuners and "Auto Tune" (used as a distortion effect or to fix intonation).[18] When applied at time scales larger than a second, autocorrelation can identify the musical beat, for example to determine tempo.
  • Autocorrelation in space rather than time, via the Patterson function, is used by X-ray diffractionists to help recover the "Fourier phase information" on atom positions not available through diffraction alone.
  • In statistics, spatial autocorrelation between sample locations also helps one estimate mean value uncertainties when sampling a heterogeneous population.
  • The SEQUEST algorithm for analyzing mass spectra makes use of autocorrelation in conjunction with cross-correlation to score the similarity of an observed spectrum to an idealized spectrum representing a peptide.
  • In astrophysics, autocorrelation is used to study and characterize the spatial distribution of galaxies in the universe and in multi-wavelength observations of low mass X-ray binaries.
  • In panel data, spatial autocorrelation refers to correlation of a variable with itself through space.
  • In analysis of Markov chain Monte Carlo data, autocorrelation must be taken into account for correct error determination.
  • In geosciences (specifically in geophysics) it can be used to compute an autocorrelation seismic attribute, out of a 3D seismic survey of the underground.
  • In medical ultrasound imaging, autocorrelation is used to visualize blood flow.
  • In intertemporal portfolio choice, the presence or absence of autocorrelation in an asset's rate of return can affect the optimal portion of the portfolio to hold in that asset.
  • In numerical relays, autocorrelation has been used to accurately measure power system frequency.[19]

Serial dependence

[edit]

Serial dependence is closely linked to the notion of autocorrelation, but represents a distinct concept (see Correlation and dependence). In particular, it is possible to have serial dependence but no (linear) correlation. In some fields however, the two terms are used as synonyms.

A time series of a random variable has serial dependence if the value at some time in the series is statistically dependent on the value at another time . A series is serially independent if there is no dependence between any pair.

If a time series is stationary, then statistical dependence between the pair would imply that there is statistical dependence between all pairs of values at the same lag .

See also

[edit]

References

[edit]
  1. ^ a b c d e f g Gubner, John A. (2006). Probability and Random Processes for Electrical and Computer Engineers. Cambridge University Press. ISBN 978-0-521-86470-1.
  2. ^ a b c d e f Kun Il Park, Fundamentals of Probability and Stochastic Processes with Applications to Communications, Springer, 2018, ISBN 978-3-319-68074-3
  3. ^ a b c Papoulis, Athanasius, Probability, Random variables and Stochastic processes, McGraw-Hill, 1991
  4. ^ a b Dunn, Patrick F. (2005). Measurement and Data Analysis for Engineering and Science. New York: McGraw–Hill. ISBN 978-0-07-282538-1.
  5. ^ Proakis, John (August 31, 2001). Communication Systems Engineering (2nd Edition) (2 ed.). Pearson. p. 168. ISBN 978-0130617934.
  6. ^ Box, G. E. P.; Jenkins, G. M.; Reinsel, G. C. (1994). Time Series Analysis: Forecasting and Control (3rd ed.). Upper Saddle River, NJ: Prentice–Hall. ISBN 978-0130607744.
  7. ^ Frenkel, D.; Smit, B. (2002). "chap. 4.4.2". Understanding Molecular Simulation (2nd ed.). London: Academic Press. ISBN 978-0122673511.
  8. ^ Colberg, P.; H?fling, F. (2011). "Highly accelerated simulations of glassy dynamics using GPUs: caveats on limited floating-point precision". Comput. Phys. Commun. 182 (5): 1120–1129. arXiv:0912.3824. Bibcode:2011CoPhC.182.1120C. doi:10.1016/j.cpc.2011.01.009. S2CID 7173093.
  9. ^ Priestley, M. B. (1982). Spectral Analysis and Time Series. London, New York: Academic Press. ISBN 978-0125649018.
  10. ^ Percival, Donald B.; Andrew T. Walden (1993). Spectral Analysis for Physical Applications: Multitaper and Conventional Univariate Techniques. Cambridge University Press. pp. 190–195. ISBN 978-0-521-43541-3.
  11. ^ Percival, Donald B. (1993). "Three Curious Properties of the Sample Variance and Autocovariance for Stationary Processes with Unknown Mean". The American Statistician. 47 (4): 274–276. doi:10.1080/00031305.1993.10475997.
  12. ^ "Serial correlation techniques". Statistical Ideas. 26 May 2014.
  13. ^ Baum, Christopher F. (2006). An Introduction to Modern Econometrics Using Stata. Stata Press. ISBN 978-1-59718-013-9.
  14. ^ Elson, Elliot L. (December 2011). "Fluorescence Correlation Spectroscopy: Past, Present, Future". Biophysical Journal. 101 (12): 2855–2870. Bibcode:2011BpJ...101.2855E. doi:10.1016/j.bpj.2011.11.012. PMC 3244056. PMID 22208184.
  15. ^ Ho?yst, Robert; Poniewierski, Andrzej; Zhang, Xuzhu (2017). "Analytical form of the autocorrelation function for the fluorescence correlation spectroscopy". Soft Matter. 13 (6): 1267–1275. Bibcode:2017SMat...13.1267H. doi:10.1039/C6SM02643E. ISSN 1744-683X. PMID 28106203.
  16. ^ Van Sickle, Jan (2008). GPS for Land Surveyors (Third ed.). CRC Press. pp. 18–19. ISBN 978-0-8493-9195-8.
  17. ^ Kalvani, Payam Rajabi; Jahangiri, Ali Reza; Shapouri, Samaneh; Sari, Amirhossein; Jalili, Yousef Seyed (August 2019). "Multimode AFM analysis of aluminum-doped zinc oxide thin films sputtered under various substrate temperatures for optoelectronic applications". Superlattices and Microstructures. 132: 106173. doi:10.1016/j.spmi.2019.106173. S2CID 198468676.
  18. ^ Tyrangiel, Josh (2025-08-05). "Auto-Tune: Why Pop Music Sounds Perfect". Time. Archived from the original on February 10, 2009.
  19. ^ Kasztenny, Bogdan (March 2016). "A New Method for Fast Frequency Measurement for Protection Applications" (PDF). Schweitzer Engineering Laboratories. Archived (PDF) from the original on 2025-08-05. Retrieved 28 May 2022.

Further reading

[edit]
阴山是今天的什么地方 血色素低吃什么补得快 黑标是什么意思 古驰是什么牌子 女性阴部痒是什么原因
怀疑心衰做什么检查 毒唯是什么意思 第二学士学位是什么意思 简历照片用什么底色 2型糖尿病吃什么药降糖效果好
炸酱面用的什么酱 感冒吃什么药最快 什么溪流 温暖如初是什么意思 什么是慈悲
甘油三酯高有什么危害 有生之年什么意思 96属什么生肖 儿童咳嗽吃什么药管用 黑色的蜜蜂是什么蜂
切诺为什么要饭前半小时吃hcv8jop2ns1r.cn 上海是什么省cl108k.com 世界上最可怕的动物是什么chuanglingweilai.com 女性尿频是什么原因hcv9jop5ns9r.cn 胰岛素为什么不能口服hcv8jop8ns9r.cn
拉尿有泡沫是什么原因hcv9jop4ns6r.cn 类风湿有什么特效药hcv8jop0ns4r.cn 同仁什么意思hcv9jop6ns5r.cn 没有宇宙之前是什么shenchushe.com 双鱼配什么星座hcv9jop3ns2r.cn
苦瓜泡水喝有什么功效hcv8jop9ns7r.cn 自在是什么意思hcv8jop3ns1r.cn 吃什么治白头发hcv9jop5ns9r.cn 什么水果止咳inbungee.com 2030是什么年hcv8jop3ns1r.cn
哪吒的妈妈叫什么hcv8jop2ns2r.cn 抵押什么意思hcv9jop6ns0r.cn 飘雪是什么茶hcv9jop6ns1r.cn 血钾低会有什么症状hcv7jop4ns7r.cn 黄皮什么时候成熟hcv7jop5ns4r.cn
百度