Functional regression

From testwiki
Jump to navigation Jump to search

Template:Short description Functional regression is a version of regression analysis when responses or covariates include functional data. Functional regression models can be classified into four types depending on whether the responses or covariates are functional or scalar: (i) scalar responses with functional covariates, (ii) functional responses with scalar covariates, (iii) functional responses with functional covariates, and (iv) scalar or functional responses with functional and scalar covariates. In addition, functional regression models can be linear, partially linear, or nonlinear. In particular, functional polynomial models, functional single and multiple index models and functional additive models are three special cases of functional nonlinear models.

Functional linear models (FLMs)

Functional linear models (FLMs) are an extension of linear models (LMs). A linear model with scalar response Yโ„ and scalar covariates Xโ„p can be written as Template:NumBlk where , denotes the inner product in Euclidean space, β0โ„ and βโ„p denote the regression coefficients, and ε is a random error with mean zero and finite variance. FLMs can be divided into two types based on the responses.

Functional linear models with scalar responses

Functional linear models with scalar responses can be obtained by replacing the scalar covariates X and the coefficient vector β in model (Template:EquationNote) by a centered functional covariate Xc()=X()๐”ผ(X()) and a coefficient function β=β() with domain ๐’ฏ, respectively, and replacing the inner product in Euclidean space by that in Hilbert space L2, Template:NumBlk where , here denotes the inner product in L2. One approach to estimating β0 and β() is to expand the centered covariate Xc() and the coefficient function β() in the same functional basis, for example, B-spline basis or the eigenbasis used in the Karhunen–Loรจve expansion. Suppose {ϕk}k=1 is an orthonormal basis of L2. Expanding Xc and β in this basis, Xc()=k=1xkϕk(), β()=k=1βkϕk(), model (Template:EquationNote) becomes Y=β0+k=1βkxk+ε. For implementation, regularization is needed and can be done through truncation, L2 penalization or L1 penalization.[1] In addition, a reproducing kernel Hilbert space (RKHS) approach can also be used to estimate β0 and β() in model (Template:EquationNote)[2]

Adding multiple functional and scalar covariates, model (Template:EquationNote) can be extended to Template:NumBlk where Z1,,Zq are scalar covariates with Z1=1, α1,,αq are regression coefficients for Z1,,Zq, respectively, Xjc is a centered functional covariate given by Xjc()=Xj()๐”ผ(Xj()), βj is regression coefficient function for Xjc(), and ๐’ฏj is the domain of Xj and βj, for j=1,,p. However, due to the parametric component α, the estimation methods for model (Template:EquationNote) cannot be used in this case[3] and alternative estimation methods for model (Template:EquationNote) are available.[4][5]

Functional linear models with functional responses

For a functional response Y() with domain ๐’ฏ and a functional covariate X() with domain ๐’ฎ, two FLMs regressing Y() on X() have been considered.[3][6] One of these two models is of the form Template:NumBlk where Xc()=X()๐”ผ(X()) is still the centered functional covariate, β0() and β(,) are coefficient functions, and ε() is usually assumed to be a random process with mean zero and finite variance. In this case, at any given time t๐’ฏ, the value of Y, i.e., Y(t), depends on the entire trajectory of X. Model (Template:EquationNote), for any given time t, is an extension of multivariate linear regression with the inner product in Euclidean space replaced by that in L2. An estimating equation motivated by multivariate linear regression is rXY=RXXβ, for βL2(๐’ฎ×๐’ฎ), where rXY(s,t)=cov(X(s),Y(t)), RXX:L2(๐’ฎ×๐’ฎ)L2(๐’ฎ×๐’ฏ) is defined as (RXXβ)(s,t)=๐’ฎrXX(s,w)β(w,t)dw with rXX(s,w)=cov(X(s),X(w)) for s,w๐’ฎ.[3] Regularization is needed and can be done through truncation, L2 penalization or L1 penalization.[1] Various estimation methods for model (Template:EquationNote) are available.[7][8]
When X and Y are concurrently observed, i.e., ๐’ฎ=๐’ฏ,[9] it is reasonable to consider a historical functional linear model, where the current value of Y only depends on the history of X, i.e., β(s,t)=0 for s>t in model (Template:EquationNote).[3][10] A simpler version of the historical functional linear model is the functional concurrent model (see below).
Adding multiple functional covariates, model (Template:EquationNote) can be extended to Template:NumBlk where for j=1,,p, Xjc()=Xj()๐”ผ(Xj()) is a centered functional covariate with domain ๐’ฎj, and βj(,) is the corresponding coefficient function with the same domain, respectively.[3] In particular, taking Xj() as a constant function yields a special case of model (Template:EquationNote) Y(t)=j=1pXjβj(t)+ε(t), for t๐’ฏ, which is a FLM with functional responses and scalar covariates.

Functional concurrent models

Assuming that ๐’ฎ=๐’ฏ, another model, known as the functional concurrent model, sometimes also referred to as the varying-coefficient model, is of the form Template:NumBlk where α0 and α are coefficient functions. Note that model (Template:EquationNote) assumes the value of Y at time t, i.e., Y(t), only depends on that of X at the same time, i.e., X(t). Various estimation methods can be applied to model (Template:EquationNote).[11][12][13]
Adding multiple functional covariates, model (Template:EquationNote) can also be extended to Y(t)=α0(t)+j=1pαj(t)Xj(t)+ε(t), for t๐’ฏ, where X1,,Xp are multiple functional covariates with domain ๐’ฏ and α0,α1,,αp are the coefficient functions with the same domain.[3]

Functional nonlinear models

Functional polynomial models

Functional polynomial models are an extension of the FLMs with scalar responses, analogous to extending linear regression to polynomial regression. For a scalar response Y and a functional covariate X() with domain ๐’ฏ, the simplest example of functional polynomial models is functional quadratic regression[14] Y=α+๐’ฏβ(t)Xc(t)dt+๐’ฏ๐’ฏγ(s,t)Xc(s)Xc(t)dsdt+ε, where Xc()=X()๐”ผ(X()) is the centered functional covariate, α is a scalar coefficient, β() and γ(,) are coefficient functions with domains ๐’ฏ and ๐’ฏ×๐’ฏ, respectively, and ε is a random error with mean zero and finite variance. By analogy to FLMs with scalar responses, estimation of functional polynomial models can be obtained through expanding both the centered covariate Xc and the coefficient functions β and γ in an orthonormal basis.[14]

Functional single and multiple index models

A functional multiple index model is given by Y=g(๐’ฏXc(t)β1(t)dt,,๐’ฏXc(t)βp(t)dt)+ε. Taking p=1 yields a functional single index model. However, for p>1, this model is problematic due to curse of dimensionality. With p>1 and relatively small sample sizes, the estimator given by this model often has large variance.[15] An alternative p-component functional multiple index model can be expressed as Y=g1(๐’ฏXc(t)β1(t)dt)++gp(๐’ฏXc(t)βp(t)dt)+ε. Estimation methods for functional single and multiple index models are available.[15][16]

Functional additive models (FAMs)

Given an expansion of a functional covariate X with domain ๐’ฏ in an orthonormal basis {ϕk}k=1: X(t)=k=1xkϕk(t), a functional linear model with scalar responses shown in model (Template:EquationNote) can be written as ๐”ผ(Y|X)=๐”ผ(Y)+k=1βkxk. One form of FAMs is obtained by replacing the linear function of xk, i.e., βkxk, by a general smooth function fk, ๐”ผ(Y|X)=๐”ผ(Y)+k=1fk(xk), where fk satisfies ๐”ผ(fk(xk))=0 for kโ„•.[3][17] Another form of FAMs consists of a sequence of time-additive models: ๐”ผ(Y|X(t1),,X(tp))=j=1pfj(X(tj)), where {t1,,tp} is a dense grid on ๐’ฏ with increasing size pโ„•, and fj(x)=g(tj,x) with g a smooth function, for j=1,,p[3][18]

Extensions

A direct extension of FLMs with scalar responses shown in model (Template:EquationNote) is to add a link function to create a generalized functional linear model (GFLM) by analogy to extending linear regression to generalized linear regression (GLM), of which the three components are:

  1. Linear predictor η=β0+๐’ฏXc(t)β(t)dt;
  2. Variance function Var(Y|X)=V(μ), where μ=๐”ผ(Y|X) is the conditional mean;
  3. Link function g connecting the conditional mean and the linear predictor through μ=g(η).

See also

References