Conjugate gradient squared method

From testwiki
Revision as of 06:31, 21 December 2024 by imported>Citation bot (Add: isbn, date. | Use this bot. Report bugs. | Suggested by Dominic3203 | Category:Numerical linear algebra | #UCB_Category 6/123)
(diff) โ† Older revision | Latest revision (diff) | Newer revision โ†’ (diff)
Jump to navigation Jump to search

Template:Short description

In numerical linear algebra, the conjugate gradient squared method (CGS) is an iterative algorithm for solving systems of linear equations of the form A๐ฑ=๐›, particularly in cases where computing the transpose AT is impractical.[1] The CGS method was developed as an improvement to the biconjugate gradient method.[2][3][4]

Background

A system of linear equations A๐ฑ=๐› consists of a known matrix A and a known vector ๐›. To solve the system is to find the value of the unknown vector ๐ฑ.[3][5] A direct method for solving a system of linear equations is to take the inverse of the matrix A, then calculate ๐ฑ=A1๐›. However, computing the inverse is computationally expensive. Hence, iterative methods are commonly used. Iterative methods begin with a guess ๐ฑ(0), and on each iteration the guess is improved. Once the difference between successive guesses is sufficiently small, the method has converged to a solution.[6][7]

As with the conjugate gradient method, biconjugate gradient method, and similar iterative methods for solving systems of linear equations, the CGS method can be used to find solutions to multi-variable optimisation problems, such as power-flow analysis, hyperparameter optimisation, and facial recognition.[8]

Algorithm

The algorithm is as follows:[9]

  1. Choose an initial guess ๐ฑ(0)
  2. Compute the residual ๐ซ(0)=๐›A๐ฑ(0)
  3. Choose ๐ซ~(0)=๐ซ(0)
  4. For i=1,2,3, do:
    1. ρ(i1)=๐ซ~T(i1)๐ซ(i1)
    2. If ρ(i1)=0, the method fails.
    3. If i=1:
      1. ๐ฉ(1)=๐ฎ(1)=๐ซ(0)
    4. Else:
      1. β(i1)=ρ(i1)/ρ(i2)
      2. ๐ฎ(i)=๐ซ(i1)+βi1๐ช(i1)
      3. ๐ฉ(i)=๐ฎ(i)+β(i1)(๐ช(i1)+β(i1)๐ฉ(i1))
    5. Solve M๐ฉ^=๐ฉ(i), where M is a pre-conditioner.
    6. ๐ฏ^=A๐ฉ^
    7. α(i)=ρ(i1)/๐ซ~T๐ฏ^
    8. ๐ช(i)=๐ฎ(i)α(i)๐ฏ^
    9. Solve M๐ฎ^=๐ฎ(i)+๐ช(i)
    10. ๐ฑ(i)=๐ฑ(i1)+α(i)๐ฎ^
    11. ๐ช^=A๐ฎ^
    12. ๐ซ(i)=๐ซ(i1)α(i)๐ช^
    13. Check for convergence: if there is convergence, end the loop and return the result

See also

References

Template:Reflist