Talk:Gram–Schmidt process

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Extension to polynomials?[edit]

When working with polynomials and an arbitrary weight function, there is a recursive Gram-Schmidt orthonomalization technique. More details are provided below:

http://mathworld.wolfram.com/Gram-SchmidtOrthonormalization.html 70.162.89.24 (talk) 05:51, 30 August 2013 (UTC)[reply]

Another example or two is needed, particularly something in a function (ie functional analysis) setting. For example, some simple polynomial examples with a suitable inner product. For example,
, , .
One covers examples like this in introductions to signal processing, so I imagine it's quite important in certain engineering fields. Also, this will produce orthogonal or orthonormal polynomials, making a nice connection to special functions. For example on the interval with the inner product , one recovers the Legendre polynomials.Improbable keeler (talk) 09:29, 18 January 2018 (UTC)[reply]

Determinant formula[edit]

I have two complaints about this section!

First of all, the section defines some vectors ei and then never mentions them again. Frankly, I have no idea what's going on there.

Secondly, the section contains the text: "Note that the expression for uk is a "formal" determinant, i.e. the matrix contains both scalars and vectors; the meaning of this expression is defined to be the result of a cofactor expansion along the row of vectors."

I highly doubt that this explanation will be the least bit enlightening to anyone who didn't already know what was going on, and I also doubt that the link would be very enlightening with a bit of additional context. (For instance, the text should indicate which row the Laplace expansion is being taken over - the last one - and then indicate the formula as a linear combination of the v's with coefficients combing from the corresponding minor determinants.) 2602:30A:C04C:5F30:805:108B:A61A:2175 (talk) 23:37, 3 August 2014 (UTC)[reply]

Francesco Caravelli: I have been trying very hard to find the determinant formula in the literature. There is no reference in the wiki page. Very frustrating! Update: I have found a similar formula in Gantmacher: Theory of matrices (1959) Volume 1, Pages 256-258.

— Preceding unsigned comment added by 204.121.137.208 (talk) 17:28, 16 March 2017 (UTC)[reply] 

Untitled[edit]

The Matlab implementation for the Gram-Schmidt process is for a specific norm and inner product definition (here being the Standard Euclidean Inner Product and by it's extension the 2-norm). Should be updated to reflect that. — Preceding unsigned comment added by BlackMetalStats (talkcontribs) 00:19, 3 April 2017 (UTC)[reply]

Historical origins[edit]

The article originally said that the method had appeared in the work by both Laplace and Cauchy, citing the Cheney and Kincaid book. [1] But the book only mentions that Laplace was "familiar" with the method. I couldn't see a reference to Cauchy on Google Books, but I don't have a copy of the book. Arguably a better historical reference is needed.Improbable keeler (talk) 06:57, 18 January 2018 (UTC)[reply]

References

  1. ^ Cheney, Ward; Kincaid, David (2009). Linear Algebra: Theory and Applications. Sudbury, Ma: Jones and Bartlett. pp. 544, 558. ISBN 978-0-7637-5020-6.

The definition of projection is INCORRECT+Proof[edit]

If we consider , I proof the set you get isn't orthogonal to ech other. Let's proof:

For the field we consider Complex Numbers. If then we begin by compute .

Now see what happen when :

Now the is number in field so we can get it out of bracket as define [1].

and that ISN'T Zero in Complex Vector Space; But if you write the correct form which is every think make sense.

Firouzyan (talk) 21:35, 15 May 2019 (UTC)[reply]

References

  1. ^ "Inner product space", Wikipedia, 2019-04-30, retrieved 2019-05-15