On May 1, 11:03 pm, someone <newsbo...@gmail.com> wrote:

> On 05/02/2012 01:38 AM, Russ P. wrote:

>

>

>

>

>

>

>

>

>

> > On May 1, 4:05 pm, Paul Rubin<no.em...@nospam.invalid> wrote:

> >> someone<newsbo...@gmail.com> writes:

> >>> Actually I know some... I just didn't think so much about, before

> >>> writing the question this as I should, I know theres also something

> >>> like singular value decomposition that I think can help solve

> >>> otherwise illposed problems,

>

> >> You will probably get better advice if you are able to describe what

> >> problem (ill-posed or otherwise) you are actually trying to solve. SVD

> >> just separates out the orthogonal and scaling parts of the

> >> transformation induced by a matrix. Whether that is of any use to you

> >> is unclear since you don't say what you're trying to do.

>

> > I agree with the first sentence, but I take slight issue with the word

> > "just" in the second. The "orthogonal" part of the transformation is

> > non-distorting, but the "scaling" part essentially distorts the space.

> > At least that's how I think about it. The larger the ratio between the

> > largest and smallest singular value, the more distortion there is. SVD

> > may or may not be the best choice for the final algorithm, but it is

> > useful for visualizing the transformation you are applying. It can

> > provide clues about the quality of the selection of independent

> > variables, state variables, or inputs.

>

> Me would like to hear more! :-)

>

> It would really appreciate if anyone could maybe post a simple SVD

> example and tell what the vectors from the SVD represents geometrically

> / visually, because I don't understand it good enough and I'm sure it's

> very important, when it comes to solving matrix systems...

SVD is perhaps the ultimate matrix decomposition and the ultimate tool

for linear analysis. Google it and take a look at the excellent

Wikipedia page on it. I would be wasting my time if I tried to compete

with that.

To really appreciate the SVD, you need some background in linear

algebra. In particular, you need to understand orthogonal

transformations. Think about a standard 3D Cartesian coordinate frame.

A rotation of the coordinate frame is an orthogonal transformation of

coordinates. The original frame and the new frame are both orthogonal.

A vector in one frame is converted to the other frame by multiplying

by an orthogonal matrix. The main feature of an orthogonal matrix is

that its transpose is its inverse (hence the inverse is trivial to

compute).

The SVD can be thought of as factoring any linear transformation into

a rotation, then a scaling, followed by another rotation. The scaling

is represented by the middle matrix of the transformation, which is a

diagonal matrix of the same dimensions as the original matrix. The

singular values can be read off of the diagonal. If any of them are

zero, then the original matrix is singular. If the ratio of the

largest to smallest singular value is large, then the original matrix

is said to be poorly conditioned.

Standard Cartesian coordinate frames are orthogonal. Imagine an x-y

coordinate frame in which the axes are not orthogonal. Such a

coordinate frame is possible, but they are rarely used. If the axes

are parallel, the coordinate frame will be singular and will basically

reduce to one-dimensional. If the x and y axes are nearly parallel,

the coordinate frame could still be used in theory, but it will be

poorly conditioned. You will need large numbers to represent points

fairly close to the origin, and small deviations will translate into

large changes in coordinate values. That can lead to problems due to

numerical roundoff errors and other kinds of errors.

--Russ P.

--

http://mail.python.org/mailman/listinfo/python-list