sponsored links 

2
28th December 05:28
External User
Posts: 1

Curve Fitting, more points than coefficients needed
In article <d01ebccd.0310290808.70957c2e@posting.google.com>,
MikeTheRed@Hotmail.com (Mike Larson) writes: least squares: (x(i),y(i)) are given i=1,..,n, n>=3 find a,b,c to minimize sum_I { y(i)(a+b*x(i)+c*x(i)^2) }^2 as a rule it is better to replace the variable "x" here by xmid there mid is (xmax+xmin)/2. you can solve this in several ways, depending on your (software and hardware possibilities). usually this is done using a qr decomposition A=[ 1, x, x.^2 ] 1 is a column of n ones , x the x vector and x.^2 x componentwise squared. so this is n times 3. there norm is the euclidean length. then there is software to compute the decomposition QA =[ R ; O ] with a three by three matrix R and Q orthogonal. then R*{a,b,c]' = (Qy)(1:3) remains to be solved. there are also ready to use programs for this. see http://plato.la.asu.edu/topics/problems/nlolsq.html hth peter 
8
28th December 05:28
External User
Posts: 1

Curve Fitting, more points than coefficients needed
In article <d01ebccd.0310300618.51cedf4e@posting.google.com>,
MikeTheRed@Hotmail.com (Mike Larson) writes: the good message first: you need not do anything than calling a ready code with your numbers. here is one in c: (not the one from numerical recipes, there is a minor flow in the termination criterion which may lead to infinite looping) http://www.netlib.org/c/meschach contains QR and SVD decompositions and solvers. There were already two explanations, but these may be a bit too general for you, hence I try again: you have n (here 10, but maybe much more) points (x(i),y(i)) and want a,b,c such that y(i) aproximately equal a+b*(x(i)xm)+c*(x(i)xm) ^2 . the curve then will be y=a+b*(xxm)+c*(xxm)^2 (The trick with xm is to make the problem more robust against roundoff effects. xm=(max(x(i))+min(x(i)))/2 is a good choice.) The idea is to build the difference (which might be positive or negative) squaring it and summing up. this is one possible and often used measure for the goodness of the fit. known as teh method of least squares (invented by C.F.Gauss) its advantage: in order to minimize it you can differentiate with respect to a,b,c , set zero and get three equations for your three unknowns, which will have a unique solution provided only you have three different x(i). (this allows you to have repeated measurements: for example (1,2), (1,2.1), (1,1.8)... so we have F(a,b,c) = sum_{i=1}^n (y(i)(a+b*(x(i)xm)+c*(x(i)xm)^2) )^2 to minimize with respect to a,b,c. we differentiate and set the result zero. this gives sum_{i=1}^n (y(i)(a+b*(x(i)xm)+c*(x(i)xm)^2) )*2*(1) =0 sum_{i=1}^n (y(i)(a+b*(x(i)xm)+c*(x(i)xm)^2) )*2*((x(i)xm) = 0 sum_{i=1}^n (y(i)(a+b*(x(i)xm)+c*(x(i)xm)^2) )*2*((x(i)xm)^2) = 0 we reorder this , cancel the common factor 2 and get a system of three linear equations in a b and c. these are the so called normal equations. obviously you know how to solve three equations in three unknowns. here is it: [ n s1 s2 ] [ a ] [ sy0 ] [ s1 s2 s3 ] [ b ] = [ sy1 ] [ s2 s3 s4 ] [ c ] [ sy2 ] n is the number of points s1 = sum (x(i)xm) s2 = sum (x(i)xm)2 s3 = sum (x(i)xm)^3 s4 = sum (x(i)xm)^4 sy0 = sum y(i) sy1 = sum y(i)*(x(i)xm) sy2 = sum y(i)*(x(i)xm)^2 systems of this type are often quite sensitive against roundoff errors for example if the numbers x(i)xm are quite large, say +/ 1000 or more, then the matrix will be almost singular and the rounded computation will spoil a,b,c. therefore other techniques are preferred for solving the system. a good and rather simple one is the QR decomposition. (SVD remains for extreme cases) if you reconsider the matrix above then you see that this system can also be A(transpose)*A*coeff = A(transpose)*y where coeff is the vector [a,b,c] (as a column) and y is the vector of the y(i). A is a n times three matrix with the columns [1,x(1)xm,(x(1)xm)^2] ........................ [1,x(n)xm,(x(n)xm)^2] now if you have an orthogonal matrix Q (n times n ) such that Q*A = [R] [O] with a three by three matrix R and a n3 times 3 zero matrix, and you insert this, then you see that also R*coeff = (Q*y) first 3 compoents and this is even easier to solve than the normal equations directly. Q and R are given by a method called QRdecomposition. but since the software exists you can use this plugging just the matrix A and the right hand side y in. hth peter 