Mombu the Science Forum sponsored links

Go Back   Mombu the Science Forum > Science > Curve Fitting, more points than coefficients needed
User Name
Password
REGISTER NOW! Mark Forums Read

sponsored links


Reply
 
1 28th December 05:27
mikethered
External User
 
Posts: 1
Default Curve Fitting, more points than coefficients needed


Hello,

I have a simple question. How to you curve fit a second order
polynominal when you have more than three points, I have 10.

Would someone have and idea to share?

Thanks.

Mike
  Reply With Quote


  sponsored links


2 28th December 05:28
nospamspellucci
External User
 
Posts: 1
Default Curve Fitting, more points than coefficients needed


In article <d01ebccd.0310290808.70957c2e@posting.google.com>,
MikeTheRed@Hotmail.com (Mike Larson) writes:

least squares:
(x(i),y(i)) are given i=1,..,n, n>=3

find a,b,c to minimize
sum_I { y(i)-(a+b*x(i)+c*x(i)^2) }^2
as a rule it is better to replace the variable "x" here by x-mid
there mid is (xmax+xmin)/2.
you can solve this in several ways, depending on your (software and
hardware possibilities). usually this is done using a qr decomposition

A=[ 1, x, x.^2 ] 1 is a column of n ones , x the x vector and x.^2
x componentwise squared. so this is n times 3.


there norm is the euclidean length.
then there is software to compute the decomposition
QA =[ R ; O ] with a three by three matrix R and Q orthogonal.
then
R*{a,b,c]' = (Qy)(1:3)
remains to be solved.
there are also ready to use programs for this.
see
http://plato.la.asu.edu/topics/problems/nlolsq.html
hth
peter
  Reply With Quote
3 28th December 05:28
arto huttunen
External User
 
Posts: 1
Default Curve Fitting, more points than coefficients needed


I assume that you are non mathematician

Use EXCEL
make a table
each row is x x2 y
run regression ****ysis to get the ( least squares ) coefficients.

Arto Huttunen
  Reply With Quote
4 28th December 05:28
google
External User
 
Posts: 1
Default Curve Fitting, more points than coefficients needed


If you only need fitting results and not source code, try
my hobby curve and surface fitting web site, http://zunzun.com

James
  Reply With Quote
5 28th December 05:28
mikethered
External User
 
Posts: 1
Default Curve Fitting, more points than coefficients needed


Thanks all for the ideas.

I won't claim to be a professional mathematician but I did get a minor
in mathimatics from college, 10 years ago.

I do need to understand how the ****ysis is performed because I need
to code it into C.

Peter Spellucci's explanation is the most in-depth but it is just a
bit beyond what I remember. Can someone expand on it? I could solve
the curve fit using least squares for three unknowns and three points
but using more than three points for the three unknows I don't get.
How to you get to the equations to solve?

Thanks.

Mike
  Reply With Quote
6 28th December 05:28
alexandru.lupas
External User
 
Posts: 1
Default Curve Fitting, more points than coefficients needed


=====
Suppose the points are (x_i,y(i)),i=1,2,..,n , where
-infty<a=<x_i =< b<infty , for 1=<i=<n .
For f,g :[a,b]--->R denote A(f)= (1/n)*SUM_{k=1 to k=n} f(x_k) and
[f,g]=A(fg) , ||f||=sqrt([f,f]).
Consider two functions h_1,h_2 :[a,b]-->R such that :
i) h_1 is strict monotonic on [a,b] ,
ii) for each distinct points t_1,t_2,t_3 from [a,b], the
determinant
| 1 h_1(t_1) h_2(t_1) |
| 1 h_1(t_2) h_2(t_2) | is non-zero .
| 1 h_2(t_3) h_2(t_3) |
Denote by P the set of all linear combinations h of form
h(x) = A +B*h_1(x)+C*h_2(x) with A,B,C real
numbers. For instance you can take h_1(x)=x , h_2(x)=x^2 .
If h is in P , let us say that h is a ,,generalized polynomial""
having ,,degree =<2 ".
Further consider that y(i)=f(x_i) , i=1,2,...,n , i.e. y(i) are values
of a certain function f:[a,b]-->R.

The question is to find, if exist, the elements H=Hf in P such that
||f- Hf||=min_{h in P}||f-h|| . It is shown that Hf exists and is
unique.
Because h_1 is strictly monotonic, the inverse h_1^{-1}
:h_1([a,b])--->[a,b] exists.
The number m:= h_1^{-1}(A(h_1)) is in [a,b] . Let us say that m is the
generalized mean of points x(1),...,x(n). Further denote
A(f)= (1/n)*SUM_{k=1 to k=n} y(k)
a = A(h_1^2) - (h_1(m))^2 , b= A(h_2)*h_1(m)- A(h_1*h_2)
c= A(h_1*h_2)*h_1(m)-A(h_2)*A(h_1^2) p_1(x)=h_1(x)-h_1(m) , p_2(x)=a*h_2(x)+b*h_1(x)+c
w_j= 1/<p_j,p_j>}=1/A(p_j^2)}=n/(SUM_{k=1 to k=n}|p_j(x_k)|^2)
Then
================================================== ====
(*) (Hf)(x)=A(f)+ w_1* A(f*p_1)p_1(x) + w_2*A(f*p_2)p_2(x) .
================================================== =====
If A(f)= (y(1)+y(2)+...+y(n))/n= (f(x_1)+f(x_2)+...+f(x_n))/n ,
in case h_1(x)= x , h_2(x)=x^2 , you obtain
m = (x_1+x_2+...+x_n)/n , a=((x_1^2-m^2)+...+(x_n^2-m^2))/n
b=(1/n)*SUM_{k=1 to k=n}(m-x_k)*x_k^2
c=(m/n)*SUM_{k=1 to k=n}x_k^3 - (1/n^2)*(SUM_{k=1 to k=n}x_k^2)^2
p_1(x)= x-m , p_2(x)= ax^2+bx+c
w_1= n/((x_1-m)^2+(x_2-m)^2+...+(x_n-m)^2)
w_2= n/(p_2(x_1)^2+...+p_2(x_n)^2)
A(f*p_1)=[f,p_1]=(1/n)*SUM_{k=1 to k=n}y(k)*(x_k-m) ,A(f*p_2)=[f,p_2]
and from (8) you find the desired polynomial of degree =< 2 .
However, it's possible to select other curves : represents the points
(x_i,y(i)) by means of a poligonal line. One finds a certain graphic.
If you consider that the obtained arc of curve is similar with the
graphic of an exponential you can consider
h_1(x)=e^{q*x} and h_2(x)= e^{r*x} with 0<q<r .
======Regards,Alex
  Reply With Quote
7 28th December 05:28
axel hutt
External User
 
Posts: 1
Default Curve Fitting, more points than coefficients needed


say you have x(i),y(i) for i=1....T.

Then the least squares fit means you minimize

V = \sum_{i=1}^T ( \sum_{l=1}^N a(l)*z(l,i)-y(i) )^2

with respect to a(l). Here you assume a polynomial of order N-1 with
the polynom

a(0) + a(1)*x(i) + a(2)*x^2(i) +.....a(N-1)*x^{N-1}(i) =

a(0) + a(1)z(1,i) + a(2)z(2,i) +.....a(N-1)*z(N-1,i)

i.e. z(l,i)=x^l(i) .

After some ****ytics, you get your coefficients a(l)

a(l) = Minv(l,k)*r(k) , (*)

where Minv is the inverse matrix of the square matrix

M(l,k)= \sum_{i=1}^T z(l,i)*z(k,i) , l=1...N , k=1...N

i.e. the mean over T values of z(l,i)*z(k,i),

and r(k) is

r(k)= \sum_{i=1}^T z(l,i)*y(i) ,

i.e. the mean over T values of z(l,i)*y(i) .

Equation (*) gives directly the coefficients of the polynom.
For a parabolic function, you need a polynom of 2nd order and
it is N=3. Hence, it is

z(0,i)=1 , z(1,i)=x(i) and z(2,i)=x(i)*x(i) .

The latter calculations correspond to most parts of the posting
of P.Spellucci.

One remark: be happy, that you have more data points than parameters,
as for less data than parameters, your coefficients would not be unique.
The number of degrees of freedom, i.e. number of data minus number of
parameters, should be positive for unique solutions.

For general polynomial fitting, I suggest the computation of the inverse
by SVD (see 'Numerical Recipes in C'), as it is the most robust method
for weak singular matrices M i.e. det(M)<<1


Axel


--
Dr. rer.nat. Axel Hutt
Weierstraß-Institute for Applied ****ysis and Stochastics
Mohrenstr. 39, 10117 Berlin, Germany
Tel.: +49 - (0)30 - 20372 564
Fax.: +49 - (0)30 - 2044 975
http://www.wias-berlin.de/people/hutt
  Reply With Quote
8 28th December 05:28
nospamspellucci
External User
 
Posts: 1
Default Curve Fitting, more points than coefficients needed


In article <d01ebccd.0310300618.51cedf4e@posting.google.com>,
MikeTheRed@Hotmail.com (Mike Larson) writes:


the good message first: you need not do anything than calling a ready code
with your numbers. here is one in c:
(not the one from numerical recipes, there is a minor flow in the termination
criterion which may lead to infinite looping)
http://www.netlib.org/c/meschach
contains QR and SVD decompositions and solvers.

There were already two explanations, but these may be a bit too general for
you, hence I try again:

you have n (here 10, but maybe much more) points (x(i),y(i)) and want
a,b,c such that

y(i) aproximately equal a+b*(x(i)-xm)+c*(x(i)-xm) ^2 .
the curve then will be
y=a+b*(x-xm)+c*(x-xm)^2
(The trick with xm is to make the problem more robust against roundoff effects.
xm=(max(x(i))+min(x(i)))/2 is a good choice.)

The idea is to build the difference (which might be positive or negative)
squaring it and summing up. this is one possible and often used measure
for the goodness of the fit.
known as teh method of least squares (invented by C.F.Gauss)

its advantage: in order to minimize it you
can differentiate with respect to a,b,c , set zero and get three
equations for your three unknowns, which will have a unique solution
provided only you have three different x(i). (this allows you to have
repeated measurements: for example (1,2), (1,2.1), (1,1.8)...
so we have

F(a,b,c) = sum_{i=1}^n (y(i)-(a+b*(x(i)-xm)+c*(x(i)-xm)^2) )^2

to minimize with respect to a,b,c.
we differentiate and set the result zero. this gives

sum_{i=1}^n (y(i)-(a+b*(x(i)-xm)+c*(x(i)-xm)^2) )*2*(-1) =0
sum_{i=1}^n (y(i)-(a+b*(x(i)-xm)+c*(x(i)-xm)^2) )*2*(-(x(i)-xm) = 0
sum_{i=1}^n (y(i)-(a+b*(x(i)-xm)+c*(x(i)-xm)^2) )*2*(-(x(i)-xm)^2) = 0

we reorder this , cancel the common factor 2 and get
a system of three linear equations in a
b and c. these are the so called normal equations.
obviously you know how to solve three equations in three unknowns. here
is it:


[ n s1 s2 ] [ a ] [ sy0 ]
[ s1 s2 s3 ] [ b ] = [ sy1 ]
[ s2 s3 s4 ] [ c ] [ sy2 ]

n is the number of points
s1 = sum (x(i)-xm)
s2 = sum (x(i)-xm)2
s3 = sum (x(i)-xm)^3
s4 = sum (x(i)-xm)^4
sy0 = sum y(i)
sy1 = sum y(i)*(x(i)-xm)
sy2 = sum y(i)*(x(i)-xm)^2


systems of this type are often quite sensitive against roundoff errors
for example if the numbers x(i)-xm are quite large, say +/- 1000 or more,
then the matrix will be almost singular and the rounded computation will spoil
a,b,c. therefore other techniques are preferred for solving the system.
a good and rather simple one is the QR decomposition. (SVD remains for
extreme cases)
if you reconsider the matrix above then you see that this system can also be


A(transpose)*A*coeff = A(transpose)*y
where coeff is the vector [a,b,c] (as a column) and y is the vector of
the y(i). A is a n times three matrix with the columns
[1,x(1)-xm,(x(1)-xm)^2]
........................
[1,x(n)-xm,(x(n)-xm)^2]

now if you have an orthogonal matrix Q (n times n ) such that

Q*A = [R]
[O]
with a three by three matrix R and a n-3 times 3 zero matrix, and you insert this,
then you see that also

R*coeff = (Q*y) first 3 compoents
and this is even easier to solve than the normal equations directly.
Q and R are given by a method called QR-decomposition. but since the software
exists you can use this plugging just the matrix A and the right hand side y in.
hth
peter
  Reply With Quote
9 28th December 05:28
mikethered
External User
 
Posts: 1
Default Curve Fitting, more points than coefficients needed


Gentlemen,

Thank you for the information. I believe I have a handle on it.

Mike
  Reply With Quote
Reply


Thread Tools
Display Modes




Copyright © 2006 SmartyDevil.com - Dies Mies Jeschet Boenedoesef Douvema Enitemaus -
666