Skip to content

ECE 204: Numerical Methods

Linear regression

Given a regression and a data set , the residual is the difference between the actual and regressed data:

Method of least squares

This method minimises the sum of the square of residuals.

and can be found by taking the partial derivative and solving for them:

This returns, where is the mean of the actual -values:

The total sum of square around the mean is based off of the actual data:

Error is measured with the coefficient of determination — the closer the value is to 1, the lower the error.

If the intercept is the origin, reduces down to a simpler form:

Non-linear regression

Exponential regression

Solving for the same partial derivatives returns the same values, although bisection may be required for the exponent coefficient () Instead, linearising may make things easier (by taking the natural logarithm of both sides. Afterward, solving as if it were in the form returns correct

Example

Polynomial regression

The residiual is the offset at the end of a polynomial.

Taking the relevant partial derivatives returns a system of equations which can be solved in a matrix.

Interpolation

Interpolation ensures that every point is crossed.

Direct method

To interpolate data points, you need a polynomial of a degree up to , and points that enclose the desired value. Substituting the and values forms a system of equations for a polynomial of a degree equal to the number of points chosen - 1.

Newton's divided difference method

This method guesses the slope to interpolate. Where is an existing point:

The constant is an existing y-value and the slope is an average.

This extends to a quadratic, where the second slope is the average of the first two slopes:

Derivatives

Derivatives are estimated based on first principles:

Derivatives of continuous functions

At a desired for :

  1. Choose an arbitrary
  2. Calculate derivative via first principles
  3. Shrink and recalculate derivative
  4. If the answer is drastically different, repeat step 3

Derivatives of discrete functions

Guesses are made based on the average slope between two points.

Divided differences

  • Using the next term, or a indicates a forward divided difference (FDD).
  • Using the previous term, or a indicates a backward divided difference (BDD).

The central divided difference averages both if or of the forward and backward DDs are equal.

Higher order derivatives

Taking the Taylor expansion of the function or discrete set and then expanding it as necessary can return any order of derivative. This also applies for if positive and negative are alternated.

Example

To find second order derivatives:

Example

if and :

For discrete data:

  • If the desired point does not exist, differentiating the surrounding points to create a polynomial interpolation of the derivative may be close enough.

Example

t 0 10 15 20 22.5 30
v(t) 0 227.04 362.78 517.35 602.47 901.67

with FDD:

Using points :

with Newton's first-order interpolation:

  • If the spacing is not equal (to make DD impossible), again creating an interpolation may be close enough.
  • If data is noisy, regressing and then solving reduces random error.

Integrals

If you can represent a function as an -th order polynomial, you can approximate the integral with the integral of that polynomial.

Trapezoidal rule

The trapezoidal rule looks at the first order polynomial and

From to , if there are trapezoidal segments, where is the width of each segment:

The error for the th trapezoidal segment is . This can be approximated with a maximum value of :

Simpson's 1/3 rule

This uses the second-order polynomial with two segments. Three points are usually used: . Thus for two segments:

For an arbitrary number of segments, as long as there are an even number of equal segments:

The error is:

Ordinary differential equations

Initial value problems

These problems only have results for one value of .

Euler's method converts the function to the form , where .

Example

Where is the width of each estimation (lower is better):

Example

If ,

Heun's method uses Euler's formula as a predictor. Where is the Euler solution:

Example

For :

Euler's formula returns .

Applying Heun's correction:

The Runge-Kutta fourth-order method is the most accurate of the three methods:

Higher order ODEs

Higher order ODEs can be solved by reducing them to first order ODEs by creating a system of equations. For a second order ODE: Let .

For each ODE, the any method can be used:

Example

For :

Boundary value problems

The finite difference method divides the interval between the boundary into sub-intervals, replacing derivatives with their first principles representations. Solving each equation returns a proper system of equations.

Example

For :

Replace with first principles: