Category Archives: Math

Non-linear ODEs

Came across this in my applied math grad course. The correct answer is incredibly simple, the equation of a circle. But in order to arrive at that solution a fair bit of geometry, algebra, and solution of a non-linear ODE is required.

A curve passing through (1,2) has the property that the length of a line drawn between the origin and intersecting perpendicular with the normal extending from any point along the curve is always numerically equal to the ordinate of that point on the curve. Find the equation of the curve.

Put another way: “Any curvature has a normal and a tangent. Draw the normal out and draw a line passing from the origin that is perpendicular to that (this line will be parallel to the tangent). The length of this line is equal to the ‘y’ of the point from which the normal emanates.”

Here’s a crude diagram:

Solution strategy is as follows:

We have some function y=f(x). We pick any point on f, and call that point (x,y) and draw a normal to the function at that point. We then draw a line from the origin such that it intersects with the normal perpendicularly at a point (a,b). The length of the vector (a,b) is equal to the ordinate of the point on the function from which the normal emanates, ‘y’. So that |(a,b)| = y.

What function satisfies this condition, and also passes through the point (1,2)?

Here’s the completed solution: BLAM!

Note the necessity of solving a non-linear (!) first order ODE to get the function! This is a special case ODE that has a ready analytic solution and is referred to as a Bernoulli ODE. It takes the general form dy/dx + P(x)y = Q(x)*(y^n).
Here’s a plot of the solution function:
Neato!

Source: Schaum’s Advanced Mathematics for Engineers and Scientists, Ch2 Prob 83

Orthogonal Functions

After consulting numerous sources I finally found something that clearly and satisfactorily explains the details of orthogonal functions. The concept is the easy part: Two vectors are considered orthogonal if their dot product is zero.

The tough part is the details: This can be generalized to functions, with an inner product of a vector space taking the place of the dot product. How you define this inner product defines the orthogonality conditions and is dependent on the vector space. This is where the useful source comes in:

I have also attached the PowerPoint file should it be lost to the sands of time.
What made it click for me is him drawing direct analogues between the various pieces of the dot product, and the inner product. Take Legendre polynomials as an example: instead of Cartesian 3-space directions (i, j, k) your basis is the the monomial power (1, x, x²) in polynomial “space”. In general for real functions over the domain [a,b] the inner product is:
Restricting the domain to [-1,1] yields the Legendre polynomials.
These ruminations stem from attempting to solve problem 7.42 in Schaum’s Advanced Mathematics.
Given the functions  where a_0, a_1+a_2x, a_3+a_4x+a_5x^2 where a_0,\ldots,a_5 are constants.  Determine the constants so that these functions are mutually orthogonal in (-1,1) and thus obtain the functions.

Attached: Orthogonal