|
- f is a linear-function (-map, -transformation, ...) iff
- f(v + w) = f(v) + f(w),
- f(k v) = k f(v)
- (consequently
- f(0) = 0.)
- f is often called a linear-transformation
if the input and output spaces of f are the same.
- A linear transformation over a finite-dimension vector-space
can be represented by a matrix.
Examples
- Identity, I:
-
-
-
- Projection of <x, y>,
onto the line through the origin whose unit normal
is n = <nx, ny>:
-
- (<x, y>
→ <x, y> - (<x, y> . n) n
= <x, y>
- (x nx - y ny) n
= <x - (x nx + y ny) nx,
y - (x nx + y ny) ny>.)
-
- Reflection of <x, y>,
in the line through the origin whose unit normal
is n = <nx, ny>:
-
1-2nx2 | -2nxny |
-2nxny | 1-2ny2
|
|
|
-
- Anti-clockwise rotation of <x, y>,
by angle θ, about the origin:
-
Equations
- Given constants a, b, c, d, p, q, s, and t, and
variables w, x, y, and z, in matrices
-
- i.e.,
- a w + b y = p,
- a x + b z = q,
- c w + d y = r,
- c x + d z = s.
-
- The following are all 2×2 but
generalize ...
-
-
| is equivalent (wrt solving for w, x, y, and z) to
|
|
to
|
|
|
|
to | and to |
|
| =
|
|
row multiplication by constant k
|
|
|
| =
|
|
row addition (or subtraction)
|
|
-
- If we can work on M and P to reduce
- M X = P
- to an equivalent
- I X = P',
- using the relations above,
then we can just read off the solution, P', for X.
- X and P can be n×1 column vectors, or n×n matrices, etc..
- For example, let P=I, the identity, and reduce
- M X = I
- to
- I X = M-1
- giving the
matrix inverse
of M.
- (Note that any column swaps cause row swaps, in X, which must
be undone to get the final answer.)
|
|