Matrix multiplication
The last entry looked at Gaussian elimination modeling matrices and vectors as tuples and using some idioms of functional programming. The linear algebra theme is continued here with programs to compute matrix/vector and matrix/matrix multiplications.
First some utilities to warm up on. Functions to project the rows or columns of a matrix.
def proj_r (A, i): return A[i] def proj_c (A, j): return tuple (map (lambda t: (t[j],), A))The function that scales a row vector by a constant is here again along with its symmetric partner for scaling a column vector.
def scale_r (u, lam): return tuple (map (lambda ui: lam*ui, u)) def scale_c (c, k): def f (e): (xi,) = e return k*xi, return tuple (map (f, c))For example, if
c=((1,),(2,))
, then scale_c (c, 2)
is ((2,), (4,))
.
The utility
flatten
reinterprets a sequence of column vectors as a matrix of row vectors.
def flatten (t): def row(i): def f (acc, y): return acc + y[i] return functools.reduce (f, t, ()) return tuple (map (row, range (0, len (t[0]))))Here is an example. If
t1=(1,),(2,)
and t2=(3,),(4,)
then s=(t1, t2)
is a column vector sequence and flatten (s)
is the matrix ((1, 3), (2, 4))
.
Matrix vector multiplication comes next.
def mat_vec_mul (A, x): def f (i): (xi,) = x[i] return scale_c (proj_c (A, i), xi) M = tuple (map (f, range (0, len (x)))) return tuple (map (lambda r : (sum (r), ), flatten (M)))This computation utilizes the interpretation of the product as the sum of the column vectors of A weighted by the components of
Matrix multiplication follows immediately by seeing it as the flattening of a sequence of matrix/vector multiplications (that is of A on the individual columns of B).
def mat_mat_mul (A, B): f = lambda j : mat_vec_mul (A, proj_c (B, j)) return flatten (tuple (map (f, range (0, len (B[0])))))
The transcript below shows the function at work in the Python top-level.
>>> A = ((2, 3), (4, 0) ) >>> B = ((1, 2, 0), (5, -1, 0) ) >>> print (str (mat_mat_mul (A, B))) ((17, 1, 0), (4, 8, 0))