Just writing down the connections here for myself and maybe others:
A matrix represents a linear function taking a vector and returning a vector, written as
w = M v
Matrix multiplication corresponds to function composition.
Vectors can be indexed, and we can view them as a function i -> v[i] defined on the indexing set. We can also define basis vectors b_i, such that b_i[j] is 1 at index j=i and 0 otherwise. Any vector can be written as a weighted sum of basis vectors, with the vector components as coefficients:
v = Σ_i v[i] b_i
where Σ_i represents summation over the index i.
Matrices can be indexed with two indices, and this is closely related to vector indexing: For a matrix M, we have
M[i, j] = (M b_j)[i]
Each column of the matrix represents its output for a certain basis vector as input.
By writing a vector as a sum of basis vectors and using linearity, we get the well-known matrix-vector multiplication formula:
A matrix represents a linear function taking a vector and returning a vector, written as
w = M v
Matrix multiplication corresponds to function composition.
Vectors can be indexed, and we can view them as a function i -> v[i] defined on the indexing set. We can also define basis vectors b_i, such that b_i[j] is 1 at index j=i and 0 otherwise. Any vector can be written as a weighted sum of basis vectors, with the vector components as coefficients:
v = Σ_i v[i] b_i
where Σ_i represents summation over the index i.
Matrices can be indexed with two indices, and this is closely related to vector indexing: For a matrix M, we have
M[i, j] = (M b_j)[i]
Each column of the matrix represents its output for a certain basis vector as input. By writing a vector as a sum of basis vectors and using linearity, we get the well-known matrix-vector multiplication formula:
(M v)[i] = Σ_j M[i, j] v[j]