# Dot product

If v = [v_{1}, ... , v_{n}]^{T} and v = [w_{1}, ... , w_{n}]^{T} are n-dimensional vectors, the dot product of v and w, denoted v ∙ w, is a special number defined by the formula:

v ∙ w = [v_{1}w_{1} + ... + v_{n}w_{n}]

For example, the dot product of v = [-1, 3, 2]^{T} with w = [5, 1, -2]^{T} is:

v ∙ w = (-1 × 5) + (3 × 1) + (2 × -2) = -6

The following properties can be proven using the definition of a dot product and algebra.

### Dot product properties

- Commutative: v ∙ w = w ∙ v. This is seen by expanding the dot product:
- v ∙ w = v
_{1}w_{1}+ ... + v_{n}w_{n}= w_{1}v_{1}+ ... + w_{n}v_{n}= w ∙ v - Distributive: For n-dimensional vectors v, w, and x,
- v ∙ (w + x) = v ∙ w + v ∙ x
- Compatible with scalars: For scalars c and vectors v and w,
- v ∙ (cw) = (v ∙ w) = (cv ∙ w)
- Induced Euclidean norm: A vector dotted with itself is the square of its magnitude.
- v ∙ v = v + .... + v = ||v||
^{2}

### Theorem

If θ is the angle between vectors v and w, then:

v ∙ w = ||v|| ||w||cos(θ)

Note: We define the angle θ between 2 vectors to be in the range 0 ≤ θ ≤ π = 180°

Proof:

In the figure above, the length of v - w is given by the law of cosines:

1. ||v - w||^{2} = ||v||^{2} + ||w||^{2} - 2||v|| ||w||cos(θ)

By the induced Euclidean norm,

2. ||v - w||^{2} = (v - w) ∙ (v - w)

Using the distributive and commutative properties, we see that:

3. (v - w) ∙ (v - w) = (v ∙ v) - (w ∙ v) - (v ∙ w) + (w ∙ w) = (v ∙ v) + (w ∙ w) - 2(v ∙ w)

Using the induced Euclidean norm again, v ∙ v = ||v||^{2} and w ∙ w = ||w||^{2}, so plugging this into (3) gives us

4. (v ∙ v) + (w ∙ w) - 2(v ∙ w) = ||v||^{2} + ||w||^{2} - 2 (v ∙ w)

Therefore, (2), (3), and (4) give us

5. ||v - w||^{2} = ||v||^{2} + ||w||^{2} - 2(v ∙ w)

Plugging (5) into (1) and canceling out terms gives us the theorem:

6. ||v||^{2} + ||w||^{2} - 2(v ∙ w) = ||v||^{2} + ||w||^{2} - 2||v|| ||w|| cos(θ), or equivalently:

v ∙ w = ||v|| ||w|| cos(θ)

This theorem is arguably the most important fact about dot products because it connects the algebraic definition with the geometric concepts of lengths and angles.

### Orthogonal vectors

A special case of the theorem described above is that if

v ∙ w

then v and w are perpendicular. For example, the 3-dimensional vectors v = [4, 0, -3]^{T} and w = [6, -1, 8]^{T} are perpendicular because

v ∙ w = (4 × 6) + (0 × -1) + (-3 × 8) = 24 - 24 = 0

In general, if v ∙ w = 0 then by the theorem above,

v ∙ w = ||v|| ||w||cos(θ) = 0

If both v and w are of non-zero length, then we can cancel out ||v|| and ||w|| to get

cos(θ) = 0

Since 0 ≤ θ ≤ π,

so v and w are perpendicular. Whenever v ∙ w = 0, v and w are said to be orthogonal. The vector with all entries equal to 0, called the 0-vector and denoted 0, satisfies 0 ∙ v for every vector v. Therefore, we say that 0 is orthogonal to every vector.

Note: It is important to distinguish between 0, the 0-vector, and 0, the number. We can add, subtract, and take dot products with 0 and other vectors, while 0 is simply a scalar that we multiply vectors with.

### Using the theorem

Find the angle between [-1, 1, 1]^{T} and [1, 1, -1]^{T}

v ∙ w = ||v|| ||w||cos(θ)

Thus:

Plugging in the following

gives us

so