# Algebraic aspects of adjacency matrices

In our last encounter, we saw how there is a matrix naturally associated with every graph, and it can be used for representing the graph on a computer. While this already merits giving adjacency matrices our attention as useful constructs, we shall now see that there is a whole lot more to them than just a data structure.

I’d like to introduce some motivation for the study of this and related structures.

## Graph Isomorphism

Given two graphs $$G = (V, E), G' = (V', E')$$, we say that a bijective function $$f: V \to V'$$ is a graph isomorphism if $$(u, v) \in E \iff (f(u), f(v)) \in E'$$ $$\forall\ u, v \in V$$. We say that $$G$$ and $$G'$$ are isomorphic, and we note this $$G \simeq G'$$. The idea is that two isomorphic graphs are really the same graph, just under a relabeling of the nodes. Most of the properties we are interested in when studying graphs are preserved under graph isomorphism. For instance, knowing that $$G \simeq G'$$:

• $$|V| = |V'|, |E| = |E'|$$.
• $$G$$ is connected $$\iff$$ $$G'$$ is connected.
• Every cycle $$C$$ in $$G$$ has a unique cycle image $$f(C)$$ in $$G'$$, and obviously $$|C| = |f(C)|$$.
• $$\chi(G) = \chi(G')$$, $$\chi$$ the chromatic number. In general, $$G$$ and $$G'$$ have the same chromatic polynomial. We will discuss this polynomial in further posts.
• $$G$$ and $$G'$$ have the same degree sequence.
• $$\omega(G) = \omega(G')$$, with $$\omega$$ the clique number (size of the largest clique, a maximal complete subgraph).
• $$G$$ and $$G'$$ have the same vertex cover and edge cover numbers.

As you can see, many interesting properties of graphs are preserved by graph isomorphisms. Properties that do not talk solely about the structure of the graph, but perhaps the representation, need not be preserved under isomorphism. A common example of this is: if we’ve represented our nodes as numbers from $$1$$ to $$n$$, graph isomorphism does not preserve the quantity $$\sum_{i=1}^n i \cdot deg(v_i)$$.

Now, here is how this relates to adjacency matrices.

Lemma. Given two graphs $$G = (V, E)$$, $$G' = (V', E')$$ with adjacency matrices $$A$$ and $$A'$$ respectively, and calling $$n = |V|$$, then $$G \simeq G'$$ $$\iff$$ there exists a permutation matrix $$P \in \{0, 1\}^{n \times n}$$ such that

$P^t A P = A'$

where $$P^t$$ denotes the transpose of $$P$$.

$$\Rightarrow )$$ Suppose $$G \simeq G'$$,with $$n = |V|$$ and $$V = \{v_1, ..., v_n\}, V' = \{v'_1, ..., v'_n\}$$. Then there exists $$f: V \to V'$$ a graph isomorphism. This bijection gives rise to a permutation $$\sigma$$, where we define $$\sigma(i) = j \iff f(v_i) = v'_j$$. It is clear, then, that $$f(v_i) = v'_{\sigma(i)}$$.

Let $$P \in \{0, 1\}^{n \times n}$$ be defined as:

$P_{i, j} = \begin{cases} 1, & \text{if }\sigma(i) = j\\ 0, & \text{otherwise} \end{cases}$

It is clear that $$P$$ has only one $$1$$ per row or column, and that there are no two rows or columns alike, since $$\sigma$$ is a permutation. Then, $$P$$ is a permutation matrix. We can also see that what we need to prove is equivalent to $$P^t A = A' P^t$$.

Let us now consider a given entry $$(i, j)$$ in $$P^t A$$:

$\displaystyle (P^t A)_{i, j} = \sum_{k = 1}^n P^t_{i, k} A_{k, j} = \sum_{k = 1}^n P_{k, i} A_{k, j}$

However, we know that $$P_{k, i}$$ will only be $$1$$ when $$\sigma(k) = i$$, and be $$0$$ elsewhere. This is the same as saying that it will be $$1$$ only when $$k = \sigma^{-1}(i)$$.

$\displaystyle (P^t A)_{i, j} = \sum_{k = 1}^n P_{k, i} A_{k, j} = A_{\sigma^{-1}(i), j}$

What we can see here is that multiplying to the left by $$P^t$$ has the effect of permuting the rows according to $$\sigma$$. In effect, what used to be the $$i$$th row of $$A$$ is now the $$\sigma(i)$$th row of $$P^t A$$.

Now let’s look at what happens with $$A' P^t$$.

$\displaystyle (A' P^t)_{i, j} = \sum_{k = 1}^n A'_{i, k} P^t_{k, j} = \sum_{k = 1}^n A'_{i, k} P_{j, k} = A'_{i, \sigma(j)}$

So we must prove that $$A_{\sigma^{-1}(i), j} = A'_{i, \sigma(j)}$$. Before we do that, we note that if $$f(v_i) = v'_j$$, then $$v_i = f^{-1}(v_j)$$, and thus $$\sigma^{-1}(j) = i$$. Also, since $$f$$ is a graph isomorphism, $$f^{-1}: V' \to V$$ is too.

\begin{aligned} & (A' P^t)_{i, j} = 1 \\ & \iff A'_{i, \sigma(j)} = 1 \\ & \iff (v'_i, f(v_j)) \in E' \\ & \iff (f^{-1}(v'_i), v_j) \in E \\ & \iff (v_{\sigma^{-1}(i)}, v_j) \in E \\ & \iff A_{\sigma^{-1}(i), j} = 1 \\ & \iff (P^t A)_{i, j} = 1 \end{aligned}\\

$$\Leftarrow )$$ This is clear if we define $$f: V \to V'$$, $$f(v_i) = v'_j \iff P_{i, j} = 1$$, and use the same proof as above.

Thus, the problem of finding whether or not two graphs are isomorphic is equivalent to finding if their adjacency matrices are simply permutations of eachother. This makes sense, since the intuitive idea of a graph isomorphism is that it doesn’t care if nodes have been relabeled or reordered.

An interesting offshoot of this is that the problem of finding whether or not two graphs are isomorphic (called GRAPH-ISOMORPHISM in complexity theory) has an unknown complexity. It is known to be in NP, but it is not known if it is in NP-complete, or if it’s in P, or where. It has its own complexity class, called GI. This algebraic approach lets us attack the problem using the tools of linear algebra. It is generalized by the mathematical branch of algebraic graph theory.

## Walks

Another interesting property of adjacency matrices, is that they allow us to compute the amount of walks from a node to another, purely algebraically. A walk of length $$k$$ from $$v_i$$ to $$v_j$$ in a graph is an ordered list of $$k+1$$ nodes, where each node shares an edge with the next one in the list, the first node is $$v_i$$, and the last node is $$v_j$$. This assertion is formalized as such:

Lemma. Let $$G = (V, E)$$ be a graph, $$A$$ its adjacency matrix. Let $$n = |V|$$, and let $$\delta_k(i, j)$$ be the amount of walks of length $$k$$ from node $$v_i$$ to node $$v_j$$. Then $$\delta_k(i, j) = A^k_{i, j}$$.

This is an easy induction proof. First, we note that this is true for $$k = 1$$. Indeed, if $$k = 1$$, then there is a walk of length $$1$$ if and only if $$v_i$$ and $$v_j$$ are neighbors, meaning they share an edge. Obviously there is only one such walk if this edge exists, and no such walks if it does not. Hence $$\delta_1(i, j) = A_{i, j}$$.

Now let us assume that this works for $$k$$, and attempt to prove it for $$k+1$$.

$\displaystyle A^{k+1}_{i, j} = (A^k \cdot A)_{i, j} = \sum_{r = 1}^n A^k_{i, r} \cdot A_{r, j}$

By induction, $$A^k_{i, r} = \delta_k(i, r)$$. Thus

$\displaystyle A^{k+1}_{i, j} = \sum_{r = 1}^n \delta_k(i, r) \cdot \delta_1(r, j)$

It is true that any walk of length $$k+1$$ from $$v_i$$ to $$v_j$$, can be decomposed as a walk of length $$k$$ to some neighbor $$v_r$$ of $$v_j$$, plus an edge from $$v_r$$ to $$v_j$$. The amount of walks is the summation of all such possible paths. Thus, the amount of ways to get from $$v_i$$ to $$v_j$$ using in a walk of length $$k+1$$ is the same as summing over all possible ways to get from $$v_i$$ to a neighbor of $$v_j$$, using walks of length $$k$$. Thus

$\displaystyle \delta_{k+1}(i, j) = \sum_{r = 1}^n \delta_k(i, r) \cdot \delta_1(r, j) = A^{k+1}_{i, j}$

Knowing this, we can derive some other information about a graph. For instance, the number of induced $$K_3$$ subgraphs that $$G$$ has, call it $$G_{K_3}$$ is exactly

$G_{K_3} = \displaystyle \frac{tr(A^3)}{6} = \frac{\displaystyle \sum_{i = 1}^n A^3_{i, i}}{6}$

We also have that

$|E| = \frac{tr(A^2)}{2} = \frac{\displaystyle \sum_{i = 1}^n A^2_{i, i}}{2}$

The proofs of these facts are left as an exercise, they are not hard.

## Regularity

An interesting class of graphs is those such that $$deg(v_i) = d$$, for some constant $$d$$. These are called the $$d$$-regular graphs. We shall now prove the following result about the adjacency matrix of a $$d$$-regular graph:

Lemma. Let $$G$$ be a $$d$$-regular graph, with $$A$$ its adjacency matrix. Then $$\rho(A) = d$$, where $$\rho(A)$$ is the matrix’s spectral radius, $$\max \{|\lambda| : \lambda \text{ eigenvalue of }A \}$$.

We will first prove that $$d$$ is an eigenvalue of $$A$$, and then that it is the largest one.

Let $$n = |V|$$, and take $$v = (1, \cdots, 1) \in \mathbb{R}^n$$. Since

$deg(v_i) = d = \displaystyle \sum_{j=1}^n A_{i, j} \ \forall\ 1 \le i \le n$

we have

$\displaystyle (Av)_i = \sum_{j = 1}^n A_{i, j} \cdot v_j = \sum_{j = 1}^n A_{i, j} \cdot 1 = deg(v_i) = d$ Thus $$Av = (d, \cdots, d)$$, and $$Av = dv$$. Thus $$d$$ is an eigenvalue with eigenvector $$v$$.

To see that it is the largest one, we note that $$||A||_{\infty} = d$$, since $$||A||_\infty = \max_i \sum_{j = 1} |A_{i, j}|$$. But since it is a consistent norm, we have that $$||Av||_\infty \le ||A||_\infty ||v||_\infty$$. Now let $$\lambda$$ be an eigenvalue of $$A$$, with an eigenvector $$v$$, which by definition is nonzero. Then

\begin{aligned} ||Av||_\infty & \le ||A||_\infty ||v||_\infty \\ ||\lambda v||_\infty & \le d ||v||_\infty\\ |\lambda| ||v||_\infty & \le d ||v||_\infty\\ |\lambda| &\le d \end{aligned}

This completes the proof.

An easy property related to regulaity, is that there are no $$d$$-regular graphs $$G = (V, E)$$ with $$n = |V|$$, such that $$n$$ and $$d$$ are both odd. The proof is a straightforward application of the handshake lemma. If we wanted to do this again with the adjacency matrix, the proof revolves around the fact that adjacency matrices always have an even amount of $$1$$s by virtue of being symmetric, and if we have an odd amount $$n$$ of columns, and each column has an odd amount $$d$$ of $$1$$s, then the amount of $$1$$s in the matrix would be odd, which we know can’t happen.

## Graph properties

When a property of a graph is preserved under graph isomorphism, we call it a graph property, or a graph invariant. Given that two isomorphic graphs share the same adjacency matrix, only under a permutation of the rows and columns, this tells us that some properties of the adjacency matrix are graph invariants. For instance

• Since $$X_A = X_{P^{-1} A P}$$ for any invertible matrix $$P$$, where $$X_A$$ is the characteristic polynomial of A, and since for a permutation matrix $$P$$, $$P^{-1} = P^t$$, we have that the characteristic polynomial is a graph invariant.
• The same as above holds for the minimal polynomial, $$m_A$$ of an adjacency matrix.
• The set of eigenvalues of a matrix, along with their multiplicity, is called the spectrum of a matrix. Since isomorphic graphs have similar matrices (in particular, this kind of similarity is called permutation-similarity), they share eigenvalues.

One usually talks about the spectrum of the graph, and the characteristic or minimal polynomial of the graph, referring to its adjacency matrix. Note, however, that while being isomorphic implies these qualities, the converse does not hold. For example, the following two graphs (taken from Godsil & Royle’s Algebraic Graph Theory) have the same characteristic polynomial, and thus share a spectrum:

They both have adjacency matrices with the characteristic polynomial $$(x+2)(x+1)^2(x-1)^2(x^2-2x-6)$$, yet they are clearly not isomorphic. The problem here is that their adjacency matrices are similar, but not permutation similar. In particular, we see how planarity is not encoded in the graph’s spectrum, nor is the degree of its vertices.

## Conclusion

I hope you got a taste of how we can extract information about a graph using its adjacency matrix, and why it is important other than because of its use as a data structure. The fields of algebraic graph theory and, in particular, spectral graph theory, are the ones who study this structure and a related one, the incidence matrix, which we will see in future discussions.