Federico Lebrón

Algebraic aspects of adjacency matrices

Categories: Computer Science, Mathematics

In our last encounter, we saw how there is a matrix naturally associated with every graph, and it can be used for representing the graph on a computer. While this already merits giving adjacency matrices our attention as useful constructs, we shall now see that there is a whole lot more to them than just a data structure.

I'd like to introduce some motivation for the study of this and related structures.

Graph Isomorphism

Given two graphs $G = (V, E), G' = (V', E')$, we say that a bijective function $f: V \to V'$ is a graph isomorphism if $(u, v) \in E \iff (f(u), f(v)) \in E'$ $\forall\ u, v \in V$. We say that $G$ and $G'$ are isomorphic, and we note this $G \simeq G'$. The idea is that two isomorphic graphs are really the same graph, just under a relabeling of the nodes. Most of the properties we are interested in when studying graphs are preserved under graph isomorphism. For instance, knowing that $G \simeq G'$:

• $|V| = |V'|, |E| = |E'|$.
• $G$ is connected $\iff$ $G'$ is connected.
• Every cycle $C$ in $G$ has a unique cycle image $f(C)$ in $G'$, and obviously $|C| = |f(C)|$.
• $\chi(G) = \chi(G')$, $\chi$ the chromatic number. In general, $G$ and $G'$ have the same chromatic polynomial. We will discuss this polynomial in further posts.
• $G$ and $G'$ have the same degree sequence.
• $\omega(G) = \omega(G')$, with $\omega$ the clique number (size of the largest clique, a maximal complete subgraph).
• $G$ and $G'$ have the same vertex cover and edge cover numbers.

As you can see, many interesting properties of graphs are preserved by graph isomorphisms. Properties that do not talk solely about the structure of the graph, but perhaps the representation, need not be preserved under isomorphism. A common example of this is: if we've represented our nodes as numbers from $1$ to $n$, graph isomorphism does not preserve the quantity $\sum_{i=1}^n i \cdot deg(v_i)$.

Now, here is how this relates to adjacency matrices.

Lemma. Given two graphs $G = (V, E)$, $G' = (V', E')$ with adjacency matrices $A$ and $A'$ respectively, and calling $n = |V|$, then $G \simeq G'$ $\iff$ there exists a permutation matrix $P \in \{0, 1\}^{n \times n}$ such that

where $P^t$ denotes the transpose of $P$.

Thus, the problem of finding whether or not two graphs are isomorphic is equivalent to finding if their adjacency matrices are simply permutations of eachother. This makes sense, since the intuitive idea of a graph isomorphism is that it doesn't care if nodes have been relabeled or reordered.

An interesting offshoot of this is that the problem of finding whether or not two graphs are isomorphic (called GRAPH-ISOMORPHISM in complexity theory) has an unknown complexity. It is known to be in NP, but it is not known if it is in NP-complete, or if it's in P, or where. It has its own complexity class, called GI. This algebraic approach lets us attack the problem using the tools of linear algebra. It is generalized by the mathematical branch of algebraic graph theory.

Walks

Another interesting property of adjacency matrices, is that they allow us to compute the amount of walks from a node to another, purely algebraically. A walk of length $k$ from $v_i$ to $v_j$ in a graph is an ordered list of $k+1$ nodes, where each node shares an edge with the next one in the list, the first node is $v_i$, and the last node is $v_j$. This assertion is formalized as such:

Lemma. Let $G = (V, E)$ be a graph, $A$ its adjacency matrix. Let $n = |V|$, and let $\delta_k(i, j)$ be the amount of walks of length $k$ from node $v_i$ to node $v_j$. Then $\delta_k(i, j) = A^k_{i, j}$.

Knowing this, we can derive some other information about a graph. For instance, the number of induced $K_3$ subgraphs that $G$ has, call it $G_{K_3}$ is exactly

We also have that

The proofs of these facts are left as an exercise, they are not hard.

Regularity

An interesting class of graphs is those such that $deg(v_i) = d$, for some constant $d$. These are called the $d$-regular graphs. We shall now prove the following result about the adjacency matrix of a $d$-regular graph:

Lemma. Let $G$ be a $d$-regular graph, with $A$ its adjacency matrix. Then $\rho(A) = d$, where $\rho(A)$ is the matrix's spectral radius, $\max \{|\lambda| : \lambda \text{ eigenvalue of }A \}$.

An easy property related to regulaity, is that there are no $d$-regular graphs $G = (V, E)$ with $n = |V|$, such that $n$ and $d$ are both odd. The proof is a straightforward application of the handshake lemma. If we wanted to do this again with the adjacency matrix, the proof revolves around the fact that adjacency matrices always have an even amount of $1$s by virtue of being symmetric, and if we have an odd amount $n$ of columns, and each column has an odd amount $d$ of $1$s, then the amount of $1$s in the matrix would be odd, which we know can't happen.

Graph properties

When a property of a graph is preserved under graph isomorphism, we call it a graph property, or a graph invariant. Given that two isomorphic graphs share the same adjacency matrix, only under a permutation of the rows and columns, this tells us that some properties of the adjacency matrix are graph invariants. For instance

• Since $X_A = X_{P^{-1} A P}$ for any invertible matrix $P$, where $X_A$ is the characteristic polynomial of A, and since for a permutation matrix $P$, $P^{-1} = P^t$, we have that the characteristic polynomial is a graph invariant.
• The same as above holds for the minimal polynomial, $m_A$ of an adjacency matrix.
• The set of eigenvalues of a matrix, along with their multiplicity, is called the spectrum of a matrix. Since isomorphic graphs have similar matrices (in particular, this kind of similarity is called permutation-similarity), they share eigenvalues.

One usually talks about the spectrum of the graph, and the characteristic or minimal polynomial of the graph, referring to its adjacency matrix. Note, however, that while being isomorphic implies these qualities, the converse does not hold. For example, the following two graphs (taken from Godsil & Royle's Algebraic Graph Theory) have the same characteristic polynomial, and thus share a spectrum:

They both have adjacency matrices with the characteristic polynomial $(x+2)(x+1)^2(x-1)^2(x^2-2x-6)$, yet they are clearly not isomorphic. The problem here is that their adjacency matrices are similar, but not permutation similar. In particular, we see how planarity is not encoded in the graph's spectrum, nor is the degree of its vertices.

Conclusion

I hope you got a taste of how we can extract information about a graph using its adjacency matrix, and why it is important other than because of its use as a data structure. The fields of algebraic graph theory and, in particular, spectral graph theory, are the ones who study this structure and a related one, the incidence matrix, which we will see in future discussions.