As the title states - this is a computationally expensive way to get all rings in a graph, but it's fairly simple, and illustrates some nice principles. For a better way to do things, perhaps Rich Apodaca's description of the Hanser, Jauffret, and Kaufmann algorithm would suit.

Anyway, back to the expensive way. The set of cycles in a graph form what is called a 'cycle space' - which I didn't understand at all for a while, but is not actually that hard. For example, here is a basis set for the cycle space on a 3x3 hexagonal lattice:

Looks like a bunch of cycles, really. The important thing is that it is possible to combine any subset of these cycles to get another cycle (or one of the other cycles in the basis). By 'combine' we mean XOR or the symmetric difference of the edge vectors. This sounds more complicated than necessary, so it's useful to consider a simple example. Well, the example is simple - the picture is not:

On the left here is a graph (top left) and a cycle basis (lower left; A-C). Each cycle is just a bit set (vector) of edges. So cycle

It should be clear that

All the cycles in the graph can be generated in this way - for example

So why is this expensive? Well, the problem lies in the fact that there are a large number of subsets of the basis set - 2 to the power |S|, for the set S. Many of these will be the same cycle, so there are undoubtedly ways to cleverly choose subsets; I'm not sure what these are.

Finally, here is an example of a set of (unique) cycles on a lattice. The procedure to get these was ridiculous : a) find all rings as above, b) layout to get the faces, c) construct the inner dual, d) make a signature of dual to filter duplicates.

Obviously, from a drawing point of view, the top row are nicely symmetric ways to layout 18-rings. Probably the top-center and top-right are the best, as they are also more convex.

Anyway, back to the expensive way. The set of cycles in a graph form what is called a 'cycle space' - which I didn't understand at all for a while, but is not actually that hard. For example, here is a basis set for the cycle space on a 3x3 hexagonal lattice:

Looks like a bunch of cycles, really. The important thing is that it is possible to combine any subset of these cycles to get another cycle (or one of the other cycles in the basis). By 'combine' we mean XOR or the symmetric difference of the edge vectors. This sounds more complicated than necessary, so it's useful to consider a simple example. Well, the example is simple - the picture is not:

On the left here is a graph (top left) and a cycle basis (lower left; A-C). Each cycle is just a bit set (vector) of edges. So cycle

**A**is the set of edges {a, b, d, j} and cycle**C**is {e, f, g, h}. On the right is one example of a combination of two cycles - known as a ring sum - to produce a new cycle**D**.It should be clear that

**A**+**B**is equal to {a, b, d, j} + {c, d, f, i} - {d}. In words, this is the union of the edge sets minus their intersection. One important point is that the ring sum of two disconnected cycles is just those two cycles. So**A**+**C**=**A**+**C**...All the cycles in the graph can be generated in this way - for example

**A**+**B**+**C**= {a, b, c, e, g, h, i, j} or the outer cycle of the graph. I think I am right in saying that the order of combination does not matter.So why is this expensive? Well, the problem lies in the fact that there are a large number of subsets of the basis set - 2 to the power |S|, for the set S. Many of these will be the same cycle, so there are undoubtedly ways to cleverly choose subsets; I'm not sure what these are.

Finally, here is an example of a set of (unique) cycles on a lattice. The procedure to get these was ridiculous : a) find all rings as above, b) layout to get the faces, c) construct the inner dual, d) make a signature of dual to filter duplicates.

Obviously, from a drawing point of view, the top row are nicely symmetric ways to layout 18-rings. Probably the top-center and top-right are the best, as they are also more convex.

## Comments