Continuous embedding of graphs. Network geometry is a related field.
For an overview, check out the Graph Representation Learning Book written by William L. Hamilton, and reviews such as Graph Embedding Techniques, Applications, and Performance: A Survey.
A common technique is using (random) walks on graphs. Each walk can be considered a “sentence” with nodes as “words”. Then these walks can be directly fed into the word embedding methods like word2vec. The walks can be adjusted in many ways, for instance by introducing a bias against already visited nodes (e.g. node2vec). The other paradigm is Graph neural network. Similarly to Message passing, it is about aggregating information around each node iteratively to obtain each node’s representation.
One can also think about obtaining the representation of objects other than the nodes. For instance, can we obtain embedding of edges or embedding of relationships?
Can graph embedding capture hierarchical structure? One idea is coarse-graining a network into increasingly smaller, higher-level networks and then use that hierarchical relationship to learn fine-grained embedding (Chen2017HARP, Hu2019HGCN).
Methods and libraries
- It’s Who You Know: Graph Mining Using Recursive Structural Features