Algorithmic advances for the adjacency spectral embedding
Resumen:
The Random Dot Product Graph (RDPG) is a popular generative graph model for relational data. RDPGs postulate there exist latent positions for each node, and specifies the edge formation probabilities via the inner product of the corresponding latent vectors. The embedding task of estimating these latent positions from observed graphs is usually posed as a non-convex matrix factorization problem. The workhorse Adjacency Spectral Embedding offers an approximate solution obtained via the eigendecomposition of the adjacency matrix, which enjoys solid statistical guarantees but can be computationally intensive and is formally solving a surrogate problem. In this paper, we bring to bear recent non-convex optimization advances and demonstrate their impact to RDPG inference. We develop first-order gradient descent methods to better solve the original optimization problem, and to accommodate broader network embedding applications in an organic way. The effectiveness of the resulting graph representation learning framework is demonstrated on both synthetic and real data. We show the algorithms are scalable, robust to missing network data, and can track the latent positions over time when the graphs are acquired in a streaming fashion.
2022 | |
Este trabajo fue parcialmente financiado por la NSF (fondo CCF-1750428 y ECCS-1809356) | |
Representation learning Signal processing algorithms Europe Signal processing Probability Solids Inference algorithms Graph Representation Learning Gradient Descent Non-convex Optimization Random Dot Product Graphs |
|
Inglés | |
Universidad de la República | |
COLIBRI | |
https://ieeexplore.ieee.org/document/9909610
https://hdl.handle.net/20.500.12008/35237 |
|
Acceso abierto | |
Licencia Creative Commons Atribución - No Comercial - Sin Derivadas (CC - By-NC-ND 4.0) |