Graph Embeddings, Symmetric Real Matrices, and Generalized Inverses
Title | Graph Embeddings, Symmetric Real Matrices, and Generalized Inverses PDF eBook |
Author | Stephen Guattery |
Publisher | |
Pages | 18 |
Release | 1998 |
Genre | Eigenvalues |
ISBN |
Graph embedding techniques for bounding eigenvalues of associated matrices have a wide range of applications. The bounds produced by these techniques are not in general tight, however, and may be off by a log(2)n factor for some graphs. Guattery and Miller showed that, by adding edge directions to the graph representation, they could construct an embedding called the current flow embedding, which embeds each edge of the guest graph as an electric current flow in the host graph. They also showed how this embedding can be used to construct matrices whose nonzero eigenvalues had a one-to-one correspondence to the reciprocals of the eigenvalues of the generalized Laplacians. For the Laplacians of graphs with zero Dirichlet boundary conditions, they showed that the current flow embedding could be used generate the inverse of the matrix. In this paper, we generalize the definition of graph embeddings to cover all symmetric matrices, and we show a way of computing a generalized current flow embedding. We prove that, for any symmetric matrix A, the generalized current flow embedding of the orthogonal projector for the column space of A into A can be used to construct the generalized inverse, or pseudoinverse, of A. We also show how these results can be extended to cover Hermitian matrices.
Graph Separators, with Applications
Title | Graph Separators, with Applications PDF eBook |
Author | Arnold L. Rosenberg |
Publisher | Springer Science & Business Media |
Pages | 267 |
Release | 2005-12-21 |
Genre | Computers |
ISBN | 0306469774 |
Graph Separators with Applications is devoted to techniques for obtaining upper and lower bounds on the sizes of graph separators - upper bounds being obtained via decomposition algorithms. The book surveys the main approaches to obtaining good graph separations, while the main focus of the book is on techniques for deriving lower bounds on the sizes of graph separators. This asymmetry in focus reflects our perception that the work on upper bounds, or algorithms, for graph separation is much better represented in the standard theory literature than is the work on lower bounds, which we perceive as being much more scattered throughout the literature on application areas. Given the multitude of notions of graph separator that have been developed and studied over the past (roughly) three decades, there is a need for a central, theory-oriented repository for the mass of results. The need is absolutely critical in the area of lower-bound techniques for graph separators, since these techniques have virtually never appeared in articles having the word `separator' or any of its near-synonyms in the title. Graph Separators with Applications fills this need.
ICASE Semiannual Report
Title | ICASE Semiannual Report PDF eBook |
Author | |
Publisher | |
Pages | 80 |
Release | 1998 |
Genre | |
ISBN |
NASA Langley Scientific and Technical Information Output: 1998
Title | NASA Langley Scientific and Technical Information Output: 1998 PDF eBook |
Author | |
Publisher | |
Pages | 166 |
Release | 1999 |
Genre | |
ISBN |
Graph Representation Learning
Title | Graph Representation Learning PDF eBook |
Author | William L. William L. Hamilton |
Publisher | Springer Nature |
Pages | 141 |
Release | 2022-06-01 |
Genre | Computers |
ISBN | 3031015886 |
Graph-structured data is ubiquitous throughout the natural and social sciences, from telecommunication networks to quantum chemistry. Building relational inductive biases into deep learning architectures is crucial for creating systems that can learn, reason, and generalize from this kind of data. Recent years have seen a surge in research on graph representation learning, including techniques for deep graph embeddings, generalizations of convolutional neural networks to graph-structured data, and neural message-passing approaches inspired by belief propagation. These advances in graph representation learning have led to new state-of-the-art results in numerous domains, including chemical synthesis, 3D vision, recommender systems, question answering, and social network analysis. This book provides a synthesis and overview of graph representation learning. It begins with a discussion of the goals of graph representation learning as well as key methodological foundations in graph theory and network analysis. Following this, the book introduces and reviews methods for learning node embeddings, including random-walk-based methods and applications to knowledge graphs. It then provides a technical synthesis and introduction to the highly successful graph neural network (GNN) formalism, which has become a dominant and fast-growing paradigm for deep learning with graph data. The book concludes with a synthesis of recent advancements in deep generative models for graphs—a nascent but quickly growing subset of graph representation learning.
Development of a Mixed Shell Element for 7-parameter Formulation and Identification Methods of Lowest Eigenvalues
Title | Development of a Mixed Shell Element for 7-parameter Formulation and Identification Methods of Lowest Eigenvalues PDF eBook |
Author | Youngwon Hahn |
Publisher | |
Pages | 624 |
Release | 2005 |
Genre | |
ISBN |
Monthly Catalog of United States Government Publications
Title | Monthly Catalog of United States Government Publications PDF eBook |
Author | |
Publisher | |
Pages | 1154 |
Release | 1999-07 |
Genre | Government publications |
ISBN |