Paper Details

Matrix Decompositions In Graph Theory: A Comprehensive Study

Vol. 7, Jan-Dec 2021 | Page: 112-119

Dr. Rajendra Singh
Associate Professor, HOD Department of Mathematics, Off. Principal MBP.G. College, Dadri, G.B. Nagar

Received: 15-07-2021, Accepted: 27-09-2021, Published Online: 17-11-2021


. Download Full Paper

Abstract

Matrix decompositions are powerful tools in graph theory, providing critical insights into the structure and properties of graphs. Key decompositions such as the adjacency matrix, Laplacian matrix, and their variants facilitate various graph-related computations and analyses. Advanced decompositions, like those involving the graph's incidence matrices and their factorizations, further enhance our understanding of graph properties and optimization problems. These techniques underpin algorithms for graph partitioning, network analysis, and dynamic graph models, making matrix decompositions indispensable for both theoretical and applied graph theory. We advocate a useful vision of matrix decomposition troubles on graphs consisting of geometric matrix accomplishment and graph regularized dimensionality discount. Our consolidate framework is primarily based on a key idea that using reduced basis to symbolize a feature on the result space of graph is sufficient to get better a short rank matrix estimation even from a sparse signal.

References

  1. A. Boyarski, S. Vedula, and A. Bronstein. Deep matrix factorization with spectral geometric regularization. arXiv preprint arXiv:1911.07255, 2020.
  2. Cioabă, R. Elzinga and D. A. Gregory, Some observations on the smallest adjacency eigenvalue of a graph, Discuss. Math. Graph Theory 40 (2020), no. 2, 467–493.
  3. E. BAE, J. YUAN, XC TAI, and Y. BOYKOV. A fast continuous max-flow approach to nonconvex multilabeling problems. Technical report, Technical report CAM-10-62, UCLA, 2020.
  4. J. Yuan, E. Bae, and X.C. Tai. A study on continuous max-flow and min-cut approaches. CVPR2010, IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pages 2217–2224, 2020.
  5. D. Greig, B. Porteous, and A. Seheult. Exact maximum a posteriori estimation for binary images. Journal of the Royal Statistical Society, 51(2):271–279, 2019.
  6. J. Darbon. Global optimization for first order markov random fields with submodular priors. Discrete Applied Mathematics, 157(16):3412–3423, 2019.
  7. S. Arora, N. Cohen, W. Hu, and Y. Luo. Implicit regularization in deep matrix factorization. arXiv preprint arXiv:1905.13655, 2019.
  8. S. Gidaris and N. Komodakis. Generating classification weights with gnn denoising autoencoders for few-shot learning. CVPR, 2019.
  9. S. Melzi, J. Ren, E. Rodola, M. Ovsjanikov, and P. Wonka. Zoomout: Spectral upsampling for efficient shape correspondence. arXiv preprint arXiv:1904.07865, 2019.
  10. Ž. Milovanović, E. I. Milovanović, M. M. Matejić and A. Ali, A note on the relationship between graph energy and determinant of adjacency matrix, Discrete Math. Algorithms Appl. 11 2019, no. 1, 1950001, 8 pp.
  11. G. H. Hardy and E. M. Wright, “An introduction to the theory of numbers”, Oxford University Press, Oxford, 2018.
  12. M.P. Kumar, P.H.S. Torr, and A. Zisserman. Obj cut. CVPR, 2018.
  13. Rother, V. Kolmogorov, and Blake A. ’grabcut’: Interactive foreground extraction using iterated graph cuts. ACM Trans. Graphic, 23(3):309–314, 2017.
  14. O. Litany, E. Rodolà, A. M. Bronstein, and M. M. Bronstein. Fully spectral partial shape matching. In Computer Graphics Forum, volume 36, pages 247–258. Wiley Online Library, 2017.
  15. Y. Boykov and V. Kolmogorov. An experimental comparison of min-cut/max-flow algorithms for energy minimization in vision. IEEE Trans. Pattern Anal. Mach. Intell., 26(9):1124– 1137, 2017.