Big data plays an increasingly central role in many areas of research including optimization and network modeling. We consider problems applicable to large datasets within these two branches of research. We begin by presenting a nonlinearly preconditioned nonlinear conjugate gradient (PNCG) algorithm to increase the convergence speed of iterative unconstrained optimization methods. We demonstrate numerically that PNCG often leads to important pay-offs for computing difficult rank-R canonical tensor decompositions, with convergence that is significantly faster and more robust than for the stand-alone nonlinear conjugate gradient (NCG) algorithm or alternating least squares (ALS) algorithm. As well, we show numerically that the PNCG algorithm requires fewer iterations and less time to reach desired ranking accuracies than stand-alone ALS in solving latent factor models where latent factor models are used as important building blocks in many practical recommendation systems.
We next turn to problems within the field of network or graph modeling. Many large real-world networks across different fields have similar properties which has garnered considerable interest in developing models that can replicate these properties. We begin our discussion of graph models by closely examining the Chung-Lu model. We explore what happens both theoretically and numerically when simple changes are made to the model and when the model assumptions are violated. As well, we explore the properties of an approximate Chung-Lu model under a variety of conditions and explore several ways of improving this approximate model. We next design a new graph generator to match the community structure found in real-world networks as measured using the clustering coefficient and assortativity coefficient. Using several real-world networks, we test our algorithm numerically by creating a synthetic network to compare the properties with the real network properties as well as to the properties of another popular graph generator, BTER, developed by Seshadhri, Kolda and Pinar.
Bio: Manda Winlaw recently completed her Ph.D. in Applied Mathematics at the University of Waterloo with her dissertation titled, “Algorithms and Models for Tensors and Networks with Applications in Data Science.” Her work has been presented at several conferences and she has won best paper awards at ICPADS 2015 and the Thirteenth Copper Mountain Conference on Iterative Methods. She interned at Lawrence Livermore National Laboratory for several summers and her previous experience includes time as an economist at the Bank of Canada.