Details

Spectral Clustering and Biclustering


Spectral Clustering and Biclustering

Learning Large Graphs and Contingency Tables
1. Aufl.

von: Marianna Bolla

69,99 €

Verlag: Wiley
Format: PDF
Veröffentl.: 19.06.2013
ISBN/EAN: 9781118650707
Sprache: englisch
Anzahl Seiten: 296

DRM-geschütztes eBook, Sie benötigen z.B. Adobe Digital Editions und eine Adobe ID zum Lesen.

Beschreibungen

<p><b>Explores regular structures in graphs and contingency tables by spectral theory and statistical methods</b></p> <p>This book bridges the gap between graph theory and statistics by giving answers to the demanding questions which arise when statisticians are confronted with large weighted graphs or rectangular arrays. Classical and modern statistical methods applicable to biological, social, communication networks, or microarrays are presented together with the theoretical background and proofs.</p> <p>This book is suitable for a one-semester course for graduate students in data mining, multivariate statistics, or applied graph theory; but by skipping the proofs, the algorithms can also be used by specialists who just want to retrieve information from their data when analysing communication, social, or biological networks.</p> <p><i>Spectral Clustering and Biclustering: </i></p> <ul> <li>Provides a unified treatment for edge-weighted graphs and contingency tables via methods of multivariate statistical analysis (factoring, clustering, and biclustering).</li> <li>Uses spectral embedding and relaxation to estimate multiway cuts of edge-weighted graphs and bicuts of contingency tables.</li> <li>Goes beyond the expanders by describing the structure of dense graphs with a small spectral gap via the structural eigenvalues and eigen-subspaces of the normalized modularity matrix.</li> <li>Treats graphs like statistical data by combining methods of graph theory and statistics.</li> <li>Establishes a common outline structure for the contents of each algorithm, applicable to networks and microarrays, with unified notions and principles.</li> </ul>
<p>Preface xi</p> <p>Acknowledgements xiii</p> <p>List of abbreviations xv</p> <p>Introduction xix</p> <p>References xxii</p> <p><b>1 Multivariate analysis techniques for representing graphs and contingency tables 1</b></p> <p>1.1 Quadratic placement problems for weighted graphs and hypergraphs 1</p> <p>1.1.1 Representation of edge-weighted graphs 2</p> <p>1.1.2 Representation of hypergraphs 5</p> <p>1.1.3 Examples for spectra and representation of simple graphs 8</p> <p>1.2 SVD of contingency tables and correspondence matrices 12</p> <p>1.3 Normalized Laplacian and modularity spectra 16</p> <p>1.4 Representation of joint distributions 21</p> <p>1.4.1 General setup 21</p> <p>1.4.2 Integral operators between L2 spaces 22</p> <p>1.4.3 When the kernel is the joint distribution itself 23</p> <p>1.4.4 Maximal correlation and optimal representations 25</p> <p>1.5 Treating nonlinearities via reproducing kernel Hilbert spaces 28</p> <p>1.5.1 Notion of the reproducing kernel 29</p> <p>1.5.2 RKHS corresponding to a kernel 32</p> <p>1.5.3 Two examples of an RKHS 33</p> <p>1.5.4 Kernel – based on a sample – and the empirical feature map 37</p> <p>References 40</p> <p><b>2 Multiway cuts and spectra 44</b></p> <p>2.1 Estimating multiway cuts via spectral relaxation 44</p> <p>2.1.1 Maximum, minimum, and ratio cuts of edge-weighted graphs 45</p> <p>2.1.2 Multiway cuts of hypergraphs 54</p> <p>2.2 Normalized cuts 57</p> <p>2.3 The isoperimetric number and sparse cuts 64</p> <p>2.4 The Newman–Girvan modularity 76</p> <p>2.4.1 Maximizing the balanced Newman–Girvan modularity 78</p> <p>2.4.2 Maximizing the normalized Newman–Girvan modularity 81</p> <p>2.4.3 Anti-community structure and some examples 84</p> <p>2.5 Normalized bicuts of contingency tables 88</p> <p>References 91</p> <p><b>3 Large networks, perturbation of block structures 96</b></p> <p>3.1 Symmetric block structures burdened with random noise 96</p> <p>3.1.1 General blown-up structures 99</p> <p>3.1.2 Blown-up multipartite structures 109</p> <p>3.1.3 Weak links between disjoint components 112</p> <p>3.1.4 Recognizing the structure 114</p> <p>3.1.5 Random power law graphs and the extended planted partition model 121</p> <p>3.2 Noisy contingency tables 124</p> <p>3.2.1 Singular values of a noisy contingency table 127</p> <p>3.2.2 Clustering the rows and columns via singular vector pairs 129</p> <p>3.2.3 Perturbation results for correspondence matrices 132</p> <p>3.2.4 Finding the blown-up skeleton 138</p> <p>3.3 Regular cluster pairs 142</p> <p>3.3.1 Normalized modularity and volume regularity of edge-weighted graphs 142</p> <p>3.3.2 Correspondence matrices and volume regularity of contingency tables 150</p> <p>3.3.3 Directed graphs 156</p> <p>References 157</p> <p><b>4 Testable graph and contingency table parameters 161</b></p> <p>4.1 Convergent graph sequences 161</p> <p>4.2 Testability of weighted graph parameters 164</p> <p>4.3 Testability of minimum balanced multiway cuts 166</p> <p>4.4 Balanced cuts and fuzzy clustering 172</p> <p>4.5 Noisy graph sequences 175</p> <p>4.6 Convergence of the spectra and spectral subspaces 177</p> <p>4.7 Convergence of contingency tables 182</p> <p>References 187</p> <p><b>5 Statistical learning of networks 189</b></p> <p>5.1 Parameter estimation in random graph models 189</p> <p>5.1.1 EM algorithm for estimating the parameters of the block-model 189</p> <p>5.1.2 Parameter estimation in the α and β models 192</p> <p>5.2 Nonparametric methods for clustering networks 197</p> <p>5.2.1 Spectral clustering of graphs and biclustering of contingency tables 199</p> <p>5.2.2 Clustering of hypergraphs 201</p> <p>5.3 Supervised learning 203</p> <p>References 205</p> <p><b>Appendix A Linear algebra and some functional analysis 207</b></p> <p>A.1 Metric, normed vector, and Euclidean spaces 207</p> <p>A.2 Hilbert spaces 209</p> <p>A.3 Matrices 217</p> <p>References 233</p> <p><b>Appendix B Random vectors and matrices 235</b></p> <p>B.1 Random vectors 235</p> <p>B.2 Random matrices 239</p> <p>References 245</p> <p><b>Appendix C Multivariate statistical methods 246</b></p> <p>C.1 Principal component analysis 246</p> <p>C.2 Canonical correlation analysis 248</p> <p>C.3 Correspondence analysis 250</p> <p>C.4 Multivariate regression and analysis of variance 252</p> <p>C.5 The k-means clustering 255</p> <p>C.6 Multidimensional scaling 257</p> <p>C.7 Discriminant analysis 258</p> <p>References 261</p> <p>Index 263</p>
<p>She is graduated from the Eötvös University of Budapest and holds a PhD (1984); further, a CSc degree (1993) from the Hungarian Academy of Sciences. Currently, she is a professor of the Institute of Mathematics, Budapest University of Technology and Economics and adjoint professor of the Central European University of Budapest. She also leads an undergraduate research course on Spectral Clustering in the Budapest Semester of Mathematics.</p> <p>Her fields of expertise are multivariate statistics, applied graph theory, and data mining of social, biological, and communication networks. She has been working in various national and European research projects related to networks and data analysis.</p> <p>She has published research papers in the Journal of Multivariate Analysis, Linear Algebra and  Its Applications, Discrete Mathematics, Discrete Applied Mathematics, European Journal of Combinatorics, and the Physical  Review E, among others.</p> <p>She is the coauthor of the textbook in Hungarian: Bolla, M., Krámli, A., Theory of statistical inference, Typotex, Budapest (first ed. 2005, second ed. 2012) and another Hungarian book on multivariate statistical analysis. She was the managing editor of the book Contests in Higher Mathematics (ed. G. J. Székely), Springer, 1996.</p>
<p><b>Explores regular structures in graphs and contingency tables by spectral theory and statistical methods</b></p> <p>This book bridges the gap between graph theory and statistics by giving answers to the demanding questions which arise when statisticians are confronted with large weighted graphs or rectangular arrays. Classical and modern statistical methods applicable to biological, social, communication networks, or microarrays are presented together with the theoretical background and proofs.</p> <p>This book is suitable for a one-semester course for graduate students in data mining, multivariate statistics, or applied graph theory; but by skipping the proofs, the algorithms can also be used by specialists who just want to retrieve information from their data when analysing communication, social, or biological networks.</p> <p><i>Spectral Clustering and Biclustering: </i></p> <ul> <li>Provides a unified treatment for edge-weighted graphs and contingency tables via methods of multivariate statistical analysis (factoring, clustering, and biclustering).</li> <li>Uses spectral embedding and relaxation to estimate multiway cuts of edge-weighted graphs and bicuts of contingency tables.</li> <li>Goes beyond the expanders by describing the structure of dense graphs with a small spectral gap via the structural eigenvalues and eigen-subspaces of the normalized modularity matrix.</li> <li>Treats graphs like statistical data by combining methods of graph theory and statistics.</li> <li>Establishes a common outline structure for the contents of each algorithm, applicable to networks and microarrays, with unified notions and principles.</li> </ul>

Diese Produkte könnten Sie auch interessieren:

Modeling Uncertainty
Modeling Uncertainty
von: Moshe Dror, Pierre L'Ecuyer, Ferenc Szidarovszky
PDF ebook
236,81 €
Level Crossing Methods in Stochastic Models
Level Crossing Methods in Stochastic Models
von: Percy H. Brill
PDF ebook
203,29 €
Continuous Bivariate Distributions
Continuous Bivariate Distributions
von: N. Balakrishnan, Chin Diew Lai
PDF ebook
128,39 €