National Science Library of Georgia

Image from Google Jackets

High-dimensional probability : an introduction with applications in data science / Roman Vershynin.

By: Material type: TextTextSeries: Cambridge series in statistical and probabilistic mathematics ; 47.Publisher: Cambridge : Cambridge University Press, 2018Description: 1 online resource (xiv, 284 pages) : digital, PDF file(s)Content type:
  • text
Media type:
  • computer
Carrier type:
  • online resource
ISBN:
  • 9781108231596 (ebook)
Subject(s): Additional physical formats: Print version: : No titleDDC classification:
  • 519.2 23
LOC classification:
  • QA273 .V4485 2018
Online resources:
Contents:
Preliminaries on random variables -- Concentration of sums of independent random variables -- Random vectors in high dimensions -- Random matrices -- Concentration without independence -- Quadratic forms, symmetrization and contraction -- Random processes -- Chaining -- Deviations of random matrices and geometric consequences -- Sparse recovery -- Dvoretzky-Milman's theorem.
Summary: High-dimensional probability offers insight into the behavior of random vectors, random matrices, random subspaces, and objects used to quantify uncertainty in high dimensions. Drawing on ideas from probability, analysis, and geometry, it lends itself to applications in mathematics, statistics, theoretical computer science, signal processing, optimization, and more. It is the first to integrate theory, key tools, and modern applications of high-dimensional probability. Concentration inequalities form the core, and it covers both classical results such as Hoeffding's and Chernoff's inequalities and modern developments such as the matrix Bernstein's inequality. It then introduces the powerful methods based on stochastic processes, including such tools as Slepian's, Sudakov's, and Dudley's inequalities, as well as generic chaining and bounds based on VC dimension. A broad range of illustrations is embedded throughout, including classical and modern results for covariance estimation, clustering, networks, semidefinite programming, coding, dimension reduction, matrix completion, machine learning, compressed sensing, and sparse regression.
Tags from this library: No tags from this library for this title. Log in to add tags.
No physical items for this record

Title from publisher's bibliographic system (viewed on 28 Sep 2018).

Preliminaries on random variables -- Concentration of sums of independent random variables -- Random vectors in high dimensions -- Random matrices -- Concentration without independence -- Quadratic forms, symmetrization and contraction -- Random processes -- Chaining -- Deviations of random matrices and geometric consequences -- Sparse recovery -- Dvoretzky-Milman's theorem.

High-dimensional probability offers insight into the behavior of random vectors, random matrices, random subspaces, and objects used to quantify uncertainty in high dimensions. Drawing on ideas from probability, analysis, and geometry, it lends itself to applications in mathematics, statistics, theoretical computer science, signal processing, optimization, and more. It is the first to integrate theory, key tools, and modern applications of high-dimensional probability. Concentration inequalities form the core, and it covers both classical results such as Hoeffding's and Chernoff's inequalities and modern developments such as the matrix Bernstein's inequality. It then introduces the powerful methods based on stochastic processes, including such tools as Slepian's, Sudakov's, and Dudley's inequalities, as well as generic chaining and bounds based on VC dimension. A broad range of illustrations is embedded throughout, including classical and modern results for covariance estimation, clustering, networks, semidefinite programming, coding, dimension reduction, matrix completion, machine learning, compressed sensing, and sparse regression.

There are no comments on this title.

to post a comment.
Copyright © 2023 Sciencelib.ge All rights reserved.