Machine learning under a modern optimization lens / Dimitris Bertsimas, Jack Dunn.
Material type: TextPublisher: Belmont, Massachusetts : Dynamic Ideas LLC, [2019]Copyright date: ©2019Description: xviii, 589 pages : color illustrations ; 24 cmContent type:- text
- unmediated
- volume
- 9781733788502
- 1733788506
- 519.72 B551m
Item type | Current library | Collection | Call number | Copy number | Status | Date due | Barcode |
---|---|---|---|---|---|---|---|
Books | Castorina Estantes Abertas (Open Shelves) | Livros (Books) | 519.72 B551i 1997 IMPA (Browse shelf(Opens below)) | 2 | Available | 39063000135247 |
Browsing Castorina shelves, Shelving location: Estantes Abertas (Open Shelves), Collection: Livros (Books) Close shelf browser (Hides shelf browser)
No cover image available No cover image available | ||||||||
519.72 A546l 1987 IMPA Linear programming in infinite-dimensional spaces: theory and applications/ | 519.72 B282q 1977 IMPA Que es la programacion lineal. | 519.72 B362l 1977 IMPA Linear programming and network flows/ | 519.72 B551i 1997 IMPA Machine learning under a modern optimization lens / | 519.72 B551i 1997 IMPA Introduction to linear optimization/ | 519.72 B742o 2019 IMPA Opt art: from mathematical optimization to visual design/ | 519.72 B788n 1969 IMPA Non-linear optimization techniques: |
Includes bibliographical references and index.
"The book provides an original treatment of machine learning (ML) using convex, robust and mixed integer optimization that leads to solutions to central ML problems at large scale that can be found in seconds/minutes, can be certified to be optimal in minutes/hours, and outperform classical heuristic approaches in out-of-sample experiments. Structure of the book: Part I covers robust, sparse, nonlinear, holistic regression and extensions. Part II contains optimal classification and regression trees. Part III outlines prescriptive ML methods. Part IV shows the power of optimization over randomization in design of experiments, exceptional responders, stable regression and the bootstrap. Part V describes unsupervised methods in ML: optimal missing data imputation and interpretable clustering. Part VI develops matrix ML methods: sparse PCA, sparse inverse covariance estimation, factor analysis, matrix and tensor completion. Part VII demonstrates how ML leads to interpretable optimization. Philosophical principles of the book: Interpretability in ML is materially important in real world applications. Practical tractability not polynomial solvability leads to real world impact. NP-hardness is an opportunity not an obstacle. ML is inherently linked to optimization not probability theory. Data represents an objective reality; models only exist in our imagination. Optimization has a significant edge over randomization. The ultimate objective in the real world is prescription, not prediction."--Cover.
There are no comments on this title.