0
Your cart

Your cart is empty

Browse All Departments
  • All Departments
Price
  • R1,000 - R2,500 (4)
  • -
Status
Brand

Showing 1 - 4 of 4 matches in All Departments

Sparse Modeling for Image and Vision Processing (Paperback): Julien Mairal, Francis Bach, Jean Ponce Sparse Modeling for Image and Vision Processing (Paperback)
Julien Mairal, Francis Bach, Jean Ponce
R2,232 Discovery Miles 22 320 Ships in 10 - 15 working days

In recent years, a large amount of multi-disciplinary research has been conducted on sparse models and their applications. In statistics and machine learning, the sparsity principle is used to perform model selection-that is, automatically selecting a simple model among a large collection of them. In signal processing, sparse coding consists of representing data with linear combinations of a few dictionary elements. Subsequently, the corresponding tools have been widely adopted by several scientific communities such as neuroscience, bioinformatics, or computer vision. Sparse Modeling for Image and Vision Processing provides the reader with a self-contained view of sparse modeling for visual recognition and image processing. More specifically, the work focuses on applications where the dictionary is learned and adapted to data, yielding a compact representation that has been successful in various contexts. It reviews a large number of applications of dictionary learning in image processing and computer vision and presents basic sparse estimation tools. It starts with a historical tour of sparse estimation in signal processing and statistics, before moving to more recent concepts such as sparse recovery and dictionary learning. Subsequently, it shows that dictionary learning is related to matrix factorization techniques, and that it is particularly effective for modeling natural image patches. As a consequence, it has been used for tackling several image processing problems and is a key component of many state-of-the-art methods in visual recognition. Sparse Modeling for Image and Vision Processing concludes with a presentation of optimization techniques that should make dictionary learning easy to use for researchers that are not experts in the field.

Learning with Submodular Functions - A Convex Optimization Perspective (Paperback): Francis Bach Learning with Submodular Functions - A Convex Optimization Perspective (Paperback)
Francis Bach
R2,244 Discovery Miles 22 440 Ships in 10 - 15 working days

Submodular functions are relevant to machine learning for at least two reasons: (1) some problems may be expressed directly as the optimization of submodular functions, and (2) the Lovasz extension of submodular functions provides a useful set of regularization functions for supervised and unsupervised learning. In Learning with Submodular Functions, the theory of submodular functions is presented in a self-contained way from a convex analysis perspective, presenting tight links between certain polyhedra, combinatorial optimization and convex optimization problems. In particular, it describes how submodular function minimization is equivalent to solving a wide variety of convex optimization problems. This allows the derivation of new efficient algorithms for approximate and exact submodular function minimization with theoretical guarantees and good practical performance. By listing many examples of submodular functions, it reviews various applications to machine learning, such as clustering, experimental design, sensor placement, graphical model structure learning or subset selection, as well as a family of structured sparsity-inducing norms that can be derived and used from submodular functions. This is an ideal reference for researchers, scientists, or engineers with an interest in applying submodular functions to machine learning problems.

Optimization with Sparsity-Inducing Penalties (Paperback): Francis Bach, Rodolph Jenatton, Julien Mairal, Guillaume Obozinski Optimization with Sparsity-Inducing Penalties (Paperback)
Francis Bach, Rodolph Jenatton, Julien Mairal, Guillaume Obozinski
R1,803 Discovery Miles 18 030 Ships in 10 - 15 working days

Sparse estimation methods are aimed at using or obtaining parsimonious representations of data or models. They were first dedicated to linear variable selection but numerous extensions have now emerged such as structured sparsity or kernel selection. It turns out that many of the related estimation problems can be cast as convex optimization problems by regularizing the empirical risk with appropriate nonsmooth norms. Optimization with Sparsity-Inducing Penalties presents optimization tools and techniques dedicated to such sparsity-inducing penalties from a general perspective. It covers proximal methods, block-coordinate descent, reweighted l2-penalized techniques, working-set and homotopy methods, as well as non-convex formulations and extensions, and provides an extensive set of experiments to compare various algorithms from a computational point of view. The presentation of Optimization with Sparsity-Inducing Penalties is essentially based on existing literature, but the process of constructing a general framework leads naturally to new results, connections and points of view. It is an ideal reference on the topic for anyone working in machine learning and related areas.

Optimization for Machine Learning (Paperback): Suvrit Sra, Sebastian Nowozin, Stephen J Wright Optimization for Machine Learning (Paperback)
Suvrit Sra, Sebastian Nowozin, Stephen J Wright; Contributions by Suvrit Sra, Sebastian Nowozin, …
R1,862 Discovery Miles 18 620 Ships in 10 - 15 working days

An up-to-date account of the interplay between optimization and machine learning, accessible to students and researchers in both communities. The interplay between optimization and machine learning is one of the most important developments in modern computational science. Optimization formulations and methods are proving to be vital in designing algorithms to extract essential knowledge from huge volumes of data. Machine learning, however, is not simply a consumer of optimization technology but a rapidly evolving field that is itself generating new optimization ideas. This book captures the state of the art of the interaction between optimization and machine learning in a way that is accessible to researchers in both fields. Optimization approaches have enjoyed prominence in machine learning because of their wide applicability and attractive theoretical properties. The increasing complexity, size, and variety of today's machine learning models call for the reassessment of existing assumptions. This book starts the process of reassessment. It describes the resurgence in novel contexts of established frameworks such as first-order methods, stochastic approximations, convex relaxations, interior-point methods, and proximal methods. It also devotes attention to newer themes such as regularized optimization, robust optimization, gradient and subgradient methods, splitting techniques, and second-order methods. Many of these techniques draw inspiration from other fields, including operations research, theoretical computer science, and subfields of optimization. The book will enrich the ongoing cross-fertilization between the machine learning community and these other fields, and within the broader optimization community.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Frozen - Blu-Ray + DVD
Blu-ray disc R330 Discovery Miles 3 300
The Papery A5 WOW 2025 Diary - Giraffe…
R349 R300 Discovery Miles 3 000
ZA Choker Necklace
R570 R399 Discovery Miles 3 990
MSI B450M-A PRO Max II AMD Gaming…
R1,999 R1,499 Discovery Miles 14 990
John C. Maxwell Undated Planner
Paperback R469 R325 Discovery Miles 3 250
Speak Now - Taylor's Version
Taylor Swift CD R496 Discovery Miles 4 960
LG 20MK400H 19.5" Monitor WXGA LED Black
R2,199 R1,699 Discovery Miles 16 990
Loot
Nadine Gordimer Paperback  (2)
R383 R318 Discovery Miles 3 180
The Lockdown Sessions
Elton John CD R57 Discovery Miles 570
Multi Colour Animal Print Neckerchief
R119 Discovery Miles 1 190

 

Partners