0
Your cart

Your cart is empty

Browse All Departments
  • All Departments
Price
  • R1,000 - R2,500 (4)
  • -
Status
Brand

Showing 1 - 4 of 4 matches in All Departments

Sparse Modeling for Image and Vision Processing (Paperback): Julien Mairal, Francis Bach, Jean Ponce Sparse Modeling for Image and Vision Processing (Paperback)
Julien Mairal, Francis Bach, Jean Ponce
R2,366 Discovery Miles 23 660 Ships in 10 - 15 working days

In recent years, a large amount of multi-disciplinary research has been conducted on sparse models and their applications. In statistics and machine learning, the sparsity principle is used to perform model selection-that is, automatically selecting a simple model among a large collection of them. In signal processing, sparse coding consists of representing data with linear combinations of a few dictionary elements. Subsequently, the corresponding tools have been widely adopted by several scientific communities such as neuroscience, bioinformatics, or computer vision. Sparse Modeling for Image and Vision Processing provides the reader with a self-contained view of sparse modeling for visual recognition and image processing. More specifically, the work focuses on applications where the dictionary is learned and adapted to data, yielding a compact representation that has been successful in various contexts. It reviews a large number of applications of dictionary learning in image processing and computer vision and presents basic sparse estimation tools. It starts with a historical tour of sparse estimation in signal processing and statistics, before moving to more recent concepts such as sparse recovery and dictionary learning. Subsequently, it shows that dictionary learning is related to matrix factorization techniques, and that it is particularly effective for modeling natural image patches. As a consequence, it has been used for tackling several image processing problems and is a key component of many state-of-the-art methods in visual recognition. Sparse Modeling for Image and Vision Processing concludes with a presentation of optimization techniques that should make dictionary learning easy to use for researchers that are not experts in the field.

Learning with Submodular Functions - A Convex Optimization Perspective (Paperback): Francis Bach Learning with Submodular Functions - A Convex Optimization Perspective (Paperback)
Francis Bach
R2,379 Discovery Miles 23 790 Ships in 10 - 15 working days

Submodular functions are relevant to machine learning for at least two reasons: (1) some problems may be expressed directly as the optimization of submodular functions, and (2) the Lovasz extension of submodular functions provides a useful set of regularization functions for supervised and unsupervised learning. In Learning with Submodular Functions, the theory of submodular functions is presented in a self-contained way from a convex analysis perspective, presenting tight links between certain polyhedra, combinatorial optimization and convex optimization problems. In particular, it describes how submodular function minimization is equivalent to solving a wide variety of convex optimization problems. This allows the derivation of new efficient algorithms for approximate and exact submodular function minimization with theoretical guarantees and good practical performance. By listing many examples of submodular functions, it reviews various applications to machine learning, such as clustering, experimental design, sensor placement, graphical model structure learning or subset selection, as well as a family of structured sparsity-inducing norms that can be derived and used from submodular functions. This is an ideal reference for researchers, scientists, or engineers with an interest in applying submodular functions to machine learning problems.

Optimization with Sparsity-Inducing Penalties (Paperback): Francis Bach, Rodolph Jenatton, Julien Mairal, Guillaume Obozinski Optimization with Sparsity-Inducing Penalties (Paperback)
Francis Bach, Rodolph Jenatton, Julien Mairal, Guillaume Obozinski
R1,910 Discovery Miles 19 100 Ships in 10 - 15 working days

Sparse estimation methods are aimed at using or obtaining parsimonious representations of data or models. They were first dedicated to linear variable selection but numerous extensions have now emerged such as structured sparsity or kernel selection. It turns out that many of the related estimation problems can be cast as convex optimization problems by regularizing the empirical risk with appropriate nonsmooth norms. Optimization with Sparsity-Inducing Penalties presents optimization tools and techniques dedicated to such sparsity-inducing penalties from a general perspective. It covers proximal methods, block-coordinate descent, reweighted l2-penalized techniques, working-set and homotopy methods, as well as non-convex formulations and extensions, and provides an extensive set of experiments to compare various algorithms from a computational point of view. The presentation of Optimization with Sparsity-Inducing Penalties is essentially based on existing literature, but the process of constructing a general framework leads naturally to new results, connections and points of view. It is an ideal reference on the topic for anyone working in machine learning and related areas.

Optimization for Machine Learning (Paperback): Suvrit Sra, Sebastian Nowozin, Stephen J Wright Optimization for Machine Learning (Paperback)
Suvrit Sra, Sebastian Nowozin, Stephen J Wright; Contributions by Suvrit Sra, Sebastian Nowozin, …
R1,976 Discovery Miles 19 760 Ships in 10 - 15 working days

An up-to-date account of the interplay between optimization and machine learning, accessible to students and researchers in both communities. The interplay between optimization and machine learning is one of the most important developments in modern computational science. Optimization formulations and methods are proving to be vital in designing algorithms to extract essential knowledge from huge volumes of data. Machine learning, however, is not simply a consumer of optimization technology but a rapidly evolving field that is itself generating new optimization ideas. This book captures the state of the art of the interaction between optimization and machine learning in a way that is accessible to researchers in both fields. Optimization approaches have enjoyed prominence in machine learning because of their wide applicability and attractive theoretical properties. The increasing complexity, size, and variety of today's machine learning models call for the reassessment of existing assumptions. This book starts the process of reassessment. It describes the resurgence in novel contexts of established frameworks such as first-order methods, stochastic approximations, convex relaxations, interior-point methods, and proximal methods. It also devotes attention to newer themes such as regularized optimization, robust optimization, gradient and subgradient methods, splitting techniques, and second-order methods. Many of these techniques draw inspiration from other fields, including operations research, theoretical computer science, and subfields of optimization. The book will enrich the ongoing cross-fertilization between the machine learning community and these other fields, and within the broader optimization community.

Free Delivery
Pinterest Twitter Facebook Google+
You may like...
Spectra S1 Double Rechargeable Breast…
 (46)
R3,999 R3,199 Discovery Miles 31 990
LP Support Deluxe Waist Support
 (1)
R369 R262 Discovery Miles 2 620
Docking Edition Multi-Functional…
R1,099 R799 Discovery Miles 7 990
Tipping Point: Turmoil Or Reform…
Raymond Parsons Paperback R300 R215 Discovery Miles 2 150
The South African Guide To Gluten-Free…
Zorah Booley Samaai Paperback R380 R270 Discovery Miles 2 700
Loot
Nadine Gordimer Paperback  (2)
R398 R330 Discovery Miles 3 300
Casio LW-200-7AV Watch with 10-Year…
R999 R884 Discovery Miles 8 840
Loot
Nadine Gordimer Paperback  (2)
R398 R330 Discovery Miles 3 300
Loot
Nadine Gordimer Paperback  (2)
R398 R330 Discovery Miles 3 300
Wonder Organics Veggie Garden Fertiliser…
R94 Discovery Miles 940

 

Partners