Optimization with Sparsity-Inducing Penalties
| AUTHOR | Jenatton, Rodolph; Jenatton, Rodolph; Jenatton, Rodolph et al. |
| PUBLISHER | Now Publishers (01/04/2012) |
| PRODUCT TYPE | Paperback (Paperback) |
Description
Sparse estimation methods are aimed at using or obtaining parsimonious representations of data or models. They were first dedicated to linear variable selection but numerous extensions have now emerged such as structured sparsity or kernel selection. It turns out that many of the related estimation problems can be cast as convex optimization problems by regularizing the empirical risk with appropriate nonsmooth norms. Optimization with Sparsity-Inducing Penalties presents optimization tools and techniques dedicated to such sparsity-inducing penalties from a general perspective. It covers proximal methods, block-coordinate descent, reweighted ?2-penalized techniques, working-set and homotopy methods, as well as non-convex formulations and extensions, and provides an extensive set of experiments to compare various algorithms from a computational point of view. The presentation of Optimization with Sparsity-Inducing Penalties is essentially based on existing literature, but the process of constructing a general framework leads naturally to new results, connections and points of view. It is an ideal reference on the topic for anyone working in machine learning and related areas.
Show More
Product Format
Product Details
ISBN-13:
9781601985101
ISBN-10:
160198510X
Binding:
Paperback or Softback (Trade Paperback (Us))
Content Language:
English
More Product Details
Page Count:
124
Carton Quantity:
72
Product Dimensions:
6.14 x 0.26 x 9.21 inches
Weight:
0.41 pound(s)
Country of Origin:
US
Subject Information
BISAC Categories
Computers | Artificial Intelligence - General
Computers | Computer Science
Computers | Machine Theory
Dewey Decimal:
658
Descriptions, Reviews, Etc.
publisher marketing
Sparse estimation methods are aimed at using or obtaining parsimonious representations of data or models. They were first dedicated to linear variable selection but numerous extensions have now emerged such as structured sparsity or kernel selection. It turns out that many of the related estimation problems can be cast as convex optimization problems by regularizing the empirical risk with appropriate nonsmooth norms. Optimization with Sparsity-Inducing Penalties presents optimization tools and techniques dedicated to such sparsity-inducing penalties from a general perspective. It covers proximal methods, block-coordinate descent, reweighted ?2-penalized techniques, working-set and homotopy methods, as well as non-convex formulations and extensions, and provides an extensive set of experiments to compare various algorithms from a computational point of view. The presentation of Optimization with Sparsity-Inducing Penalties is essentially based on existing literature, but the process of constructing a general framework leads naturally to new results, connections and points of view. It is an ideal reference on the topic for anyone working in machine learning and related areas.
Show More
List Price $80.00
Your Price
$79.20
