File: features.rst

package info (click to toggle)
ceres-solver 2.2.0%2Bdfsg-4.1
  • links: PTS, VCS
  • area: main
  • in suites: forky, sid, trixie
  • size: 14,064 kB
  • sloc: cpp: 87,689; ansic: 3,060; python: 659; sh: 78; makefile: 73; xml: 21
file content (96 lines) | stat: -rw-r--r-- 4,504 bytes parent folder | download
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
.. default-domain:: cpp

.. cpp:namespace:: ceres

====
Why?
====
.. _chapter-features:

* **Code Quality** - Ceres Solver has been used in production at
  Google since 2011. It is clean, extensively tested and well
  documented code that is actively developed and supported.

* **Modeling API** - It is rarely the case that one starts with the
  exact and complete formulation of the problem that one is trying to
  solve. Ceres's modeling API has been designed so that the user can
  easily build and modify the objective function, one term at a
  time. And to do so without worrying about how the solver is going to
  deal with the resulting changes in the sparsity/structure of the
  underlying problem.

  - **Derivatives** Supplying derivatives is perhaps the most tedious
    and error prone part of using an optimization library.  Ceres
    ships with `automatic`_ and `numeric`_ differentiation. So you
    never have to compute derivatives by hand (unless you really want
    to). Not only this, Ceres allows you to mix automatic, numeric and
    analytical derivatives in any combination that you want.

  - **Robust Loss Functions** Most non-linear least squares problems
    involve data. If there is data, there will be outliers. Ceres
    allows the user to *shape* their residuals using a
    :class:`LossFunction` to reduce the influence of outliers.

  - **Manifolds** In many cases, some parameters lie on a manifold
    other than Euclidean space, e.g., rotation matrices. In such
    cases, the user can specify the geometry of the local tangent
    space by specifying a :class:`Manifold` object.

* **Solver Choice** Depending on the size, sparsity structure, time &
  memory budgets, and solution quality requirements, different
  optimization algorithms will suit different needs. To this end,
  Ceres Solver comes with a variety of optimization algorithms:

  - **Trust Region Solvers** - Ceres supports Levenberg-Marquardt,
    Powell's Dogleg, and Subspace dogleg methods. The key
    computational cost in all of these methods is the solution of a
    linear system. To this end Ceres ships with a variety of linear
    solvers - dense QR and dense Cholesky factorization (using
    `Eigen`_, `LAPACK`_ or `CUDA`_) for dense problems, sparse
    Cholesky factorization (`SuiteSparse`_, `Apple's Accelerate`_,
    `Eigen`_) for large sparse problems, custom Schur complement based
    dense, sparse, and iterative linear solvers for `bundle
    adjustment`_ problems.

  - **Line Search Solvers** - When the problem size is so large that
    storing and factoring the Jacobian is not feasible or a low
    accuracy solution is required cheaply, Ceres offers a number of
    line search based algorithms. This includes a number of variants
    of Non-linear Conjugate Gradients, BFGS and LBFGS.

* **Speed** - Ceres Solver has been extensively optimized, with C++
  templating, hand written linear algebra routines and modern C++
  threads based multithreading of the Jacobian evaluation and the
  linear solvers.

* **GPU Acceleration** If your system supports `CUDA`_ then Ceres
  Solver can use the Nvidia GPU on your system to speed up the solver.

* **Solution Quality** Ceres is the `best performing`_ solver on the
  NIST problem set used by Mondragon and Borchers for benchmarking
  non-linear least squares solvers.

* **Covariance estimation** - Evaluate the sensitivity/uncertainty of
  the solution by evaluating all or part of the covariance
  matrix. Ceres is one of the few solvers that allows you to do this
  analysis at scale.

* **Community** Since its release as an open source software, Ceres
  has developed an active developer community that contributes new
  features, bug fixes and support.

* **Portability** - Runs on *Linux*, *Windows*, *Mac OS X*, *Android*
  *and iOS*.

* **BSD Licensed** The BSD license offers the flexibility to ship your
  application

.. _best performing: https://groups.google.com/forum/#!topic/ceres-solver/UcicgMPgbXw
.. _bundle adjustment: http://en.wikipedia.org/wiki/Bundle_adjustment
.. _SuiteSparse: http://www.cise.ufl.edu/research/sparse/SuiteSparse/
.. _Eigen: http://eigen.tuxfamily.org/
.. _LAPACK: http://www.netlib.org/lapack/
.. _automatic: http://en.wikipedia.org/wiki/Automatic_differentiation
.. _numeric: http://en.wikipedia.org/wiki/Numerical_differentiation
.. _CUDA : https://developer.nvidia.com/cuda-toolkit
.. _Apple's Accelerate: https://developer.apple.com/documentation/accelerate/sparse_solvers