To support faster development *and* better performance Richard and I pushed through a large refactoring of NonlinearFactors.
The following are the biggest changes:
1) NonLinearFactor1 and NonLinearFactor2 are now templated on Config, Key type, and X type, where X is the argument to the measurement function.
2) The measurement itself is no longer kept in the nonlinear factor. Instead, a derived class (see testVSLAMFactor, testNonlinearEquality, testPose3Factor etc...) has to implement a function to compute the errors, "evaluateErrors". Instead of (h(x)-z), it needs to return (z-h(x)), so Ax-b is an approximation of the error. IMPORTANT: evaluateErrors needs - if asked - *combine* the calculation of the function value h(x) and the derivatives dh(x)/dx. This was a major performance issue. To do this, boost::optional<Matrix&> arguments are provided, and tin EvaluateErrors you just says something like
if (H) *H = Matrix_(3,6,....);
3) We are no longer using int or strings for nonlinear factors. Instead, the preferred key type is now Symbol, defined in Key.h. This is both fast and cool: you can construct it from an int, and cast it to a strong. It also does type checking: a Symbol<Pose3,'x'> will not match a Symbol<Pose2,'x'>
4) minor: take a look at LieConfig.h: it help you avoid writing a lot of code bu automatically creating configs for a certain type. See e.g. Pose3Config.h. A "double" LieConfig is on the way - Thanks Richard and Manohar !
1) eliminate methods no longer return a shared pointer. Shared pointers are good for Factors and Conditionals (which are also non-copyable), because these are often passed around under the hood. However, a BayesNet is simple a list of shared pointers and hence does not cost a lot to return as an object (which is compiler-optimized anyway: there is no copy). So, the signature of all eliminate methods changed to simply return a BayesNet<> object (not a shared pointer).
2) GaussianBayesNet::optimize is now replaced by optimize(GaussianBayesNet) and returns a VectorConfig and not a shared pointer
3) GaussianBayesNet and SymbolicBayesNet are now simply typedefs, not derived classes. This is desirable because the BayesTree class uses templated methods that return BayesNet<Conditional>, not a specific BayesNet derived class.
This simplifies Bayes nets quite a bit. Also created a Conditional base class, derived classes ConditionalGaussian and SymbolicConditional
Finally, some changes were needed because I moved some headers to .cpp