-
Notifications
You must be signed in to change notification settings - Fork 251
Gradient optimizers #1328
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: develop
Are you sure you want to change the base?
Gradient optimizers #1328
Conversation
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## develop #1328 +/- ##
===========================================
+ Coverage 95.24% 95.32% +0.07%
===========================================
Files 814 825 +11
Lines 69309 67994 -1315
===========================================
- Hits 66015 64812 -1203
+ Misses 3294 3182 -112 see 509 files with indirect coverage changes Continue to review full report in Codecov by Sentry.
🚀 New features to boost your workflow:
|
|
At this point the code is something I'm relatively happy with. I also updated the docs. Would be great if this could get reviewed cc: @mborland @NAThompson |
include/boost/math/optimization/detail/differentiable_opt_utilties.hpp
Outdated
Show resolved
Hide resolved
include/boost/math/optimization/detail/differentiable_opt_utilties.hpp
Outdated
Show resolved
Hide resolved
include/boost/math/optimization/detail/differentiable_opt_utilties.hpp
Outdated
Show resolved
Hide resolved
|
@demroz in general this looks very good! Left some comments, and opinions that you can feel free to push back on. I pushed changes to CI and spelling rather than leaving those as TODOs for you. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I updated the PR according to your suggestions. A few points :
- I think adding support for
C++17execution policies should be done in a separate PR with dedicated benchmarks. - using
<random>instead of<boost/random.hpp>. I'm not sure what should be done here. I agree with your point that this breaks the ability of this being a standalone package. However,<random>doesn't support multiprecision types. We couldstatic_casttodoublefor multiprecision types, andstatic_castback tomultiprecisionbut thats not behavior I would expect as a user. Practically, i doubt it makes much difference as this is just an initialization policy. Any thoughts?
I mentioned it above but I think generating a random double and then casting it to a multiprecision type should be fine. I would not expect the lost digits to create much of a difference. |
|
I updated the docs and removed the |
adds a few gradient optimizers
everything is policy based, so these should be quite extensible. I may add a few more optimziers in the future, but these are the main ones i think. If there are any specific ones that you guys think should be added, I'd be happy to do so. Also although everything is reverse-mode autodiff centric, as long as you provide the objective function, a way to evaluate it, and a way to evaluate the gradient, everything should work correctly.
For some examples on how to use the optimziers:
test_gradient_descent_optimizer.cpp,test_nesterov_optimizer.cpp, andtest_lbfgs.cppshould be good starting points.I'm working on the documentation currently. I wanted to hold off to see if any major revisions are necessary