Header logo is al
3 results

2018


no image
L4: Practical loss-based stepsize adaptation for deep learning

Rolinek, M., Martius, G.

In Proc. NIPS 2018, Neural Information Processing Systems (NIPS 2018), 2018, to appear, arXiv preprint \url{https://arxiv.org/abs/1802.05074} (inproceedings)

Github Project Page [BibTex]

2018

Github Project Page [BibTex]


Thumb xl featured pic
Learning equations for extrapolation and control

Sahoo, S. S., Lampert, C. H., Martius, G.

In Proc. 35th International Conference on Machine Learning, ICML 2018, Stockholm, Sweden, 2018, 80, pages: 4442-4450, http://proceedings.mlr.press/v80/sahoo18a/sahoo18a.pdf, (Editors: Dy, Jennifer and Krause, Andreas), PMLR, 2018 (inproceedings)

Abstract
We present an approach to identify concise equations from data using a shallow neural network approach. In contrast to ordinary black-box regression, this approach allows understanding functional relations and generalizing them from observed data to unseen parts of the parameter space. We show how to extend the class of learnable equations for a recently proposed equation learning network to include divisions, and we improve the learning and model selection strategy to be useful for challenging real-world data. For systems governed by analytical expressions, our method can in many cases identify the true underlying equation and extrapolate to unseen domains. We demonstrate its effectiveness by experiments on a cart-pendulum system, where only 2 random rollouts are required to learn the forward dynamics and successfully achieve the swing-up task.

Code Arxiv link (url) Project Page [BibTex]

Code Arxiv link (url) Project Page [BibTex]

2016


no image
Extrapolation and learning equations

Martius, G., Lampert, C. H.

2016, arXiv preprint \url{https://arxiv.org/abs/1610.02995} (misc)

Project Page [BibTex]

2016

Project Page [BibTex]