💾 Archived View for dioskouroi.xyz › thread › 24997736 captured on 2020-11-07 at 00:57:25. Gemini links have been rewritten to link to archived content
-=-=-=-=-=-=-
________________________________________________________________________________
Hello, PyMC developer here. We're excited to give Theano a second life, and hope that this work will lend some staying power to the PyMC project in the probabilistic programming world.
As always, we're happy to accept to contributions! If you're looking to get involved, now is a great time. Please don't hesitate to speak up or reach out, either on the Theano-PyMC GitHub repo (
https://github.com/pymc-devs/Theano-PyMC
) or some other way (my website's in my bio).
Good to hear Theano has come back to life, it was my first deep learning library :). These days I do PyTorch, and while I really appreciate its debuggability and flexibility, I definitely can see in retrospect some of the advantages of Theano's declarative approach.
In that regard, I am curious about why Tensorflow didn't work out. I understand Tensorflow version 1 implements a declarative mode that I guess is in many ways similar to Theano's. I'm assuming v2 still supports that mode, on top of the new eager mode -- is that the case? If so, was there some aspect of its implementation that made it unsuitable for PyMC?
I'm only one person involved, but my primary reason for choosing Theano over TensorFlow has to do with the ability to manipulate and reason symbolically about models/graphs.
In order to improve the performance and usability of PyMC, I believe we need to automate things at the graph level, and Theano is by far the most suitable for this--between the two, at least.
You can find some work along these lines in the Symbolic PyMC project (
https://github.com/pymc-devs/symbolic-pymc);
it contains symbolic work done in both Theano and TensorFlow.
you think you can change the title to something less juvenile? so-and-so is dead, long live so-and-so is so last decade dude.
Remarkable bird, the Norwegian Blue, idn'it, aye?
Beautiful plumage!
I have a Jax question here - could someone explain what the PyMC developers exactly did to move their C/C++ backend to Jax ?
Because this sounds very numba-esque. I always thought Jax was just a math library, that was slightly more usable than Numpy. This article makes it seem that code written in Jax ends up being significantly faster than Numpy...almost close to C
That's NUMBA territory
Also, Tensorflow probability is moving to JAX.
I wonder if this was borne out of the success of NumPyro. Been playing around with, pystan and NumPyro recently. Mostly because pymc3 is going the way of the dodo.
NumPyro / JAX / PyTorch just seems like the most versatile offering out there right now
JAX
I'm the author of the JAX backend in Theano, and, no, this didn't have anything to do with NumPyro--especially since I neither use nor know very much about NumPyro.
It was just a quick demonstration of how easily one can use Theano as a generalized graph "front-end", while also preserving its more unique and programmable symbolic optimization capabilities. JAX was one of a few "backends" I considered, and, due to the JAX Python library, it also looked like the most straightforward one to implement first.
> pymc3 is going the way of the dodo
on the contrary, i recently was looking for python libraries for some bayesian computations for some greenfield development and pymc3 was at the top of the list. with statistical libraries, i prioritize well-tested, large community, and extensive documentation. if others share the same priorities, there's a long future for pymc3.
God I really hate this title meme.