Auto-differentiable numerical inverse & efficient materialization of BNAF masks#234
Auto-differentiable numerical inverse & efficient materialization of BNAF masks#234noahewolfe wants to merge 9 commits intodanielward27:mainfrom
Conversation
…o turn off custom jvp warning preventing autodiff of inverter
… inverse (numerical) transformation
|
Nice! Cheers. I'm a bit busy at the moment but will try to check it out before the end of the week. |
danielward27
left a comment
There was a problem hiding this comment.
Thank you so much again for the contribution! It looks good to me, I've just left some comments with some minor suggestions, let me know what you think.
| (1, block_dim), | ||
| ] | ||
|
|
||
| def make_layer(inp): |
There was a problem hiding this comment.
This change seems to be unused?
| bijection: AbstractBijection, | ||
| inverter: Callable[[AbstractBijection, Array, Array | None], Array], | ||
| diffable_inverter: bool = False, | ||
| raise_old_error: bool = False, |
There was a problem hiding this comment.
I think we can remove the raise_old_error. If we don't bother using the legacy behavior for a deprecation cycle then we should probably just not bother including it to simplify the code a bit.
| self, | ||
| bijection: AbstractBijection, | ||
| inverter: Callable[[AbstractBijection, Array, Array | None], Array], | ||
| diffable_inverter: bool = False, |
There was a problem hiding this comment.
To me it would feel less confusing to replace diffable_inverter argument with use_implicit_diff or use_implicit_differentiation as a key word only argument defaulting to True.
Here, we make two updates which are particularly relevant for block-neural autoregressive flows:
NumericalInversetransforms.lineaxas a dependency for memory-efficient computation of the JVP.block_neural_autoregressive_flowand the default greedy bisection search (in a publication to be on the arxiv in the next week or two).nn_block_dim.block_diag_maskandblock_tril_mask, and reimplemented these with kronecker products.test_masks.pyLet me know what you think, happy to answer any questions and take any feedback!