You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Proposal: Add a Tutorial/Documentation Example on Differentiable Decision Forests
Overview
This is a proposal to add a well-documented example or tutorial demonstrating a Differentiable Decision Forest model in PyTorch — inspired by the Deep Neural Decision Forests paper (Kontschieder et al., ICCV 2015).
The goal is not to introduce a new torch.nn module, but rather to show how such a model can be implemented using native PyTorch operations in a transparent and educational way.
Why This?
Combines the interpretability of decision trees with the feature learning power of neural networks.
Uses soft routing (sigmoid decisions) and learnable leaf distributions (softmax) to allow end-to-end backpropagation.
Offers an alternative to traditional ensembles or black-box classifiers, especially for tabular and hybrid domains.
What the Tutorial Would Include
Overview of the model structure (CNN → decision trees)
How to implement soft decisions and routing probabilities (μ) with PyTorch ops like sigmoid, softmax, einsum, gather, etc.
Joint optimization of routing and leaf distributions
Training on MNIST or tabular datasets
Emphasis on "Simple over Easy" — no custom abstractions
This is not a request to add this as a built-in PyTorch module — in fact, that might go against PyTorch's Simple over Easy philosophy.
Instead, this would be best suited as a community-contributed tutorial or example in the official PyTorch Tutorials repository or documentation site.
Extended Note
I'm currently in the middle of university exams and may not be able to actively contribute for a few weeks — but I’d be very interested in helping develop the tutorial afterwards.
Existing tutorials on this topic
No response
Additional context
No response
The text was updated successfully, but these errors were encountered:
Hey!
That sounds like a good tutorial to have on less mainstream ML architectures. I'm not sure if we will have someone well versed to review this though, so it should most likely follow the paper closely.
Thanks @albanD
I’d be happy to work on this tutorial! While my technical expertise is still developing, I plan to closely follow the original paper and build on existing GitHub repositories.
I’ll also reach out to the right people if I need help or clarification during the process.
My exams are wrapping up soon, so I’ll be able to start working on this in a few weeks.
I'm open to feedback and will do my best to make it clear, accurate, and useful!
🚀 Describe the improvement or the new tutorial
Proposal: Add a Tutorial/Documentation Example on Differentiable Decision Forests
Overview
This is a proposal to add a well-documented example or tutorial demonstrating a Differentiable Decision Forest model in PyTorch — inspired by the Deep Neural Decision Forests paper (Kontschieder et al., ICCV 2015).
The goal is not to introduce a new
torch.nn
module, but rather to show how such a model can be implemented using native PyTorch operations in a transparent and educational way.Why This?
What the Tutorial Would Include
sigmoid
,softmax
,einsum
,gather
, etc.Reference
Final Note
This is not a request to add this as a built-in PyTorch module — in fact, that might go against PyTorch's Simple over Easy philosophy.
Instead, this would be best suited as a community-contributed tutorial or example in the official PyTorch Tutorials repository or documentation site.
Extended Note
I'm currently in the middle of university exams and may not be able to actively contribute for a few weeks — but I’d be very interested in helping develop the tutorial afterwards.
Existing tutorials on this topic
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: