This package contains a Keras 3 implementation of the minGRU layer, a minimal and parallelizable version of the gated recurrent unit (GRU)
-
Updated
May 23, 2025 - Python
This package contains a Keras 3 implementation of the minGRU layer, a minimal and parallelizable version of the gated recurrent unit (GRU)
Implement "Were RNNs All We Needed" paper and compare it with original models
xlstm s/m model and mingru - minlstm for parallel concurrency or threads using stacked layers or columns like the cerebral cortex, parallel or distributed concurrent networks, let's try everything except attention :)
Add a description, image, and links to the mingru topic page so that developers can more easily learn about it.
To associate your repository with the mingru topic, visit your repo's landing page and select "manage topics."