This is the official implementation of our ICML 2024 paper "MultiMax: Sparse and Multi-Modal Attention Learning""
-
Updated
Feb 9, 2026 - Python
This is the official implementation of our ICML 2024 paper "MultiMax: Sparse and Multi-Modal Attention Learning""
This repository features hands-on Jupyter Notebooks, covering everything from fundamental concepts to advanced neural network architectures.
Interactive deep learning project to predict IPL innings scores from match context such as teams, venue, batsman, bowler, runs, wickets, and overs.
We introduce two novel hybrid activation functions: S3 (Sigmoid-Softsign) and its improved version S4 (Smoothed S3)
Open ideas. Code less. Publish more.
An Artificial Neural Network (ANN) built using PyTorch to classify images in the Fashion MNIST dataset, utilizing GPU support for improved training performance.
This is a custom-built neural network that detects handwritten numbers from image inputs. It uses ReLU activation in the hidden layers and a softmax activation function in the output layer for classification. The model is trained using backpropagation with a loss function to minimize prediction errors, achieving over 99% accuracy when predicting
This is a Fake and AI image prediction using Transfer Learning
Linear and Non Linear Activation Functions : Linear, ReLU, Sigmoid, Softmax, Tanh
A Benchmark for Activation Function Exploration for Neural Architecture Search (NAS)
Vanishing Gradient and Activation Functions
Graph by matplotlib
ZiLU Activation Function for neural networks
Add a description, image, and links to the activation-function-exploration topic page so that developers can more easily learn about it.
To associate your repository with the activation-function-exploration topic, visit your repo's landing page and select "manage topics."