This document acknowledges external resources, tutorials, and documentation that inspired or informed the examples in this repository.
The MLP implementation using max.nn.Module was inspired by the official MAX documentation:
- Source: Build an MLP block as a module
- Concepts adapted:
- Using
Moduleclass for layer composition __call__method for forward pass- Layer stacking with ReLU activations
- Using
- Our additions:
- Applied to regression task (California housing dataset)
- Pre-trained weights included
- Complete end-to-end inference example
- Benchmarking against PyTorch
General MAX Graph concepts learned from:
- Get started with MAX graphs (Python tutorial)
- Introduction to MAX Graph
- MAX Graph Operations Reference
- Device Management
- Source: scikit-learn's
fetch_california_housing() - Usage: MLP regression example
- License: BSD-3-Clause (scikit-learn)
- Source: torchvision's MNIST dataset
- Usage: CNN classifier example
- License: Creative Commons Attribution-Share Alike 3.0
- Source: Hugging Face model hub
- Model:
distilbert-base-uncased-finetuned-sst-2-english - Original paper: "DistilBERT, a distilled version of BERT" (Sanh et al., 2019)
- License: Apache 2.0
This repository has been improved through:
- Feedback from the Modular Discord community
- Early testing and bug reports from users
- Suggestions for additional examples and clarifications
- MAX Engine: Modular's high-performance inference framework
- Pixi: Package manager for managing dependencies
- PyTorch: Used for benchmarking comparisons and dataset loading
- Hugging Face Transformers: Model and tokeniser loading
Key resources that informed our understanding:
- Modular's official documentation and tutorials
- MAX GitHub repository examples
- Community discussions on Modular forums and Discord
If you notice missing attributions or have suggestions for acknowledgements:
- Open an issue on GitHub
- Provide the source/resource details
- Describe how it relates to our examples
We strive to properly acknowledge all inspirations and sources.