Skip to content
View Radioheading's full-sized avatar

Highlights

  • Pro

Organizations

@xv6-universe

Block or report Radioheading

Block user

Prevent this user from interacting with your repositories and sending you notifications. Learn more about blocking users.

You must be logged in to block users.

Maximum 250 characters. Please don't include any personal information such as legal names or email addresses. Markdown supported. This note will be visible to only you.
Report abuse

Contact GitHub support about this user’s behavior. Learn more about reporting abuse.

Report abuse
Radioheading/README.md

A brief introduction to me

📚I am a senior undergrad in ACM Class of 2026, Shanghai Jiao Tong University. Now I am a research intern at MIT HAN Lab. My research focuses on efficient and scalable generative models.

👋Let's rejoice in the rhythm and coding!


Pinned Loading

  1. mit-han-lab/radial-attention mit-han-lab/radial-attention Public

    [NeurIPS 2025] Radial Attention: O(nlogn) Sparse Attention with Energy Decay for Long Video Generation

    Python 586 32

  2. thu-ml/SpargeAttn thu-ml/SpargeAttn Public

    [ICML2025] SpargeAttention: A training-free sparse attention that accelerates any model inference.

    Cuda 953 87

  3. PolarisDane/KETA PolarisDane/KETA Public

    Submitted to IJCNN 2025

    3

  4. GFS-Palhinia GFS-Palhinia Public

    A toy Google-File-System(GFS) implemented in Golang

    Go 1