Shunmei Cho (趙 俊茗) — M.Sc. Candidate in Computer Science @ Tokyo University of Technology
- NVIDIA Student Ambassador (AI / LLM Team)
- Kaggle Bronze Medal — March ML Mania 2025 (150/1727, Top 8.7%)
- Languages:
ChineseNative ·JapaneseBusiness ·EnglishFluent
"Explainable, Reproducible, Deployable."
| Area | Focus |
|---|---|
| Research | Hierarchical Reasoning Models for small-data inference & structured knowledge under data constraints |
| Systems | LLM x Robotics (Unitree GO2) — Voice → ASR → LLM → Robot Control → Feedback Loop |
| Engineering | Reproducible ML experiments — Config-driven, W&B tracked, Dockerized |
|
Small-Data Inference under Constraints
Methodology: Config-driven experiments | Clear baselines | Honest failure analysis | W&B tracking |
Unitree GO2 End-to-End Prototype Key Concerns: Interface contracts | Latency budgets | Safety filters | Fallback behaviors |
| Project | Description | Tech | Highlight |
|---|---|---|---|
| Claudia | Intelligent quadruped robot with LLM for dialogue, voice and navigation | ROS2 / Unitree SDK / ASR | End-to-end LLM x Robotics |
| March ML Mania 2025 | Feature engineering + ensemble pipeline for NCAA predictions | LightGBM / XGBoost / CatBoost | Top 8.7% (150/1727) |
| JetLLM | LLM deployment and inference optimization | CUDA / Python | GPU-accelerated inference |
| PEFT-SFT-Curator | Parameter-efficient fine-tuning and data curation pipeline | PyTorch / PEFT | Efficient LLM training |



