Skip to content

fix(testing/bench.py): make_param_key use _format_value for non-width keys#9

Open
pandacooming wants to merge 2 commits intodeepseek-ai:mainfrom
pandacooming:fix/make-param-key-dtype
Open

fix(testing/bench.py): make_param_key use _format_value for non-width keys#9
pandacooming wants to merge 2 commits intodeepseek-ai:mainfrom
pandacooming:fix/make-param-key-dtype

Conversation

@pandacooming
Copy link
Copy Markdown

Summary

Apply _format_value() to non-_WIDTH values in make_param_key(), ensuring torch.dtype and other complex types are formatted consistently with make_param_id(), which already uses _format_value.

Fixes #8

Problem

Function Input {'dtype': torch.float16} Output
make_param_id dtype=fp16
make_param_key dtype=torch.float16

Changes

def make_param_key(params: dict) -> str:
    param_str = ','.join(
-       f'{_SHORT_NAME.get(k, k)}={format(v, f">{_WIDTH.get(k)}") if k in _WIDTH else v}'
+       f'{_SHORT_NAME.get(k, k)}={_format_value(v) if k not in _WIDTH else format(v, f">{_WIDTH.get(k)}")}'
        for k, v in params.items() if v is not None
    )
    return f'{param_str}'

Also changes v != None to v is not None (PEP8 style).

Testing

make_param_key({'dtype': torch.float16, 'num_tokens': 1024, 'hidden': 256})
  → dtype=fp16,num_tokens= 1024,hidden= 256

make_param_id({'dtype': torch.float16, 'num_tokens': 1024, 'hidden': 256})
  → dtype=fp16-num_tokens=1024-hidden=256

dtype part match: True  (dtype=fp16 vs dtype=fp16) ✅

pandacooming added 2 commits April 24, 2026 00:53
- Add .github/workflows/format.yml: GitHub Actions workflow that runs
  yapf + ruff on every push to main and on every PR.
- Add format.sh: local formatting script (used by CI and for
  contributors to run locally before pushing).

Both files follow the same conventions as deepseek-ai/DeepEP.
… keys

Ensures torch.dtype and other complex types are formatted consistently
with make_param_id, which already uses _format_value.

Fixes deepseek-ai#8
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

testing/bench.py: make_param_key does not handle torch.dtype consistently with make_param_id

1 participant