Using the benchmarking code and requirements specification from the official Github repo will cause a failure claiming Qwen3.5 is unsupported on the transformers v4.57.3. And by forcing the mainline transformers to be installed as stated by the Qwen 3.5 codebase, some random code error complaining about DynamicCache not supporting a specific method will be raised:
File ".venv/lib/python3.12/site-packages/transformers/models/qwen3_5/modeling_qwen3_5.py", line 1387, in _update_linear_attn_mask
if (past_key_values is not None and past_key_values.has_previous_state) or (
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'DynamicCache' object has no attribute 'has_previous_state'
Is it planed to add transformers support to this model, at least when transformers release a stable version with Qwen3.5 MoE model family support?
Using the benchmarking code and requirements specification from the official Github repo will cause a failure claiming Qwen3.5 is unsupported on the transformers v4.57.3. And by forcing the mainline transformers to be installed as stated by the Qwen 3.5 codebase, some random code error complaining about DynamicCache not supporting a specific method will be raised:
Is it planed to add transformers support to this model, at least when transformers release a stable version with Qwen3.5 MoE model family support?