Skip to content

Meteor-Stars/SFF

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SFF (Lost in the Non-convex Loss Landscape: How to Fine-tune the Large Time Series Model? Published in ICLR 2026)

Pytorch implementation of SFF. The paper is available at the link Paper (PDF).

Overview

替代文本

Datasets

The public datasets can be downloaded from https://drive.google.com/drive/folders/1PPLsAoDbv4WcoXDp-mm4LFxoKwewnKxX and place them in the datasets folder.

Usage

Timer's pre-trained weights can be downloaded from the link https://drive.google.com/drive/folders/15oaiAl4OO5gFqZMJD2lOtX2fxHbpgcU8.

In the run.py script, different evaluation modes are enabled by setting training_from_scratch (TFS), LP (linear probing), LPFF (linear probing first then full fine-tuning) or smoothed_full_finetuning. If both are set to False, the original full fine-tuning (FF) strategy is adopted.

Reference

If this repository and the work are helpful to you, please consider citing it:

@inproceedings{zhanglost,
  title={Lost in the Non-convex Loss Landscape: How to Fine-tune the Large Time Series Model?},
  author={Zhang, Xu and Wang, Peng and Wang, Wei},
  booktitle={The Fourteenth International Conference on Learning Representations}
}

About

This repository contains the official code of Smoothed Full Fine-tuning (SFF) from Lost in the Non-convex Loss Landscape: How to Fine-tune the Large Time Series Model? (ICLR 2026). SFF enables large time series foundation models to achieve better fine-tuning by smoothing the steep and highly non-convex loss landscape induced by pre-training.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages