Skip to content

lishanlu136/knowledge_distillation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Knowledge distillation

descriptin

training

  • main.py and model.py, use inception_resnet_v1 as bigModel and mobile_v2 as smallModel.
    Small_model's loss == softmax_loss + center_loss + distillation_loss

    distillation_loss可以为:

    1. soft_logits_loss
    2. soft_embedding_regression_loss(通过bigmodel的embedding与smallmodel的embedding之间构建损失函数来约束smallmodel的embedding.)

freeze model

  • 运行freeze_student_model.py文件生成pb文件

About

knowledge distillation(知识蒸馏)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages