Skip to content
/ MoME Public

Code for "MoME: Mixture-of-Masked-Experts for Efficient Multi-Task Recommendation"

License

Notifications You must be signed in to change notification settings

Xjh0327/MoME

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Introduction

Code for "MoME: Mixture-of-Masked-Experts for Efficient Multi-Task Recommendation", Jiahui Xu, Lu Sun, Dengji Zhao

Reference

Some implementations refers to the following repositories:

  • L0_regularization: The implementation of "Learning Sparse Neural Networks through L0 regularization" by Christos Louizos, Max Welling & Diederik P. Kingma.
  • stg: The implementation of "Feature Selection using Stochastic Gates" by Yamada, Yutaro and Lindenbaum, Ofir and Negahban, Sahand and Kluger, Yuval.
  • MTReclib: MTReclib provides a PyTorch implementation of multi-task recommendation models and common datasets.

About

Code for "MoME: Mixture-of-Masked-Experts for Efficient Multi-Task Recommendation"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages