Skip to content

LCM-Lab/LOOM-Train

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

17 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸ—οΈ Loom-Train

A Simple & Efficient Training Framework for Long-Context LLMs

Optimized for scalability, memory efficiency, and seamless integration β€” built to unlock the full potential of long-context large language models.


πŸ“… Update Log


✨ Key Features

  • πŸ”§ Plug-and-Play: Drop-in replacement for HF Trainer β€” no major code changes needed.
  • πŸš€ Memory-Efficient: Leverages Ring-Flash-Attention to reduce GPU memory footprint by up to 50%.
  • πŸ“ˆ Scalable: Seamlessly scales to 100K+ context lengths without sacrificing speed.
  • ⚑ Fast Setup: Minimal dependencies, easy installation via pip install loom-train.

πŸ’» Environment & Installation

To install theloomtrain package from the gitee repository, run:

git clone https://github.com/LCM-Lab/LOOM-Train.git
conda create -n loom_train python=3.10 -y
conda activate loom_train
cd LOOM-Train/loomtrain
pip install -e .

To install flash attention, run the command below to obtain the required flah-attn version:

loomtrain-required-flash-attn

Download the suitable version of flash_attn from https://github.com/Dao-AILab/flash-attention/releases

pip install <path_to_flash_attn_whl_file>
pip install ring_flash_attn

πŸ› οΈ Getting Started

Then just swap your Trainer with LoomTrainer:

from loomtrain import LoomTrainer

trainer = LoomTrainer(
    model=model,
    args=training_args,
    train_dataset=train_dataset,
    # ... rest unchanged!
)

🀝 Contributing

We welcome contributions! Whether it’s bug fixes, new features, or documentation improvements β€” feel free to open an issue or PR.
Let’s build the future of long-context training, together. πŸ’ͺ


πŸ“¬ Contact

Questions? Suggestions? Reach out at: iiiigray19@gmail.com and zctang2000@gmail.com

About

Simple and efficient training framework for long-context models

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages