314 Billion Parameter Grok-1 Inference Accelerated by 3.8x, Efficient and Easy-to-Use PyTorch+HuggingFace version is Here! March 25, 2024 Read more
Open-Sora: Revealing Complete Model Parameters, Training Details, and Everything for Sora-like Video Generation Models March 17, 2024 Read more
Open-Sora:Sora Replication Solution with 46% Cost Reduction, Sequence Expansion to Nearly a Million March 4, 2024 Read more
Inference Performance Improved by 46%, Open Source Solution Breaks the Length Limit of LLM for Multi-Round Conversations January 8, 2024 Read more
Construct Refined 13B Private Model With Just $5000 USD, Upgraded Colossal-AI LLaMA-2 Open Source January 7, 2024 Read more
Unveiling Colossal-AI Booth and Exciting Hiring Opportunities at EMNLP 2023! December 7, 2023 Read more
Enhanced MoE Parallelism, Open-source MoE Model Training Can Be 9 Times More Efficient November 9, 2023 Read more
One half-day of training using a few hundred dollars yields similar results to mainstream large models, open-source and commercial-free domain-specific LLM solution September 25, 2023 Read more
LuxProvide and HPC-AI Tech Join Forces to Revolutionize AI with SuperComputer and Colossal-AI Solutions September 6, 2023 Read more
70 billion parameter LLaMA2 model training accelerated by 195% with best foundation model practice upgraded September 4, 2023 Read more