Enhanced MoE Parallelism, Open-source MoE Model Training Can Be 9 Times More Efficient November 9, 2023 Read more
One half-day of training using a few hundred dollars yields similar results to mainstream large models, open-source and commercial-free domain-specific LLM solution September 25, 2023 Read more
LuxProvide and HPC-AI Tech Join Forces to Revolutionize AI with SuperComputer and Colossal-AI Solutions September 6, 2023 Read more
70 billion parameter LLaMA2 model training accelerated by 195% with best foundation model practice upgraded September 4, 2023 Read more
Colossal-AI Platform made its debut at ICML, ushering in a new era of large-scale model training August 8, 2023 Read more
Join Colossal-AI at ICML 2023 and Try Our Newly Released Colossal-AI Platform July 22, 2023 Read more
65-billion-parameter large model pretraining accelerated by 38%, best practices for building LLaMA-like base models open-source July 17, 2023 Read more
Collaborative Innovation: Building an Arabic Chat-Based System in Partnership with Watad and HPC-AI Tech June 23, 2023 Read more
Colossal-AI and Intel® Partner to Deliver Cost-Efficient Open-Source Solution for Protein Folding Structure Prediction with Habana Gaudi Processors March 29, 2023 Read more