Join 6Pages
Log In
About Us
Log In
Join 6Pages
Back to Glossary
Sparse Mixture-of-Experts
What is a sparse mixture-of-experts?
A technique used by DeepSeek in its model training.
Related Briefs
Jan 17 2025
Defense
Insurance
AI
3 Shifts Edition (Jan 17 2025): What happens to US home insurance next, China’s notable progress in AI models, NATO countries’ defense spending will rise
Aug 2 2024
Diagnostics
Creator Economy
AI
3 Shifts Edition (Aug 2 2024): The professionalization of influencer marketing, Open AI models are here to stay, Small-sample blood tests have arrived
Feb 7 2025
Economy
Regulation
AI
3 Shifts Edition (Feb 7 2025): Tariffs and the de minimis closure, The end of the CFPB as we know it, Distillation and AI economics
Oct 25 2024
Pharma
Generative AI
Life Sciences
3 Shifts Edition (Oct 25 2024): Retail giants’ same-day drug delivery, The 1st successful legal psychedelic, The recent slate of capable LLMs