Join 6Pages
Log In
About Us
Log In
Join 6Pages
Back to Glossary
Mixture Of Experts
What is a mixture of experts?
An AI model architecture that activates only a subset of the model’s parameters at any given time, leading to more efficient computation, as is used by DeepSeek-V3.
Related Briefs
Apr 12 2024
Education
Regulation
AI
3 Shifts Edition (Apr 12 2024): Countries race to become an AI power, AI’s inroads into education, SEC cracks down on “shadow trading”
Aug 16 2024
Defensetech
Electric Vehicles
Social Commerce
3 Shifts Edition (Aug 16 2024): Drone fleets and the next stage of warfare, Buying in-app from social ads, The resurgence in plug-in hybrids
Sep 6 2024
AI
Healthcare
Startups
3 Shifts Edition (Sep 6 2024): The growth of “hospital at home”, Startups are getting smaller, China's investment push in AI
Jan 17 2025
Defense
Insurance
AI
3 Shifts Edition (Jan 17 2025): What happens to US home insurance next, China’s notable progress in AI models, NATO countries’ defense spending will rise
Jan 10 2025
Social Media
AI
Autonomous Vehicles
3 Shifts Edition (Jan 10 2025): Social media’s shift away from moderation, GPT-5’s troubles and AI data, Uber steers toward autonomous driving
Nov 1 2024
Generative AI
Coding
Space
3 Shifts Edition (Nov 1 2024): Defining open-source AI, The changing US space industry, The expansions of GitHub’s Copilot