Join 6Pages
Log In
About Us
Log In
Join 6Pages
Back to Glossary
Multi-Head Latent Attention (MLA)
What is multi-head latent attention?
A technique used by DeepSeek in its model training.
Related Briefs
Jan 17 2025
Defense
Insurance
AI
3 Shifts Edition (Jan 17 2025): What happens to US home insurance next, China’s notable progress in AI models, NATO countries’ defense spending will rise
Feb 7 2025
Economy
Regulation
AI
3 Shifts Edition (Feb 7 2025): Tariffs and the de minimis closure, The end of the CFPB as we know it, Distillation and AI economics