🚀
Latent MOE
LatentMoE is a neural network architecture that improves Mixture-of-Experts models by routing activations through a latent space to reduce overhead and increase capacity.
Infrastructure & MLOps
LatentMoE is a neural network architecture that improves Mixture-of-Experts models by routing activations through a latent space to reduce overhead and increase capacity.