Ask HN: Is anybody building an alternative transformer?
Various alternative architectures to Transformers are being explored, with MAMBA showing promise through faster inference and lower compute costs, performing on par with transformers up to 7B parameters. Researchers are investigating recurrent architectures, state-space models, and efficient attention mechanisms, while debating the future direction of foundation models.