Google proposes Titans: breaking through computing power limits and expanding context
Editor
3 hours ago 4,659
Share to:
Golden Finance reported that Google Research Institute released a new study, Titans. By introducing new neural long-term memory modules, three-head collaborative architecture and hardware optimization design, the context window of the big model is expanded to 2 million tokens with the computing power only 1.8 times. Titans not only solves the computing power bottleneck of the Transformer model in long context processing, but also uses bionic design to simulate the hierarchical mechanism of human memory systems, achieving accurate inference of 2 million tokens for the first time.