The Impact of Positional Encoding on Length Generalization in Transformers Paper • 2305.19466 • Published May 31, 2023 • 2
Transformers Can Do Arithmetic with the Right Embeddings Paper • 2405.17399 • Published May 27, 2024 • 52
Teaching Transformers Causal Reasoning through Axiomatic Training Paper • 2407.07612 • Published Jul 10, 2024 • 2
Round and Round We Go! What makes Rotary Positional Encodings useful? Paper • 2410.06205 • Published Oct 8, 2024 • 1
Byte Latent Transformer: Patches Scale Better Than Tokens Paper • 2412.09871 • Published 26 days ago • 85