Tag: long-context LLMs
Rotary Position Embeddings (RoPE) vs ALiBi: Which LLM Positioning Method Wins?
Compare RoPE and ALiBi positional embeddings in LLMs. Learn how rotation matrices and linear biases solve the context window problem for models like Llama.