Daily Shaarli

All links of one day in a single page.

March 22, 2026

[2603.13875] GradMem: Learning to Write Context into Memory with Test-Time Gradient Descent

a solid, well-executed paper with a clean idea and good ablations, but limited in ambition by the small scale and synthetic-heavy evaluation. The core insight — that gradient-based memory writing with meta-learned initialization beats forward-only writing — is believable and likely to hold at larger scale, though the computational tradeoff gets harder.