This AI Was Trained Only on Pre-1930 Text. We Asked It About Hitler, Stocks, and the Future
By Jose Antonio Lanz
Published on April 29, 2026.
A research team has developed a 13-billion-parameter open-weight model, Talkie-1930, which is trained on text published before January 1, 1931. The model, which lacks knowledge of the internet, civil rights movements, or the Cold War, has never heard of these topics. It also has no concept of the future of technology, understanding of medicine, crypto, AI, memes, and internet culture. The project was led by non-profit team led by Nick Levine, David Duvenaud, and Alec Radford, with compute support from Anthropic. The team is now using this model to assess historical events after its cutoff, which peaks around the 1950s–60s. They aim to scale to over a trillion tokens by summer 2026, with a corpus that they estimate can eventually build a similar in capability to the original ChatGPT.
Read Original Article