Well, okay, but you're running into the same problem with memory, and I could invoke the space variant of the theorem.
You could say that an infinite family of GPT-like models with increasing context size collectively form a Turing-complete computational model and I would have no objections, but you're stretching the definition a bit...
You could say that an infinite family of GPT-like models with increasing context size collectively form a Turing-complete computational model and I would have no objections, but you're stretching the definition a bit...