Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Well, okay, but you're running into the same problem with memory, and I could invoke the space variant of the theorem.

You could say that an infinite family of GPT-like models with increasing context size collectively form a Turing-complete computational model and I would have no objections, but you're stretching the definition a bit...



In the real world nothing has infinite memory, so no computer would be turing complete. Therefore this requirement is ignored.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: