Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Theory to show LEAP is Turing Complete? #13

Open
mtanghu opened this issue Sep 3, 2022 · 0 comments
Open

Theory to show LEAP is Turing Complete? #13

mtanghu opened this issue Sep 3, 2022 · 0 comments
Labels
enhancement New feature or request

Comments

@mtanghu
Copy link
Owner

mtanghu commented Sep 3, 2022

An Attention mechanism is considered Turing Complete if it can model any kind of sequence computation. Global LEAP with $O(1)$ path length and multiple heads should be enough to easily prove that LEAP can replicate any computation within Turing Completeness simply by performing the same steps as Turing Machine (using similar ideas and assumptions as Pérez, J., Marinković, J., & Barceló, P. (2019) like arbitrary precision, infinite recursive steps, and hard attention). Then the local/windowed attention will just allow for more parallel computation if only local computations are needed.

If this can be shown, it may be of less importance to perform one-to-one comparisions with GPT2 as there is theory to back up the expressiveness of the architecture/attention mechanism

@mtanghu mtanghu added enhancement New feature or request help wanted Extra attention is needed labels Sep 3, 2022
@mtanghu mtanghu removed the help wanted Extra attention is needed label Dec 31, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant