Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
pests
4 months ago
|
parent
|
context
|
favorite
| on:
Facts will not save you – AI, history and Soviet s...
Technically modern LLMs are handicapped on translation tasks compared to the original transformer architecture. The origami transformer got to see future context as well as past tokens.
vouaobrasil
4 months ago
[–]
Okay, but I'm not really concerned with the state of the art now with any specific technology, but what will be the state of the art in 20 years.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: