Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
jjcm
27 days ago
|
parent
|
context
|
favorite
| on:
Qwen3-Next
As plenty of others have mentioned here, if inference were 100x cheaper, I would run 200x inference.
There are so many things you can do with long running, continuous inference.
sipjca
27 days ago
[–]
but what if you don't need to run it in the cloud
ukuina
27 days ago
|
parent
[–]
You will ALWAYS want to use the absolute best model, because your time is more valuable than the machine's. If the machine gets faster or more capable, your value has jumped proportionally.
Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
There are so many things you can do with long running, continuous inference.