Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
tkiolp4
31 days ago
|
parent
|
context
|
favorite
| on:
Claude 4
But the LLM is going to do what its prompt (system prompt + user prompts) says. A human being can reject a task (even if that means losing their life).
LLMs cannot do other thing than following the combination of prompts that they are given.
Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
LLMs cannot do other thing than following the combination of prompts that they are given.