> For every line of code they would have you predict all the socio-economic consequences
This is like saying for every step you take, you would have to predict if it would kill you or not. This is not how humans generally function. You make simplifying assumptions, such as thinking, "If I go for a walk at night in the jungle, would that kill me or not?" and then take action based on the overall planned trip.
The same logic applies to shon's argument: if you write code for a tactical nuke manufacturer, would that contribute to harm in the world? You can then decide whether to work at that company or not, rather than analyzing if the "int count = 0" you just wrote brings the company closer to its (likely) destructive goals.
This is like saying for every step you take, you would have to predict if it would kill you or not. This is not how humans generally function. You make simplifying assumptions, such as thinking, "If I go for a walk at night in the jungle, would that kill me or not?" and then take action based on the overall planned trip. The same logic applies to shon's argument: if you write code for a tactical nuke manufacturer, would that contribute to harm in the world? You can then decide whether to work at that company or not, rather than analyzing if the "int count = 0" you just wrote brings the company closer to its (likely) destructive goals.