Hacker News new | past | comments | ask | show | jobs | submit login

This is a logical fallacy. A human is not an algorithm. We do not have to extend rights regarding novel invention to an algorithm to protect them for people.



Differentiating between a human and a machine simply because one "is not an algorithm" doesn't make a lot of sense. If it were true, people would very easily game it, by using algorithms to automate the most trivial parts of copying someone's work.

Ultimately the algorithm is automating something a human could do. There is a lot of gray area to copyright law, but you can't get around that simply by offloading to an algorithm.


> Differentiating between a human and a machine simply because one "is not an algorithm" doesn't make a lot of sense.

Uh? So if I design a self driving car which kills someone, it's the car that goes to jail?

Legal precedent seems to indicate this is not the case at all. Because humans and machines are different, simply because humans aren't machines and viceversa.


"So if I design a self driving car which kills someone, it's the car that goes to jail?"

No but the manufacturer will typically be held responsible. If the manufacturer intentionally designed it to kill people, someone could certainly be charged with murder. More likely it was a software defect and then it is a matter of financial liability. (in between is a software defect that someone knew about and chose not to fix)

This isn't a new issue. If you design a car and the brakes fail due to a design issue and that issue can be determined to be something that could have been preventable by more competent design.... someone might indeed go to jail but more likely it would be the corporation paying out a large amount of money.

It could even be a mixture of the manufacturer's fault and the driver. Maybe the brakes failed but the driver was speeding and being reckless and slammed on the brakes with no time to spare. Had it not been a faulty design, no one would have gotten hurt, but also if the driver had been competent and responsible, no one would have gotten hurt.

But with self driving cars, when they no longer need a "safety driver", it certainly won't typically be the human occupant of the car's fault to any degree, since they are simply a passenger.


Last I checked this was very much a gray area. I’d expect at least a long investigation into the amount of work and testing put into validating that the self-driving algorithm operates inside reasonable standards of safe driving. In fact, I expect that, as the industry progresses, the tests for minimal acceptable self-driving safety get more and more standardised.

That doesn’t answer the question of who’s responsible when an accident happens and someone gets hurt or dies - but then, there was a time when animals would be judged and sentenced if they committed a crime under human law. That practice is no longer deemed valid, maybe we need to agree that, if the self-driving car was built with reasonable care, accidents can still happen and it’s no one’s fault.


This makes no sense.

If you build a machine and sell it, and this machine kills someone even operated correctly you'll have a problem. A big problem…

AI is a machine.

So the case is actually quite simple.

Regarding the sibling's Uber example: There the argumentation was that the machine was not operated correctly. So this is not a comparable case.


https://usa.streetsblog.org/2019/03/08/uber-got-off-the-hook...

Well the "long" investigation let uber off the hook despite disabling emergency breaking and put the driver in jail.

Which seems to put all the blame on the user and nothing on the makers of the AI.


>by using algorithms to automate the most trivial parts of copying someone's work

That's basically what copilot is...?


Ethical or not, Copilot is sophisticated software, which doesn't qualify as "trivial" by my definition.


> you can't get around that simply by offloading to an algorithm.

You can ...?

By simply saying existing fair usage rights are limited to be used by humans and not for-profit companies building for-profit products.


First of all, that isn't simple. How do you determine what is done by humans? If the human is using a computer and using copy and paste does that still qualify?

No matter where you draw the line between "done by computers" and "done by a human simply using a computer as a tool," there will always be a lot of gray area.

Also, if I spend a year creating my masterpiece, and some kid releases a copy of it for free and claims that that's ok just because it's "not for profit," there is still a problem.


> Differentiating between a human and a machine simply because one "is not an algorithm" doesn't make a lot of sense.

it makes a lot of sense, for that reason and a lot of others

people can create algorithms that do whatever they want, including copyright infringement and outright criminality, but algorithms can't create people or want anything for themselves




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: