… But when Moffatt later attempted to receive the discount, he learned that the chatbot had been wrong. Air Canada only awarded bereavement fees if the request had been submitted before a flight. The airline later argued the chatbot was a separate legal entity “responsible for its own actions,”….
How exactly do you go about making a chatbot a legal entity?
Yeah, but would it not be cool for this Ruling to have set a precedent for AI personhood? - thats something you would read in a humorously written scifi story
This gets close to the law of principal and agent. Are "intelligent agents" agents in the legal sense? That is, is the principal responsible for their actions?
That's usually the case for employees, unless the employee clearly acted outside the scope of their employment. AI systems operated on behalf of a business should be held to the same standard.
There's an economic theory of accounting for mistakes of agents.[1] There's a cost of mistakes, and a cost of decreasing the error rate. So it's something that can be priced into the cost of running the business.
I was thinking about this. Surely it must be made to pay back the money lost.
Options could be always to have it include advertisement to competitors or for example porn or dating sites, in each discussion or with each message. Seems reasonable enough to implement and deliver.
Always deliver some message critical of Air Canada...
There is lot of creative ways it could communicate with other customers. Thus paying for the wrongs commited...
How exactly do you go about making a chatbot a legal entity?