![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
At Eastercon, one of the few items I managed to get to, between wrangling offspring, was the "Ethics of AI" panel.
It was an interesting item, if a little "bitty" – I get the impression that there are so many unresolved issues that a single hour’s discussion couldn’t devote any significant period to any of them, so it mostly just bounced from one issue to the next. However, I was struck by how many of the issues are applicable, right now, to "dumb" software, never mind anything approaching an AI.
One of the topics (briefly) discussed was the issue of legal liability for the actions of a piece of software. I mentioned the very common software licence clause denying all liability for anything a program does.
majorclanger quickly pointed out that such clauses are unlikely to survive any significant contact with the UK legal system (I don't recall the details he gave of the act in question – something about unenforceable/unreasonable contracts?). There are presumably similar laws in other jurisdictions.
In some ways (i.e. as a user of software) I think thatIs a good thing. If a company releases software that does damage somewhere, then there should be consequences.
On the other hand, as a professional programmer, I'm a little more uneasy. IIRC, one of Alan Turing's great contributions to computer science was a proof that it's impossible to prove exactly what an arbitrary lump of software is going to actually do before you run it. For trivial programs, you can make deductions via human inspection, but that fails utterly for even relatively small lumps of code.
For any real-world-useful software, it's basically impossible to prove that it is bug free. With care, you can probably assert that it probably has no major bugs. For huge software projects (say, an operating system[1]) even getting that far can require carefully-co-ordinated person-centuries or person-millennia of effort, backed up by even larger quantities of automated computational grunt work.
(Things get murkier still if the software in question has eleventy-billion little config switches that the user can fiddle with, some of which are labelled "if you get this wrong, very bad things will happen")
Surely there has to be some sort of cut-off, where a software company can say "look, we did everything reasonably possible to ensure that the software was good, we can’t be held liable for a one-in-a-trillion bug that only kicks in when you make a typo at 12:42pm on a Tuesday in March when the wind is blowing from the south-east"? There are industry standards and quality standards and acceptance testing and so on. Presumably some of those things are actually recognised in law as a defence for the software producer?
So, how many liability issues have actually made it to court? Certainly in my professional experience, screw-ups with major real-world consequences have mostly been resolved via negotiated financial settlements. Has anyone ever tried to seriously lean on a "no liability" licence clause, and if so, what happened?
[1] Scientific American once printed an article (probably about a decade or so back) which argued, totally seriously and very persuasively (yeah, I'm biased) that Windows 2000 was one of the most complex artifacts ever built. Yes, they included things like Airliners and Moon Rockets. Big software is complicated.
It was an interesting item, if a little "bitty" – I get the impression that there are so many unresolved issues that a single hour’s discussion couldn’t devote any significant period to any of them, so it mostly just bounced from one issue to the next. However, I was struck by how many of the issues are applicable, right now, to "dumb" software, never mind anything approaching an AI.
One of the topics (briefly) discussed was the issue of legal liability for the actions of a piece of software. I mentioned the very common software licence clause denying all liability for anything a program does.
![[livejournal.com profile]](https://www.dreamwidth.org/img/external/lj-userinfo.gif)
In some ways (i.e. as a user of software) I think thatIs a good thing. If a company releases software that does damage somewhere, then there should be consequences.
On the other hand, as a professional programmer, I'm a little more uneasy. IIRC, one of Alan Turing's great contributions to computer science was a proof that it's impossible to prove exactly what an arbitrary lump of software is going to actually do before you run it. For trivial programs, you can make deductions via human inspection, but that fails utterly for even relatively small lumps of code.
For any real-world-useful software, it's basically impossible to prove that it is bug free. With care, you can probably assert that it probably has no major bugs. For huge software projects (say, an operating system[1]) even getting that far can require carefully-co-ordinated person-centuries or person-millennia of effort, backed up by even larger quantities of automated computational grunt work.
(Things get murkier still if the software in question has eleventy-billion little config switches that the user can fiddle with, some of which are labelled "if you get this wrong, very bad things will happen")
Surely there has to be some sort of cut-off, where a software company can say "look, we did everything reasonably possible to ensure that the software was good, we can’t be held liable for a one-in-a-trillion bug that only kicks in when you make a typo at 12:42pm on a Tuesday in March when the wind is blowing from the south-east"? There are industry standards and quality standards and acceptance testing and so on. Presumably some of those things are actually recognised in law as a defence for the software producer?
So, how many liability issues have actually made it to court? Certainly in my professional experience, screw-ups with major real-world consequences have mostly been resolved via negotiated financial settlements. Has anyone ever tried to seriously lean on a "no liability" licence clause, and if so, what happened?
[1] Scientific American once printed an article (probably about a decade or so back) which argued, totally seriously and very persuasively (yeah, I'm biased) that Windows 2000 was one of the most complex artifacts ever built. Yes, they included things like Airliners and Moon Rockets. Big software is complicated.
no subject
Date: 2012-04-14 21:09 (UTC)Most of what gets presented as AI in SF really ought to just drop the "A" bit, as it's blatantly just intelligence.
Edit: Oh, and I thoroughly second the "big software is complicated" thing.
no subject
Date: 2012-04-15 15:02 (UTC)no subject
Date: 2012-04-15 21:00 (UTC)no subject
Date: 2012-04-14 21:34 (UTC)Assuming that all that is competently done, the plaintiff's best approach should be testing the definition of reasonable precautions, and whether the company took those precautions. Who wins the case will help define where the reasonable line is. Sucks to be the guy who has to defend against that; the best way to avoid being that guy is to make sure your software safety auditing is better than everyone else's.
no subject
Date: 2012-04-15 05:21 (UTC)no subject
Date: 2012-04-15 15:34 (UTC)I suspect that a more popular answer is "make sure your software is used in an environment where the worst-case failure mode isn't that bad."
Which sorta loops back to the licencing issue again - I'm pretty sure that a lot of software has licence clauses like "don't use this software for any medical purpose" too...
Edited to Add: I mean "more popular" in the sense that "more people do this". Which is bleedin' obvious, now that I think about it, because most software probably exists in a reasonably "safe" environment.
no subject
Date: 2012-04-14 23:09 (UTC)Or worse, when they are NOT so labeled.
Surely there has to be some sort of cut-off, where a software company can say "look, we did everything reasonably possible to ensure that the software was good, we can’t be held liable for a one-in-a-trillion bug that only kicks in when you make a typo at 12:42pm on a Tuesday in March when the wind is blowing from the south-east"?
But the Therac-25 bug was exactly this sort of bug, and it killed people. So what is the appropriate cut-off?
no subject
Date: 2012-04-15 10:47 (UTC)no subject
Date: 2012-04-15 15:23 (UTC)Indeed, that's kinda what I was asking. As
For instance, in my day job, I have to worry about PCI-DSS and UK insurance regulation (and, of course, customers so clueless that they give us requirements that breach those regs...). So, if one of our systems gets hacked and loses $LoadsaMoney, does "well, we passed our PCI-DSS audits, we were just unlucky enough to get ownz0red by a zero-day exploit in $PopularHTTPServer" count as any sort of defence?
no subject
Date: 2012-04-15 14:44 (UTC)no subject
Date: 2012-04-15 15:28 (UTC)I suspect that, at this point (given that such cars now pretty much exist, albeit probably not in an economically viable form) it's going to take longer to work out exactly what the legal/liability issues are than to actually make them into mass-market product.
no subject
Date: 2012-04-16 19:21 (UTC)What I was citing was the Unfair Contract Terms Act 1977, which says that under English law you cannot contractually exclude liability for causing personal injury or death. You may be able to exclude liability for damage or monetary losses, but this will depend on the circumstances. There are entire chapters of books on IT law covering this, so it would be hard for me to provide a brief summary. However, some of the important factors will include:
- whether the party the exclusion clause operates against is a consumer rather than acting in the course of business;
- whether the exclusion clause was negotiated or was imposed as a standard condition;
- whether the clause excludes liability or just limits it;
- how closely connected to any fault the losses are that are excluded.
no subject
Date: 2012-04-16 19:52 (UTC)(Sorry we didn't get to discuss this at the time - between me chasing offspring and you helping to run a con bid, I think we only came with 10 feet of each other for about 2 minutes in the whole con...)