Thursday, February 27, 2025

Software Liability: US vs. EU

I have written before about the double-edged sword of software vendors' ability to disclaim liability for the performance of their products. Six years ago I wrote The Internet of Torts about software embedded in the physical objects of the Internet of Things. Four years ago I wrote about Liability In The Software Supply Chain.

Source
Last October, Tom Uren wrote The EU Throws a Hand Grenade on Software Liability:
The EU and U.S. are taking very different approaches to the introduction of liability for software products. While the U.S. kicks the can down the road, the EU is rolling a hand grenade down it to see what happens.
It is past time to catch up on this issue, so follow me below the fold.

USA

Source
In March 2020 the Cyberspace Solarium Commission, a "bipartisan team of lawmakers and outside experts", launched their report. Among its 82 recommendations were several related to liability:
  • 3.3.2: Clarify Liability for Federally Directed Mitigation, Response and Recovery Efforts
  • 4.2 Congress should pass a law establishing that final goods assemblers of software, hardware, and firmware are liable for damages from incidents that exploit known and unpatched vulnerabilities
  • 4.3 Congress should establish a Bureau of Cyber Statistics charged with collecting and providing statistical data on cybersecurity and the cyber ecosystem to inform policymaking and government programs
Source
In March 2023 the Biden administration announced their National Cybersecurity Strategy. As regards liability, it stated:
Markets impose inadequate costs on — and often reward — those entities that introduce vulnerable products or services into our digital ecosystem. Too many vendors ignore best practices for secure development, ship products with insecure default configurations or known vulnerabilities, and integrate third-party software of unvetted or unknown provenance. Software makers are able to leverage their market position to fully disclaim liability by contract, further reducing their incentive to follow secure-by-design principles or perform pre-release testing. Poor software security greatly increases systemic risk across the digital ecosystem and leave[s] American citizens bearing the ultimate cost.
What did the administration propose to do about this?:
The Administration will work with Congress and the private sector to develop legislation establishing liability for software products and services. Any such legislation should prevent manufacturers and software publishers with market power from fully disclaiming liability by contract, and establish higher standards of care for software in specific high-risk scenarios. To begin to shape standards of care for secure software development, the Administration will drive the development of an adaptable safe harbor framework to shield from liability companies that securely develop and maintain their software products and services. This safe harbor will draw from current best practices for secure software development, such as the NIST Secure Software Development Framework. It also must evolve over time, incorporating new tools for secure software development, software transparency, and vulnerability discovery.
In October 2024 Eric Geller looked at what happened to the Cyberspace Solarium Commission's recommendations in The struggle for software liability: Inside a ‘very, very, very hard problem’:
Six years after Congress tasked a group of cybersecurity experts with reimagining America’s approach to digital security, virtually all of that group’s proposals have been implemented. But there’s one glaring exception that has especially bedeviled policymakers and advocates: a proposal to make software companies legally liable for major failures caused by flawed code.
It wasn't like the Commission invented the liability problem:
Since the 1980s, legal scholars have discussed how liability should apply to flawed software. The fact that there still isn’t a consensus about the right approach underscores how complicated the issue is.

One of the biggest hurdles is establishing a “standard of care,” a minimum security threshold that companies could meet to avoid lawsuits. There’s disagreement about “how to define a reasonably secure software product,” Dempsey said, and technology evolves so quickly that it might not be wise to codify one specific standard.

Various solutions have been proposed, including letting juries decide if software is safe enough — like they do with other products — and letting companies qualify for “safe harbor” from lawsuits through existing programs like a government attestation process.

The Solarium Commission proposed safe harbor for companies that patch known vulnerabilities. But that would only address part of the problem.
Even a very weak "duty of care" would be a big improvement. It would, for example, outlaw hard-wired passwords, require 2-factor authentication with FIDO or passkeys not SMS, require mailers to display the actual target of links, and so on.

The industry pushback has been ferocious:
One of the industry’s chief arguments is that liability would distract companies from improving security and overburden them with compliance costs. “The more companies are spending their time on thinking about liability, the less they might be spending their time on higher-value activities,” said Henry Young, senior director of policy at the software trade group BSA.
The problem is that the "higher-value activities" typically result in adding vulnerabilites to their products. I doubt that management at victim companies like Equifax or SolarWinds' customers would think that adding flashy new features was "higher-value" than fixing vulnerabilities.

You can't say the industry's flacks aren't creative:
Liability opponents say insecure software isn’t the biggest cybersecurity problem, pointing to widespread and devastating phishing attacks.
Riiiiight! Phishing is a force of nature, not a tactic enabled by the industry's insecure products.

Some of the industry's arguments are laughable:
They argue that even if policymakers want to focus on software security, there are better ways to prod vendors forward, such as encouraging corporate-board oversight.
Board members are insulated from liability by D&O insurance, so "encouraging" them will have precisely zero effect.

But this is my favorite:
And they warn that focusing on liability will distract the government from pursuing better policies with its limited resources.
Exactly what are the "better policies" they desire? To be left alone to ship more buggy products.

The industry whines that they wouldbe treated differently from others, conveniently ignoring that the others can't disclaim liability:
Critics also contend that it’s unfair to punish companies for digital flaws that are deliberately exploited by malicious actors, a scenario that’s rare in most industries with liability, such as food and automobiles.
In 2023 nearly 41,000 people were killed by automobiles in the US. Many, probably most of those deaths were caused by "malicious actors" exploiting flaws such as cars that know what the speed limit is but don't enforce it, or could but don't detect that the driver is drunk or sleepy. Liability hasn't caused the auto industry to get real about safety. Instead we have Fake Self-Driving killing people. And:
Up to 48 million people get sick from a foodborne illness every year, and up to 3,000 are estimated to die from them.
Finally, the industry claims that everything in just fine:
Industry leaders say liability is unnecessary because there’s already a working alternative: the marketplace, where businesses are accountable to their customers and invest in security to avoid financial and reputational punishment. As for contracts disclaiming liability, the industry says customers can negotiate security expectations with their vendors.

“We're open to conversations about any way to improve software security,” Young said. “Our customers care about it, and we want to deliver for them.”
Have you tried to "negotiate security expectations" with Microsoft, or have a conversation with Oracle about a "way to improve software security"? How did it go? I guess it didn't go well:
“Just telling organizations that not fixing security bugs will impact their business is not enough of an incentive,” a group of tech experts warned the Cybersecurity and Infrastructure Security Agency in a report approved this month.

Experts also rejected the idea that most customers could negotiate liability into their contracts. Few companies have leverage in negotiations with software giants, and few customers know enough about software security to make any demands of their vendors.
Despite its laughable nature, the industry's pushback ensured that nothing happened until it was too late:
Senior administration officials haven’t lived up to their lofty rhetoric about shifting the burden of cybersecurity from customers to suppliers, Herr said. “There is an attitude in this White House of a willingness to defer to industry in operational questions in a lot of cases.”
In the end it may come down to judges to determine whether the disclaimers of liability are effective, and there may be a siight ray of hope. Last December Sean Lyngaas reported that Judge rules Israeli firm NSO Group liable for damages in WhatsApp hacking case:
Messaging service WhatsApp claimed a major legal victory over Israeli spyware firm NSO Group on Friday after a federal judge ruled that NSO was liable under federal and California law for a 2019 hacking spree that breached over 1,000 WhatsApp users.

It’s a rare legal win for activists who have sought to rein in companies that make powerful spyware, or software capable of surveilling calls and texts, that has reportedly been used on journalists, human rights advocates and political dissidents worldwide.
So in the US the disclaimers of liability in the end user license agreement are, and will presumably continue to be, valid unless your product is intended to commit crimes such as violations of the Computer Fraud and Abuse Act. The bar for a victim to prove liability is impossibly high.

Color me skeptical, but Jim Dempsey writes in The MAGA Case for Software Liability:
Under the current administration, the instinctive inclination of post-Reagan Republicans to rely only on market forces to hold businesses responsible for the consequences of their actions would seem to preclude the use of government policy to improve the security of software vital to business and government operations. Indeed, the Trump team has promised wholesale repudiation of regulations adopted in the past four years, so new limits on industry would seem especially unlikely.

There are, however, good reasons why the new administration should not default to repealing the cybersecurity actions of the past four years and passively accepting severe cyber vulnerabilities in critical infrastructure. In fact, as I explained in a series last year on initiatives aimed at infrastructure and data, much of the Biden administration’s cybersecurity agenda was built on projects launched by President Trump in his first term. The Trump administration would do well to remember the underlying principles that spurred it to initiate these actions the first time around.

EU

Source
The EU seems to be taking an opposite approach. Tom Uren wrote:
Earlier this month, the EU Council issued a directive updating the EU’s product liability law to treat software in the same way as any other product. Under this law, consumers can claim compensation for damages caused by defective products without having to prove the vendor was negligent or irresponsible. In addition to personal injury or property damages, for software products, damages may be awarded for the loss or destruction of data.
In the EU companies are presumed to be liable for a defective product unless they can qualify for a "safe harbor":
Rather than define a minimum software development standard, the directive sets what we regard as the highest possible bar. Software makers can avoid liability if they prove a defect was not discoverable given the “objective state of scientific and technical knowledge” at the time the product was put on the market.
The directive is based on 2 main principles:
  • the manufacturer has to compensate the damage caused by a defective product of theirs
  • the victim has to prove the product’s defectiveness, the damage that was caused and establish that this defectiveness was the cause of the damage
And the products are defined thus:
  • Digital economy: The new law extends the definition of “product” to digital manufacturing files and software. Also online platforms can be held liable for a defective product sold on their platform just like any other economic operators if they act like one.
  • Circular economy: When a product is repaired and upgraded outside the original manufacturer’s control, the company or person that modified the product should be held liable.
Unlike the current and proposed US approaches, the EU's approach imposes victim-driven consequences for:
  • Failure to use current software tools and development practices to prevent defects. Note that the directive gives victims the right to obtain evidence of this kind.
  • Failure to acknowledge and respond to defect reports from customers and third parties, because they were clearly discoverable via the "objective state of scientific and technical knowledge". But note the caveat "at the time the product was put on the market".
  • Failure to issue timely fixes for defects.
Uren concludes:
Major software vendors used by the world’s most important enterprises and governments are publishing comically vulnerable code without fear of any blowback whatsoever. So yes, the status quo needs change. Whether it needs a hand grenade lobbed at it is an open question. We’ll have our answer soon.

Open Source

Neither the US nor the EU approaches seem to take account of the fact that many of the "products" they propose to regulate are based upon open source code. An under-appreciated feature of the rise of IT has been the extraordinary productivity unleashed by the 1988 Berkeley Software Distribution license and Richard Stallman's 1989 GNU General Public License. Version 3 of the Gnu Public License Section 15 states:
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM “AS IS” WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
And Section 16 states:
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.
That these licenses disclaimed warranties and liabilities was the key to the open source revolution because it enabled individuals, rather than developers protected by their employer's lawyers, to contribute. Without the disclaimers individual developers would face unacceptable legal risk. On the other hand, open source is also a channel for malware. Shaurya Malwa reports on an example in Hackers Are Using Fake GitHub Code to Steal Your Bitcoin: Kaspersky:
The report warned users of a “GitVenom” campaign that’s been active for at least two years but is steadily on the rise, involving planting malicious code in fake projects on the popular code repository platform.

The attack starts with seemingly legitimate GitHub projects — like making Telegram bots for managing bitcoin wallets or tools for computer games.

Each comes with a polished README file, often AI-generated, to build trust. But the code itself is a Trojan horse: For Python-based projects, attackers hide nefarious script after a bizarre string of 2,000 tabs, which decrypts and executes a malicious payload.

For JavaScript, a rogue function is embedded in the main file, triggering the launch attack. Once activated, the malware pulls additional tools from a separate hacker-controlled GitHub repository.

Balancing the need to stop vendors "publishing comically vulnerable code" with the need to nurture the open source ecosystem is a difficult problem.

1 comment:

David. said...

Thomas Claburn reports on a potential problem for open source in An appeals court may kill a GNU GPL software license:

"At some point in the months ahead, the United States Court of Appeals for the Ninth Circuit will consider an effort to reverse a California federal district court's decision in Neo4j v. PureThink.

If the appellate court upholds that decision, which endorsed database maker Neo4j's right to amend the GNU Affero General Public License, version 3, governing the use of its software with new binding terms, current assumptions about the enforceability of copyleft licenses will no longer apply."