Should the tech industry be the arbiters of morality?
1 Star2 Stars3 Stars4 Stars5 Stars (No Ratings Yet)
Loading...

Should the tech industry be the arbiters of morality?

The DarkMatter debate raging in the Mozilla root forum has major implications

As much as a free and open internet sounds appealing, most sane people will admit we do need some degree of regulation. Without it — with no restrictions at all — the internet begins to look a lot more like the dark web with sexual exploitation and illicit goods out in the open. With too many restrictions things can easily become repressive.

You can see how both sides play out in various locales around the world, from countries with no regulations that are anarchic havens for cybercriminals to strict dictatorial regimes that control every aspect of their citizens’ online lives.

But the global nature of the worldwide web makes the mish-mash of national laws attempting to regulate the internet in their respective countries ineffectual, which creates a need for some supranational entity to make some of these regulatory decisions.

Who that entity should be is not a question easily answered — if it can be at all. But while there are no right answers, there are definitely some wrong ones.

One of those would be turning it over to the tech industry.

Admittedly, on its face, doing so would make some sense. Some companies, by nature of their ubiquity and positioning, certainly have the power to effect change on the internet. But allowing that power to go unchecked is a very slippery slope.

Deep down, one’s sense of morality is one of the things that informs regulation.

And the tech industry should not be the arbiters of morality.

Let’s use an example from our own industry (SSL/TLS & PKI), where Certificate Authorities, Browser vendors and Root Programs collaborate (via the CA/B forum) on a set of standards that the entire industry must abide.

Right now, there’s a fascinating debate unfolding in the Mozilla root forum, where decisions are made on Certificate Authority inclusions to the Mozilla foundation’s root program (we’ll explain in a second). It serves as an excellent microcosm for the larger debate.

Mozilla’s root program is unique, owing to Mozilla’s position in the industry. Unlike Google, Apple and Microsoft — who run the other major root programs of record — Mozilla is a non-profit organization that isn’t viewing a lot of these decisions through a financial lens. Mozilla really does try to act in a way that’s aligned with its ethos, it legitimately wants to make the internet a better place.

Of the major root programs, Mozilla’s intentions are the noblest.

That’s why its root forum is currently ground zero for a debate over the potential inclusion of DarkMatter’s roots.

There’s a lot to unpack here, and I’m going to try to be as comprehensive as I can be, but before we go any further I want to make it absolutely clear that this article (and by extension, Hashed Out or The SSL Store™) is not advocating for DarkMatter’s inclusion, nor is it arguing that its application should be rejected — it’s just raising some serious questions that need to be considered alongside this decision and have ramifications for the question we posed earlier. After all, nothing in this industry happens in a vacuum.

Let’s hash it out.

The Mozilla Root Program

Let’s start with what a root program is. A lot of you may already know this but bear with me anyway. Every computer system, whether it’s a desktop or mobile device, uses a root store. A root store is a collection of Certificate Authority root certificates that live on the device and facilitate Public Key Infrastructure (PKI). Digital certificates and PKI are what facilitate secure connections on the internet (HTTPS), email signing, document signing, code signing, etc.

When a device is presented with a digital certificate or a signed file, it uses the digital signature that’s been affixed to help authenticate it. To do this, it looks at the signature and then follows it back to the certificate whose private key signed it. It continues following the signatures until it reaches one of the roots in its store. As long as a digital certificate can be chained back to one of these roots then the device will trust it. If it doesn’t, it issues an error.

This is why you can’t self-sign certificates. It’s also what makes root certificates so valuable. A Certificate Authority (DigiCert, Sectigo, Entrust, etc.) is responsible for issuing digital certificates. They are required to abide a strict set of standards to prevent mis-issuance of those certificates.

In order to issue digital certificates you need to have a trusted root. That means you need to have your root accepted by the various root programs.

Once your root is included in the root programs, ANY digital certificate you issue will be automatically trusted by any system using that root program. Obviously, that can be dangerous, those are called rogue certificates and they do serious damage. Ergo, these root programs are closely guarded.

There are four major programs of note:

  • Mozilla
  • Apple
  • Microsoft
  • Google

Microsoft systems use the Microsoft root store, Mac OS and iOS systems use Apple’s. Android devices use Google’s and the Mozilla Firefox web browser, Thunderbird mail client — as well as many open source OS distributions — use Mozilla’s root store.

While Mozilla’s root program isn’t openly democratic, in the interest of transparency a lot of the conversations regarding its business are housed on a Google forum that can be accessed by the public. It’s on that forum that other industry experts and stakeholders can weigh in.

And it’s because of said transparency — and because Mozilla as a 20-year-old non-profit has a well-deserved reputation as a conscientious and benevolent actor on the internet — that this decision is likely to serve as a bellwether.

Google, Apple and Microsoft are far more opaque about their programs, though for the sake of interoperability the major root programs typically fall in line (see: Symantec distrust).

Suffice it to say the debate currently unfolding will carry weight in the decisions of the other root programs.

Enter DarkMatter Group

The issue at hand is the CA Root application of the DarkMatter group. Previously, DarkMatter had acted as a sub-CA, using an intermediate root that had been issued by QuoVadis to sign and issue certificates. However, for the past two years DarkMatter has been attempting to become a proper CA with its own roots included in the various root programs.

Here’s where it all comes to a head.

The DarkMatter Group is a cybersecurity firm located in the United Arab Emirates. Lately it’s been in the news quite a bit as a result of a Reuters article that outlined how the organization manages “Project Raven,” which partially consists of former NSA agents, and has helped the UAE and its ruling party (it’s a monarchy) spy on dissidents, rival leaders and journalists.

The story of Project Raven reveals how former U.S. government hackers have employed state-of-the-art cyber-espionage tools on behalf of a foreign intelligence service that spies on human rights activists, journalists and political rivals.

Interviews with nine former Raven operatives, along with a review of thousands of pages of project documents and emails, show that surveillance techniques taught by the NSA were central to the UAE’s efforts to monitor opponents.

I recommend reading the entire article, it spins an interesting tale.

One that DarkMatter wholly rejects as fiction.

Per a February 25th release from DarkMatter CEO Karim Sabbagh:

I am writing to provide DarkMatter Group’s position on a recent media article about the UAE that referenced security and intelligence matters. It also mentioned DarkMatter with misleading information about who we are and what we do. Similar claims have been raised in the Mozilla Policy Group where Forum members have been urged to take action against DarkMatter Group, seeing to derail a two-year process to have our Roots included in their Trust Store. We believe this is grossly unfair and has no basis in fact.

I want to assure you that DarkMatter’s work is solely focused on defensive cyber security, secure communications and digital transformation. We have never, nor will we ever, operate or manage non-defensive cyber activities against any nationality.

Far from being a black-and-white debate, this conversation skews in a number of troubling directions and could potentially open several cans of worms. It starts to get into the structure and ownership of various organizations, the subjective nature of root programs in general, future precedents, ethics and the wisdom of Ex Post Facto action. It’s also a great referendum on whether or not the tech industry should be the arbiters of morality.

We’re going to go through each one, and as we said from the top: we’re just posing these questions. We don’t purport to have the answers.

But someone better, because these are critical decisions.

Does DarkMatter qualify for inclusion in the Mozilla root program?

Welcome to the grey area. The answer is yes, technically. But also, no. That’s because Mozilla, perhaps wisely, has given itself some wiggle room with regard to making subjective decisions on including a CA’s roots or not.

With the exception of some ongoing debate about serial numbers, DarkMatter has all of the technical qualifications required for a CA’s inclusion in the Mozilla root program.

Per, DarkMatter General Counsel Benjamin Gabriel, who raised concerns over bias in the process:

As you are fully aware, DarkMatter has spent considerable effort over the past three (3) years to establish its commercial CA and Trust related business. A key milestone has been the successful completion of two (2) Web Trust public audits verifying that DarkMatter’s CA business is operating in accordance with the standards stipulated within Mozilla Root Store Policy and the latest version of the CA/Browser Forum (“CABForum”) Requirements for the Issuance and Management of Publicly-Trusted Certificates. We have publicly disclosed our Certificate Policy and Certification Practice Statements showing how we comply with the above noted requirements.

And bear in mind, it’s already been operating as a sub-CA under QuoVadis. So, if you were to take the names off the application, DarkMatter would likely have no problem being included.

Unfortunately for DarkMatter though, these applications are far from anonymous. They require extensive vetting and audits. They also allow Mozilla to reject an application without even providing a reason for the decision (though that route would considerably undermine the organization’s attempts to be transparent). Here’s the relevant passage from Mozilla’s Root program policies:

We reserve the right to not include certificates from a particular CA in our root program. This includes (but is not limited to) cases where we believe that a CA has caused undue risks to users’ security, e.g. by knowingly issuing certificates without the knowledge of the entities whose information is referenced in those certificates (‘MITM certificates’). Mozilla is under no obligation to explain the reasoning behind any inclusion decision.

Obviously, the knock against DarkMatter is the news reports — which it denies — claiming it’s helped the UAE to spy on its own citizens (and potentially some international citizens, too). Despite those reports though, Mozilla has not received any hard evidence of mis-issuance or untoward behavior in DarkMatter’s CA capacity.

Per Mozilla’s Wayne Thayer:

We are not aware of direct evidence of misused certificates in this case. However, the evidence does strongly suggest that misuse is likely to occur, if it has not already.

Complicating this issue even more is the fact that, like most organizations, DarkMatter Group has different business units that carry out different functions. As DarkMatter’s Senior VP, Scott Rea, explains:

DarkMatter has several business units that focus on a broad range of cyber security activities. The Trust Services BU is responsible for the DarkMatter CA and primarily focused on enabling secure communications and digital transformation. We utilize the services of other DM BU’s who are primarily focused on defensive cyber security activities e.g. Cyber Network Defense and Managed Security Services to protect and ensure the integrity of the CA operations.

DarkMatter is required to disclose a certain amount of information about its control and ownership, which it has. That hasn’t satisfied the more stringent voices in the forum though, which has led to proposals that would require CAs to share far more proprietary information than what is likely comfortable.

And that’s really the largest problem inherent in all of this: the subjectivity is a double-edged sword. While it provides Mozilla with a degree of flexibility in this decision, it also biases the process to a fault.

The definition of a double standard?

Regardless of your position on DarkMatter Group, it is, objectively, being held to a much different standard than other organizations. Not only is it being held to account for the previous sins of now-distrusted CAs like CNNIC, Procert, WoSign and StartCom, it’s also being held to an untenable standard: what COULD happen in the future.

And that is another very slippery slope — a standard that I’m not sure other organizations would want to be held to.

This isn’t the Tom Cruise movie, Minority Report, there’s no legal basis for penalizing a person or organization for crimes they MIGHT commit in the future. To my knowledge, there’s not a legal system on Earth that operates that way.

Granted, this is a decision being made by a private entity, so its not beholden to the same restrictions and parameters as government, but that’s also another argument for why the tech industry is ill-suited to make these kinds of decisions. And Mozilla is an outlier, it’s guided by a core philosophy, which is far more well-intentioned than the financial motivations that drive other companies in the industry.

And speaking of legality, it’s also worth noting that despite the accusations leveled against DarkMatter Group, it’s never been charged with breaking the law. A big part of that is because it’s (allegedly) been conducting business under the banner of nation state, which affords it a slightly different set of rules. And that is important, because while it would be easy to dismiss an organization that has run afoul of various national laws, it’s a little bit different when the organization has operated within the parameters of the law in its jurisdiction.

And by most accounts DarkMatter has.

The issue isn’t so much DarkMatter’s conduct as it is that we (and by we, I mean the consensus of the industry as a whole) are projecting our own Western ethics and morality on a company and a country halfway around the world. There is no universal standard of ethics. It differs place to place. Whether or not we agree with the rule of law there, DarkMatter Group has a completely legal, law-abiding public-private partnership with the UAE’s government.

Now let me just pose a rhetorical question, if DarkMatter Group is structured with distinct business units handling distinct roles, meaning that the CA operation likely doesn’t have much (if any) overlap with the Project Raven business units working in conjunction with the UAE government — how is that any different from the work Google does with the US Department of Defense (drone targeting, named Project Maven ) or the Chinese government (censored search engine)?

Seriously, what’s the difference? One letter? Google is a CA. Google has trusted roots. Google also works on clandestine projects with various governments and government agencies. Many companies have these kinds of government partnerships – many American tech companies work closely with the NSA, which infamously keeps tabs on American citizens – oftentimes never disclosing their collaboration publicly. It might seem like Apples and Oranges, but take the drone targeting systems that Google’s own employees protested against collaborating on. They’re used to conduct what some countries view to be illegal cross-border operations that target foreign nationals. The program is even controversial domestically. And that’s just the tip of the iceberg. The US government does a lot of the same things we’re criticizing the UAE for.

Just yesterday news broke that the US had been surveilling journalists and immigration activists near the border with Mexico.

The point isn’t to vilify Google or the US, it’s to point out how blatantly subjective this all is.

And it’s subjectivity borne out of our Western worldview. We’re projecting our own morality and ethics, which are not universally shared. So, what happens in 20 years if the balance of economic power has swung dramatically and the companies of import possess a different moral compass?

How do the precedents being set today play out tomorrow?

Ex Post Facto

Back to the arguments against DarkMatter, getting into what a law-abiding company MIGHT do, or COULD do, is a really murky place to be. That’s definitely not a rule on the books, even if it does roll up under the discretionary clause. And where do you draw the line? You’re potentially about to establish a very important precedent, an extremely impactful one at that.

Because now we’re in the territory of Ex Post Facto legislation, where you’re creating rules after the fact and applying them retroactively. There is such a strong consensus on this principle in the legal community that something as seemingly just and obvious as the Nuremberg Trials — which prosecuted Nazi war criminals after World War 2 — have been called illegal by some legal scholars as there weren’t actually laws on the books for many of the crimes being charged.

Despite any universal ethical consensus, one principle pretty much every legal system in every developed country agrees on is Ex Post Facto rulings are unjust.

And that actually serves as an interesting parallel because morally, ethically, from a common sense standpoint, there’s broad agreement that punishing Nazis was absolutely the correct course of action. Just like from a common sense standpoint, it’s probably not a good idea to give an organization like DarkMatter that much power. But much like when the American Civil Liberties Union has defended the KKK’s first amendment right to organize, there are also principles and precedents that undergird the developed world’s systems of laws and justice.

The precedents set by this decision will continue to have a bearing on other, more pressing discussions if they’re not enacted well now.

Objectively, it’s easy to look at an organization like DarkMatter Group and say, for the sake of security on the internet we cannot accept this root application. It’s just too risky. And it’s OK to feel that way. But that opinion ignores the larger implications. This is not really about DarkMatter at all, so much as it is about the set of standards we’re going to apply categorically moving forward. As well as who should be setting those standards.

If we make a point of rejecting DarkMatter because of what it MIGHT do, on suspicion, how do we apply that standard moving forward?

And what if, by applying that standard to other established CAs, it forces us to re-evaluate their participation, too? Or are we simply condoning applying different standards to different organizations based on their location and other partnerships? The ones we deem, according to our own worldview, to be bad even if they’re perfectly legal where they’re occurring. Again, that’s a slippery slope. One that can also lead to accusations of regional bias or prejudice.

And while it’s fine to hold those opinions personally, this industry probably should not be the arbiter of ethics and morality.

Because at the end of the day that would constitute arbitrating morality. Despite the fact DarkMatter has followed the local rule of law, we don’t like those laws and the alleged activities that have been carried out under those laws — so there’s a problem.

That’s bias by definition. Something DarkMatter’s General Counsel, Benjamin Gabriel, called out on Tuesday morning:

While we welcome the public discussion as a vital component in the maintenance of trust and transparency in Mozilla’s Root Store, we wish to bring to your attention, and to other esteemed CABForum members, DarkMatter’s reasonable apprehension of bias and conflict of interest in how the Mozilla organization has framed and conducted the discussion at hand. Notwithstanding the stated goal of transparency in the public discussion, recent public comments by Mozilla employees (including your opening statement in the discussion), indicate a hidden organizational animus that is fatal to the idea of “due process” and “fundamental fairness” being accorded to any CA applicant to the Mozilla Root Store.

Nothing happens in a vacuum

It’s fair to have questions about DarkMatter and whether or not its inclusion in the root programs is wise. This is an important debate. But this is not a one-off decision, so care needs to be taken with the decision. Because there are precedents being set, there will be far-reaching ramifications regardless of what decision is made, and — most importantly — once you set a new standard you need to apply it consistently across the board. Not selectively, when it suits you.

But none of that answers the question of whether the tech industry is best suited to handle these kinds of decisions in the first place. Because, as mentioned earlier, Mozilla is an outlier. Most of the companies of record are not non-profits operating on a set of principles. They’re for-profit businesses. And when it comes to profitability and running a company, what’s good for the Goose isn’t always good for the gander. A company isn’t worried about doing good in the world, that comes secondary to growth and profit.

Even disputing that would be naïve. Time and time against businesses the world over have proven they will do what’s best for themselves and their shareholders — including working on controversial government contracts — often at the expense of the people around them. That’s a conflict of interest where ethics and moral standards are concerned.

Just look at the fervor over Google and Facebook and their alleged censorship of various groups in the US and Europe. A major debate rages over whether or not the companies have the right to hide or delete someone’s speech on their own platforms. Asking them to make decisions that have far more profound effects than whether or not Alex Jones should be de-platformed is inviting disaster.

The unfortunate thing is that there isn’t a good answer to who should be making these kinds of decisions. And if there is one, it’s like to be unsatisfying to a good portion of people anyway.

But letting the tech industry make these kinds of decisions, or rather trusting it to do what is best for the internet — or what is just — is unwise at best and downright irresponsible at worst.

Because in 20, 30, 40 years from now when the economic landscape has changed and the power dynamics have shifted, the precedents we set today are going to either help us or hamstring us.

Here’s hoping it’s the former.

This post originally appeared on Medium.

Author

Patrick Nohe

Patrick started his career as a beat reporter and columnist for the Miami Herald before moving into the cybersecurity industry a few years ago. Patrick covers encryption, hashing, browser UI/UX and general cyber security in a way that’s relatable for everyone.