[Servercert-wg] [cabf_validation] Underscores, DNSNames, and SRVNames

Ryan Sleevi sleevi at google.com
Mon Oct 22 18:47:12 MST 2018


On Mon, Oct 22, 2018 at 7:36 PM Wayne Thayer <wthayer at mozilla.com> wrote:

> On Tue, Oct 23, 2018 at 6:33 AM Ryan Sleevi <sleevi at google.com> wrote:
>
>> Given the data I've shared, do you feel that it's still an accurate
>> statement to suggest there's any "pain" at all, or that there are any
>> bureaucratic change management processes involved?
>>
>>
> Given the data that you shared, please explain how you reach the
> conclusion that there is zero "pain" for Subscribers?
>

Sure. When a mosquito bites me, I don't treat it like kidney stones. And I
make a distinction between eating ice cream really fast and having my teeth
sensitive versus having my appendix removed. It's not that there's
"nothing" felt, it's that there's no pain - to suggest that the
inconvenience, on an ecosystem level, represents 'pain' is to undermine
much of the real pain that exists in the ecosystem.

While I know CAs have been very attune to "no customer left behind", every
ecosystem maintainer in the Forum has generally acknowledge that "no"
impact is not a reasonable goal, and "minimal" impact is acceptable. My
point of sharing those numbers was to show that talking about it being
painful is a bit like being a professional soccer player [1][2]. It's
entertaining, but it's not exactly furthering the industry.

I think the question here, at the core of this discussion, is how much
'pain' is acceptable. Is 3955 certificates? Is it the 3238 distinct
hostnames represented in those certificates? Is it the 216 certificates,
representing 166 domain names, that can't be transitioned to wildcards
tomorrow? Does factoring in revocations and distrust matter? What's the
threshold at which we say that it's more important to deliberately
introduce incompatibility into the ecosystem? And who should decide? The
browsers? The CAs? The public? In those same logs, we see 252,675,924
distinct DNS names and 1,187,938,671 certificates. Are those numbers
inflated due to revocations, distrust, and non-publicly-trusted CAs
included - but considering that the same applies to those previous numbers,
it's at least a consistent measure - and a hard upper bound.

It should be clear my view that standards exist to provide a reliable and
interoperable system that allows for competitors to work together to build
interoperable systems, to provide a consistent baseline for those seeking
to understand and address security concerns. That the Forum should not be
in the business of circumventing multi-stakeholder models, whether that
multi-stakeholder model is the Mozilla Dev.Security.Policy community that
allows far greater representation than the Forum, or the IETF
consensus-building process. The moment we, as a Forum, start pursuing that
path, we sacrifice the legitimacy of the Forum and the safety afforded to
our collaborations.

[1]
https://www.buzzfeed.com/mjkiebus/ridiculous-soccer-dives-guaranteed-to-make-you-angry
[2]
https://www.theatlantic.com/entertainment/archive/2014/06/dissecting-american-soccers-hatred-of-the-flop-is-a-world-cup-tradition/372839/


> I think we disagree, then. By failing to adopt an immediate sunset - which
>> I believe the data does not support - this encourages CAs to reissue
>> certificates with the prolonged period sooner for their customers, "just in
>> case". This then creates more migration pain, as customers will have 2+
>> year certificates, and fail to actually transition until their current
>> certificate expires.
>>
>> I am explicitly attempting to prevent this type of behavior by requiring
> that all certificates with a lifetime > 30 days be revoked prior to 1-June.
> Are you dismissing revocation as an effective tool for blocking the use of
> these certificates (presumably they would need to be added to a CRLSet), or
> is there some other problem with this approach?
>

I believe this approach is fundamentally flawed, and that we have ample
evidence to know that this does not lead to the desired result. The desired
result of any later sunset is to say "Organizations need time to adapt, and
by delaying impact, we give them time to move." In every single attempt
that the Forum, or its representative Browser members have done it, it has
not worked out like that. It's foolish to think this will be different.

What happens - and we see this consistently, whether we talk RSA-1024,
SHA-1, internal server names, or multi-staged CA deprecations - is that
until the impact is actually experienced, no preparations are made.
Further, we see disingenuous behaviour by both CAs and other browsers, by
suggesting that since it's not "yet" sunset, any premature action (such as
adding to OneCRL or changing UI) is somehow going against the "consensus"
of the industry - as if the Forum sets acceptable standards, rather than
minimal.

The benefit of an immediate reduction in time, and shortly thereafter,
revocation-and-replacement, is that it actually sets up a sustainable
method of alerting certificate holders to the need to replace, while
allowing them time to transition. There's no ambiguity in the ecosystem if
you have to replace your existing (2020 hypothetically, but in practice far
shorter) certificate with something to 30 days. And to periodically replace
it. There's no question that there's a frequent reminder that change is
coming and is necessary, and allowing organizations to make the
cost/benefit analysis. The approach proposed - whether for this or for any
other deprecation - is to create an expense that goes from 0 (nothing
wrong) to total (time to replace). No amount of preparation - years even -
has been shown to work in the ecosystem, while rather consistently,
expiration dates have proven a means to effectively moving the ecosystem
forward.

We'd be fooling ourselves to approach deprecations in the future as
anything other than "immediate" reduction in lifetime to address the issue
going forward, and "soon" thereafter, bringing the existing body of
certificates (and their holders) into that same cost/benefit analysis
discussion through revocation. If there's something new to try here, I'm
all ears, but let's not think our past failures will somehow be fixed by
doing the exact same thing.


> Can you explain how we're discussing anything but that? We've expended
>> significant energy for what accounts for a dozen or so organizations. We
>> have data to know the scope of the problem and the impact, and its
>> practical reality.
>>
>> Given that the impact is so small, perhaps a better solution is to stop
> debating this and for you to file a bug to block these certificates in
> Chromium. Seriously - that would satisfy my primary goal here.
>

Because that approach is to fundamentally make the CA/Browser Forum
useless, both philosophically and practically. If the goal is to serve as a
Forum to avoid conflicting program requirements, by establishing common
Baselines, then it provides no value if the Forum is going to retroactively
grant indulgences depending on how widespread the issue is. It's actively
hostile to our European CAs bound by Regulation (EU) No 910/2014 - they
don't have the luxury of getting to pick and choose which parts of RFC 5280
to ignore like our WebTrust-audited friends do. It suggests that the
approach to deal with interoperability is to externally regulate, that the
Forum itself is not capable of defining or implementing interoperable
standards. As an approach to risk, it shows an aversion that is an extreme
- at best, setting a goal of "zero" impact, an impossible to achieve and
harmful to the ecosystem level. And if that's not the level we're setting,
then we're no longer acting on principles, and instead arbitrarily.

For all the CAs not participating in the Forum, who don't get to vote to
retroactively bless themselves, and to all the CAs who do participate in
the Forum, and have stopped the practice, it suggests that the Baseline
Requirements are a way to keep CAs out (since to vote, they have to comply,
while existing voting members are being allowed to be non-compliant). It
suggests the Baseline Requirements aren't really Requirements, they're the
"Baseline hints for things we'd really like you to follow, but if you have
customers who want certificates that don't comply, I guess that's OK too"

For the browsers who would vote in favor of this, it seems like a way for
browsers to add more barriers for new competition. The point of standards
is to make it easier to develop interoperability. Want to see how that bit
Mozilla? https://bugzilla.mozilla.org/show_bug.cgi?id=1136616 or
https://bugzilla.mozilla.org/show_bug.cgi?id=479520#c45 or
https://bugzilla.mozilla.org/show_bug.cgi?id=354493#c198 . The ecosystem
"works" when it's not a giant ball of hacks, which is why we browsers have
such productive relationships trying to actually align on standards
behaviour (and define standards where appropriate) - so that developers can
have a reliable and consistent experience, whether writing a browser or
writing for the Web.


> Perhaps you can help me understand how this is different than, for
>> example, Symantec's issuance of multiple BR violating certificates -
>> https://bugzilla.mozilla.org/show_bug.cgi?id=966350 - and a process of
>> retroactive indulgences?
>>
>>
> It sounds like you are focused on the behavior of the CAs, and I'm focused
> on the behavior of the Subscribers. Thinking about the SHA-1 sunset, I
> don't want Subscribers to have any hope of getting an exception.
>

But isn't that exactly what's being proposed? Automatic exceptions before
May, and after May, exceptions only so long as it's less than 45 days (as
proposed). It doesn't actually define any sunset - and it's not about
changing requirements, because they're requirements that have always
existed.


> If the goal is to find compromise, then it seems more reasonable to focus
>> on "Which organizations cannot switch to a wildcard certificate tomorrow"
>>
>
> MY goal is to find a compromise - Your goal is not clear to me, and you
> didn't answer my request for specific guidance. Would a ballot that
> "immediately" forbids issuance of these certificates with validity periods
> of greater than 30 days be the basis for an acceptable compromise? If not,
> is that because you don't believe that any acceptable compromise exists?
>

I had thought we walked out of Shanghai with a reasonable path forward to
meaningfully not throwing the ecosystem - and the Forum - under the bus.
1) Immediate reduction in lifetime
2) Complete disclosure of any existing certificates not yet disclosed, to
measure compliance with #3 and to assist #4.
3) "Soon" a complete revocation of existing certificates. The suggestion
was "3 months".
4) A "whitelist" for any domains that cannot be met by existing, permitted,
compatible practice (underscores other than in the left-most label). As the
data shows, the practical impact of such a whitelist is, at an upper-bound,
166 certificates based on the currently disclosed practice, but as a
practical matter, is actually much smaller. Just from looking at the list I
provided, it appears that up to a third of those may have been automated
systems and/or mistakes.
5) Issuance is *only* permitted for #4 above, and consistent with #1. These
are the only certificates that can't be simply replaced with wildcard
certificates.
6) Complete sunset <= 1y.

The choice of #3 is when the ecosystem will finally "start" to move.
Delaying #1 or #4 just makes #3 worse for the ecosystem, and doesn't really
move the needle forward in reducing impact or risk.

Breaking this down by DV/OV/EV and distinct names, we see the following:
count subject_organization
1251 vIPtela Inc
282 null
151 CVS / Pharmacy
122 CVS Pharmacy Inc
104 Northern Trust Company
80 verizon wireless
69 Citigroup Inc.
50 The Northern Trust Company
46 INTUIT INC.
42 Vodafone Group Services Limited

Breaking it down by names that have an underscore somewhere else than the
leftmost label, we see something more mundane:
count subject_organization
59 null
7 TOYOTA Connected Corporation
6 Aviva PLC
5 Daum Assecuranz KG Versicherungsmakler
4 Citigroup Inc.
4 Microsoft Corporation
4 verizon wireless
3 Omgeo
3 PTFS, Inc.
3 HP Inc

If we assume each of those DV certificates (those "null") represents a
totally different account (which, I mean, I can keep digging into data to
show that such an assumption is not correct, because of shared ADNs),
that's still a "solvable" level of human contact to talk to, and that the
impact on the actual names and the effort the Subscriber needs to take is
bounded.

That subset shows its a far fewer set of names, many of which themselves
>> are no longer valid domains (demonstrably not in use nor possible to) and
>> makes it clear that we're spending all this time for a few dozen
>> organizations. Which is to say it's a whitelist, it's an exception process,
>> and it's one that directly benefits a few CAs and a few organizations, yet
>> with existential risk to the legitimacy of the Forum, the Baseline
>> Requirements, and the neutrality of the Forum.
>>
>
> To the extent that a whitelist benefits CAs who have continued to - or
> recently discontinued - issuance of these certificates, I agree that a
> whitelist is biased and thus shouldn't be pursued.
>

I think it's rather the inverse. Anything other than a whitelist is to
suggest that if you do enough of something, you can continue doing it, and
anyone who stops is prematurely penalized. The whitelist prevents further
profiteering from those negligent and incorrect CAs.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://cabforum.org/pipermail/servercert-wg/attachments/20181022/7bfba888/attachment-0001.html>


More information about the Servercert-wg mailing list