[cabfpub] FW: Android app signing problems -- it's even worse than I imagined

Ben Wilson ben at digicert.com
Fri Sep 28 13:33:48 MST 2012


Forwarding this from the ABA InfoSec listserv.

 

From: Information Security Committee Discussion
[mailto:ST-ISC at MAIL.AMERICANBAR.ORG] On Behalf Of Robert R. Jueneman
Sent: Friday, September 28, 2012 12:55 PM
To: ST-ISC at MAIL.AMERICANBAR.ORG
Subject: FW: Android app signing problems -- it's even worse than I imagined

 

First of all, I guess I should apologize for explicitly mentioning my
company’s products in my last e-mail.  I did so only to be specific, and not
leave people guessing.  Mea culpa.

I would like to think that I had a pretty good imagination, but the
code-signing process that is documented for Android apps at
http://developer.android.com/tools/publishing/app-signing.html#cert#cert
<http://developer.android.com/tools/publishing/app-signing.html#cert#cert>
boggles my mind.

Chief among my complaints:

1.       “You can use self-signed certificates to sign your applications. No
certificate authority is needed.”  So that means that you can sign your app
as google.com, microsoft.com, dod.gov, fbi.gov, or whatever you choose, and
no one will ever be the wiser.

2.      “The system tests a signer certificate's expiration date only at
install time. If an application's signer certificate expires after the
application is installed, the application will continue to function
normally.”  So if malware infects your machine and alters one of your apps
after installation, you will have no way of knowing about it.

3.      Since self-signed certificates are allowed, it is not surprising
that CRLs are not required to be supported.  So if an incident such as the
recent Adobe compromise occurs, the user will have no automatic way of
knowing about it.

4.      If you use the Android Keytool for generating the signing key, a
default key of 1024 bits is used.  This is clearly inadequate.  Only DSA and
RSA are supported, and not ECDSA.  Worse yet, only MD5 seems to be supported
for the signing algorithm, although SHA-1 is supported for the digest
algorithm.

5.      Keys are generated in software, and stored under password
protection.

This entire process seems to have been dreamed up around 1990 or earlier.
It certainly doesn’t reflect current best practices.  And given the extent
to which adverse governments and other hostile forces seem to be attacking
our cyber security, I think this is regrettable, to say the least – at least
if you are developing anything more important than Angry Birds.

Bob

______________________________________

From: Information Security Committee Discussion
[mailto:ST-ISC at MAIL.AMERICANBAR.ORG] On Behalf Of Robert R. Jueneman
Sent: Friday, September 28, 2012 11:13 AM
To: ST-ISC at MAIL.AMERICANBAR.ORG
Subject: Code signing certificate problems and issues.

 

This is a very timely reminder of how important it is to do this right.
Adobe has repeatedly been shown to be either uncaring or uninformed about
security, and this is just the latest incident.  But the fact that allegedly
external  attackers were able to penetrate their server should be a wake-up
call for any of us who are generating code that needs to be signed.

Here are my recommendations for how to do this using my company’s (SPYRUS)
components, in order to minimize the risk: 
 
1.        Build an operational key signing application from scratch (so as
to be sure it isn’t infected).  This needs to be done in a secure area,
using components loaded from DVDs, and not from the Internet.  

2.       That key signing application should then be installed on a SPYRUS
Secure Pocket Drive hardware device in Read-Only mode, so that you will
always have a fresh, clean image whenever you need to sign something.  You
can then plug the Secure Pocket Drive into any available laptop or desktop
machine — it won’t even access the computer’s hard drive, so no viruses
could be loaded or propagated.  Nonetheless, that machine should not be
connected to the Internet, or even to the company’s internal network, as we
have seen from the Adobe example.

3.       The private signature key or keys should be created on a SPYRUS
Digital Attaché, so that they can never be extracted.  The public keys can
be extracted and sent off to Symantec or other signing service to obtain
certificates.

4.       Three trusted employees should get together in a secure room to
sign any code segments.  One person should have the logon PIN for the Secure
Pocket Drive, one should have the Digital Attaché PIN, and one should have
the Windows application logon PIN.

5.       These PINs should be recorded and sealed in three separate
envelopes for back-up purposes, and kept in a bank vault to which only the
corporate officers have access – preferably under at least two-person
control.

6.       The Secure Pocket Drive and Digital Attaché hardware devices should
be kept in a locked container when not in use.
 
Now, Adobe is apparently in the process of revoking their compromised code
signing key, but based on my internal investigations, it isn’t at all clear
just how that will solve the problem, if the bogus app has already been
installed.

I would love to have someone either confirm or deny my present
understanding, but on Windows XP, it doesn’t appear that ANY checking of the
signing certificate is done after the application is first installed.  Even
then, it is not obvious that lower level DLLs are being validated, ever.

On Vista and newer platforms, once an application has been installed, system
services and drivers, especially on 32-bit versions, are apparently not
checked, ever again. On 64-bit versions, drivers signed by Microsoft (not
VeriSign or Symantec) are needed for most applications to be installed.  But
again, once they are installed, it is again not at all clear that platform
systems services are checked for signatures.

And if all of that seems pretty fuzzy, the situation with respect to
certificates that simply expire, or are revoked, is even less clear.  

>From what we have been able to determine, the inclusion of a CRL
distribution point in the certificate is optional, and without CRLs, the
user or application would have no way of knowing whether the certificate had
been revoked, or not.  More seriously, if the certificate simply expires,
perhaps because the code-signing organization only purchased a 1, 2, or 3
year certificate and didn’t renew it (which brings up yet another set of
questions), then presumably no CRLs would be available in any case.

To make matters worse, the code-signing scripts have an option for
generating a secure time-stamp, and if that time-stamp hasn’t been modified,
that application is still considered “good”.  So that provides protection
against installing bad applications once the certificate has been revoked,
which may take a week or more, but it does nothing at all to protect against
attacks that were only recently discovered. In other words, once the horse
is out of the barn, the user is on their own.

Unfortunately, in my opinion, the situation with Google and Android is even
worse.  Google is REQUIRING that their apps be signed with a certificate
with a 25-year validity period, in order to be eligible for distribution via
Google Play.  Now, in my judgment that is just a ridiculously long time.
Not only does it exceed the current recommendations from NIST for RSA-2048
keys, which should be deprecated for use after 2030, but it doesn’t take
into account the life expectancy of most applications, and particularly
those on  devices as volatile as a cell phone or tablet.  Think about it —
25 years would take us back to Windows 3.0 or earlier.  Now who in their
right mind would still be using apps that came out that long ago?

More importantly, 25 years is simply too long a time-period from a
risk-management and corporate turn-over perspective.  People will come and
go, and even entire companies may turn over in that amount of time.  So from
my perspective, apps and the certificates used to certify them should have a
life expectancy of about 5 years, and after about 3 years the organization
should get a new certificate, and start using that new key to sign those
apps.

I think it is time for for the ISC to rethink the PKI model as it applies to
code-signing applications, as opposed to the traditional nonrepudiation of
documents.

Bob

  _____  

From: Hoyt L Kesterson II <hoyt.kesterson at TERRAVERDESERVICES.COM>
Reply-To: Hoyt L Kesterson II <hoyt.kesterson at TERRAVERDESERVICES.COM>
Date: Thu, 27 Sep 2012 22:01:28 -0700
To: ABA <ST-ISC at MAIL.AMERICANBAR.ORG>
Subject: Re: Move along, nothing to see,

One doesn't have to break into an HSM to get to the keys (I'm assuming a
function this important is done with a hardware security module doing the
signing); one only has to fool the HSM into thinking you're an authorized
user of its services.

Ars Technica also has an article
<http://arstechnica.com/security/2012/09/adobe-to-revoke-crypto-key-abused-t
o-sign-5000-malware-apps/>  on this.

   hoyt


On 27 Sep 2012, at 9:55 PM, Steven W. Teppler wrote:

Adobe signing cert problems...

http://www.theregister.co.uk/2012/09/27/adobe_cert_revoked/

Sent from my iPad




 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://cabforum.org/pipermail/public/attachments/20120928/ba11813a/attachment-0001.html 


More information about the Public mailing list