Posted on

Australian government encryption folly

data encryption lockIn IDG’s CIO magazine (17 July 2018): Wickr, Linux Australia, Twilio sign open letter against govt’s encryption crackdown ‘mistake’.  Not just those few, though, 76 companies and organisations signed that letter.

Learning Lessons

Encryption is critical to the whole “online thing” working, both for individuals as well as companies.  Let’s look back at history:

  • In most countries’ postal legislation, there’s a law against the Post Office opening letters and packages.
  • Similarly, a bit later, telephone lines couldn’t get tapped unless there’s a very specific court order.

This was not just a nice gesture or convenience.  It was critical for

  1. trusting these new communication systems and their facilitating companies, and
  2. enabling people to communicate at greater distances and do business that way, or arrange other matters.

Those things don’t work if there are third parties looking or listening in, and it really doesn’t matter who or what the third party is. Today’s online environment is really not that different.

Various governments have tried to nobble encryption at various points over the past few decades: from trying to ban encryption outright, to requiring super keys in escrow, to exploiting (and possibly creating) weaknesses in encryption mechanisms.  The latter in particular is very damaging, because it’s so arrogant: the whole premise becomes that “your people” are the only bright ones that can figure out and exploit the weakness. Such presumptions are always wrong, even if there is no public proof. A business or criminal organisation who figures it out can make good use of it for their own purposes, provided they keep quiet about it.  And companies are, ironically, must better than governments at keeping secrets.

Best Practice

Apps and web sites, and facilitating infrastructures, should live by the following basic rules:

  • First and foremost, get people on staff or externally who really understand encryption and privacy. Yes, geeks.
  • Use proper and current encryption and signature mechanisms, without shortcuts.  A good algorithm incorrectly utilised is not secure.
  • Only ask for and store data you really need, nothing else.
  • Be selective with collecting metadata (including logging, third party site plugins, advertising, etc). It can easily be a privacy intrusion and also have security consequences.
  • Only retain data as per your actual needs (and legal requirements), no longer.
  • When acting as an intermediary, design for end-to-end encryption with servers merely passing on the encrypted data.

Most of these aspects interact.  For example: if you just pass on encrypted data, but meanwhile collect and store an excess of metadata, you’re not actually delivering a secure environment. On the other hand, by not having data, or keys, or metadata, you’ll be neither a target for criminals, nor have anything to hand over to a government.

See also our earlier post on Keeping Data Secure.

But what about criminals?

Would you, when following these guidelines, enable criminals? Hardly. The proper technology is available anyway, and criminal elements who are not technically capable can hire that knowledge and skill to assist them. Fact is, some smart people “go to the dark side” (for reasons of money, ego, or whatever else).  You don’t have to presume that your service or app is the one thing that enables these people. It’s just not.  Which is another reason why these government “initiatives” are folly: they’ll fail at their intended objective, while at the same time impairing and limiting general use of essential security mechanisms.  Governments themselves could do much better by hiring and listening to people who understand these matters.

Posted on
Posted on

How not to respect your users’ privacy

PrivacyYou just run the usual online frameworks, with their extensive plugin range, CDN, Google Analytics, NewRelic, Twitter, Facebook and LinkedIn widgets, and the rest.  Then, you display a notice to your users that your site uses cookies and passes some data to third parties (such as Google Analytics and NewRelic) “to enhance the user experience”.

There. Easy, right? You probably didn’t need to change anything at all. Most companies, sites and applications do this.  Now tell me: given that you probably agree with at least some of the above, how come you display a notice to your users explaining how you respect their privacy?  It can’t both be true.

So yes, this was a test.  And most of us fail, including us.  Why is this?

  1. Are you asking for and storing more data than you actually require for delivering the product or service that you provide?  You can probably only test this by working out the minimum data requirements, questioning each item, and then comparing that list with what you currently actually collect.  There’s likely to be a (large) discrepancy.
  2. Are you using multiple analytics and trackers?  Why?  It does in fact affect the user experience of your site, both in terms of speed as well as privacy.  And you probably don’t actually use all that data.  So think about what you actually use, and get rid of the rest.  That’s a good exercise and an excellent step.
  3. Does your site deliver pixel images for Facebook and others?  If so, why?
  4. Does your site show a “site seal” advertising your SSL certificate’s vendor?  If so, why?
  5. Does your site set one or more cookies for every user, rather than only logged-in users?  If so, why?
  6. Most CMS and frameworks actually make it difficult to not flood users with cookies and third-party tracking. They have become the new bloat.  Example: you use a component that includes a piece  of javascript or css off a vendor-provided CDN. Very convenient, but you’ve just provided site-usage data to that vendor as well as your users’ IP address.
  7. Respecting privacy is not “business as usual” + a notice. It’s just not.

So, privacy is actually really hard, and for a large part because our tools make it so.  They make it so not for your users’ convenience, or even your convenience, but for the vendors of said tools/components. You get some benefit, which in turn could benefit your users, but I think it’s worthwhile to really review what’s actually necessary and what’s not.

A marketing or sales person might easily say “more data is better”, but is it, really?  It affects site speed and user experience. And unless you’ve got your analytics tools really well organised, you’re actually going to find that all that extra data is overhead you don’t need in your company.  If you just collect and use what you really need, you’ll do well. Additionally, it’ll enable you to tell your users/clients honestly about what you do and why, rather than deliver a generic fudge-text as described in the first paragraph of this post.

A few quick hints to check your users’ privacy experience, without relying on third-party sites.

  • Install EFF’s Privacy Badger plugin.  It uses heuristics (rather than a fixed list) to identify suspected trackers and deal with them appropriately (allow, block cookies, block completely).  Privacy Badger provides you with an icon on the right of your location bar, showing a number indicating how many trackers the current page has.  If you click on the icon, you can see details and adjust.  And as a site-owner, you’ll want to adjust the site it rather than badger!
  • If you click on the left hand side of your location bar, on the secure icon (because you are already offering https, right?), you can also see details on cookies: both how many and to which domains. If you see any domains which are not yours, they’re caused by components (images, javascript, css) on your page that retrieve bits from elsewhere. Prepare to be shocked.
  • To see in more detail what bits an individual page uses, you can right-click on a page and select “Inspect” then go to the “Sources” tab.  Again, prepare to be shocked.

Use that shock well, to genuinely improve privacy – and thereby respect your users.

Aside from the ethics, I expect that these indicates (cookies, third-party resource requests, trackers, etc) will get used to rank sites and identify bad players. So there’ll be a business benefit in being ahead of this predictable trend.  And again, doing a clean-up will also make your site faster, as well as easier to use.

Posted on
Posted on

Keeping Data Secure

a safeWe often get asked about data security (how to keep things safe) and local regulations and certifications regarding same. Our general thoughts on this are as follows

  1. Government regulations tend to end up becoming part of the risk/cost/benefit equations in a business, which is not particularly comforting for customers.
    • Example: some years ago an Australian bank had a mail server mis-configured to allow relaying (i.e., people could send phishing emails pretending to legitimately originate from that bank).  A caring tech citizen reported the issue to the bank.  Somehow, it ended up with the legal department rather than a system/network administrator.  The legal eagles decided that the risk to the organisation was fairly low, and didn’t forward it for action at that time.  Mind that the network admin would’ve been able to fix up the configuration within minutes.
  2. Appreciate that certifications tend to mainly give you a label to wave in front of a business partner requiring it, they do not make your business more secure.
    • Data leaves footprints.  For instance, some people use a separate email address for each website they interact with.  Thus, when a list of email addresses leaks, saying “it didn’t come from us” won’t hold.  That’s only a simple example, but it illustrates the point.  Blatant denial was never a good policy, but these days it’ll backfire even faster.
  3. Recent legislation around mandatory data retention only makes things worse, as
    • companies tend to already store much more detail about their clients and web visitors than is warranted, and
    • storing more activity data for longer just increases the already enlarged footprint.

business advice personSo what do we recommend?

  1. Working within the current legal requirements, we still advise to keeping as little data as possible.
    • More data does not intrinsically mean more value – while it’s cheap and easy to gather and store more data, if you’re actually actually more strategic about what you collect and store, you’ll find there’s much more value in that.
  2. Fundamentally, data that you don’t have can’t be leaked/stolen/accessed through you.  That’s obvious, but still worth noting.
    • Our most critical example of this is credit card details.  You do not want to store credit card details, ever.  Not for any perceived reason.  There are sensible alternatives using tokens provided by your credit card gateway, so that clients’ credit cards never touch your system.  We wrote about this (again) in our post “Your Ecommerce Site and Credit Cards” last year.
      Why?  It’s fairly easy to work out from a site’s frontend behaviour whether it stores credit cards locally, and if it does, you’re much more of a target.  Credit card details provide instant anonymous access to financial resources.  Respect your clients.
  3. More secure online architecture.
    • We’ll do a separate post on this.
  4. If you have a data breach, be sensible and honest about it.
    • If your organisation operates in Australia and “with an annual turnover of $3 million or more, credit reporting bodies, health service providers, and TFN recipients, among others.“, the Notifiable Data Breaches (part of the Australian Privacy Act) scheme applies, which came in to force this February 2018, applies to you.

handshakeWe’re happy to advise and assist.  Ideally, before trouble occurs.  For any online system, that’s a matter of when, not if.
(And, of course, we’re not lawyers.  We’re techies.  You may need both, but never confuse the two!)

Posted on
Posted on

Web Security: SHA1 SSL Deprecated

You may not be aware that the mechanism used to fingerprint the SSL certificates that  keep your access to websites encrypted and secure is changing. The old method, known as SHA1 is being deprecated – meaning it will no longer be supported. As per January 2016 various vendors will no longer support creating certificates with SHA1, and browsers show warnings when they encounter an old SHA1 certificate. Per January 2017 browsers will reject old certificates.

The new signing method, known as SHA2, has been available for some time. Users have had a choice of signing methods up until now, but there are still many sites using old certificates out there. You may want to check the security on any SSL websites you own or run!

To ensure your users’ security and privacy, force https across your entire website, not just e-commerce or other sections. You may have noticed this move on major websites over the last few years.

For more information on the change from SHA1 to SHA2 you can read:

To test if your website is using a SHA1 or SHA2 certificate you can use one of the following tools:

Open Query also offers a Security Review package, in which we check on a broad range of issues in your system’s front-end and back-end and provide you with an assessment and recommendations. This is most useful if you are looking at a form of security certification.

Posted on