nettime's_certificate_authority on Mon, 20 Apr 2015 22:01:28 +0200 (CEST)


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

<nettime> FWD: Please consider the impacts of banning HTTP #107


<https://github.com/WhiteHouse/https/issues/107>

Please consider the impacts of banning HTTP #107

On the website for the HTTPS-Only "Standard", there is a statement that
'there is no such thing as insensitive web traffic' -- yet there is. Due
to many institutions having policies against FTP and peer-to-peer
protocols, HTTP has become the de facto standard in sharing scientific
data.

Much of this traffic is regularly scheduled bulk downloads through wget
and other automated retrieval tools. Forcing these transfers to go to
HTTPS would cause an undue strain on limited resources that have become
even more constrained over the past few years.

Although some data transfers are handled through rsync or scp with the
High Performance extensions to SSH ; this requires coordinating between
each pair of institutions, and requires additional paperwork to track
accounts. This additional paperwork requires collecting Personally
Identifiable Information (PII) that is not necessary when distributing
data over protocols that do not require authentication. This paperwork
has restricted our ability to openly share scientific data with
international partners, and has impacted our sharing with an NSF partner
that had a foreign-born system administrator.

In 2013, the White House Office of Science & Technology Policy issued a
memo on "Increasing Access to the Results of Federally Funded Scientific
Research". This document requires that many agencies must create plans
to make their scientific data available to the public in digital
formats. Even if agencies were to lift their restrictions on
peer-to-peer protocols, they don't scale to track the 100M+ files that
larger projects manage.

The qualities that improve the privacy of HTTPS connections are a
hindrance in bandwidth management. Caching proxies can be used with HTTP
to prevent multiple users from a given site each having to download a
file directly from the source. This is especially important for Near
Real Time (NRT) data in which many sites poll for data, or when remote
websites directly embed image links.

This is also important for times when something makes is announced in
the news. We have had times when we've had to remove 'branding' type
images from our webservers to reduce bandwidth. Even with the crush of
international traffic, we have been able to withstand torrents of
requests that were multiple orders of magnitude higher than our typical
traffic.  It is rare for us to know in advance when our data will be
newsworthy, nor what data specifically will be of interest.

The effect on those without computers at home

This increased bandwidth is not only problematic for those serving the
data, but may also be an issue for those trying to use the data. Schools
and libraries will frequently install proxy servers to both minimize
their bandwidth consuption through caching, but also to prevent students
and patrons from going to inappropriate sites. In some cases, this
filtering is mandated by state or local laws. To comply with these laws,
some institutions block HTTPS entirely.

Larger libraries may have a means to request unrestricted access, but
typically require a separate request each time, to prevent patrons from
looking at materials in eyeshot of children.

As such, requiring HTTPS may make it harder for such institutions to
provide the same level of service while complying with their local laws.
To enable scanning of HTTPS, they could configure their proxies to act
as a man-in-the-middle attack, removing all privacy when the citizens
had otherwise expected it.

Restricting the use of HTTP will require changing a number of scientific
analysis tools.

The overall funding cutbacks of the past few years has led to decreased
funding for maintaining scientific analysis software. A good portion of
it is stable and not under active development.

Many of these packages retrieve data using HTTP. Should that access be
removed, someone will have to adjust the packages to retrieve data using
HTTPS or some other protocol. Additional work may be required in the
future to deal with any future patches to the SSL libraries, whereas
older HTTP protocols can still be supported by modern web servers.

This may be even more of a problem for software used on currently
running missions. In one case that I was involved with, one of our
sources for schedule information changed over to require SSL. We
attempted to get the necessary software running on the machine that was
used for commanding, but after days of effort, we gave up. Instead, I
have a cron job running on another system to retrieve the schedules over
HTTPS, and then have the system pick up the file from our local server
using HTTP.

For other missions that have to go through change control, re-certifying
their workflows to use HTTPS could be a rather significant cost now and
in the future to deal with SSL patches.

HTTPS is not a security improvement for the hosts.

Although HTTPS makes it more difficult for an ISP to sniff someone's web
browsing, this feature should be properly weighed against all of the
security issues, as it can actually increase the attack surface and
cause other problems:

     * Flaws in SSL/TLS implementations have made it possible for third
       parties to dump information about acitivity (Heartbleed).

     * Flaws in SSL/TLS can create a false sense of security (POODLE, FREAK)

     * HTTPS would hide attacks from existing intrusion detection systems.

Many of the issues are specifically related to SSL3, but most servers
still enable this by default. It is quite possible that if forced to
meet a deadline, sysadmins may be rushed and use the defaults, making
their systems less safe.

Should there be a remote exploit, this could invalidate any of the
privacy improvements that may have been gained from switching to HTTPS.

In the past, we have taken servers offline as a preventative measure
against zero-day exploits that we could not conclusively prove that we
were immune to or could mitigate, as the alternative is multiple weeks
of rebuilding the server from the ground up and re-certifying it for use
should an exploit occur, or be suspected of having occurred. As such,
anything that increases the attack surface can decrease the availability
of the services that we provide to the public.

HTTPS is frequently implemented improperly

We also have the case of poorly implemented HTTPS systems in the US
federal government. I have stopped keeping track of the number of
certificates that I have encountered that are self-signed, signed by an
unknown CA, or expired. If certificates are not maintained, we risk
desensitizing the public to the issue and training them to automatically
trust the certificates without the appropriate level of scrutiny.

If webservers are not properly managed and insecure ciphers removed as
exploits are found against them, HTTPS may only offer the illusion of
privacy while leaving citizens vulnerable to monitoring from ISPs or
from other participants on unswitched networks such as wifi.

Just yesterday, I was given instructions for our new voicemail system,
which stated after giving the HTTPS URL to access it:

Note: You may receive a certificate error. Select "continue" to proceed
to the site.  With the relaxing of rules regarding disclosure of
information to third parties, government websites were allowed to use
services hosted externally. If these services don't support HTTPS, we
risk serving 'mixed' content back to the user -- which means we either
train them to ignore the warning if they want access to our content.

In conclusion:

Websites that use authentication or have personally identifiable
information about users of their systems should use HTTPS. There may be
other sites in which it would be appropriate for them to use HTTPS, but
there are still situations for which HTTP is a better choice.

In summary:

     1. Non-sensitive web traffic does exist.

     2. Moving to HTTPS has a non-trivial cost.

     3. HTTPS will reduce the availability of government information and
        services.

     4. HTTPS increases the risk for the maintainers of the servers.

     5. HTTPS is already implemented and broken on many federal
        websites.

     6. HTTPS may only offer an illusion of privacy while still being
        insecure.

     7. This proposal risks increasing the 'digital divide' for citizens
        who access the internet through schools, libraries or other filtered
        access points.  

     8. This proposal risks training citizens to ignore security warnings 
        from badly configured or maintained websites.

Although HTTPS may be better for citizen privacy, it actually increases
the risks for the maintainer of the servers and can require significant
changes in network architecture to mitigate those risks. There is a
non-trivial cost in banning HTTP, which could adversely affect the
distribution of information to the public.

Joe Hourclé (joseph.a.hourcle@nasa.gov)
Webserver & Database Administrator
Solar Data Analysis Center
Goddard Space Flight Center

#  distributed via <nettime>: no commercial use without permission
#  <nettime>  is a moderated mailing list for net criticism,
#  collaborative text filtering and cultural politics of the nets
#  more info: http://mx.kein.org/mailman/listinfo/nettime-l
#  archive: http://www.nettime.org contact: nettime@kein.org