Five years ago, Conficker/DOWNAD was first seen and quickly became notorious due to how quickly it spread and how much damage it caused.
Remarkably, after all that time, it’s still alive. It can still pose a serious problem, as it can propagate to other systems on the same network as an infected machine – a factor that may explain its high rate of infection to this day.
Based on feedback from the Smart Protection Network, DOWNAD has been a leading threat for years. It has been the most prolific threat – as measured by the number of infections seen in the wild – since 2011. It has beat out a wide variety of threats – from crack key generators to ZeroAccess – for this dubious distinction.
It also popularized the use of domain generation algorithms. This technique generates multiple (hundreds, in the case of DOWNAD) domains on a daily basis. It uses these domains to connect to its command-and-control servers. The sheer number of generated domains makes blocking this C&C much more difficult. Since then, it has been adopted by other malware families as well.
In order to propagate across networks, it used a zero-day vulnerability, which was later designated by Microsoft as MS08-67. Despite the availability of a patch, many users remain vulnerable due to negligent patching practices as well as piracy. Pirated versions of Microsoft Windows, are often unable to download and install security patches.
In the long-term, as Windows XP machines are retired due to its end of extended support period next year, DOWNAD is destined to recede into the background. However, some systems may still be at risk. The simplest solution is simple: ensure that the software you ran – particularly your operating system – has the latest security updates. You should also check out our tips on how to see if your system is in fact infected.
We have prepared a full malware profile which describes the capabilities, the spread, and the risks of DOWNAD/Conficker.
Throughout all of 2013, there have been numerous revelations about how the NSA conducts mass surveillance on the Internet. These have sent the Internet Engineering community reeling. Protocols that have been in use for decades and based heavily on intrinsic trust have had that trust violated.
This has caused the Internet standards community to take a look at the need for encryption. Specifically, it’s been discussed whether HTTP/2.0 – the latest version of the protocol that powers much of the Internet – should be encrypted by default. Overall, this is a positive trend, but there are some challenges that should be considered.
First, encryption without pre-existing trust adds little value. Casual eavesdropping can be prevented, but it is ineffective against a sophisticated operator. Consider, for example, self-signed certificates (often found in small or local web applications). An attacker could easily impersonate the server with a key and certificate that they create and proxy your traffic unencrypted to the real web server, giving them access to read, modify, and inject traffic within your session.
Second, certificate authorities are not always reliable or secure either. Various CAs like Comodo, DigiNotar, GlobalSign, and Starcom, have all suffered some kind of security incident. One can argue (for a very long time) whether non-trusted CAs or having no encryption is “better”. DANE (specified in RFC 6698) allows service operators to publish keys and certificates within DNSSEC, which means that certificates can be verified without a CA being involved. Challenges like how to deal with typo-squatting domains and compromised DNS infrastructure remain, but it’s technically possible to establish public trusted encryption without the involvement of a CA.. Whether it will be put into wide use is unclear.
Third, what percentage of the traffic needs to be encrypted? In the past, encryption was used sparingly due to the cost and the increased resources necessary. Banking-related pages and transactions were the most frequent cases where this was done. However, improvements by CDNs and gains in processing power have reduced the relative costs to the point where it is feasible to encrypt all traffic. Many sites are doing just that today.
Finally, we have to look at encryption primitives themselves. The security of some of these critical building blocks of security has been called into question. Some are worried that these algorithms have been weakened in such a way that government agencies can decrypt otherwise secure traffic. For example, it has been alleged that the Dual_EC_DRBG random number generator (RNG) has been compromised by the NSA by specifying insecure constants. While by no means a master key, a cryptanalyst would have an enormous head-start if they had any insight into the next number that is likely to come out of a RNG.
Of course, some would say that wide-scale HTTP encryption is not necessary. “If I haven’t done anything wrong, I have nothing to hide.” These fall rather flat in the face of bulk large-scale data collection by governments. Could simply making the same set of search-engine queries as a terrorist put you on a watch list?
This may seem like hyperbole, but in the world of big data small similarities often trigger associations that may or may not exist in reality. Did you sell your couch to the brother in-law of a terrorist three years ago on Craigslist? Connections that we would consider insignificant in person can take on new meaning as part of data correlation.
We have come a long way from the early days of the World Wide Web, where everything was in plain text and images were a novelty. Now that so much of our lives exist online, it is increasingly important to have trustworthy infrastructure behind the services we use. Changes in the threat landscape mean that our infrastructure has to change too. HTTP/2.0 won’t solve all of the problems facing the Internet. However, it is a step in the right direction.