A Bit of Personal News

November 28th, 2012 No comments

I will shortly be getting back to more regular blogging on security issues, but I thought I would inform the readers of my blog that recently I was awarded an MSc in Information Security from Royal Holloway, University of London. I was working on it for the past two years, and am happy to announce that I received a “Pass with Distinction”, the highest grade possible.

Last night I was also informed that my MSc Project (on Fuzz testing Web Applications) received a SearchSecurity.co.UK award for being of “outstanding quality”.

That’s all the personal news for now. I’ve been very busy at work over the past two months, so I have had less time to do personal projects like updating this blog.  As Christmas nears, I’ll have more time for these sorts of things. For now, thanks for reading!

Ten Tips For Securing Your Web Applications

September 30th, 2012 No comments

Web applications are often notoriously insecure. With more of us migrating to web-based technologies, ridding the web of these insecurities becomes a top priority. Here are ten tips that should help you secure your web applications.

1. Send all confidential data over a secure connection.

At the very least, send user credentials (i.e. username and password) over HTTPS. At the very most, send all data over HTTPS, especially when your apps are dealing with large amounts of personal information. There are almost no excuses for not using HTTPS these days, especially when buying an SSL certificate is so cheap. Be aware that if you choose to only send credentials over HTTPS, your web application will be susceptible to session hijacking attacks.

Never send any confidential data in an email, especially password confirmation emails. Email is not a secure method of communication, and it probably never will be (PGP is not widely used at all). When dealing with passwords, always let the user set their own, as opposed to generating it for them. That way, you do not need to send their password in an email since they already know what it is.

2. Encrypt confidential data before storing it.

If your web application stores credit card numbers of users or other confidential data, make sure that this data is encrypted in whatever storage medium you are using. If your web application needs to access this data, it should be copied and decrypted in memory, before discarding the copy. At no point should the unencrypted data be stored in some permanent location.

Additionally, the key(s) used for encryption / decryption should not be stored in the same location as the encrypted data. This is to minimize damage if the storage medium is compromised (for instance, if hackers gain access to a database containing encrypted data, the decryption key should not also be compromised).

3. Salt and hash all passwords in the database.

This is possibly one of the most important things a web application designer should implement in terms of user security, but again and again we see large organizations and companies either ignoring or misunderstanding the importance of salting and hashing passwords.

There are absolutely no excuses for not salting and hashing passwords. Your web application should never be able to retrieve a user’s password, either for a comparison or for sending to the user in case they forget it. When the user first registers, their password should be concatenated with a salt (some unique random string of characters) and then hashed with a strong hashing algorithm (SHA-256 for example). PHP has a built-in function called crypt() that supports numerous hashing methods.

Deep Web, Dark Web, Darknet, and Dark Internet

August 4th, 2012 No comments

The terms Deep Web, Dark Web, Darknet, and Dark Internet are ones I see confused and misused on a regular basis on the Internet and in the media. This is my attempt to rectify this confusion and misuse by explaining what each of these terms means and when you should use them.

Deep Web

The Deep Web is quite simply any content on the Web which is not accessible to or indexed by standard search engine spiders. A search engine spider will typically crawl a website by visiting it and then visiting all the pages it links to, which includes pages local to the site and pages on other sites. Whilst this gives the search engine a pretty good view of the web, it misses out on a lot of other resources for various reasons:

  • Standard search engine spiders do not try to log into any websites, so any resources protected by a login are not accessible to it.
  • Content which explicitly denies access to search engine spiders (e.g. using a robots.txt file) is also left off the search engine index.
  • A web server may host a file or directory of files that isn’t linked to anywhere on the web. These files and directories would be missed by search engines as they would (most likely) be by humans too.
  • Content that requires input by a user to be generated (i.e. search results) may also be effectively invisible to search engine spiders.
  • Some websites may require a special browser configuration to gain access.

You can think of the web as an ocean of content. Anything on the surface of this ocean is content that is being linked to openly. A search engine spider can only look at the content on the surface of the ocean, and any content in the deeper parts of the ocean (whether protected by a login, or just hidden from view) is inaccessible to it.

What it is important to remember is that the Deep Web has nothing necessarily to do with illegal activity, nor is it about being anonymous or hiding your identity. Most of us access the Deep Web on a regular basis, whenever we check web mail, or log in to a social networking site. If a search engine can’t see it, for whatever reason, it’s part of the Deep Web.

Dark Web

Conversely, the Dark Web does have numerous links to illegal activity and hiding one’s identity. It is a collection of websites that are only accessible over the Tor network, which hides your IP address and gives you complete anonymity. Not every website accessed over Tor is part of the Dark Web, since Tor allows you to browse anonymously on the regular web as well. However, the Tor network has a special pseudo-top-level domain suffix called “.onion” which is used to get to websites which host themselves over Tor, and are therefore only accessible via Tor.

Going to these websites without using a browser configured to use Tor is impossible, so the Dark Web is actually a subset of the Deep Web, and as such is not indexed by search engines. Whilst there are many websites on the Dark Web which do not promote illegal activity, there are plenty that do, including sites that sell drugs and weapons. A BBC report earlier this year highlighted the Dark Web quite well, and the hacktivist group Anonymous have attacked pedophilia-related websites on the Dark Web before.

Darknet

Wikipedia asserts that a darknet is a “private, distributed P2P filesharing network, where connections are either made only between trusted peers using non-standard protocols and ports or using onion routing.” Limiting the term to certain types of filesharing network is unhelpful in my opinion, and I see no reason a darknet cannot simply be any such network. This would make the onion routed part of the Tor network itself a darknet, and it is often called “The Darknet” (though there is more than one darknet, the onion routed part of the Tor network is still the most well known).

This too would make the Dark Web a part of the Darknet. However, it is important to point out that the Dark Web and the Darknet are not synonymous. Many other services can run on the Darknet, such as email, IRC, etc. The Dark Web is just one of these services, contributing a subset of traffic over the Darknet.

So a darknet (no capitalisation) is any network where connections are made only between trusted peers using non-standard protocols and ports or using onion routing. The Darknet (capitalised) is the onion routed part of the Tor network. This means that the Darknet is a darknet, in the same way as the Internet is an internet.

To make matters slightly more confusing, Project Meshnet used to be known as the “Darknet Plan”, though luckily the name was changed to more accurately reflect the nature of their project (and possibly to alleviate confusion).

Dark Internet

Finally, we end with a term which is completely unrelated to the three above, yet still manages to get confused with them. The Dark Internet refers to the unreachable network hosts on the Internet. They could be unreachable because a machine is turned off, or a network cable is damaged, or even because routing tables have become corrupted somewhere. Nobody, not even regular Internet users, can reach them. The Dark Internet is constantly changing; machines get taken offline, and some get put back online, but whilst they are offline, they are part of the Dark Internet.

Analysis of 400,000+ Stolen Yahoo! Passwords

July 13th, 2012 No comments
Image representing Yahoo! as depicted in Crunc...

Image via CrunchBase

On 12th July 2012, more than 400,000 emails and passwords for Yahoo! Voices were stolen via an SQL injection and published online. The passwords were reportedly stored in plaintext, making this security breach even more serious. If you are a member of Yahoo! Voices, change your password immediately, and if you use the same password on other sites, make sure to change them as well.

I performed the following password analysis with the help of pipal, a very popular and powerful password analyzing tool. The full pipal report is located here, with a longer report (showing the top 100 of each category) here.

10 Most Popular Passwords

123456 = 1667 (0.38%)
password = 780 (0.18%)
welcome = 437 (0.1%)
ninja = 333 (0.08%)
abc123 = 250 (0.06%)
123456789 = 222 (0.05%)
12345678 = 208 (0.05%)
sunshine = 205 (0.05%)
princess = 202 (0.05%)
qwerty = 172 (0.04%)

Despite numerous warnings by security professionals, the most popular password is still “123456″, followed by “password” in second place. These are highly insecure passwords, not just because of their length or complexity (which is very low), but because they are at the top of most password lists that attackers use to try to compromise an account. Remember, brute-forcing a password is always a last-ditch attempt at gaining access to an account; a clever attacker will always try common passwords first, and if your password appears in a password list online, you should never use it!

The fact that these passwords were even allowed reveals substandard practices in Yahoo’s password policy. To boost security, a user should be required to have a password that contains both upper and lowercase letters, as well as numbers and symbols. For additional security, the chosen password should be rejected if it matches one found in common password lists.

Password Length

8 = 119214 (26.92%)
6 = 79650 (17.99%)
9 = 66058 (14.92%)
7 = 65654 (14.83%)
10 = 54815 (12.38%)
12 = 21785 (4.92%)
11 = 21261 (4.8%)
5 = 5325 (1.2%)
4 = 2748 (0.62%)
13 = 2585 (0.58%)
14 = 1433 (0.32%)
15 = 773 (0.17%)
16 = 442 (0.1%)
3 = 303 (0.07%)
17 = 252 (0.06%)
20 = 169 (0.04%)
18 = 116 (0.03%)
1 = 116 (0.03%)
19 = 78 (0.02%)
2 = 67 (0.02%)
21 = 6 (0.0%)
22 = 4 (0.0%)
29 = 3 (0.0%)
30 = 2 (0.0%)
24 = 2 (0.0%)
28 = 2 (0.0%)

As you can see, most people are still using short passwords. Indeed, a whopping 61.66% of people are using a password that is 8 characters or shorter. If you include passwords with a length of 9 or 10, then the number jumps to 88.96%. When a dictionary attack fails, the main thing stopping a brute-force from succeeding in a specific amount of time is the length of the password. For each additional character a password has, the amount of time needed to brute-force it increases by a factor of 95 (assuming the brute-force is trying all types of character). Even if the password only contains lowercase letters, an additional letter will increase the time required by a factor of 26.

8 characters and longer is usually cited as the recommendation for password length, but with cracking speeds up due to improvements in processing power, that number should probably be closer to 12, if not more. Remember, a long complex password need not be hard to remember.

Complexity

Only lowercase alpha = 146512 (33.09%)

This small statistic shows a staggering lack of password complexity. Almost a third of passwords only contained lowercase letters, making the task of brute-forcing them much easier.

loweralphanum: 224085 (50.6%)
loweralpha: 146512 (33.09%)
numeric: 26080 (5.89%)
mixedalphanum: 23233 (5.25%)
loweralphaspecialnum: 6053 (1.37%)
mixedalpha: 5122 (1.16%)
upperalphanum: 3416 (0.77%)
mixedalphaspecialnum: 3327 (0.75%)
loweralphaspecial: 2103 (0.47%)
upperalpha: 1776 (0.4%)
mixedalphaspecial: 489 (0.11%)
upperalphaspecialnum: 233 (0.05%)
specialnum: 189 (0.04%)
upperalphaspecial: 51 (0.01%)
special: 20 (0.0%)

As these additional statistics show, more than half the passwords only contained lowercase letters and numbers (the numbers only increase the brute-forcing attack by a factor of 10). Barely one percent of the passwords could be considered “complex”, containing upper and lowercase letters, numbers, and symbols.

Conclusions

Yahoo! is of course to blame for the passwords being accessible to hackers, as well as storing them in such an insecure way. Their password policy which apparently lets users choose single characters for a password is absurd, and a full investigation should be carried out to find out how on earth the users were left this vulnerable. There were some decent passwords in the list, and those were made completely useless through Yahoo’s ineptitude.

That said, it should be noted that regardless of Yahoo’s ineffective defences and security policies, a great deal of these user chosen passwords were highly insecure. It is up to the user to choose a decent password, rather than relying on a system which you should not really trust (as users, we do not know what security weaknesses a system has, or how it stores important data). It is best, therefore, to create a unique complex password (or passphrase) for each account you have online, and to use a good password manager to help you keep track of them.

Two New Security Articles for Yahoo!

June 20th, 2012 No comments

I’ve written and published two new security articles as part of the Yahoo! Contributor Network. The first is about reducing your digital footprint, which is something I’ve been interested in for a while now. If you aren’t careful, a lot of information about yourself can be found online. Some of it might be true, some of it might be false, but most of it you probably don’t want lingering in search engine results. My article will tell you how to best map your digital footprint, and then how to go about reducing it.

The second article is on the top 5 online password managers, something every sensible person on the Internet should have. With so many different websites, you can either have the same password (highly insecure) or generate a unique password for each. Online password managers mean you don’t have to remember all your passwords, though as I’ve pointed out before, you can generate highly secure and easy to remember passphrases for the most secure sites you visit.