UPDATE: A couple of days after this interview, HackersBlog released the details of their latest succesful compromise, Tiscali UK. Once again, access to user data, including username, firstname, surname, company, telephone, regdate, lastlogin, email and hashed password.
After many high profile compromises over the past few months, the Romanian hacking project HackersBlog United is rapidly gaining visibility on the web security scene. The recent web site compromises that HackersBlog lay claim to include; Kaspersky, F-Secure, Symantec, Bitdefender, Second Life, Facebook, Hi5, StayFriends, International Herald Tribune, Yahoo!, The UK National Lottery, UK newspaper The Telegraph and most recently British Telecom
HackersBlog operate under their own code of ethics that mean that they will not expose website problems in public that have a high risk of exploitation, they will not save or distribute private data from compromised web sites, and they contact the website owner with details of the vulnerabilities exploited to allow them to carry out the necessary remediation (full code of ethics here).
I decided to contact the group to find out a little more about how they operate, why they do what they do, and importantly to ask them for any general advice that can help everyone provide a more secure online experience for their customers.
I have left the answers below exactly as they were received. I think you’ll agree that even the most high profile website can learn from the compromises detailed on HackersBlog. Perhaps the biggest lesson to keep in mind though, is that without proper regard for security as an integral part of the design process, we are all potential victims.
How long has your group existed, why did it come into being and what motivates you to continue?
We are coming from romanian “blackhat” teams that used to compete against each other. We united for a better purpose, that of informing the public of the dangers on the internet.
Is anonymity necessary for conversation or are you safe from prosecution simply because of a lack of international co-operation around cybercrime?
We have seen you target security vendors recently, a newspaper, and now telecoms companies, is there a method behind your choice of targets?
We dont have an agenda. Usually, when we find a vuln in a website, we try to show that their competitors can face the same problems. We dont like to spend too much time diggin vulns only in one type of websites but rather try to diversify and enlarge the spectrum of our research.
On average what ratio of “successes” do you have when attempting to compromise professional enterprise level web sites?
Lets look at it from a different perspective. We are using only very well known methods and therefore the return is somewhere around 15-20%. If someone is using blackhat techniques the results can grow exponentially since the ethic would not stop that person in his doings.
What are the top 5 “schoolboy errors” made by the professionals when designing or securing their sites, errors that you really shouldn’t be seeing?
When the attack is manual (without making use of certain softwares used in scaning/verifying vulns) the error messages generated by the site are of crucial importance to the attacker. One of the main issues here is that coders forget the error reporting activated.
Another serious mistake is “trusting” the data coming from the user (forms and such) as being genuine without further verification.
Another factor that cannot necesarly be taking as a mistake but which we believe can generate problems to the website or the server where the site is hosted is the presence in the links, of the parameters in their “normal” form. For instance: .php: ?parameter1=val1¶meter2=val2. A whole lot of “vulnerability scanners” search the web for sites with this kind of parameters because they are easily identifyable and can the be tested in the hope of finding security holes. Instead, if the parameters would be included in a “SEO friendly URL”, such as: /articol-23.html, those scanners would fire in the dark because the link will not have a standard structure anymore: .php?p1=v1&p2=v2.
Based on these “mishaps” and along with many others we can outline the most common vulns found on the web: Cross Site Scripting, SQL Injection, Local Path Disclosure, Local File Include and Remote File Inclusion, Remote Code Execution… Of course, this is just a short list and there are more solutions out there, available to anyone.
Do you think that companies are getting smarter about securing their online assets as time goes on or have no lessons been learned in the time that you have been active?
It is too early for us now to opinate about this since our presence online in this format (whitehat) is not very old. However, anyone who has to deal with online security can confirm that sites are safer and better protected now then they were a few years ago, also because there were people and companies out there who pointed out the problems they found.