January was a dark month for Github. The collaborative source code management site was found to be sharing the private SSH keys of many members via its public search function.
The website, which helps programmers in far-flung locations to collaborate with each other, had just upgraded its search function with many new features. However, this caused several enterprising hackers to take another look at its search functionality.
Git hub works using a series of repositories. These are folders that hold the source code for software that a developer is working on as part of a collaborative project. A private repository on a developer's own machine is replicated with a public one on the Git hub site, enabling that developer to work on their own version of a piece of source code, before it is then incorporated into the main source code along with everyone else's changes.
Unfortunately, it turns out that many developers are not very conscientious when it comes to security. They copied over the entire contents of their UNIX machines' home directories into their private repositories, which were then copied up to the public folders. By default, UNIX stores SSH keys in the home folder.
SSH is a certificate system designed to make it easier to access remote computing services without continually re-entering passwords. When a user generates SSH keys on their own computers (which can be done with a single command line instruction), it creates a private and public key. The public key can be given to servers that the user wants to access transparently via different tools on their computer. The private key is supposed to stay with them, and never be distributed.
If these private keys are made public, then an attacker has the keys to the kingdom, because they can access any online services that the user is logging into. What makes it worse is that the user's computer also keeps a list of these services on their machine in a 'known hosts' file.
So, until Github recognised what was happening, links to people's private SSH keys were popping up in its search results. This could have had far reaching ramifications. Developers' machines may already have been compromised without their knowledge. Their Github accounts could have been accessed, and malicious backdoor code could even have been inserted into their project code.
But who was at fault here? Was it Github, for making the search results available, or was it the developers themselves, for not understanding security well enough to protect their own private keys?
And if developers are making rookie security mistakes such as this, how much should we be trusting them to produce secure software?
Listed below are links to blogs that reference this entry: Who is responsible when developers screw up security?.
TrackBack URL for this entry: