Tuesday, February 27, 2007

Honeypots' coming of age

At November's Grand Rapids ISSA meeting, Tim Crothers gave an awesome preso on malware analysis. In it, he provided a value proposition for using honeypots in corporate network security. Up until this November, I had always considered honeypots a tool for AV and IDS R&D folks, but Tim demonstrated how to work Nepenthes into network security ops. I was inspired by his talk and promptly took his idea home and set set up Nepenthes and built a VMWare image for doing malware analysis and started collecting and reversing malware being passed around in the wild. Mas alegria!

As much fun as I was having at home, when it came to operationalizing Nepenthes at work, I had a problem. Collecting malware that spread via network services might not be all that useful since all workstations have 1 client firewall, and all laptops have a second that is bundled with the VPN client. Malware that Nepenthes would collect from the network would not affect these workstations because they malware would never connect regardless of whether the workstation was vulnerable. As such, all of the malware I have chased down over the past couple of years either came in via IE/Outlook exploit, e-mail attachment, or as an infected installer/program that a user downloaded.

That got me thinking about ways to collect malware that relies on browser exploits. This is much harder to do and requires a number of things that have security implications of their own, like parsing or executing JavaScript to find the downloader URL. And then there's the issue of determining what and when to crawl looking for these. Should you look for "fertile ground" like MySpace, replicate real web traffic from proxy logs, use Google, or what? Of course, I have no working code to offer. But like most good ideas, it's hard to have an exclusive or original one. And if you said, "I bet Lance Spitzner already thought of this." you'd be totally right. Of course, honeyclient is old news as it turns out, and seems to be dead in the water anyway. But more along the lines of what I was thinking, and more similar to Nepenthes, there are Capture and HoneyC, self-described low interaction honeyclients.

Of the two, HoneyC takes the least amount of effort to get up and running. It's a web honeyclient written entirely in Ruby, and a lot more like what I had in mind. It uses the Yahoo API to search and crawl for suspicious pages. It's too early for me to say how accurate it is, but it was definitely painless to get it configured based on my own search criteria and turn it loose. It will take some more work to build some intelligence into my searches (more proxy logs!) and some effort to develop some useful (Snort-like) signatures, but this looks very promising. I'll report back by the end of March with more on how this is going.

No comments: