No really, he said it. He would've been on the short list of people I assume would never say it. But there it is.
Here's the thing; I think that this is a lot like Gartner's IDS declaration (which he cites). IDS went through some product positioning changes (IPS, UTM, DLP, etc.) but the core idea and technology is still there, and guess what? The original IDS use case is still viable. Sure the attacks have changed, but having a sniffer that can search for known-bad and known-strange traffic on the wire is very, very useful.
So I assume that we are in the midst of a product positioning shift around SIM. Raffy's point that SIM schema are IP-centric and rules are based around correlating firewall and IDS events is true. But most of the vendors have already acknowledged this and are developing content to focus on other log sources. Either way, the use case is here to stay - the ability to search and correlate log events is highly useful, and will continue to be. You may call it "SIEM" or "IT Search" or "log management," but it's the same core concept, repurposed to address the constantly changing security environment.
One final note for vendors from the SecOps trenches: I am not open to a replace/resell on the basis that SIM is old and whatever-you-call-it-now is new and better. My SIM, like my IDS, contains custom content that our team has developed to keep on top of changing threats, including application attacks. SIM, like IDS, succeeds when you put talented security professionals in front of it and let them tune it and manage it like a tool. But it will fail miserably if you are hands-off with it.
Friday, June 27, 2008
Monday, June 23, 2008
Useless Statistics: Nate McFeters vs. Verizon
You know how much I love to tear into vendors whose studies and data analysis wouldn't pass muster for a high school statistics course. It's nice to see someone else go off for a change. Nate McFeters, Ernst & Young security dude, ZDNet blogger, con regular, and fellow Michigan native has taken issue with the results of a study on data breaches that Verizon published earlier this year.
So let's just get this out of the way:
To Nate's point about the wording of the survey and the study, I agree - it is dangerously ambiguous. However, it's probably not the cause of the improbable skew toward external attackers in the survey data.
I think I know what is. See, Nate's thinking about the Verizon study like a pen-tester, and forgetting that most data breaches and security compromises don't involve vulns and sploits, just the interesting ones. Sometimes they involve phishing, but most of the time they involve simple impersonation (the FBI calls it 'identity theft').
The thing is, if you follow basic authentication principles and practices around your self-service web apps, this stuff is hard to prevent but easy to detect and resolve. And this is how you get the disparity between the number of breaches and the amount of data breached, in the statistics. Most of those "external attacker" scenarios were someone's kid, deadbeat brother, or ex-wife impersonating them to get at their information. Not good. But not interesting.
Seriously, I don't know what it is, but it's almost always divorced/divorcing couples involved in these impersonation breaches. Nate, if you want to interview me about this some day, I've got some great stories.
Anyway, use Verizon's survey for what it's good for - getting more security funding. Because bottom line, that's a lot of breaches, no matter the circumstances.
So let's just get this out of the way:
The first thing you’re thinking is, “Wow, my consultant has been lying to me about internal threats!”, the thing is, that’s not necessarily true.Yes it is. "Insider threat" is a red herring throughout security, but especially where data breaches are concerned. There's no breach notification law out there that defines a breach where the data ends up in the hands of someone that already works for you. Since there's no external force requiring companies to track these incidents, it's probably very safe to assume that tracking and detection of these is low, except within a handful of specific verticals.
To Nate's point about the wording of the survey and the study, I agree - it is dangerously ambiguous. However, it's probably not the cause of the improbable skew toward external attackers in the survey data.
I think I know what is. See, Nate's thinking about the Verizon study like a pen-tester, and forgetting that most data breaches and security compromises don't involve vulns and sploits, just the interesting ones. Sometimes they involve phishing, but most of the time they involve simple impersonation (the FBI calls it 'identity theft').
The thing is, if you follow basic authentication principles and practices around your self-service web apps, this stuff is hard to prevent but easy to detect and resolve. And this is how you get the disparity between the number of breaches and the amount of data breached, in the statistics. Most of those "external attacker" scenarios were someone's kid, deadbeat brother, or ex-wife impersonating them to get at their information. Not good. But not interesting.
Seriously, I don't know what it is, but it's almost always divorced/divorcing couples involved in these impersonation breaches. Nate, if you want to interview me about this some day, I've got some great stories.
Anyway, use Verizon's survey for what it's good for - getting more security funding. Because bottom line, that's a lot of breaches, no matter the circumstances.
Monday, June 16, 2008
ArcSight Logger Face-plate-lift
Not only did ArcSight deliver on the improved UI and feature set for Logger 2.5, but like any good appliance vendor, they've popped their collar. And by collar I mean front bezel.
The red with blue LED scroller is definitely a better look than the previous model. Still no backlit logo, though. :-)
The red with blue LED scroller is definitely a better look than the previous model. Still no backlit logo, though. :-)
Friday, June 13, 2008
ArcSight's new Logger Apps
ArcSight is releasing the Logger 2.5 software here soon, and along with it new appliances with some interesting variations. You can check out the vitals on the ArcSight website here.
Prior versions of Logger were available in small, large, and SuperSized, where the SuperSized box was the same spec as the large box with artificial limitations removed via license key. So really only two boxes, all self-contained, all CentOS, all MySQL.
Now, there's a whole new batch. It would appear by the naming designations that they are going after PCI compliance heavily with the L3K-PCI, which must have retention policies and capabilities that make it easier to comply with PCI-DSS 5.2. Another model supports SAN-attached storage and Oracle, so you can grow your Logger with SAN instead of NAS. And finally, there are two new L7100 models with 6x750MB drives. If I'm doing my math right, that works out, after compression, to about40TB 36TB of log storage. That's a significant increase over the 15TB 12TB that the large/SuperSized L5K boxes shipped with.
Update: Talked with Ansh at ArcSight today, and aparently the 2.5 software adds columns to the CEF event view. That's a big deal for folks using CEF events in Logger, and may make CEF-only the preferred format for most Logger users. The new software also includes real-time alert views (like Active Channels in ESM), as well as a number of other enhancements to alerts and search filters and more. Current customers can download 2.5 from the software site.
Prior versions of Logger were available in small, large, and SuperSized, where the SuperSized box was the same spec as the large box with artificial limitations removed via license key. So really only two boxes, all self-contained, all CentOS, all MySQL.
Now, there's a whole new batch. It would appear by the naming designations that they are going after PCI compliance heavily with the L3K-PCI, which must have retention policies and capabilities that make it easier to comply with PCI-DSS 5.2. Another model supports SAN-attached storage and Oracle, so you can grow your Logger with SAN instead of NAS. And finally, there are two new L7100 models with 6x750MB drives. If I'm doing my math right, that works out, after compression, to about
Update: Talked with Ansh at ArcSight today, and aparently the 2.5 software adds columns to the CEF event view. That's a big deal for folks using CEF events in Logger, and may make CEF-only the preferred format for most Logger users. The new software also includes real-time alert views (like Active Channels in ESM), as well as a number of other enhancements to alerts and search filters and more. Current customers can download 2.5 from the software site.
DefCon 2008 Quals
The lineup for this year's DefCon CTF competition has been decided.
And, better still, Doc Brown has once again posted the challenges (with answers) for everyone to try on their own. Brain calisthenics, to be sure. Get your mental leg warmers on and jazzercise kenshoto style.
And, better still, Doc Brown has once again posted the challenges (with answers) for everyone to try on their own. Brain calisthenics, to be sure. Get your mental leg warmers on and jazzercise kenshoto style.
Sunday, June 1, 2008
From My Inbox... (ArcSight Connectors & Logger)
SC left a comment on an earlier post on ArcSight Logger and CEF vs. Raw formats...
There are probably a number of ways to do this, but I've only tested one. In earlier configurations of our syslog infrastructure, there were a couple single points of failure. In order to meet log analysis commitments, we would reload lost syslog data from file.
Start by configuring a 'Syslog Pipe' Connector. Since the connector only has to be online with the Manager when you're manually inserting logs from file, you have greater flexibility about where this Connector will live. It lived on my laptop for a while. When you set it up, point to a path that isn't already used for anything else. Then you can simply start the Connector and pipe the raw log file(s) to the named pipe:
# cat oldsyslogs.txt >> /var/spool/my_arcsight_pipe
Depending on how far the date/time stamps of the events in those files are from $Now, ArcSight will probably throw some errors. It will maintain the "End Time" from the raw log events, and apply "Manager Receipt Time" as the time the manager collects the parsed events from the Connector. This will absolutely screw up any correlation rules you wanted these events to be subject to. Sorry, no easy way around that.
Hi Paul,
Do you know if its possible to "insert" logs into the logger or SmartConnector if the logs are on a physical storage, e.g. DVD or external storage?
Thanks.
Kind Regards,
SC
There are probably a number of ways to do this, but I've only tested one. In earlier configurations of our syslog infrastructure, there were a couple single points of failure. In order to meet log analysis commitments, we would reload lost syslog data from file.
Start by configuring a 'Syslog Pipe' Connector. Since the connector only has to be online with the Manager when you're manually inserting logs from file, you have greater flexibility about where this Connector will live. It lived on my laptop for a while. When you set it up, point to a path that isn't already used for anything else. Then you can simply start the Connector and pipe the raw log file(s) to the named pipe:
# cat oldsyslogs.txt >> /var/spool/my_arcsight_pipe
Depending on how far the date/time stamps of the events in those files are from $Now, ArcSight will probably throw some errors. It will maintain the "End Time" from the raw log events, and apply "Manager Receipt Time" as the time the manager collects the parsed events from the Connector. This will absolutely screw up any correlation rules you wanted these events to be subject to. Sorry, no easy way around that.
Subscribe to:
Posts (Atom)