Knowing is half the battle.
Izzy Navarro

Knowing is half the battle.

For new and old users who may benefit from the intricacies of Splunk. If you have tried to evaluate data where Datamodels do not exist or they do, you may have tried to coalesce data and failed.

The reason is because your values must exist between indexes e.g., index=fwall and index=awakesecurity. If host= rfc1918 in fwall and host=machine in awakesecurity and these values are unique to the two indexes you'll run into this problem every time.

For those of you veterans thinking, well ok Izzy I can just run transaction on my data sets and append with an eventstats command, yes I agree. But you better have one hell of a Splunk instance and your Ops guys better be primo. Transaction will tip the server over or keep you till next Christmas.

The reason I say this is because recently I encountered an organization where I couldn't get a lookup table to load to save anyone's life. In situations like these you need to improvise, especially when a 20 second query takes 8 hours of your work day and sometimes the dreaded Job Manager pops up. (LITERALLY 8 HOUR SEARCH)

I know you're thinking one of three things at this point, "inclusivity, field, or error delay label". For those who didn't read my previous post or know inclusivity is equal to everything left of the first pipe. Yes my searches do not use "*" why when you know your index, sourcetype, pattern, lookup etc., and ran tstats and metadata all before.

The point of this is knowing the battle. Use a shift left approach in a horrible situation like this. Export your two files to .csv when you've finally gone mad and have had enough, coalesce the data there and move along. This was my lesson learned from this experience, and no, field extraction is completely separate.

REACH OUT IF YOU WOULD LIKE A SPLUNK COLLABORATION.




Darren Humphries

Group CISO & CTO-Partner at Acora; Driving Experience Led Agreements/ ROI in Cyber Security

1 年

I agree - you can optimise your search syntax but you have to start somewhere which is often “*” and guess work. So yes having a data model, correct labelling and knowledge management is key. I have never seen a Splunk instance I haven’t had to make improvements on - meaning out of the box it doesn’t just engineer itself for optimisation. So yeah “Primo” guys with Wisdom on how to best engineer the product is needed. Everyone things Splunk is easy to use. Well heck different people get different results from google or ChatGPT. Better searches or better questions to ask comes with experience, for optimisation that comes with Wisdom

要查看或添加评论,请登录

Izzy N.的更多文章

  • The Paradox of Proactive IOC Usage

    The Paradox of Proactive IOC Usage

    Indicators of Compromise (IOCs) present a paradoxical challenge in cybersecurity. While they are valuable for detecting…

  • RANSOMWARE BASICS Birds Eye

    RANSOMWARE BASICS Birds Eye

    High Level Information on RANSOMWARE Below is only one example based on current events and attacks and not the only…

  • Trinity Venn Coincidence or Real?

    Trinity Venn Coincidence or Real?

    Below is the illustration with its high level definitions. I will start by defining common or general Risk.

  • THE FAILINGS OF DECEPTION GRID

    THE FAILINGS OF DECEPTION GRID

    Deception Grid products are unlike plants. Plants in their natural habitat thrive organically as nature has engineered…

社区洞察

其他会员也浏览了