Revolutionizing Cybersecurity: Merging Generative AI with SOAR for Enhanced Automation and Intelligence
Update: If you like this content, consider subscribing to updates via my newsletter. I'll publish through this method from now on. https://applied-gai-in-security.ghost.io/
Ever since I got access to ChatGPT, I've tried to bend and shape the model to meet my needs. I use system prompts and other scaffolding methods to achieve application proof-of-concepts and workflow outcomes for discrete tasks. My use of the model always gravitates towards orchestrating process to achieve an outcome that later becomes the foundation of my code. While I enjoy the flexibility of chat, I don't always think it's the best experience for helping achieve a specific task. In my opinion, the scaffolding approach is a necessary step in getting to functional agents.
When looking across the security industry, it's hard not to draw parallels to the previous Security Orchestration Automation Response (SOAR) movement that took place many years ago. In fact, when I describe my model interactions with others less familiar with GAI, they often respond by saying that it's like a "natural language SOAR". In this post, I briefly outline my thoughts on how GAI augments SOAR and how we could be entering a new era of automation in our industry.
Augmenting SOAR with GAI
Natural Language Interface. GAI provides natural language inputs and outputs to questions or instructions. This new interface reduces the need to understand the underlying tools or data sets commonly leveraged within SOAR playbooks. While you still need a connection using traditional APIs, you aren't relying solely on the API responses to inform your decision making or next steps. In a previous SOAR approach, you were beholden to rigid inputs and outputs using data. Some SOAR solutions leveraged NLP, but it wasn't great to work with and still required use of "coded language" to align the request to the appropriate data source.
Playbooks Create Business Knowledge. Assuming your GAI solution has some notion of a "session" in which the prompts and responses are preserved together, you get documentation and the underpinnings of a knowledge base. For example, if you have a GAI-driven playbook that investigates a phishing email through a series of prompts, you will get back natural language responses that document each step. At the end of the playbook, you could summarize the session into a report. If all your phishing emails call this playbook, you begin to amass sessions that describe the types of phishing emails across your organization that serve as valuable knowledge to inform other defensive practices. In previous SOAR solutions, playbooks largely chained different code solutions together, but lacked significant documented knowledge.
Dynamic Content. GAI offers the ability to "understand" data. You can ask for results to be summarized and explained or derive new results like the creation of tags or labels. This content can then be used to inform additional steps in the orchestration effort to achieve far more specific workflows without always needing to include a human in the loop. For example, you could derive a series of tags to describe the type of phishing email based on the body/attachment content and based on those outputs, perform additional quarantining steps. Previous SOAR solutions often relied on the code returned from a tool and limited decision making to static pathways.
领英推荐
Influenced Decision Making. Similar to dynamic content, GAI systems can be asked questions about security events including how the model would classify a given result. Imagine a GAI-driven playbook that attempts to classify an event based on the evidence it was able to collect. The model opines on it's decision, associates a confidence and includes evidence to support that choice. An analyst now has a greater foundation to operate from when performing their triage for the day. Previous SOARs attempting to derive decisions, but were limited to the static responses of the systems they integrated with.
Playbook Creation. GAI models that focus on code can be leveraged to take natural language descriptions of a workflow and have it produce a first-pass of playbooks. This new ability reduces the need for a user to understand a no-code interface or graphical editor in full detail. While initial results may not be perfect, they could greatly save time and allow for rapid experimentation. Previous SOAR solutions began originally as low-code, then shifted to no-code, though still relied on graphical interfaces to build the workflow.
Better Human-in-the-Loop. For cases where a workflow requires input from a human, GAI is positioned to deliver much better direction and context. Imagine that the model reaches an ambiguous point in the workflow where clarity from an expert is required. GAI can help to draft an email or message to the user outlining the decision that needs to be made along with the context. And the decision doesn't need to occur immediately. The model could facilitate actual dialog and reasoning around the decision and it's impact. Previous SOAR solutions forced decisions into pass or fail gates, offering little in the way of deep context and wouldn't be able to express the impact of the decision.
Conclusion
GAI offers significant new capabilities to support automation and orchestration efforts across the wider security, identity, compliance and management ecosystem. In future posts, I will outline example workflows, incorporating the above value points and demonstrating what automation can help organizations achieve. I am extremely encouraged by what I have seen so far and feel we are small steps away from achieving significant new ways of working in our industry.
Update: If you like this content, consider subscribing to updates via my newsletter. I'll publish through this method from now on. https://applied-gai-in-security.ghost.io/
Exciting read! The fusion of Generative AI with SOAR is truly revolutionary The natural language interface and dynamic content aspects seem game-changing for security automation. Looking forward to more insights on the practical applications
Building Trustworthy AI for Cybersecurity
11 个月This is terrific Brandon thanks for sharing your thoughts
Love this, and agree that GAI can really help fill some of the gaps in security operations that tech hasn’t been able to.
Principal Product Manager at Microsoft - AI (Security Copilot)
11 个月I'd like to reinforce the points made by Brandon Dixon. As someone who has spent the past five years developing LogicApp and SOAR workflows, I've found that constructing workflows with LLM (Large Language Models) is not only more straightforward but also enjoyable. The productivity gains are significant. We've gathered some valuable insights during this process that we're eager to share. For instance, we can discuss strategies for implementing IF or CASE statements in LLM-based workflows and executing them automatically.