Building an AI product? Here’s how to talk to your customers about security

Building an AI product? Here’s how to talk to your customers about security

At the end of last year, I published a post about the future of cybersecurity for enterprise SaaS. As important and relevant as enterprise SaaS remains in our everyday work environments, however, the next and most critical frontier for cybersecurity exists in one space—artificial Intelligence.?

Cybersecurity for AI products is an entirely different animal than that for enterprise SaaS, starting at a foundational and emotional level. AI still harkens mistrustful images for many, both inside and out the tech industry. We’ve talked at length about the importance of trust and safety as we integrate more and more AI into our professional and personal lives. The bare minimum for establishing trust and keeping people safe is building thoughtful, strategic cybersecurity capabilities into the products we, as an industry, create.?

Today, AI companies fall into two primary buckets — companies building AI solutions and companies that are using AI as part their solutions. In either case, there will ultimately be an added layer of questioning from the cybersecurity team or CISO org during your sales cycle, particularly if you want to sell into the enterprise. If you’re not prepared to talk about this in strategic detail, you’re going to have immediate and potentially insurmountable problems.?

Why are these conversations so important? Security teams, as much as they are action-oriented, are researchers at heart. It is a core piece of the way they think, how they approach problems, and how they ultimately land on a solution. When I was leading a security team, I learned quickly that cybersecurity isn’t just about putting securing infrastructure in place, it’s about constantly identifying weaknesses and having a contingency plan if those weaknesses are exploited.

An analogy I often use to explain the different thought process that cybersecurity teams take to evaluating a solution is that of an electric vs. gas-powered vehicle. If your car takes gas, you fill up and go. When you need a gas station, you’ll find one because there are so many. If you drive an EV, taking a road trip isn’t quite so simple although it is getting better. You need to know where there are charging stations available, then plan your route and timing accordingly. You’ll have some range anxiety, and won’t be able to take impromptu detours. You’ll need to have a plan for maintenance, because you can’t just walk into Jiffy Care for a basic fix. These two cars might be the same make and model, but?ownership and use require entirely different mindsets.?

Demonstrating that you understand your customers’ concerns, and that you’ve taken steps to address them, is a prerequisite for success. Let’s talk about a few different ways to do just that.?

  1. Start at the start. Begin your conversation by clearly articulating that your approach to security for cloud, SaaS or software is different from your approach to security for AI because the requirements are different. Making this statement up front will open ears and assuage fears.?Then talk about how the company looks to have AI as part of the product, who led the effort and what the core goal has been.
  2. Be prepared to talk about the specific data used as part of your AI stack, and what you’re doing process-wise to protect that data. Protection is not about just putting a parameter in place, it requires encryption and anonymization to ensure integrity and protect against data poisoning.?
  3. Demonstrate how you’re thinking about the audit trail for all of this data so that it is properly protected. There needs to be a comprehensiveness to your logs that clearly articulates what tooling you use.?
  4. Refer back to known frameworks. For example, in the NIST framework you’ll find references to specific security threats that can help you prioritize what CISOs and security teams care most about relating to AI. Referring back to these frameworks further demonstrates that you’ve done your research into their world, their concerns and their needs. Make sure you educate yourself on the security frameworks around AI just like you would those for SaaS or cloud.?
  5. Be prepared to discuss the security awareness and training protocols for your own organization. These are different with AI. How have you trained people internally on AI? What have you done internally to ensure the data is properly looked after, how well have you communicated the impacts of getting this wrong?
  6. Be transparent and upfront about any unknowns. Acknowledge the gaps, and show an enthusiasm for filling those gaps together. Create a feedback loop based in reality, so that you can drive innovation forward together.?Hallucinations are probably expected. Quality is supposed to improve over time. Recognize that this is the customer expectation.
  7. If you are going down the POC or trial route, be exceptionally clear about what success looks like not just for the solution but on the security side as well. Be able to use aspects of the POC to demonstrate good security practices. For example, if nobody accessed the data being used in a model, explain how you did that and how you monitored it. Show them how you anonymize data used to train the model. Give people a look under the hood so they can understand how you did what you did.?
  8. Have a statement around threat mitigation that acknowledges the current landscape and articulates how you plan to address each. Talk about the specific solutions you’re looking at to address them, or, better yet, ask the customer’s security team for advice and make it a true dialogue that shows willingness to learn and customize.?

Taking this approach will lead to a definitively faster sales cycle with considerably less friction between you and the security team. If you’re open, honest and upfront about what you know and what you don’t, security teams will listen. Many of them may want to learn from you if you’re well educated, which can create a collaborative relationship that elevates your ability to innovate and sell. This symbiotic approach can also help you identify and address holes in your offering, because at the end of the day, security is still everyone’s problem.


Tia Marciel

Passionate Technical Writer | Expert in Crafting Clear User-Friendly Documentation for Fortune 500 Companies | Skilled in Collaboration and Strategic Communication | Transforming Information into Actionable Insights

3 个月

Thank you, for this insightful?? article, Yousuf! I learned something new today! I appreciate the practical advice you give security teams to be transparent and establish trust up front, , to shorten the sales cycle and promote trust downstream. ? Articulate security approach. ? Be prepared to talk about how you protect your customer's data, being open about security gaps and unknowns—their security teams will value the honesty. And love your sweater-vest logo in the article, BTW.

回复
Alex Hesterberg

Chief Executive Officer - Superna (superna.io)

3 个月

Fantastic points, Yousuf Khan (as always)! Data Security and Chain of Custody takes on a new definition with AI.

回复
MD Deluwar Hussen

SaaS Product Designer | Head of Design and Founder at AntDesk | WordPress Developer

3 个月

This is exactly what I needed to see today!

回复
Tiffany Dreyer

Senior Brand & Creative Services leader

3 个月

I love that the bot is wearing a vest too ??

回复
Wendy Phillips

Software entrepreneur and fan of software entrepreneurs

3 个月

What an on-point post. Love the EV analogy for how security for AI must be approached differently. Also appreciate that you are addressing #founders selling and building. Yes! Get in front of these AI security conversations with your prospects!

回复

要查看或添加评论,请登录

Yousuf Khan的更多文章

社区洞察

其他会员也浏览了