Using the Power of Cyberpsychology to keep your organisation Secure
Introduction:
There's a new 'ology' in town - cyberpsychology. It might sound like a gimmick, but cyber security experts are taking it seriously. Of course, there are legitimate concerns about the latest theories distracting focus and investment from the urgent cybersecurity business of fighting tech with tech to keep hackers at bay. But it's unwise to dismiss cyberpsychology when there's evidence that the hackers themselves are making full use of its knowledge and principles to perpetrate cybercrime.
What do we mean by cyberpsychology? It's actually two or even three 'ologies' - incorporating the well-established academic disciplines of psychology, behavioural science and cognitive science. All three have important relevance in modern cybersecurity knowledge and strategy, with a potentially crucial impact on the operational effectiveness of your cybersecurity measures. Business leaders and cyber security specialists need to understand the reasons for adopting psychological approaches and how to apply them practically for maximum defensive impact in their organisations.
Cybersecurity isn't a pure technology battle
Successful cybersecurity is about more than pure technology. There are three main factors that together contribute to robust, responsive system and data security in a modern organisation.
Cyberpsychology plays a part in them all. AI is advancing all the time, but almost without exception, here is a human component to security breaches. Hackers can automate attempts to break firewalls and technical security protocols, but they're going head-to-head with expert peers when they tackle the technology. It's much easier for them to trick a fallible human person or to influence them using persuasive techniques or fake information. Human responses vary in any given situation: tapping into an unguarded reaction can give criminals a quick way into the digital information and system insights that they're seeking.
Cyber criminals and hackers aren't squeamish about psychological cyberwarfare
Cyber criminals play on fear and a sense of urgency to pressure their victims to act without thinking too hard. In the consumer media, we hear a lot about ordinary people being tricked sharing financial details, allowing fake technical support experts access to their system or revealing sensitive information. It's not only people who lack digital awareness who succumb to these tactics. Phishing, vishing and smishing attacks evolve constantly and can be sophisticated and convincing.
Whether at home or work, hackers know the power of appealing to our emotions and kindness. Cyberpsychology is at work in every common scenario, from pretending to be a family member or colleague needing help, to flattering us by suggesting we are the only people who can help, or embedding fear that we've failed to complete a form or that we urgently need to update our devices or computers.
In everyday life, most people have a good sense of common dangers and how to avoid them. We have centuries of survival experience to draw on in the physical world. But in the modern digital age, we haven't yet developed a consistent collective or individual fight or flight response to online danger. That's why taught and learned behaviour and robust processes are so important when it comes to cyber security. One individual's actions can put an organisation at risk, so every human reaction to a process, requirement or anomaly matters.
Humans in the workplace can be caught off guard
In a large organisation it can be even easier for bad actors to pose as legitimate auditors, partners, suppliers or officials. Users can be more careless with company technologies than with their own, because they believe that company data security is strong compared to their own personal measures.
Human vulnerability and instinctive reactions are the cause of many data breaches. People usually want to help, so will try and do what they're asked at work, including responding to requests in business emails to share information.
On top of this, when strong emotions are invoked, people don't tend to respond as rationally as usual. When fraudulent communications are smart enough to emphasise deadlines and drop in the name of a senior company executive, people can easily miss red flags in their haste to avoid criticism. And when people are working in a stressed or rushed environments, they are more likely to give hackers an advantage, because of urgency and lack of time to consider requests or situations properly. This can apply to workers at all levels: the more responsibility and power they have in the organisation, the more sensitive, confidential and strategic the information is that they may risk revealing to prying eyes.
Even tech security and cyber experts can fall victim to scams, or narrowly avoid them. Take a look at Dr Erik. J. Huffman's TED talk to see how it can happen even to the best-informed IT experts.
Monitor, command and control: a classic response
When you look at the high stakes, it's not surprising that traditional approaches to cybersecurity tend to focus on enforcement, monitoring and limiting people's technology and connected autonomy. However, it's becoming increasingly difficult to use this model effectively.
The world of work has changed hugely in the last few years. People use all kinds of platforms, devices and technologies to work and communicate. It's very difficult for an employer or security team to fully monitor and control all of these, particularly when they need to make sure they comply with human rights and privacy regulations. Monitoring, commanding and controlling for cybersecurity cannot manage out every possibility or option in our distributed working environments. Unless everyone is office based, uses only corporate devices and accesses only a defined number of assets and resources, it's extremely difficult to ensure full oversight.
There's another problem with this approach. Treating employees (or end users, as IT departments often name them) as zero trust liabilities leads to limiting and restricting their behaviour and often undermines their sense of accountability. It's completely counter to modern HR (or People Science) best practices of motivating, informing, empowering and trusting people to do the right things.
Today, we need cybersecurity models that are built for people, not just to enclose and shore up the core of systems and technologies.
领英推荐
A psychologically informed approach to better cybersecurity behaviours
A more productive mindset for employers and security experts could be to consider that most people almost always seek to do the right thing and to make good and secure choices at work. They don't want to create security breaches or risk operational or financial consequences for their colleague or customers or damage the reputation of their employer, who provides their employment and livelihood.
But it's not always easy for people to behave in this way. Humans can be put off by onerous processes if they don't understand what they're for or find them too difficult to navigate correctly. Most of us have had the experience of abandoning a website, if the verification and sign-up process is tortuous. The same applies to office systems and protocols. Password compliance rules can be so stringent that people can't remember what they chose, so they use weak or repeated passwords or store them insecurely. System access permissions can be painful to request and processes can be difficult to use, so people borrow logins and systems or take data outside them. It's hard to maintain oversight.
People generally understand and support the aims of protecting data, stopping spam and preventing malicious incursions, but they are also sensitive to excessive security red tape, as they may perceive it. Cybersecurity decisionmakers need to harness the potential of employees and their human intelligence and insight so that it's straightforward for them to take the best security decisions and actions. We must design cybersecurity processes and protocols with people in mind, not just technology.
Trusting employees on the front line of defence
It's useful to reframe a commonly used cybersecurity phrase. People are not 'weak links'. They are certainly potential conduits for attackers to exploit, but they can also form a powerful human defence force.
To make users more likely to adopt security measures like MFA and strong passwords, they need to understand the risks and consequences. They need to trust that the extra actions or considerations they're being asked to make will be effectual.
Treating them as intelligent and trusted peers rather than inevitable violators of the rules means asking workers about their experiences and giving them the opportunity to explain honestly what they struggle with in security controls. This can help us understand what they need to make them more heedful of security. Then we can act on it in all three key areas - cybersecurity system design, training and communication. Security strategy and training needs to change the focus from compliance and completion alone to active engagement and practical scenarios.
The cybersecurity team is not the enemy
When organisations and cybersecurity experts use the same language as cybercriminals, it can seem like the IT team is escalating the threat just as aggressively as hackers are creating it. We educate users to avoid being rushed or panicked into action by criminals. But in the same breath, the cybersecurity industry uses highly emotive language ('catastrophe', 'devastating', 'critical risk') that can be intimidating or alienating. People may either feel it's an exaggerated threat, or that the challenge is so big that individual actions can't have any effect. Even worse, all the hyperbole can be desensitising, so people filter out warnings and exhortations.
To counter this, cyberpsychologists suggest fostering a sense of community and civic responsibility amongst employees to support cybersecurity. The mentality is that together we can stand up to cybercrime - the cybersecurity team is here to help you, not punish you.
Psychological safety matters too. In some organisations, a culture of toughness means individuals feel they must handle a situation without help. There may be fear of repercussions, so employees don't want to get into trouble for acknowledging a problem or a lack of knowledge. Actively creating a psychologically safe organisational and security culture means people can be open, experiment, fail in safe environments (for example, through practice or simulation exercises) ask questions, report suspicions and not feel anxious about being blamed. Employees will feel free to discuss risks, speak up about processes that aren't working and to engage constructively with cybersecurity colleagues.
Conclusion:
Cybersecurity professionals usually have a technology background, rather than a people focus. The emergence of the science of cyberpsychology presents an opportunity for them to acquire new skills and perspectives. They may need training and practice in understanding and influencing employee behaviours, so they can apply cyberpsychology to user-friendly security system design, to engaging and active training and to positive communication, including incentives to do the right thing.
Cybercriminals are experts in abusing our personal vulnerabilities and biases to achieve their aims. They know how to breach our defences by exploiting our worries, doubts, inattention and curiosity. They also know how to persuade us into unsafe courses of action, using plausible social media content and direct, often urgent communications. Some crime groups even directly encourage employees to breach security for political or ethical reasons.
Security vendors and consultants cannot promise to solve these problems purely through technology precautions and restrictive monitoring. People are inherently fallible, and potentially vulnerable to cyberattack. But collectively, they can also form a powerful defence. That's why cyberpsychology is ultimately so important: it empowers you to use your workforce's human strengths to activate and add to your technology and process measures.
That's all...for now.
Want help setting your Cyber Security Strategy speak to the CNS at Six Degrees team or join in the next Cyber University Session on 13th June in London.
B2Bs: bypass archaic data infrastructure | Future of Work
5 个月Bravo incorporating the human side. “Cybersecurity professionals usually have a technology background, rather than a people focus.”
MORE SALES, FASTER. Voted ‘UK’s Sales Trainer of the Year’. Best-selling author. Worked in 50+ countries, with some of the world’s largest and most famous companies
6 个月A very interesting read! Thank you
Sales and Training Manager
6 个月Good Advise