Business and Technical Requirements of the Security Awareness Program

Business and Technical Requirements of the Security Awareness Program

Collecting data for security awareness programs is similar to any other business project. Nowadays, data is a precious resource that can define the difference between success and failure. Data is the main tool for security awareness projects to detect security breaches and assess the efficiency of security awareness programs, security training, etc. Data collection refers to a systematic approach to collecting, analyzing, and storing information that reveals the progress of the security program and serves the needs of the IT Security team. [Open Text Corporation (n.d.). What is Data Collection? Opentext.com. Retrieved May 23, 2024, from https://www.opentext.com/what-is/data-collection] It begins with the stakeholders like any other business project. Who are the stakeholders? “In business, a stakeholder is any individual, group, or party interested in an organization and the outcomes of its actions.” [CFI Education Inc. (n.d.). Stakeholder. Corporatefinanceinstitute.com. Retrieved May 23, 2024, from https://corporatefinanceinstitute.com/resources/accounting/stakeholder/] That includes customers, employees, investors, suppliers, communities, governments, etc.

Customers: The stake for the customers is the product or service quality and value. For example, they use the meanings of business analytics to enhance their customer support with privacy and confidentiality. Many tools use machine learning algorithms and natural language processing techniques, which decreases the time necessary for the customer service staff to process support tickets. Another solution applied predictive analytics methods to prevent security events from occurring and dynamically adapting to the changing security environment.

Employees: The different groups of stakeholders, the employees, have a stake based on their income and safety. Many enterprises have started to pay special attention to their most valued asset – the employees.

My experience: In my line of duty, creating and analyzing visual representations of massive data arrays is essential. This is an everyday process, and the types of dashboards I use are Pareto Charts, Column Charts, and various combined charts. If they are correctly done, my audience clearly understands the actual situation with the company’s business. What will help me to do my job better is to have clean data and automated algorithms to sort and filter all necessary information on the dashboards. At this moment, that is a tedious activity that takes up most of the day. This is due to the methods for collecting and processing the data, which follow no clear idea or objectives. Until I started in this position, different people from all departments had created Excel charts, and other employees had been trying to extract the data from those spreadsheets. As a result, the workflow became complicated and, in places – mysterious. Some employees are no longer with the company, and the newcomers are neck-deep in all this chaos, creating even more misunderstanding. The problem is that in their attempt to automate the process, each person implements their methods – like macros, pivot tables, etc. When someone invokes changes to one or two of the spreadsheets, whether the job requires this or because it makes life easier, all connections between the tabs/tables become broken, or it doesn’t work correctly, this is the primary source of errors and misleading graphs as the final result. So, my job is to clean all this logistic nightmare and create a sound organization of how data are processed and analyzed. First, I must discover and repair all broken links and simultaneously check the integrity and consistency of the information. I use visualizations to find discrepancies between the original input and the final graphs. Then I start to fix the datasets, all the broken macros, the obsolete update links, and false visual information.

How did I collect it? Simple – data mining. SQL Server’s built-in tools allow it to scale and manage servers, instances, database applications, and resource utilization through a single-user console more efficiently. SQL Server has built-in security and compliance capabilities that allow it to manage data access and audit information to help with its regulatory compliance needs (e.g., HIPAA, PCI Data Security Standard Compliance, etc.). The next step is data wrangling. Data wrangling, sometimes referred to as data munging, is the process of transforming and mapping data from one "raw" data form into another format to make it more appropriate and valuable for various downstream purposes such as analytics. The goal of data wrangling is to ensure quality and useful data. Data analysts typically spend the majority of their time in the process of data wrangling compared to the actual analysis of the data. The method of data wrangling may include further munging, data visualization, data aggregation, training a statistical model, and many other potential uses. Data wrangling typically follows a set of general steps that begin with extracting the data in a raw form from the data source, "munging" the raw data (e.g., sorting) or parsing the data into predefined data structures, and finally depositing the resulting content into a data sink for storage and future use. [Wikimedia Foundation, Inc. (n.d.). Data wrangling. Wikipedia.org. Retrieved May 23, 2024, from https://en.wikipedia.org/wiki/Data_wrangling]

Data may come from different sources, but without proper processing, they don’t mean much. All datasets should be molded to some extent so the applied statistical algorithms can produce viable and accurate results. Automation makes this process less of a struggle, so other tasks related to the results of this analysis can be addressed more thoroughly.

要查看或添加评论,请登录

Javor Mladenoff的更多文章

  • Integrating Multiple Networks

    Integrating Multiple Networks

    Communication is key when conducting business nowadays. Technologies evolve and allow for faster, more reliable, more…

  • Network Architecture Essentials

    Network Architecture Essentials

    Going back years in my life experiences, I could recall not two but too many real-world companies that left me with…

  • OSI Model Layers

    OSI Model Layers

    The Open Systems Interconnection (OSI) model is a conceptual framework that standardizes various types of network…

  • Cloud Services

    Cloud Services

    Cloud services have a long development history, starting in the early '60s of the last century. The idea was to allow a…

  • Classical Computers vs. Quantum Computing

    Classical Computers vs. Quantum Computing

    This article compares classical and quantum computers' approaches to solving complex problems, such as navigating…

  • Public Key Infrastructure

    Public Key Infrastructure

    The digital era offers countless possibilities for communication and exchange of information between people…

  • Security Issues Associated With the Web

    Security Issues Associated With the Web

    A long time ago, the Internet was born in a galaxy far away. It was an exciting time when this event was introduced to…

    1 条评论
  • What is the purpose of resampling? Why would we want to use it?

    What is the purpose of resampling? Why would we want to use it?

    Resampling techniques are a set of methods to either repeat sampling from a given sample or population or a way to…

  • Last line of defence in the field of cybersecurity: Passwords

    Last line of defence in the field of cybersecurity: Passwords

    Passwords have been around for a long time. Military organizations, secret societies, and other legal or not-so-legal…

  • Supervised vs. Unsupervised Learning

    Supervised vs. Unsupervised Learning

    Supervised learning is based on training a data sample from the data source with the correct classification already…

    1 条评论

社区洞察

其他会员也浏览了