Business and Technical Requirements of the Security Awareness Program
Javor Mladenoff
AI/ML Specialist | Data Analytics Expert | Cloud Systems Architect | Risk Management Consultant | R&D Chemistry Scientist
Collecting data for security awareness programs is similar to any other business project. Nowadays, data is a precious resource that can define the difference between success and failure. Data is the main tool for security awareness projects to detect security breaches and assess the efficiency of security awareness programs, security training, etc. Data collection refers to a systematic approach to collecting, analyzing, and storing information that reveals the progress of the security program and serves the needs of the IT Security team. [Open Text Corporation (n.d.). What is Data Collection? Opentext.com. Retrieved May 23, 2024, from https://www.opentext.com/what-is/data-collection] It begins with the stakeholders like any other business project. Who are the stakeholders? “In business, a stakeholder is any individual, group, or party interested in an organization and the outcomes of its actions.” [CFI Education Inc. (n.d.). Stakeholder. Corporatefinanceinstitute.com. Retrieved May 23, 2024, from https://corporatefinanceinstitute.com/resources/accounting/stakeholder/] That includes customers, employees, investors, suppliers, communities, governments, etc.
Customers: The stake for the customers is the product or service quality and value. For example, they use the meanings of business analytics to enhance their customer support with privacy and confidentiality. Many tools use machine learning algorithms and natural language processing techniques, which decreases the time necessary for the customer service staff to process support tickets. Another solution applied predictive analytics methods to prevent security events from occurring and dynamically adapting to the changing security environment.
Employees: The different groups of stakeholders, the employees, have a stake based on their income and safety. Many enterprises have started to pay special attention to their most valued asset – the employees.
领英推荐
My experience: In my line of duty, creating and analyzing visual representations of massive data arrays is essential. This is an everyday process, and the types of dashboards I use are Pareto Charts, Column Charts, and various combined charts. If they are correctly done, my audience clearly understands the actual situation with the company’s business. What will help me to do my job better is to have clean data and automated algorithms to sort and filter all necessary information on the dashboards. At this moment, that is a tedious activity that takes up most of the day. This is due to the methods for collecting and processing the data, which follow no clear idea or objectives. Until I started in this position, different people from all departments had created Excel charts, and other employees had been trying to extract the data from those spreadsheets. As a result, the workflow became complicated and, in places – mysterious. Some employees are no longer with the company, and the newcomers are neck-deep in all this chaos, creating even more misunderstanding. The problem is that in their attempt to automate the process, each person implements their methods – like macros, pivot tables, etc. When someone invokes changes to one or two of the spreadsheets, whether the job requires this or because it makes life easier, all connections between the tabs/tables become broken, or it doesn’t work correctly, this is the primary source of errors and misleading graphs as the final result. So, my job is to clean all this logistic nightmare and create a sound organization of how data are processed and analyzed. First, I must discover and repair all broken links and simultaneously check the integrity and consistency of the information. I use visualizations to find discrepancies between the original input and the final graphs. Then I start to fix the datasets, all the broken macros, the obsolete update links, and false visual information.
How did I collect it? Simple – data mining. SQL Server’s built-in tools allow it to scale and manage servers, instances, database applications, and resource utilization through a single-user console more efficiently. SQL Server has built-in security and compliance capabilities that allow it to manage data access and audit information to help with its regulatory compliance needs (e.g., HIPAA, PCI Data Security Standard Compliance, etc.). The next step is data wrangling. Data wrangling, sometimes referred to as data munging, is the process of transforming and mapping data from one "raw" data form into another format to make it more appropriate and valuable for various downstream purposes such as analytics. The goal of data wrangling is to ensure quality and useful data. Data analysts typically spend the majority of their time in the process of data wrangling compared to the actual analysis of the data. The method of data wrangling may include further munging, data visualization, data aggregation, training a statistical model, and many other potential uses. Data wrangling typically follows a set of general steps that begin with extracting the data in a raw form from the data source, "munging" the raw data (e.g., sorting) or parsing the data into predefined data structures, and finally depositing the resulting content into a data sink for storage and future use. [Wikimedia Foundation, Inc. (n.d.). Data wrangling. Wikipedia.org. Retrieved May 23, 2024, from https://en.wikipedia.org/wiki/Data_wrangling]
Data may come from different sources, but without proper processing, they don’t mean much. All datasets should be molded to some extent so the applied statistical algorithms can produce viable and accurate results. Automation makes this process less of a struggle, so other tasks related to the results of this analysis can be addressed more thoroughly.