October 28, 2020
Kannan Subbiah
FCA | CISA | CGEIT | CCISO | GRC Consulting | Independent Director | Enterprise & Solution Architecture | Former Sr. VP & CTO of MF Utilities | BU Soft Tech | itTrident
IT leaders adjusting to expanded role and importance since coronavirus pandemic
"IT had to ensure that their technical environment could handle the increased online demand, as well any downstream impacts to supply chain, logistics and payment applications all connected to the online engine keeping the company operating and in business. IT had to refocus efforts to enable more robust customer engagements remotely via applications and web portals." She said the best examples of this are insurance claims, government services and applications, most of which were not submitted or enabled via an application or web portal before the COVID-19 pandemic. Despite the increase in importance due to the pandemic, IT has been gaining prominence within enterprises for years, Doebel said. IT has long been moving towards the role of business-critical for several years now as technology and innovation have become synonymous with business growth and improved customer experiences. IT teams rose to the occasion during the COVID-19 breakout and continue to drive innovation and transformation in these challenging times, she added. Important business decisions are now being put in the hands of IT workers who have to think of ways to future-proof their organizations.
5 famous analytics and AI disasters
In October 2020, Public Health England (PHE), the UK government body responsible for tallying new COVID-19 infections, revealed that nearly 16,000 coronavirus cases went unreported between Sept 25 and Oct 2. The culprit? Data limitations in Microsoft Excel. PHE uses an automated process to transfer COVID-19 positive lab results as a CSV file into Excel templates used by reporting dashboards and for contact tracing. Unfortunately, Excel spreadsheets can have a maximum of 1,048,576 rows and 16,384 columns per worksheet. Moreover, PHE was listing cases in columns rather than rows. ... The "glitch" didn't prevent individuals who got tested from receiving their results, but it did stymie contact tracing efforts, making it harder for the UK National Health Service (NHS) to identify and notify individuals who were in close contact with infected patients. In a statement on Oct. 4, Michael Brodie, interim chief executive of PHE, said NHS Test and Trace and PHE resolved the issue quickly and transferred all outstanding cases immediately into the NHS Test and Trace contact tracing system. PHE put in place a "rapid mitigation" that splits large files and has conducted a full end-to-end review of all systems to prevent similar incidents in the future.
Legal and security risks for businesses unaware of open source implications
The sobering reality is that compliance is not keeping up with usage of open source codebases. In view of this, businesses have to consider the impact of open source software in their operations as they move forward in a digitally connected world. Whether they are developing a product using open source components or involved in mergers and acquisitions activity, they have to conduct due diligence on the security and legal risks involved. One approach that has been proposed is to have a Bill of Materials (BOM) for software. Just like BOM used commonly by manufacturers of hardware, such as smartphones, a BOM for software will list the components and dependencies for each application and offer more visibility. In particular, a BOM generated by an independent software composition analysis (SCA) will offer advanced understanding for businesses seeking to understand the foundation on which they are building so many of their applications. Awareness is key to improvement. For starters, businesses cannot patch what they don't know they have. Patches must match source, so they know their code's origin. Open source is not only about source, either.
Building a hybrid SQL Server infrastructure
The solution to this challenge is to build a SANless failover cluster using SIOS DataKeeper. SIOS DataKeeper performs block-level replication of all the data on your on-prem storage to the local storage attached to your cloud-based VM. If disaster strikes your on-prem infrastructure and the WSFC fails SQL Server over to the cloud-based cluster node, that cloud-based node can access its own copy of your SQL Server databases and can fill in for your on-prem infrastructure for as long as you need it to. One other advantage afforded by the SANless failover cluster approach is that there is no limit on the number of databases you can replicate. Where you would need to upgrade to SQL Server Enterprise Edition to replicate your user databases to a third node in the cloud, the SANless clustering approach works with both the SQL Server Standard and Enterprise editions. While SQL Server Standard Edition is limited to two nodes in the cluster, DataKeeper allows you to replicate to a third node in the cloud with a manual recovery process. With Enterprise Edition the third node in the cloud can simply be part of the same cluster.
Why Enterprises Struggle with Cloud Data Lakes
The success of any cloud data lake project hinges on continual changes to maximize performance, reliability and cost efficiency. Each of these variables require constant and detailed monitoring and management of end-to-end workloads. Consider the evolution of data processing engines and the importance of leveraging the most advantageous opportunities around price and performance. Managing workload price performance and cloud cost optimization is just as crucial to cloud data lake implementations, where costs can and will quickly get out of hand if proper monitoring and management aren’t in place. ... Public cloud resources aren’t private by default. Securing a production cloud data lake requires extensive configuration and customization efforts–especially for enterprises that must fall in line with specific regulatory compliance oversights and governance mandates (HIPAA, PCI DSS, GDPR, etc). Achieving the requisite data safeguards often means enlisting experienced and dedicated teams who are equipped to lock down cloud resources and restrict access to only users that are authorized and credentialed.
The No-Code Generation is arriving
Of course, no-code tools often require code, or at least, the sort of deductive logic that is intrinsic to coding. You have to know how to design a pivot table, or understand what machine learning capability is and what it might be useful for. You have to think in terms of data, and about inputs, transformations and outputs. The key here is that no-code tools aren’t successful just because they are easier to use — they are successful because they are connecting with a new generation that understands precisely the sort of logic required by these platforms to function. Today’s students don’t just see their computers and mobile devices as consumption screens and have the ability to turn them on. They are widely using them as tools of self-expression, research and analysis. Take the popularity of platforms like Roblox and Minecraft. Easily derided as just a generation’s obsession with gaming, both platforms teach kids how to build entire worlds using their devices. Even better, as kids push the frontiers of the toolsets offered by these games, they are inspired to build their own tools. There has been a proliferation of guides and online communities to teach kids how to build their own games and plugins for these platforms (Lua has never been so popular).
Read more here ...
Assistant bei Bayer AG
4 年Dear Kannan, thank you for posting this interesting article I would like to share with my Data Mgt. colleagues! :-)