September 18, 2020
Kannan Subbiah
FCA | CISA | CGEIT | CCISO | GRC Consulting | Independent Director | Enterprise & Solution Architecture | Former Sr. VP & CTO of MF Utilities | BU Soft Tech | itTrident
Windows 10 upgrades are rarely useful, say IT admins
There is a disconnect between Microsoft's efforts and expectations – months of development time and testing to produce features and functionality that customers will clamor for – and the reaction by, in electioneering terms, a landslide-sized majority of those customers. In many cases, IT admins simply shrug at what Microsoft trumpets. "I understand the concept of WaaS, and the ability to upgrade the OS without a wipe/re-install is a good concept," one of those polled said. "((But)) let's concentrate more on useful features, like an upgraded File Explorer, a Start menu that always works, and context-sensitive (and useful) help, and less on, 'It's time to release a new feature update, whether it has any useful new features or not.'" Some were considerably harsher in taking feature upgrades to task. "Don't have a clue why they think some of the new features might be worth our time, or even theirs," said another of those polled. And others decried what they saw as wasted opportunities. "It's mostly bells, whistles and window-dressing," one IT admin said. "It seems like no fundamental problems are tackled. Although updates DO every now and then cause new problems in fundamental functionality. Looks like there's at least some scratching done on the fundamental surface – ((but)) without explanation."
Adaptive Architecture: A Bridge between Fashion and Technology
Conceptually, IT borrowed a lot of themes from Civil Engineering, one being Architecture. Despite the 3000 years that separate both areas, Architecture & Software Architecture share similar words through the multiple definitions that they have, such as "structure", "components", and "environment". At first, that relationship was really strong because the technology was "more concrete", heavier, and, obviously, slower. Everything was super difficult to change and applications used to survive without an update for quite a long time. But, as computers advance, the world is submerged in a massive flow of information on digital platforms and customers can directly connect to businesses through these channels, existing conditions that demand companies to be able to push reliable modifications to their websites, or applications, every day, or even multiple times throughout the day. This progress didn't happen overnight, and as digital evolved, the technical landscape started to change, reflecting new requirements and problems. In 2001, an initiative to understand these obstacles to develop software, obstacles still relevant to this day, seventeen people gathered in the Wasatch mountains of Utah. From that reunion, "The Agile Manifesto" was created, a declaration based on four key values and 12 principles, establishing a mindset called "Agile".
Deep Dive into OWIN Katana
OWIN stands for Open Web Interface for .NET. OWIN is a open standard specification which defines a standard interface between .NET web servers and web applications. The aim is to provide a standard interface which is simple, pluggable and lightweight. OWIN is motivated by the development of web frameworks in other coding languages such Node.js for JavaScript, Rack for Ruby, and WSGI for Phyton. All these web frameworks are designed to be fast, simple and they enable the development of web applications in a modular way. In contrast, prior to OWIN, every .NET web application required a dependency on >System.Web.dll, which tightly coupled with Microsoft's IIS (Internet Information Services). This meant that .NET web applications came with a number of application component stacks from IIS, whether they were actually required or not. This made .NET web applications, as a whole, heavier and they performed slower than their counterparts in other coding languages in many benchmarks OWIN was initiated by members of Microsoft's communities; such as C#, F# and dynamic programming communities. Thus, the specification is largely influenced by the programming paradigm of those communities.
Banking on digitalisation: A transformation journey enabled by technology, powered by humans
Banks are now staring at the massive challenge of continuing their digital investments in a cost constrained environment. Getting their workforce ready to develop the technologies, while continuing to deliver value to their customers is another issue. At the same time, they are competing with new digital banks that will undoubtedly come in with newer technology built on modern architecture without the legacy debt. However, there are industry players that may have cracked the code to successful digitalisation. I know of incumbent banks as well as digital banks developing world-class digital capabilities at lower costs, while training their people to make full use of their new digital investments. Recently the finance function of a leading global universal bank adopted a “citizen-led” digital transformation, training 300+ “citizen” developers who identified 200+ new use cases resulting in an annual run rate cost reduction of $15 million. This case study highlights the importance of engaging and upskilling your workforce while contributing to bottom line benefits. Over the last two decades, technology by itself has evolved and now has the ability to transform whole businesses in the financial services sector, similar to its impact on other industries such as retail and media. Traditionally, for banks, technology was a support function enabling product and customer strategies.
Google details RigL algorithm for building more efficient neural networks
Google researchers put RigL to the test in an experiment involving an image processing model. It was given the task of analyzing images containing different characters. During the model training phase, RigL determined that the AI only needs to analyze the character in the foreground of each image and can skip processing the background pixels, which don’t contain any useful information. The algorithm then removed connections used for processing background pixels and added new, more efficient ones in their places. “The algorithm identifies which neurons should be active during training, which helps the optimization process to utilize the most relevant connections and results in better sparse solutions,” Google research engineers Utku Evci and Pablo Samuel Castro explained in a blog post. “At regularly spaced intervals we remove a fraction of the connections.” There are other methods besides RigL that attempt to compress neural networks by removing redundant connections. However, those methods have the downside of significantly reducing the compressed model’s accuracy, which limits their practical application. Google says RigL achieves higher accuracy than three of the most sophisticated alternative techniques while also “consistently requiring fewer FLOPs (and memory footprint) than the other methods.”
IBM, AI And The Battle For Cybersecurity
While older adversarial attack patterns were algorithmic and easier to detect, new attacks add AI features such as natural language processing and a more natural human computer interaction to make malware more evasive, pervasive and scalable. The malware will use AI to keep changing form in order to be more evasive and fool common detection techniques and rules. Automated techniques can make the malware more scalable and combined with AI can move laterally through an enterprise and attack targets without human intervention. The use of AI in cybersecurity attacks will likely become more pervasive. Better spam can be crafted that avoids detection or personalized to a specific target as a form of spear phishing attack by using natural language processing to craft more human like messages. In addition, malware can be smart enough to understand when it is in a honeypot or sandbox and will avoid malicious execution to look more benign and not tip off security defenses. Adversarial AI attacks the human element with the use of AI augmented chatbots to disguise the attack with human-like emulation. This can escalate to the point where AI powered voice synthesis can fool people into believing that they’re dealing with a real human within their organization.
Read more here ...