July 28, 2021
Kannan Subbiah
FCA | CISA | CGEIT | CCISO | GRC Consulting | Independent Director | Enterprise & Solution Architecture | Former Sr. VP & CTO of MF Utilities | BU Soft Tech | itTrident
The NoOps trend aims to remove all the frictions between development and the operation simply removing it, as the name tells. This may seem a drastic solution, but we do not have to take it literally. The right interpretation — the feasible one — is to remove as much as possible the human component in the deployment and delivery phases. That approach is naturally supported by the cloud that helps things to work by themself. ... One of the most evident scenarios that explain the benefit of AppOps is every application based on Kubernetes. If you will open each cluster you will find a lot of pod/service/deployment settings that are mostly the same. In fact, every PHP application has the same configuration, except for parameters. Same for Java, .Net, or other applications. The matter is that Kubernetes is agnostic to the content of the host's applications, so he needs to inform it about every detail. We have to start from the beginning for all new applications even if the technology is the same. Why? I should explain only once how a PHP application is composed.?
Living organisms and computer systems alike must have instantaneous knowledge to allow for rapid response to external events. This knowledge represents a direct input-to-output function that reacts to events or sequences within a well-mastered domain. In addition, humans and advanced intelligent machines accrue and utilize broader knowledge with some additional processing. I refer to this second level as standby knowledge. Actions or outcomes based on this standby knowledge require processing and internal resolution, which makes it slower than instantaneous knowledge. However, it will be applicable to a wider range of situations. Humans and intelligent machines need to interact with vast amounts of world knowledge so that they can retrieve the information required to solve new tasks or increase standby knowledge. Whatever the scope of knowledge is within the human brain or the boundaries of an AI system, there is substantially more information outside or recently relevant that warrants retrieval. We refer to this third level as retrieved external knowledge.
Good architecture starts with modularity. The first step towards breaking up a monolith is to think about the separation of code and data based on feature functionalities. This can be done within the monolith before physically separating them in a microservices environment. It is generally a good architectural practice to make the code base more manageable. Start with the data and pay close attention to how they’re being accessed. Make sure each service owns and controls access to its own data, and that data access only happens through clearly defined API contracts. I’ve seen a lot of cases where people start by pulling out the code logic but still rely on calls into a shared database inside the monolith. This often leads to a distributed monolith scenario where it ends up being the worst of both worlds - having to manage the complexities of microservices without any of the benefits. Benefits such as being able to quickly and independently deploy a subset of features into production. Getting data separation right is a cornerstone in migrating from a monolithic architecture to microservices.?
领英推荐
By being abstracted from the problem solving and planning process, enterprise architects became unresponsive, he said, and “buried in the catacombs” of IT. Data Architecture needs to look at finding and putting the right mechanisms in place to support business outcomes, which could be everything from data systems and data warehouses to visualization tools.?Data architects who see themselves as empowered to facilitate the practical implementation of the Business Strategy by offering whatever tools are needed will make decisions that create data value. “So now you see the data architect holding the keys to a lot of what’s happening in our organizations, because all roads lead through data.” Algmin thinks of data as energy, because stored data by itself can’t accomplish anything, and like energy, it comes with significant risks. “Data only has value when you put it to use, and if you put it to use inappropriately, you can create a huge mess,” such as a privacy breach. Like energy, it’s important to focus on how data is being used and have the right controls in place.?
In the new advisory, CISA warns that the attacks will also compromise email and social media accounts to conduct social engineering attacks. A person is much more likely to click on an email and download software if it comes from a trusted source. If the attacker has access to an employee's mailbox and can read previous messages, they can tailor their phishing email to be particularly appealing – and even make it look like a response to a previous message. Unlike “private sector” criminals, state-sponsored actors are more willing to use convoluted paths to get to their final targets, said Patricia Muoio, former chief of the NSA’s Trusted System Research Group, who is now general partner at SineWave Ventures. ... Private cybercriminals look for financial gain. They steal credit card information and health care data to sell on the black market, hijack machines to mine cryptocurrencies, and deploy ransomware. State-sponsored attackers are after different things. If they plan to use your company as an attack vector to go after another target, they'll want to compromise user accounts to get at their communications.?
Organizations commonly view data-architecture transformations as “waterfall” projects. They map out every distinct phase—from building a data lake and data pipelines up to implementing data-consumption tools—and then tackle each only after completing the previous ones. In fact, in our latest global survey on data transformation, we found that nearly three-quarters of global banks are knee-deep in such an approach.However, organizations can realize results faster by taking a use-case approach. Here, leaders build and deploy a minimum viable product that delivers the specific data components required for each desired use case (Exhibit 2). They then make adjustments as needed based on user feedback. ... Legitimate business concerns over the impact any changes might have on traditional workloads can slow modernization efforts to a crawl. Companies often spend significant time comparing the risks, trade-offs, and business outputs of new and legacy technologies to prove out the new technology. However, we find that legacy solutions cannot match the business performance, cost savings, or reduced risks of modern technology, such as data lakes.?