Jun 10, 2022

Jun 10, 2022

Everything You Need to Know About Enterprise Architecture vs. Project Management

Even though both have their own set of specialized skills, they still correlate in certain areas. Sometimes different teams are working on various initiatives or parts of a landscape. In the middle of the project, they find out that each team needs to work on the same bit of the software or service ... However, to execute such a situation without any mishap needs some coordination and a good system in place to foresee these dependencies. Since it is hard to keep track of all the dependencies and some might come to bite you from the back later. This is where enterprise architecture is needed. Enterprise architects are usually well aware of these relationships and with their expertise in architecture models, they can uncover these dependencies better. Such dependencies are usually unknown to the project or program managers. Therefore, this is where enterprise architect vs. Project management correlates. Enterprise architecture is about managing the coherence of your business whereas project management is responsible for planning and managing usually from the financial and resource perspective.


A Minimum Viable Product Needs a Minimum Viable Architecture

In short, as the team learns more about what the product needs to be, they only build as much of the product and make as few architectural decisions as is absolutely essential to meet the needs they know about now; the product continues to be an MVP, and the architecture continues to be an MVA supporting the MVP. The reason for both of these actions is simple: teams can spend a lot of time and effort implementing features and QARs in products, only to find that customers don’t share their opinion on their value; beliefs in what is valuable are merely assumptions until they are validated by customers. This is where hypotheses and experiments are useful. In simplified terms, a hypothesis is a proposed explanation for some observation that has not yet been proven (or disproven). In the context of requirements, it is a belief that doing something will lead to something else, such as delivering feature X will lead to outcome Y. An experiment is a test that is designed to prove or reject some hypothesis.


In Search of Coding Quality

The major difference between good- and poor-quality coding is maintainability, states Kulbir Raina, Agile and DevOps leader at enterprise advisory firm Capgemini. Therefore, the best direct measurement indicator is operational expense (OPEX). “The lower the OPEX, the better the code,” he says. Other variables that can be used to differentiate code quality are scalability, readability, reusability, extensibility, refactorability, and simplicity. Code quality can also be effectively measured by identifying technical-debt (non-functional requirements) and defects (how well the code aligns to the laid specifications and functional requirements,” Raina says. “Software documentation and continuous testing provide other ways to continuously measure and improve the quality of code using faster feedback loops,” he adds. ... The impact development speed has on quality is a question that's been hotly debated for many years. “It really depends on the context in which your software is running,” Bruhmuller says. Bruhmuller says his organization constantly deploys to production, relying on testing and monitoring to ensure quality.


A chip that can classify nearly 2 billion images per second

While current, consumer-grade image classification technology on a digital chip can perform billions of computations per second, making it fast enough for most applications, more sophisticated image classification such as identifying moving objects, 3D object identification, or classification of microscopic cells in the body, are pushing the computational limits of even the most powerful technology. The current speed limit of these technologies is set by the clock-based schedule of computation steps in a computer processor, where computations occur one after another on a linear schedule. To address this limitation, Penn Engineers have created the first scalable chip that classifies and recognizes images almost instantaneously. Firooz Aflatouni, Associate Professor in Electrical and Systems Engineering, along with postdoctoral fellow Farshid Ashtiani and graduate student Alexander J. Geers, have removed the four main time-consuming culprits in the traditional computer chip: the conversion of optical to electrical signals, the need for converting the input data to binary format, a large memory module, and clock-based computations.


Scrum, Remote Teams, & Success: Five Ways to Have All Three

Agile teams have long made use of team agreements (or team working agreements). These set ground rules for the team, created by the team and enforced by the team. When our working environment shifts as much as it has recently, consider establishing some new team agreements specifically designed to address remote work. Examples? On-camera expectations, team core working hours (especially if you’re spread across multiple time zones) and setting aside focus time during which interruptions are kept to a minimum. ... One of the huge disadvantages of a remote team is the lack of personal connections that are made just grabbing a cup of coffee or standing around the water cooler. Remote teams need to be deliberate about counteracting isolation. Consider taking the first few minutes of a meeting to talk about anything non-work related. Set up a time for a team show-and-tell in which each team member can share something from their home or background in their home office that matters to them. Find excuses for the team to share anything that helps teammates get to know each other more—as human beings, not just co-workers.?


Cisco introduces innovations driving new security cloud strategy

Ushering in the next generation of zero trust, Cisco is building solutions that enable true continuous trusted access by constantly verifying user and device identity, device posture, vulnerabilities, and indicators of compromise. These intelligent checks take place in the background, leaving the user to work without security getting in the way. Cisco is introducing less intrusive methods for risk-based authentication, including the patent-pending Wi-Fi fingerprint as an effective location proxy without compromising user privacy. To evaluate risk after a user logs in, Cisco is building session trust analysis using the open Shared Signals and Events standards to share information between vendors. Cisco unveiled the first integration of this technology with a demo of Cisco Secure Access by Duo and Box. “The threat landscape today is evolving faster than ever before,” said Aaron Levie, CEO and Co-founder of Box. “We are excited to strengthen our relationship with Cisco and deliver customers with a powerful new tool that enables them to act on changes in risk dynamically and in near real-time.

Read more here ...

要查看或添加评论,请登录

社区洞察

其他会员也浏览了