October 23, 2021
Kannan Subbiah
FCA | CISA | CGEIT | CCISO | GRC Consulting | Independent Director | Enterprise & Solution Architecture | Former Sr. VP & CTO of MF Utilities | BU Soft Tech | itTrident
AI automatic testing tools can generate tests automatically, and that too with little to no code at all, so developers don’t have to worry about writing test codes. The AI is evolved enough to automatically generate tests by learning the app flows, screens, and elements that require little to no human involvement. The automation tools are so well built and perform automated audits or checks so frequently that there are almost no instances of errors. They capture feedback at every instant and analyze the input and identify the errors in real-time. The intelligence of the tools allows the developers or the team members to reduce participation in test automation creation activities and free up their time to focus on more important and urgent tasks. And eventually, develop a more productive system for the organization. AI is proficient in handling big data with minimal human involvement. For DevOps, this means that the huge data sets can now be managed with minimal effort. Since DevOps involves and impacts three functions of an organization simultaneously, it also has tons of data to be managed and maintained on an everyday basis.
We are continuing to see AI getting better but it needs to be applied in the right places. For example, teams can automate more of their processes and manual tasks, improving workflows and reducing busywork for agents. Costs are reduced, and customer demand is more easily met, which has proven to lead to happier, more productive support teams. “Solving customer problems and leaving a customer happy is what gets an agent excited about getting up every day and going into the job,†Wolverton says. “We strongly believe getting all of that repetitive work and processes automated so that they can focus on the rewarding work is what’s going to keep their motivation high and keep them in your organization.†It’s also about letting customers help themselves, she adds — more and more, customers want their answer fast, and waiting on the phone for the next available agent isn’t going to cut it. If you can get people their answer quickly and accurately through a search or through a bot, and then only escalate when the issue becomes more complex and a human is uniquely qualified to handle the issue, you’re going to have far more satisfied customers.
“I’d never considered cyber or even information technology as a career growing up. My interests always piqued around history and physics. I in fact failed first-year engineering for having written an essay on David Hume when asked to discuss induction in engineering. I have an undergraduate degree with a double major in history & philosophy of science and quantum physics. I continued down this path, working in the university’s quantum computing department on the development of quantum circuitry. My work centered on the development of superconducting diamond[s], looking to test and establish the reality of theoretical models predicting room-temperature superconductivity. I believed in making Marty McFly’s future a reality; I was on the path to making superconducting circuitry with the sci-fi application of a hoverboard — although I still don’t believe it’d be able to hover across water. “One day while taking adult skiing lessons with an instructor (now my fiancé), I realized my skillsets weren’t technically focused but operational. I’d spent my theses developing, constructing and rebuilding processes.
领英推è
When you cut through it, making the move to agile means you’re really going to be breaking the company down into self-sufficient, multidisciplinary, multidimensional teams. That’s the very essence of agile. However, it’s not all about structure. There are many barriers that must be removed to allow those teams to really work. Some barriers you don’t quite realize are there, and many other barriers don’t appear as barriers today but do appear as barriers going forward. So if you do move the organization to agile, be prepared to drive through a number of the barriers. Because you only really get the true benefit that lies in agile if you’re prepared to put those to the stake. I have talked to many organizations interested in the transition to agile, and in the early conversations the focus is understandably always on the organization’s structure. Having “seen the movie,†and helped many companies in the making of their movie, if I had $100 to spend on agile, I’d put only $10 to $15 against organizational structure. All of the rest I would invest in agile ceremonies and processes, particularly in the people processes.
Low-code/no-code platforms and capabilities are now being provided by a wide range of providers including startups trying to fill various niches in the technology all the way up to the large enterprise products and services companies. We have covered the low-code/no-code options that are available with Microsoft, Google and Amazon previously. While there is plenty of crossover ability to connect to the other companies’ products and services, Amazon is the only one that lacks any ability to tie into data that might be hosted on the other two low-code/no-code platforms.?Choosing a low-code/no-code platform will likely be impacted by where an organization has its data located. Just like other services offered by these big three companies, it is much easier to work within the same ecosystem rather than mixing and matching across low-code/no-code tools. Once that decision is made, the work of building out those first low-code tools for an organization should be fairly straightforward. Low-code/no-code development intentionally targets knowledge workers who have familiarity with the processes and workflows within their business unit, department or division but do not necessarily have any coding experience.
This release also brings more features to parallel query execution, in which PostgreSQL can devise query plans that can leverage multiple CPUs to answer queries faster. Now your database can execute queries in parallel for RETURN QUERY and REFRESH MATERIALIZED VIEW. More prominent updates include pipeline mode for LibPQ, which is the interface that developers use to connect their application to the database. With PostgreSQL, they now have the ability to use a pipeline mode. LibPQ used to be single-threaded, where it would wait for one query to complete execution before sending the next one to the database. Now devs can feed multi-transactions into the pipeline and LibPQ will execute them turn by turn to feedback results into the application. The application no longer has to wait for the first transaction to complete to execute the next one. This was one of the updates in which Shahid commented, “Why did we not think about this earlier? This is such a no-brainer! But that’s how technology progresses.†Another potential no-brainer-in-hindsight is an upgrade to TOAST, which now allows for LZ4 compression. TOAST is a system that allows the storage of much larger data.?