October 12, 2021
Kannan Subbiah
FCA | CISA | CGEIT | CCISO | GRC Consulting | Independent Director | Enterprise & Solution Architecture | Former Sr. VP & CTO of MF Utilities | BU Soft Tech | itTrident
Las Vegas began deploying edge computing technology in 2018 while working on smart traffic solutions. A key driver for analyzing data at the network edge came from working with autonomous vehicle companies that needed near real-time data, Sherwood says. “Edge computing allowed for data to be analyzed and provided to the recipient in a manner which provided the best in speed,” Sherwood says. Visualizing data in a real-time format “allows for decision-makers to make more informed decisions.” The addition of predictive analytics and artificial intelligence (AI) is helping with decisions that are improving traffic flows, “and in the near future will have dramatic impacts on reducing traffic congestion and improving transit times and outcomes,” Sherwood says. To help bolster its data analytics operations overall and at the edge, the city government is developing a data analytics group as an offshoot of the IT department. The Office of Data and Analytics will drive how data is governed and used within the organization, Sherwood says. “We see lots of opportunities with many new technologies coming onto the market,” he says.
In order to learn how to test with databases, one must first ‘unlearn’ a few things starting with the concept of unit tests and integration tests. To put it bluntly, the modern definitions of these terms are so far removed from their original meanings that they are no longer useful for conversation. So, for the remainder of this article, we aren’t going to use either of them. The fundamental goal of testing is to produce information. A test should tell you something about the thing being tested you may not have known before. The more information you get the better. So, we are going to ignore anyone who says, “A test should only have one assertion” and replace it with, “A test should have as many assertions as needed to prove a fact”. The next problematic expression we need to deal with is, “All tests should be isolated”. This is often misunderstood to mean each test should be full of mocks so the function you’re testing is segregated from its dependencies. This is nonsense, as that function won’t be segregated from its dependencies in production.
Is the Great Resignation a temporary trend or a long-term structural change? There’s no way to know but my money is on the latter. Life-changing events change lives, whether or not we realize it as it is occurring. An individual crisis changes individual behavior, worldwide crises cause lasting social and cultural consequences. The pandemic completely upended the employee experience, and while many employers continued to monitor productivity, most didn’t devote nearly the same amount of effort to soliciting real-time, real-world feedback from remote workers about the challenges, struggles and stresses they were facing. McKinsey identified “employees prioritize relational factors, whereas employers focus on transactional ones”. By neglecting to engage with remote employees, not listening to nor addressing their issues and concerns, employers missed a once-in-a-lifetime opportunity to build trust in within the organization and loyalty from workers. As the Great Resignation plays out and the workforce reshuffles, it will be interesting to see if employers and workers can engage, listen, and trust each other enough to find common ground.
Ransomware offers a low-investment, high-profit business model that’s irresistible to criminals. What began with single-PC attacks now includes crippling network-wide attacks using multiple extortion methods to target both your data and reputation, all enabled by human intelligence. Through this combination of real-time intelligence and broader criminal tactics, ransomware operators have driven their profits to unprecedented levels. This human-operated ransomware, also known as “big game ransomware,” involves criminals hunting for large targets that will provide a substantial payday through syndicates and affiliates. Ransomware is becoming a modular system like any other big business, including ransomware as a service (RaaS). With RaaS there isn’t a single individual behind a ransomware attack; rather, there are multiple groups. For example, one threat actor may develop and deploy malware that gives one attacker access to a certain category of victims; whereas, a different actor may merely deploy malware.
Simply put, the phishing “game” only has two moves: the scammers always play first, trying to trick you, and you always get to play second, after they’ve sent out their fake message. There’s little or no time limit for your move; you can ask for as much help as you like; you’ve probably got years of experience playing this game already; the crooks often make really silly mistakes that are easy to sp …and if you aren’t sure, you can simply ignore the message that the crooks just sent, which means you win anyway! How hard can it be to beat the criminals every time? Of course, as with many things in life, the moment you take it for granted that you will win every time is often the very same moment that you stop being careful, and that’s when accidents happen. Don’t forget that phishing scammers get to try over and over again. They can use email attachments one day, dodgy web links the next, rogue SMSes the day after that, and if none of those work, they can send you fraudulent messages on a social network: The crooks can try threatening you with closing your account, warning you of an invoice you need to pay, flattering you with false praise, offering you a new job, or announcing that you’ve won a fake prize.
As technology extends deeper into every aspect of business, the tip of the spear is often some device at the outer edge of the network, whether a connected industrial controller, a soil moisture sensor, a smartphone, or a security cam. This ballooning internet of things is already collecting petabytes of data, some of it processed for analysis and some of it immediately actionable. So an architectural problem arises: You don’t want to connect all those devices and stream all that data directly to some centralized cloud or company data center. The latency and data transfer costs are too high. That’s where edge computing comes in. It provides the “intermediating infrastructure and critical services between core datacenters and intelligent endpoints,” as the research firm IDC puts it. In other words, edge computing provides a vital layer of compute and storage physically close to IoT endpoints, so that control devices can respond with low latency – and edge analytics processing can reduce the amount of data that needs to be transferred to the core.