April 12, 2021
Kannan Subbiah
FCA | CISA | CGEIT | CCISO | GRC Consulting | Independent Director | Enterprise & Solution Architecture | Former Sr. VP & CTO of MF Utilities | BU Soft Tech | itTrident
Coding interviews are terrible. Can we make them better?
A typical coding interview will involve presenting a candidate with a technical problem, which they'll have to solve in real time and in front of the interviewing panel. While these typically vary from one company to another, one common format is whiteboard coding, whereby a candidate might be asked to provide a solution to a problem involving a binary tree. It was a binary tree task that drew the ire of Howell in his now-famous tweet. These are a fairly typical part of technical interviews, designed to assess a candidate's ability to solve a programming problem and show their thinking 'out loud'. Still, most programmers say this isn't representative of anything they'd have to do in their day-to-day job, and say it's an outdated means of assessing candidates that doesn't reflect their skill level. "These little challenges don't show the greater skill sets, which for me are the ability to construct large programs," says Howell. "It's not about small algorithms. It's about the design of larger systems, and that's way more important." Howell also sees traditional coding interviews as being reflective of an industry that focuses too much on building at speed. "It's partly because the software industry moves so fast," he says.
How Augmented Reality Strengthens Biotech Manufacturing
Factories where engineers or scientists are using smart glasses to obtain virtual guidance, operators working with remote vendors to detect equipment failures in real-time, or interactive training sessions planned by directors located in another continent, are already here. “The barriers to adoption are decreasing as the AR industry becomes more robust,” notes Stracquatanio. Probably, the biggest advantage of AR is it enables seeing the production process virtually, without the need to be there. “It’s a game-changer for the industry. Individuals can have eyes and ears on site at a moment’s notice to address an emerging issue, or to host routine remote collaboration sessions,” Stracquatanio highlights. AR can also increase control over the manufacturing process. Pharma and biotech companies cannot afford mistakes during the production phase. A little oversight might lead to serious consequences such as having to start from scratch, which can be very expensive and time-consuming. A recent example is that of Johnson & Johnson’s manufacturing partner Emergent BioSolutions, whose workers erroneously mixed ingredients from two different Covid-19 vaccines; this led to wasting around 15 million vaccine doses.
Fileless Malware, Endpoint Attacks on the Rise
Cybercriminals are increasingly leveraging fileless malware, cryptominers and encrypted attacks, targeting users both at remote locations as well as corporate assets behind the traditional network perimeter. These were among the findings of WatchGuard Technologies’ Internet Security Report for Q4 2020, which found fileless malware and cryptominer attack rates grew by nearly 900% and 25%, respectively, while unique ransomware payloads plummeted by 48% in 2020 compared to 2019. The report also found botnet malware targeting IoT devices and routers became a top strain, among them the Linux.Generic virus (also known as “The Moon”), malware which is part of a network of servers that directly targets IoT devices and consumer-grade network devices, like routers, to exploit any open vulnerabilities. Total network attack detections grew by 5% in Q4, reaching their highest level in more than two years, while total unique network attack signatures showed steady growth as well, with a 4% increase compared with the third quarter of 2020. “We believe the increase in endpoint attacks between 2019 and 2020 is largely due to the widespread rise of remote work in response to the global pandemic,” Corey Nachreiner, WatchGuard CTO, explained.
Could social media networks pave the way towards stronger authentication?
Passwords are still the most common form of user authentication, “protecting” accounts, devices and systems, but alone, they don’t provide strong security. Not only that, they don’t offer the best user experience. Many passwords don’t even meet the minimum criteria of being unique and complex. People reuse passwords across accounts because they simply can’t keep track of all the logins they have. They choose passwords that are easy to remember to ease the burden, but that makes them easy to guess too. In fact, our research shows that people reuse their passwords across an average of ten personal accounts, while ‘123456’ still topped the list for the most common password in 2020. Even when they have chosen well, their unique and complex password can still fall victim to a modern phishing attack. After all, even an exemplary password can’t protect an account if the holder has been tricked into providing the information. From a user experience perspective, you have the stress and strain of choosing a unique, complex password each time that also meets the criteria demanded by the platform or service provider.
Nation-state cyber attacks double in three years
“Cyber crime economies are shaping the character of nation-state conflicts,” said McGuire. “There is also a ‘second generation’ of cyber weaponry in development that draws upon enhanced capabilities in computing power, AI [artificial intelligence] and cyber/physical integrations. One such example is ‘Boomerang’ malware, which is ‘captured’ malware that can be turned inward to operate against its owners. “Nation states are also developing weaponised chatbots to deliver more persuasive phishing messages, react to new events and send messages via social media sites. In the future, we can also expect to see the use of deepfakes on the digital battlefield, drone swarms capable of disrupting communications or engaging in surveillance, and quantum computing devices with the ability to break almost any encrypted system.” To ease rising tensions and prevent nation states from being drawn into more hostile cyber attacks, 70% of the expert panel said they thought some kind of international treaty would ultimately be necessary – this is by no means a new idea – but just 15% of them thought a cyber convention would be agreed on this decade, 37% said it was more likely to come in the 2030s, and 30% said it would probably never happen.
Quantum computer based on shuttling ions is built by Honeywell
Trapped-ion qubits were used to implement the first quantum logic gates in 1995, and the proposal for a quantum charged coupled device (QCCD) – a type of quantum computer with actions controlled by shuffling the ions around – was first made in 2002 by researchers led by David Wineland of the US National Institute of Standards and Technology, who went on to win the 2012 Nobel Prize for Physics for his work. Quantum gates have subsequently been demonstrated in multiple platforms, from Rydberg atoms to defects in diamond. The quantum computing technology first used by IT giants, however, was solid state qubits. In these, the qubits are superconducting circuits, which can be mounted directly on to a chip. These rapidly surpassed the benchmarks set by trapped ions, and are used in record-breaking machines from IBM and Google: “Working with trapped ions, I would be asked by people, ‘Why aren’t you working with superconducting qubits? Isn’t that race pretty much already settled?’,” says Winfried Hensinger of the UK’s University of Sussex. Recently, however, the progress made using superconducting circuits appears to be slowing as quantum computers integrate more and more qubits.
Read more here ...