July 25, 2021

July 25, 2021

Discord CDN and API Abuses Drive Wave of Malware Detections

Discord’s CDN is being abused to host malware, while its API is being leveraged to exfiltrate stolen data and facilitate hacker command-and-control channels, Sophos added. Because Discord is heavily trafficked by younger gamers playing Fortnite, Minecraft and Roblox, a lot of the malware floating around amounts to little more than pranking, such as the use of code to crash an opponent’s game, Sophos explained. But the spike in info stealers and remote access trojans is more is more alarming, it added. “But the greatest percentage of the malware we found have a focus on credential and personal information theft, a wide variety of stealer malware as well as more versatile RATs,” the report said. “The threat actors behind these operations employed social engineering to spread credential-stealing malware, then use the victims’ harvested Discord credentials to target additional Discord users.” The team also found outdated malware including spyware and fake app info stealers being hosted on the Discord CDN.


The sixth sense of a successful leader

The Sixth Sense endowed Leader has to possess a highly developed awareness of what needs to be done, how it needs to be done, when it needs to be done, simultaneously anticipating the needs of the human resource involved on the task, and continuously visualising the anticipated outcome. For successful employment of sixth sense the Leader needs to work on the Higher Intellect plane. This does not preclude the Leader from seeking material gains, for that is the ultimate aim of any business. However, the Leader needs to weigh the anticipated gains against likely social and environment degradation. Similarly, the Leader needs to be steeped in definable values and ethics, which in turn act as the Sixth Sense Pillar. This Pillar will be the fulcrum enabling the Leader to leverage gains beyond cognitive reasoning, and to attain the status of a Karma Yogi. The Sixth Sense Leader, a true Karma Yogi, empowers self to develop: – Vision to create rather than await opportunity, by tapping dimensional awareness of the future. Analysing and risk acceptance capability, through capacity to subtly induce change in energy fields impacting the mission.


Why Data Management Needs An Aggregator Model

As enterprises shift to a hybrid multicloud architecture, they can no longer manage data within each storage silo, search for data within each storage silo and pay a heavy cost to move data from one silo to another. As GigaOm analyst Enrico Signoretti pointed out: "The trend is clear: The future of IT infrastructures is hybrid ... [and] it requires a different and modern approach to data management." Another key reason an aggregator model for data management is needed is that customers want to extract value from their data. To analyze and search unstructured data, vital information is stored in what is called "metadata" — information about the data itself. Metadata is like an electronic fingerprint of the data. For example, a photo on your phone might have information about the time and location when it was taken as well as who was in it. Metadata is very valuable, as it is used to search, find and index different types of unstructured data. Since storage business models are built on owning the data, storage vendors will move some blocks when moving data to the cloud rather than move all of the data.


Next-Gen Data Pipes With Spark, Kafka and k8s

In Lambda Architecture, there are two main layers – Batch and Speed. The first one transforms data in scheduled batches whereas the second is responsible for near real-time data processing. The batch layer is typically used when the source system sends the data in batches, access to the entire dataset is needed for required data processing, or the dataset is too large to be handled as a stream. On the contrary, stream processing is needed for small packets of high-velocity data, where the packets are either mutually independent or packets in close vicinity form a context. Naturally, both types of data processing are computation-intensive, though the memory requirement for batch is higher than the stream layer. Architects look for solution patterns that are elastic, fault-tolerant, performing, cost-effective, flexible, and, last but not least – distributed. ... Lambda architecture is complex because it has two separate components for handling batch and stream processing of data. The complexity can be reduced if one single technology component can serve both purposes.


Moving fast and breaking things cost us our privacy and security

Tokenized identification puts the power in the user’s hands. This is crucial not just for workplace access and identity, but for a host of other, even more important reasons. Tokenized digital IDs are encrypted and can only be used once, making it nearly impossible for anyone to view the data included in the digital ID should the system be breached. It’s like Signal, but for your digital IDs. As even more sophisticated technologies roll out, more personal data will be produced (and that means more data is vulnerable). It’s not just our driver’s licenses, credit cards or Social Security numbers we must worry about. Our biometrics and personal health-related data, like our medical records, are increasingly online and accessed for verification purposes. Encrypted digital IDs are incredibly important because of the prevalence of hacking and identity theft. Without tokenized digital IDs, we are all vulnerable. We saw what happened with the Colonial Pipeline ransomware attack recently. It crippled a large portion of the U.S. pipeline system for weeks, showing that critical parts of our infrastructure are extremely vulnerable to breaches.


Agile at 20: The Failed Rebellion

In some ways, Agile was a grassroots labor movement. It certainly started with the practitioners on the ground and got pushed upwards into management. How did this ever succeed? It’s partially due to developers growing in number and value to their businesses, gaining clout. But the biggest factor, in my view, is that the traditional waterfall approach simply wasn’t working. As software got more complicated and the pace of business accelerated and the sophistication of users rose, trying to plan everything up front became impossible. Embracing iterative development was logical, if a bit scary for managers used to planning everything. I remember meetings in the mid-2000s where you could tell management wasn’t really buying it, but they were out of ideas. What the hell, let’s try this crazy idea the engineers keep talking about. We’re not hitting deadlines now. How much worse can it get? Then to their surprise, it started working, kind of, in fits and starts. Teams would thrash for a while and then slowly gain their legs, discovering what patterns worked for that individual team, picking up momentum.

Read more here ...

要查看或添加评论,请登录

社区洞察

其他会员也浏览了