August 16, 2024

August 16, 2024

W3C issues new technical draft for verifiable credentials standards

Part of the promise of the W3C standards is the ability to share only the data that’s necessary for a completing a secure digital transaction, Goodwin explained, noting that DHS’s Privacy Office is charged with “embedding and enforcing privacy protections and transparency in all DHS activities.” DHS was brought into the process to review the W3C Verifiable Credentials Data Model and Decentralized Identifiers framework and to advise on potential issues. DHS S&T said in a statement last month that “part of the promise of the W3C standards is the ability to share only the data required for a transaction,” which it sees as “an important step towards putting privacy back in the hands of the people.” “Beyond ensuring global interoperability, standards developed by the W3C undergo wide reviews that ensure that they incorporate security, privacy, accessibility, and internationalization,” said DHS Silicon Valley Innovation Program Managing Director Melissa Oh. “By helping implement these standards in our digital credentialing efforts, S&T, through SVIP, is helping to ensure that the technologies we use make a difference for people in how they secure their digital transactions and protect their privacy.”


Managing Technical Debt in the Midst of Modernization

Rather than delivering a product and then worrying about technical debt, it is more prudent to measure and address it continuously from the early stages of a project, including requirement and design, not just the coding phase. Project teams should be incentivized to identify improvement areas as part of their day-to-day work and implement the fixes as and when possible. Early detection and remediation can help streamline IT operations, improve efficiencies, and optimize cost. ... Inadequate technical knowledge or limited experience in the latest skills itself leads to technical debt. Enterprises must invest and prioritize continuous learning to keep their talent pool up to date with the latest technologies. A skill-gap analysis helps forecast the need for skills for future initiatives. Teams should be encouraged to upskill in AI, cloud, and other latest technologies, as well as modern design and security standards. This will help enterprises address the technical debt skill-gap effectively. Enterprises can also employ a hub and spoke model, where a central team offers automation and expert guidance while each development team maintains their own applications, systems and related technical debt.


Generative AI Adoption: What’s Fueling the Growth?

The banking, financial services, and insurance (BFSI) sector is another area where generative AI is making a significant impact. In this industry, generative AI enhances customer service, risk management, fraud detection, and regulatory compliance. By automating routine tasks and providing more accurate and timely insights, generative AI helps financial institutions improve efficiency and deliver better services to their customers. For instance, generative AI can be used to create personalized customer experiences by analyzing customer data and predicting their needs. This capability allows banks to offer tailored products and services, improving customer satisfaction and loyalty. ... The life sciences sector stands to benefit enormously from the adoption of generative AI. In this industry, generative AI is used to accelerate drug discovery, facilitate personalized medicine, ensure quality management, and aid in regulatory compliance. By automating and optimizing various processes, generative AI helps life sciences companies bring new treatments to market more quickly and efficiently. For instance, generative AI can largely draw on masses of biological data to find a probable medication, much faster than conventional means.?


Overcoming Software Testing ‘Alert Fatigue’

Before “shift left” became the norm, developers would write code that quality assurance testing teams would then comb through and identify the initial bugs in the product. Developers were then only tasked with reviewing the proofed end product to ensure it functioned as they initially envisioned. But now, the testing and quality control onus has been put on developers earlier and earlier. An outcome of this dynamic is that developers are becoming increasingly numb to the high volume of bugs they are coming across in the process, and as a result, they are pushing bad code to production. ... Organizations must ensure that vital testing phases are robust and well-defined to mitigate these adverse outcomes. These phases should include comprehensive automated testing, continuous integration (CI) practices, and rigorous manual testing by dedicated QA teams. Developers should focus on unit and integration tests, while QA teams handle system, regression acceptance, and exploratory testing. This division of labor enables developers to concentrate on writing and refining code while QA specialists ensure the software meets the highest quality standards before production.


SSD capacities set to surge as industry eyes 128 TB drives

Maximum SSD capacity is expected to double from its current 61.44 TB maximum by mid-2025, giving us 122 TB and even 128 TB drives, with the prospect of exabyte-capacity racks. Five suppliers have discussed and/or demonstrated prototypes of 100-plus TB capacity SSDs recently. ... Systems with enclosures full of high-capacity SSDs will need to cope with drive failure and that means RAID or erasure coding schemes. SSD rebuilds take less time than HDD rebuilds but higher-capacity SSDs take longer. Looking at a 61.44 TB Solidigm D5-P5336 drive, its max sequential write bandwidth is 3 GBps. For example, rebuilding a 61.44 TB Solidigm D5-P5336 drive with a max sequential write bandwidth of 3 GBps would take approximately 5.7 hours. A 128 TB drive will take 11.85 hours at the same 3 GBps write rate. These are not insubstantial periods. Kioxia has devised an SSD RAID parity compute offload scheme with a parity compute block in the SSD controller and direct memory access to neighboring SSDs to get the rebuild data. This avoids the host server’s processor getting involved in RAID parity compute IO and could accelerate SSD rebuild speed.


Putting Individuals Back In Charge Of Their Own Identities

Digital identity comprises many signals to ensure it can accurately reflect the real identity of the relevant individual. It includes biometric data, ID data, phone data, and much more. In shareable IDs, these unique features are captured through a combination of AI and biometrics which provide robust protection against forgery and replication, and so provide a high assurance that a person is who they say they are. Importantly, these technologies provide an easy and seamless alternative to other verification processes. For most people, visiting a bank branch to prove their identity with paper documents is no longer convenient, while knowledge-based authentication, like entering your mother’s maiden name, is not viable because data breaches make this information readily for sale to nefarious actors. It’s no wonder that 76% of consumers find biometrics more convenient, while 80% find it more secure than other options.? ... A shareable identity is a user-controlled identity credential that can be stored on a device and used remotely. Individuals can then simply re-use the same digital ID to gain access to services without waiting in line, offering time-saving convenience for all.

Read more here ...

要查看或添加评论,请登录

社区洞察

其他会员也浏览了