Today's Tech Digest - Dec 09, 2019

Today's Tech Digest - Dec 09, 2019

The PC was supposed to die a decade ago. Instead, this happened

Not all that long ago, tech pundits were convinced that by 2020 the personal computer as we know it would be extinct. You can even mark the date and time of the PC's death: January 27, 2010, at 10:00 A.M. Pacific Time, when Steve Jobs stepped onto a San Francisco stage to unveil the iPad. The precise moment was documented by noted Big Thinker Nicholas Carr in The New Republic with this memorable headline: "The PC Officially Died Today." ... And so, here we are, a full decade after the PC's untimely death, and the industry is still selling more than a quarter-billion-with-a-B personal computers every year. Which is pretty good for an industry that has been living on borrowed time for ten years. Maybe the reason the PC industry hasn't suffered a mass extinction event yet is because they adapted, and because those competing platforms weren't able to take over every PC-centric task. So what's different as we approach 2020? To get a proper before-and-after picture, I climbed into the Wayback Machine and traveled back to 2010.


Netflix open sources data science management tool

Netflix has open sourced Metaflow, an internally developed tool for building and managing Python-based data science projects. Metaflow addresses the entire data science workflow, from prototype to model deployment, and provides built-in integrations to AWS cloud services. Machine learning and data science projects need mechanisms to track the development of the code, data, and models. Doing all of that manually is error-prone, and tools for source code management, like Git, aren’t well-suited to all of these tasks. Metaflow provides Python APIs to the entire stack of technologies in a data science workflow, from access to the data through compute resources, versioning, model training, scheduling, and model deployment. ... Metaflow does not favor any particular machine learning framework or data science library. Metaflow projects are just Python code, with each step of a project’s data flow represented by common Python programming idioms. Each time a Metaflow project runs, the data it generates is given a unique ID. This lets you access every run—and every step of that run—by referring to its ID or user-assigned metadata.


AppSec in the Age of DevSecOps

Application security as a practice is dynamic. No two applications are the same, even if they belong in the same market domain, presumably operating on identical business use-cases. Some (of the many) factors that cause this variance include technology stack of choice, programming style of developers, a culture of the product engineering team, priority of the business, platforms used, etc. This consequentially results in a wide spectrum of unique customer needs. Take penetration testing as an example. This is a practice area that is presumably well-entrenched, both as a need and as an offering in the application security market. However, in today's age, even a singular requirement such as this could make or break an initial conversation. While, for one prospect, the need could be to conduct the test from a compliance (only) perspective, another's need could stem from a proactive software security initiative. There are many others who have internal assessment teams and often look outside for a third-party view.


Data centers in 2020: Automation, cheaper memory

Storage-class memory is memory that goes in a DRAM slot and can function like DRAM but can also function like an SSD. It has near-DRAM-like speed but has storage capabilities, too, effectively turning it into a cache for SSD. Intel and Micron were working on SCM together but parted company. Intel released its SCM product, Optane, in May, and Micron came to market in October with QuantX. South Korean memory giant SK Hynix is also working on a SCM product that’s different from the 3D XPoint technology Micron and Intel use as well. ... Remember when everyone was looking forward to shutting down their data centers entirely and moving to the cloud? So much for that idea. IDC’s latest CloudPulse survey suggests that 85% of enterprises plan to move workload from public to private environments over the next year. And a recent survey by Nutanix found 73% of respondents reported that they are moving some applications off the public cloud and back on-prem. Security was cited as the primary reason. And since it’s doubtful security will ever be good enough for some companies and some data, it seems the mad rush to the cloud will likely slow a little as people become more picky about what they put in the cloud and what they keep behind their firewall.


Batch Goes Out the Window: The Dawn of Data Orchestration

Add to the mix the whole world of streaming data. By open-sourcing Kafka to the Apache Foundation, LinkedIn let loose the gushing waters of data streams. These high-speed freeways of data largely circumvent traditional data management tooling, which can't stand the pressure. Doing the math, we see a vastly different scenario for today's data, as compared to only a few years ago. Companies have gone from relying on five to 10 source systems for an enterprise data warehouse to now embracing dozens or more systems across various analytical platforms. Meanwhile, the appetite for insights is greater than ever, as is the desire to dynamically link analytical systems with operational ones. The end result is a tremendous amount of energy focused on the need for ... meaningful data orchestration. For performance, governance, quality and a vast array of business needs, data orchestration is taking shape right now out of sheer necessity. The old highways for data have become too clogged and cannot support the necessary traffic. A whole new system is required. To wit, there are several software companies focused intently on solving this big problem. Here are just a few of the innovative firms that are shaping the data orchestration space.

Read more here ...

要查看或添加评论,请登录

Kannan Subbiah的更多文章

  • November 26, 2024

    November 26, 2024

    Just what the heck does an ‘AI PC’ do? As the PC market moves to AI PCs, x86 processor dominance will lessen over time,…

  • November 25, 2024

    November 25, 2024

    GitHub Copilot: Everything you need to know GitHub Copilot can make inline code suggestions in several ways. Give it a…

  • November 24, 2024

    November 24, 2024

    AI agents are unlike any technology ever “Reasoning” and “acting” (often implemented using the ReACT — Reasoning and…

  • November 23, 2024

    November 23, 2024

    AI Regulation Readiness: A Guide for Businesses The first thing to note about AI compliance today is that few laws and…

  • November 22, 2024

    November 22, 2024

    AI agents are coming to work — here’s what businesses need to know Defining exactly what an agent is can be tricky…

  • November 21, 2024

    November 21, 2024

    Building Resilient Cloud Architectures for Post-Disaster IT Recovery A resilient cloud architecture is designed to…

  • November 20, 2024

    November 20, 2024

    5 Steps To Cross the Operational Chasm in Incident Management A siloed approach to incident management slows down…

  • November 19, 2024

    November 19, 2024

    AI-driven software testing gains more champions but worries persist "There is a clear need to align quality engineering…

  • November 18, 2024

    November 18, 2024

    3 leadership lessons we can learn from ethical hackers By nature, hackers possess a knack for looking beyond the…

  • November 17, 2024

    November 17, 2024

    Why Are User Acceptance Tests Such a Hassle? In the reality of many projects, UAT often becomes irreplaceable and needs…

社区洞察

其他会员也浏览了