October 07, 2023

October 07, 2023

No Need to Have a 'FOBO' for AI

It is a well- known fact that before AI takes your job, someone using AI will take it. To stay relevant in the job market, it is then absolutely essential to adopt AI and automation tools to enhance one's productivity to ensure that his or her job is not rendered obsolete. Here are some strategies, which will help one stay ahead of the curve and be able to effectively compete and thrive in the fast paced and dynamic world of employment. ... Being Human: Human beings have evolved over centuries of evolution to become a superior race and embracing human emotions like empathy, gratitude, compassion, zeal to strive for the betterment of our fellow human beings will always keep us ahead of the game. This is what distinguishes us from machines. Interdisciplinary skills: Consider developing skills across multiple disciplines and combining them will make one more versatile and valuable to the employers. Problem Solving: It cannot be understated more, that problem solving and our ability to think critically to solve the complex problems around us will make us stay ahead of the machines.?


Driving Digital Transformation Through Model-Based Systems Engineering

Digital engineering is revolutionizing important areas such as the health care industry. From sophisticated imaging devices and robotic surgical systems to telemedicine platforms that connect doctors and patients across vast distances, each of these systems depends on the integration of numerous complex components, and each must operate seamlessly to ensure optimal performance. A key approach that relates systems engineering to digital transformation and digital engineering is model-based systems engineering (MBSE). Whereas traditional systems engineering relies on document-based approaches to support systems engineering activities (e.g., text-based requirements and design documents), MBSE does so by relying on digital system models instead. In essence, MBSE supports traditional systems engineering. It doesn’t replace it; rather, it offers an approach that aims to make systems engineering more efficient.?


Optimize Your Observability Spending in 5 Steps

You can’t use an observability agent on its own to put these steps into practice. Agents are simply neutral forwarders, sending out information to be processed downstream in the observability analysis tools. You could implement some of these steps using open source tools and in-house development, but this comes with increased operational cost and complexity, requiring your team to build expertise that is not core to your business. Overall, the main challenge with putting these steps into practice is that the available tools are either like agents, which simply send information, or like observability tools, which simply receive it. You need to be able to process telemetry data in stream, to be able to transform and route it as it passes from agent to tool, to optimize and shape it for your downstream requirements. Our Mezmo Telemetry Pipelines were conceived with the goal of helping organizations get better control of their data in stream. This approach enables you to control the flow between your data sources and your observability tools, and manage in detail the optimization of your data before it arrives downstream.


Why AI Regulations Are Needed to Check Risk and Misuse

Adopting a new technology poses certain risks, especially if it has not been previously deployed. That calls for certain risk mitigation strategies, such as testing, sandboxing, proof of concepts, and taking smaller steps such as minimum viable product, before complete adoption. Mahadevan believes there will always be risks and that we "amplify the risk" to a large extent today. "Companies need to follow a framework and put together a risk mitigation panel, rather than focus on the risk itself. I insist that AI and the risk mitigation should become a part of the blueprint. And this is not a job for a CIO alone, it is a job for a CHRO, the risk manager, and for operations," Mahadevan said. Deep fake and the violation of one's privacy is a hotly debated topic in the industry today. Thomas said deep fake will lead to many scams, causing victims to lose a lot of money. It is also a violation of one's privacy, and poses a substantial risk at an individual level. Deep fake technology uses a form of artificial intelligence called deep learning to create convincing videos, photo or audio clips of a subject, which are used for misinformation campaigns or to defraud/deceive relatives or friends.


New kind of quantum computer made using high-resolution microscope

It is unlikely to compete any time soon with the leading approaches to quantum computing, including those adopted by Google and IBM, as well as by many start-up companies. But the tactic could be used to study quantum properties in a variety of other chemical elements or even molecules, say the researchers who developed it. At some level, everything in nature is quantum and can, in principle, perform quantum computations. The hard part is to isolate quantum states called qubits — the quantum equivalent of the memory bits in a classical computer — from environmental disturbances, and to control them finely enough for such calculations to be achieved. Andreas Heinrich at the Institute for Basic Science in Seoul and his collaborators worked with nature’s ‘original’ qubit — the spin of the electron. Electrons act like tiny compass needles, and measuring the direction of their spin can yield only two possible values, ‘up’ or ‘down’, which correspond to the ‘0’ and ‘1’ of a classical bit.?


Net-zero carbon data centers: Expanding capacity amid evolving policy and regulation

The sting in the tail for data center developers, is that emissions associated with the IT process load are now to be included in the calculation. Given that the annual energy consumption of even a modestly sized facility could run to hundreds of thousands of megawatt hours (MWh), this represents a very substantial cost for developers – unless they can drive their on-site emissions down below the 35 percent threshold. Outside of London, there is currently no policy for carbon offsetting, but it seems likely that other local authorities will follow London’s lead and introduce similar schemes in the future. In some regions, particularly the Nordic’s, planning policy has been introduced requiring new data centers to provide waste heat to local district heating infrastructure, or to be ‘heat network ready’ for connection to future schemes. Whilst a policy of promoting heat reuse may not lead to a direct reduction in data center emissions, it is seen as an important step towards decarbonizing the wider community, by displacing other, more carbon intensive, sources of heat.

Read more here ...
Meghna Arora

Quality Assurance Project Manager at IBM

1 年

Prepare for Open Group Certification like a champion with www.processexam.com/open-group! ???? Your success is our priority. #CertificationChampion #OpenGroupPreparation

回复
CHESTER SWANSON SR.

Realtor Associate @ Next Trend Realty LLC | HAR REALTOR, IRS Tax Preparer

1 年

Thank you for Sharing.

要查看或添加评论,请登录

Kannan Subbiah的更多文章

  • March 21, 2025

    March 21, 2025

    Synthetic data and the risk of ‘model collapse’ There is a danger of an ‘ouroboros’ here, or a snake eating its own…

  • March 20, 2025

    March 20, 2025

    Agentic AI — What CFOs need to know Agentic AI takes efficiency to the next level as it builds on existing AI platforms…

  • March 19, 2025

    March 19, 2025

    How AI is Becoming More Human-Like With Emotional Intelligence The concept of humanizing AI is designing systems that…

  • March 17, 2025

    March 17, 2025

    Inching towards AGI: How reasoning and deep research are expanding AI from statistical prediction to structured…

  • March 16, 2025

    March 16, 2025

    What Do You Get When You Hire a Ransomware Negotiator? Despite calls from law enforcement agencies and some lawmakers…

  • March 15, 2025

    March 15, 2025

    Guardians of AIoT: Protecting Smart Devices from Data Poisoning Machine learning algorithms rely on datasets to…

    1 条评论
  • March 14, 2025

    March 14, 2025

    The Maturing State of Infrastructure as Code in 2025 The progression from cloud-specific frameworks to declarative…

  • March 13, 2025

    March 13, 2025

    Becoming an AI-First Organization: What CIOs Must Get Right "The three pillars of an AI-first organization are data…

  • March 12, 2025

    March 12, 2025

    Rethinking Firewall and Proxy Management for Enterprise Agility Firewall and proxy management follows a simple rule:…

  • March 11, 2025

    March 11, 2025

    This new AI benchmark measures how much models lie Scheming, deception, and alignment faking, when an AI model…

社区洞察

其他会员也浏览了