The Fintech Digital Revolution Will Continue in 2022

The Fintech Digital Revolution Will Continue in 2022

No alt text provided for this image

The Fundamentals of Fintech Tackling the Security Issues For the Future

E-Book?$12.95

ORDER-

Call +1 888 857 9203

+2348133877820

[email protected]

[email protected]

www.mgireservationsandbookings.co.uk



The Fintech Digital Revolution Will Continue in 2022


2021 was a banner year for Fintech investments, with long-term trends and short-term pressures combining to accelerate the digital transformation of the financial services industry. One BIS study estimated that fintech has attracted a trillion dollars over 35,000 equity deals since 2010, and grown from 1% to 5% of deals globally.


2022 will most likely see this trend continue and perhaps even accelerate – but only if institutional players in the industry can stay ahead of the curve, something they have struggled to do in recent years. So what can we expect for Fintechs in 2022?

Fintechs are here to stay – if you can’t beat’em, join’em


The top 100 Fintechs already account for a value of $2.7trn, while the top 100 banks achieve a value of $7.1trn. There is no denying that challenger banks, insurance, wealth managers, payment companies as well as B2B Fintech solution providers have established themselves and they are here to stay. Laser-focused on the slowest-moving incumbents and the tech-savvy, demanding 20-40 demographic, these new services have siphoned off millions of customers with their convenient, web- and mobile-first approaches.


Basic mobile banking has become commonplace, and challengers are leading the charge, and while the establishment has been playing catch-up the fast-moving startups have moved on to wealth management and digital insurance solutions. The simple truth here is the age-old adage: If you can’t beat ’em, join ’em, or in many cases integrate ‘em. Incumbents will need to partner with complementary Fintech companies to fend off threats from others.


Decentralising finance – blockchain, cryptocurrency and smart contracts


The idea behind decentralized finance is to replace the traditional intermediaries like banks, brokers and insurances with peer-to-peer relationships offering the entire world of financial services. The core components are blockchain, cryptocurrency and smart contracts (and like always – data). It’s clear that blockchain-based services will be a major part of any financial institution in the next 5-10 years, perhaps replacing centralized banking altogether for a generation raised on the tech. The goal is to offer more accessible, transparent, efficient and cheaper financial services.


Venture capital has supercharged the development of these new categories and most traditional financial services companies are still at the starting line. However, even the best-funded startups need the reach of established incumbents, and perhaps even more importantly, their data.

AI in the archives – data will change customer experience for good


Deep reserves of data are the crown jewels of incumbent banks and insurances. But decades of records are as much a liability as an asset if one doesn’t know how to access the insights inside them. The next phase of innovation and investment in Fintech will be data-driven applications that cater to and engage with individuals – an unexpected and valuable business proposition in an industry defined by regulation (and highly regulated) services.


Accessing the insights in this data will necessitate the use of AI and machine learning. These technologies may sound like buzzwords when shoehorned into pitches and product ideas, but they’re a perfect match for anyone trying to make sense of huge piles of semi-structured data. Here again startups will lead the way, offering services that can securely and safely access and analyze billions of bytes of financial records to improve customer experience.

Diversify and conquer – the rise of embedded financial services


Equipped with highly specific and data-derived trends and perspectives, a financial organization will be able to offer improved and faster risk assessment, customer-product matching, personalized recommendations, and better security and customer services. Happy customers are the best bulwark against hungry and trendy Fintechs.


Equipped with data, non-bank entities like Amazon and other tech giants are now offering financial services like buy-now-pay-later and other loan-adjacent products, since consumer investment in one platform makes it far more likely they’ll use its other offerings as well. Convenience is key, but trust is also hard to come by when large sums are in play – an advantage incumbents have against startups that have yet to become household names. Embedded finance therefore also gives incumbents the opportunity to integrate other financial products into their offering.

Values on display – ESG can no longer be ignored


It’s important to note that with the rise in visibility of matters like climate change, systemic racism, political disinformation, and other key issues, ESG values will be more important than ever going forward. Not only do consumers care, but regulations, board decisions, and business logic are tending towards favoring socially responsible investments and products. This is one area where institutions may be able to lead, as it is rarely a surprise when a small startup claims carbon neutrality or offers progressive benefits – but it definitely is when a major bank or 50,000-strong corporation does so.


Value-driven business decisions aren’t just smart in this market, they’re the mark of forward-thinking leadership that believes we can build an economic system that benefits both people and the planet they live on. Ignoring ESG is no longer an option.






Cyber Attacks- Phishing


Many businesses continue to underestimate phishing threat


Despite being around for almost three decades, phishing is still a popular cybercrime tactic, a new report from Sophos claims. Phishing owes most of its popularity to its simplicity, scalability and capacity to flex to current events.


To address the issue, most businesses are deploying cyber awareness programs and various training initiatives, with varying levels of success.


The biggest problem is that businesses often underestimate the destructive potential of phishing. Most of the time, they perceive it as a low-level threat, disregarding the fact that phishing is usually the first of many steps in highly complex and often devastating attacks.


It's true that businesses have started to address the issue, however. Nine in ten have implemented a cyber-awareness program, with an additional six percent planning to set one up.


But most of these programs (65 percent) were implemented up to three years ago, Sophos adds, hinting that businesses have only recently started addressing the problem in a holistic way.


Computer-based training programs seem to be most popular, as 58 percent of organizations use them. More than half (53 percent) use human-led training, while 43 percent run phishing simulations. Sophos also found that 16 percent combine all three techniques.


These programs should be a lot of help to organizations that run them, the report concludes, adding that employees should be “well-placed to withstand the barrage of phishing emails”.


?Many cybercrime victims are repeatedly attacked by the same hackers


Victims often don't bother remedying the issue that caused initial breach


On average, one in two companies worldwide have suffered multiple attacks from the same hacking group, a new report from AtlasVPN claims. What’s more, almost two-thirds (61 percent) of those attacked did not remediate the flaws that made them vulnerable, making the criminals’ jobs that much easier.


Of all the companies, those in the UK seem to be suffering the most (55 percent), followed by those in the States and Canada (50 percent), Europe (49 percent), and Latin America (48 percent).


The organizations are mostly vulnerable in the cloud (65 percent), but they’re also susceptible to DDoS attacks (60 percent), phishing and social engineering attacks (52 percent), malicious insider threats (45 percent), and DNS-based attacks (44 percent).


One of the biggest challenge, for 69 percent of the respondents, is systems generating too many low-value security alerts and forcing IT teams to address them before “reaching” highly important alerts - as many see it as wasting the IT team’s time.


Another problem is the chronic staff shortage. In total, six in ten have a shortage of in-house experts that could use security technologies, 56 percent lack the staff to pick up the workload, and 53 percent can’t find enough workers to deliver “lasting data-driven outcomes”.


“As long as organizations do not address existing vulnerabilities and security issues, they risk being hit by cybercriminals again,” commented Ruth Cizynski, the cybersecurity researcher and writer at Atlas VPN.


“Organizations should prioritize internal processes that they can control over external security risks that they cannot."



Meta to read lips with new speech recognition tech



The Lead

[1] Meta claims its AI improves speech recognition by reading lips

[2] How John Deere created its autonomous tractor

[3] 4 healthcare cloud security recommendations for 2022




The Follow

[1] People perceive speech both by listening to it and watching the lip movements of speakers. In fact, studies show that visual cues play a key role in language learning. By contrast, AI speech recognition systems are built mostly — or entirely — on audio. And they require a substantial amount of data to train, typically ranging in the tens of thousands of hours of recordings.

To investigate whether visuals — specifically footage of mouth movement — can improve the performance of speech recognition systems, researchers at Meta (formerly Facebook) developed Audio-Visual Hidden Unit BERT (AV-HuBERT), a framework that learns to understand speech by both watching and hearing people speak.

Meta claims that AV-HuBERT is 75% more accurate than the best audiovisual speech recognition systems using the same amount of transcriptions. Moreover, the company says, AV-HuBERT outperforms the former best audiovisual speech recognition system using one-tenth of the labeled data — making it potentially useful for languages with little audio data. >> Read more.

[2] Adding new brawn to the self-driving vehicle industry, John Deere unveiled a 40,000-pound autonomous tractor at this year’s Consumer Electronics Show that it says will be commercially available by the end of 2022.

The system uses six pairs of stereo cameras combined with GPS guidance to drive a Deere 8R tractor with a chisel plow and the capability to tow other equipment. A farmer can put the tractor to work with a swipe of a smartphone app and then walk away to spend time with family or attend to other business, using the app to monitor the tractor’s progress plowing a field or performing some other task. The farmer will receive alerts of anomalies the software doesn’t know how to handle. While it’s working, the tractor can also gather data about the health of crops in the field, the health and moisture content of the soil and other metrics.

“Until recently, agriculture has always been about doing more with more — more horsepower, more inputs, more acres — but the new digital era is changing all that. The last decade has been about doing more with less and providing farmers with additional tools to make more informed decisions,” said Jahmy Hindman, chief technology officer for Deere & Company.

[3] Despite initial hesitation, healthcare is increasingly moving to the cloud. In a July 2020 survey from Spok, 67% of healthcare professionals said there were no applications that they wouldn’t host in the cloud.

There’s no secret as to why: McKinsey has estimated that migrating to the cloud could generate up to $140 billion in additional value for healthcare companies by 2030 in terms of cost reduction, new product development, and overall growth.

To help healthcare continue its transition to the cloud – whether as part of an ongoing strategic initiative or as a sudden shift due to the need to provide remote care due to COVID-19 – Damian Chung, business information security officer for Netskope, provides four security recommendations for healthcare IT leaders in 2022:

Sign a BAA with your CSP.

See who controls your data.

Manage user access with adaptive trust.

Don’t slow down innovation as budgets and hiring, tighten.






More data will live on the edge in 2022



The Lead

[1] Data will continue to move to the edge in 2022

[2] D-Wave opens up to gate-model quantum computing

[3] AR will be the heart of the metaverse — not VR





The Follow

[1] How can software be faster, cheaper, and more resilient? For many developers, the answer in 2021 was to move the computation out of a few big datacenters and into many smaller racks closer to users on the metaphorical edge of the internet. 2022 promises more of the same.

The move is driven by physics and economics. Even when data travels at the speed of light, the time it takes to send packets halfway around the world to one central location is noticeable by users whose minds start to wander in just a few milliseconds.

However, edge computing will continue to be limited by countervailing forces that may, in some cases, be stronger. Datacenter operators are able to negotiate lower prices for electricity and that typically means right next to the point of generation like a few miles from some large hydroelectric dams. Keeping data in multiple locations synchronized can be a challenge, and some algorithms like those used in machine learning also depend heavily on working with large, central collections.

Despite these challenges, many architects continue to embrace the opportunity, thanks to the efforts of cloud companies to simplify the process.

The ultimate edge location, though, will continue to be in the phones and laptops. Web app developers continue to leverage the power of browser-based storage while exploring more efficient ways to distribute software. >> Read more.

[2] Recent advances in quantum computing show progress, but not enough to live up to years of hyperbole. An emerging view suggests the much-publicized quest for more quantum qubits and quantum supremacy may be overshadowed by a more sensible quest to make practical use of the qubits we have now.

The latter view holds particularly true at D-Wave Systems Inc., the Vancouver, B.C., Canada-based quantum computing pioneer that recently disclosed its roadmap for work on logic gate-model quantum computing systems.

However, D-Wave’s annealing qubits don’t have the general quantum qualities that competitive quantum gate-model systems have, and the degree of processing speed-up they provide has been questioned. D-Wave’s qubit counts have been faulted by critics for specializing in a purpose-built approach aimed at a certain class of optimization problems.

Still, the company has a leg-up with its experience compared to most competitors, having fabricated and programmed superconducting parts since at least 2011.

The gate-model quantum computing crew’s benchmarks have come under attack, too, and its battles with scaling and quantum error (or “noise”) correction have spawned the term “noisy intermediate-scale quantum” (or NISQ) to describe the present era, where users have to begin to do what they can with whatever working qubits they have.

While it will continue to work on its annealing-specific quantum variety, D-Wave has joined a gate-model quantum competition where there appears to be plenty of room for growth.

[3] My first experience in a virtual world was in 1991 as a PhD student working in a virtual reality lab at NASA. I was using a variety of early VR systems to model interocular distance?(i.e., the distance between your eyes) and optimize depth perception in software. Despite being a true believer in the potential of virtual reality, I found the experience somewhat miserable.

Even when I used early 3D glasses (i.e., shuttering glasses for viewing 3D on flat monitors), the sense of confinement didn’t go away. I still had to keep my gaze forward, as if wearing blinders to the real world. There was nothing I wanted more than to take the blinders off and allow the power of virtual reality to be splattered across my real physical surroundings.

Cut to 30 years later, and the phrase “metaverse” has suddenly become all the rage. At the same time, the hardware for virtual reality is significantly cheaper, smaller, lighter, and has much higher fidelity. And yet, the same problems I experienced three decades ago still exist. Like it or not, wearing a scuba mask is not pleasant for most people, making you feel cut off from your surroundings in a way that’s just not natural.

This is why the metaverse, when broadly adopted, will be an augmented reality environment accessed using see-through lenses. This will hold true even though full virtual reality hardware will offer significantly higher fidelity. The fact is, visual fidelity is not the factor that will govern broad adoption. Instead, adoption will be driven by which technology offers the most natural experience to our perceptual system.








No-code platforms ease development tasks, but make it harder to detect model bias



The Lead

[1] No-code AI development platforms could introduce model bias

[2] How Incorta uses AI to address supply-chain issues

[3] What 1000-X faster simulation means for digital twins.



The Follow

[1] AI deployment in the enterprise skyrocketed as the pandemic accelerated organizations’ digital transformation plans. Eighty-six percent of decision-makers told PricewaterhouseCoopers in a recent survey that AI is becoming a “mainstream technology” at their organization. A separate report by The AI Journal finds that most executives anticipate that AI will make business processes more efficient and help to create new business models and products.

The emergence of no-code AI development platforms is fueling adoption in part. Designed to abstract away the programming typically required to create AI systems, no-code tools enable nonexperts to develop machine learning models that can be used to predict inventory demand or extract text from business documents, for example. In light of the growing data science talent shortage, use of no-code platforms is expected to climb, with Gartner predicting that 65% of app development will be low-code/no-code by 2024.

But there are risks in abstracting away data science work — chief among them, making it easier to forget the flaws in the real systems underneath.





[2] Prior to 2021, the term “supply chain” didn’t raise many red flags for most consumers, frankly because they didn’t have to think about it. Everything just happened. Buyers were so accustomed to getting things on schedule that it rarely became a regular topic of conversation.

That all changed in the second half of 2021. With the pandemic slowing down production lines and transportation in faraway places, the supply chain is now regularly in the headlines. This has been the greatest shock to global supply chains in modern history. Buyers often have to wait months for raw materials, durable goods, building materials, electronic devices, apparel, toys, and numerous other items. This remains a nagging problem that may continue well into 2022 – or even 2023.





Relative newcomer Incorta, which makes a software-as-a-service (SaaS)-based unified data analytics platform that includes the above functions, comes at the supply chain from a different perspective. Its single-screen platform puts all a company’s data into a single system, replacing separate tools, to move data from source locations into a form that both line-of-business staff members and data scientists can use more effectively. This is the data analysis that’s used to project and/or identify supply-chain snags and find ways to solve them, similar to the way GPS routes drivers around traffic snarls.

[3] About a decade ago, MIT researchers discovered a technique that speeds up physics modeling by 1,000 times. They spun this out into a new company, called Akselos, which has been helping enterprises weave the tech into various kinds of digital twins used to improve shipping, refining, and wind power generation.





A digital twin is a virtual representation of an object or system that spans its lifecycle, is updated from real-time data, and uses simulation, machine learning, and reasoning to help decision-making. Connected sensors on the physical asset collect data that can be mapped onto the virtual model.

What might these improvements mean for the industry as a whole? At a high level, faster simulation makes it easier to compare design tradeoffs leading to more efficient products, reduced cost, enhanced performance, and better AI algorithms. Practical benefits have included trimming one-third the weight of wind towers and improving the safety of oil vessels.







Follow the roadmap for data science and machine learning



The Lead

[1] How to build a data science and machine learning roadmap in 2022

[2] How patient communication impacts healthcare’s data stack

[3] Report: 69% of enterprises embrace quantum computing




The Follow

[1] Closing the gap between their organization’s choice to invest in a data science and machine learning (DSML) strategy and the needs that business units have for results, will dominate data and analytics leaders’ priorities in 2022. Despite the growing enthusiasm for DSML’s core technologies, getting results from its strategies is elusive for enterprises.

Market forecasts reflect enterprises’ early optimism for DSML. The IDC estimates worldwide revenues for the artificial intelligence (AI) market, including software, hardware, and services will grow 15.2% year over year in 2021 to $341.8 billion and accelerate further in 2022 with 18.8% growth, reaching $500 billion by 2024. In addition, 56% of global enterprise executives said their adoption of DSML and AI is growing, up from 50% in 2020, according to McKinsey.

Gartner notes that organizations undertaking DSML initiatives rely on low-cost, open-source, and public cloud service provider offerings to build their knowledge, expertise, and test use cases. The challenge remains of how best to productize models to be deployed and managed at scale. >> Read more.

[2] Nearly two years after COVID-19 suddenly shifted healthcare from in-person to virtual appointments, people’s preferences for communicating with physicians have shifted.




Hospitals and healthcare systems have responded by ramping up telehealth offerings and increasing staff at their call centers, but a recent survey suggests that providers would benefit from a broader communication and outreach strategy. According to the November 2021 survey from Dynata and Redpoint Global, 80% of healthcare consumers prefer to use digital channels, such as online messaging, texting, or virtual visits, to communicate with their providers.

However, standing up a digital front door poses a significant challenge if communication and engagement happen across disparate channels.

“Many consumers can’t get a consistent experience across a healthcare organization’s web app, website, call center, retail health center, hospital or clinic,” said John Nash, chief marketing and strategy officer for Redpoint Global.





[3] Sixty-nine percent of global enterprises have adopted or plan to adopt quantum computing in the near term, according to a survey of enterprise leaders commissioned by Zapata Computing. The findings suggest that quantum computing is quickly moving from the fringes and becoming a priority for enterprise digital transformation, as 74% of enterprise leaders surveyed agreed that those who fail to adopt quantum computing will fall behind.

Broken down further, 29% of enterprises worldwide are early adopters of quantum technology, while another 40% plan to follow in their footsteps in the near future. Adoption is highest in the transportation sector, where 63% of respondents reported being in the early stages of quantum adoption. This may be a reaction to the ongoing supply chain crisis, which quantum could help relieve through its potential to solve complex optimization problems common in shipping and logistics.

That said, hurdles remain. The most commonly cited challenge is the complexity of integrating quantum with enterprises’ existing IT stack. To solve these challenges, enterprises are turning to quantum vendors.


要查看或添加评论,请登录

Akintayo Joda的更多文章

社区洞察

其他会员也浏览了