The S/4 Storage Conundrum
PCC's AI-Powered SAP Solutions for Transformative Business Growth

The S/4 Storage Conundrum

SAP’s new S/4 is fast but costly.? Companies now archive data sooner, reducing storage and thus creating a greater need for fast archive access.

Using AI to solve the simultaneous problems of providing the right archive data in the fastest possible way


Your archive data volume exceeds your database volume and presents a unique challenge in providing this data to end-user requests in a reliable and performant manner

It may be counterintuitive that most SAP customers with any data archiving strategy now have 5, 10, or 20 times more useful and valuable data sitting in their archives than they do in their active database.? Especially in the S/4 environment, where database storage is speed-of-light fast but much more expensive, it is critical to provide not only fast, correct, and reliable access to your archive data but also to maintain the size and affordability of the database through consistent data archiving. ?AI has proved strategically crucial in addressing this issue, as it has recently been across so many application areas.

Narrow AI, or weak AI, as defined by Dr. John Martin, is an “artificial intelligence [system] that [is] designed and trained to perform a specific task or a limited range of tasks”.?It can replicate human-like decisions that are “specialized and limited in scope”.

Typically, the primary benefit for SAP data archiving has always been reduced database size.?Secondary benefits include managed risk, greater compliance, and faster IO when reporting on database data.?Notably absent from this list is business user satisfaction, customer happiness, and faster IO when reporting on archive data.?These criteria can now be considered, provided AI is integrated into the archiving solution.

To that end, SAP data archiving has always consisted of two tasks:?archiving the data and then reporting on that data.?The latter has grown in importance to the point where many customers and solutions are ill-equipped to report on archive data with celerity in the S/4 environment.?Fast real-time reporting on archive data is precisely the goal to which we can apply AI to make access decisions sensitive to your environment, optimizing our IO speed throughout the landscape and freeing us from application redesigns to accommodate changes in the landscape.

With the advent and proliferation of S/4, many SAP customers are driven to shrink their S/4 database by reducing online database residence times, thus archiving their data faster and sooner than ever.?This expedited archiving increases the volume of archive data and thus the need to access that data in a performant manner.?The need for AI-powered, high-speed reporting on SAP archive data is greater now than ever.

The above-mentioned form of Narrow AI is perfectly suited to support the higher requirements placed on reliably accurate and high-speed reporting on increasingly high-volume archive repositories.?In SAP archive reporting, Narrow AI is a quantum leap forward because while we need AI to help us achieve our reporting goals, we don’t yet need it to make our morning coffee.

?

Why your archive data access could be a problem that impacts your entire system

Don’t let your archive reads remain old and slow; they will become an anchor on your applications, slowing down your entire system?

Consider this example: you’re driving on the Autobahn doing 200MPH in a Ferrari, making great progress, traveling 100 miles in 30 minutes!? You get used to that blur of the landscape and have no problems.? Now you must exit the highway to get to your business meeting and your Ferrari is sitting in traffic doing 5MPH with the powerful engine acquiesced – it takes 30 minutes to travel the last mile to your office - the “last mile” issue.?

That’s what archive reads are like if you don’t optimize every archive read every time.? Your S/4 database gets you some, or most, of the data for your report, but your user needs to go back 2 years, and now your archive IOs are making your report run 10X or 100X slower.? With PCC, every single?archive IO is optimized to solve the “last mile” issue, and your reporting remains performant without rewriting your applications.?

?

Making your archive IO as close in performance as possible to your database IO is critical

Your SAP environment, complemented with PCC’s AI, informs PCC algorithms in real-time on what data to access and in which manner, removing you and the programmer from making critical query-plan decisions

You pay big dollars for high-speed databases and DBMS resources so that when reading data from the online database, your queries are monitored and tracked; if insufficiently fast, indices are built, and hints are provided, ensuring your read times are as fast as possible.?

Why then rely on programmers to decide access methods into your high-volume archive storage which likely won’t actually provide high-speed access to all, most, or any, of your archive data.? When programmers implement data archive reads, they decide at a point in time how the archives will be read.? Then they copy that code that they know works and replicate it into the next program that needs to be updated to read from the archives.? However, we know empirically that the optimal access method varies from object to object as well as on a host of other variables, including:?

1.??? How much data is in the current access request??

2.??? How large are the archive files??

3.??? Where are the archive files stored? ?On-premise or in the cloud??

4.??? How large are the individual objects within the archive files??

5.??? Does this archive object have large volumes of ancillary objects, e.g., change records, rebates, pricing conditions, etc.?

6.??? Is our target data concentrated in one file, several files, or dispersed randomly throughout the archive population??

There are three potential ways to solve this problem.? First is to spend lots of money, purchase an NLS database, and store your most valuable archive data in it.? NLS can be a valid choice if you have the money and time to invest in an NLS implementation and the time to wait for what one customer described as the “million-year ROI”.?? A second option would be to bypass the NLS and hire and retain only the best programmers in the archiving arena and rely on them to write optimized code when it’s released for development and hope the environment and the other variables never change.???

The third option is the PCC Connector solution powered by AI - the most cost-effective of these three options.? PCC has integrated powerful AI and decision-making algorithms into the solution to ensure that every read is optimized every time.? Behind that AI, we have integrated every SAP-supported mode of archive reporting possible, giving PCC Connector the broadest range of powerful reporting choices.

?

PCC implemented AI in data archiving to solve the problem of always getting the right data to the right application at highly optimized speeds

In summary, we’ve seen how the advent of the SAP S/4 database is accelerating DB read times and increasing database costs. This results in a renewed drive to maintain a small database through archiving.? Hence, as the archive repository continues to grow with increasingly current data, the demand for reliably high-speed, accurate, and flexible archive data access skyrockets in the S/4 environment.

PCC, with 30 years of experience in SAP data archiving, has developed a new solution leveraging AI, Machine Learning, and many other enhancements that overcome the gaps in prior offerings, keep your system highly performant regardless of configuration and other technical variables, ensuring your company can continue to drive growth using the volumes of still extremely valuable information now residing in your archive repository.

Download our case study to learn how we helped our client achieve 67% reduction in overall solution cost with PCC's AI-powered solution for their SAP reporting today.


PROCESS / EXPERTISE / INNOVATION


Download case study, 67% in overall solution cost.
Case study download
Jim Paschke
PCC CEO/Senior Data Architect
Jim Paschke, PCC's CEO/Senior Data Architect



要查看或添加评论,请登录

PCC的更多文章

社区洞察

其他会员也浏览了