Building A Data Fabric across Public & Private Clouds
Michael Lasham
Experienced Technical Sales specialist covering data center compute, storage, networking, security and cloud technologies.
Today many customers have started to realise that Cloud is not the full answer and that lift and shift does not deliver the promised benefits of cloud.
To get those promised benefits of cloud our applications and data need to reach from the on premises world, across private clouds and out to public clouds. Data should not be tied to any one location, and applications need to be design to suit. Applications and Data need to be in the right place at the right time.
New methods of deploying applications, such as Docker Containers, have risen that allow for fast deployment, easy scaling and equally fast tear down. These containers allow customers to build micro-services which put together can form an application. For example you Amazon.com checkout is made up of several micro-services - "Shopping Cart", "Customers who bought XYZ also bought...", "Sponsored Products relating to your Cart". These are all containerised micro-services and if one fails it does not stop the "Application" from working - you can still buy your product.
Having an application that can only run in the public cloud may be of limited use. What if applications could be developed in the cloud but deployed using the same technology on Premises, or written on premises and deployed in the cloud, or more likely written and deployed anywhere - which ever suits the business outcome?
Certainly the limiting factor of where is the data should not stop this from happening and that is where a companies Data Fabric comes into play.
NetApp has transformed itself over the last 4-5 years into a cloud software company. Sure they still sell great storage appliances however they have invested heavily into Cloud Services. Just take a look at their cloud portal
Customers can deploy Kubernetes clusters into AWS, Azure and Google, create high performance volumes within those clouds and move data across those clouds, all through this one portal and all without owning any NetApp tin.
If they do own NetApp storage on premises they can then use this portal to include that data into their cloud - their On Premises storage is no longer isolated but could be seen as their private region of the public cloud. Kubernetes clusters and apps can be deployed across they hyper-scalers as well onto On Premises infrastructure.
With NetApp you can now run any application, on any protocol using any storage media anywhere you like on your terms - Your Data Fabric.
This is not vaporware or marketing fluff talk - it is real.
The latest addition to this suite is Fabric Orchestrator - what if you could easily protect all your data (in cloud/on premises) with just a few clicks; or identify cold storage that could be freed up to reduce cost; or create flows to move data from one cloud to another. See a demo below:
Some of the services from NetApp available today on the public cloud (all of which can be deployed on premises too) include:
If you want to know more, drop me a line and I can help or put you in contact with a NetApp partner, or just simply go to either https://cloud.netapp.com and start a free 30 day trial of any of these services.