Segmentation - Is it a lost art?
The great Adrian Cockcroft has a very meaningful saying "Don't do any of your own heavy undifferentiated lifting". This is particularly true when it comes to running, managing and operating infrastructure and infrastructure software.
One would seek a high degree of automation, portability, elasticity, efficiencies and services to enable relying on others. Many folks are enamoured with the Netflix effect and for good reason. However, they use it to their business advantage in a very meaningful way. It is not just about automation and scale.
They enable their business to do things others are not able to do and the most important thing they do is understand their users better than anyone else knows theirs. Simply put - being web scale and cloud native matters for your business.
Your then instrumented from the start to take advantages of the new computers in this world that are vastly abundant and cheap (IE - Any Cloud of your choice) to build new applications and algorithmically reason over the data at scale.
A very fundamental but yet so powerful aspect is knowing your users. This is not possible without a segmentation model that is built with the current environment at hand. The old world or old way does not apply. This requires not just prediction at extreme scale and in-memory but also a dimensional model with feature selections, variables and feedback loops flexible to change at will at scale.
I have seen many companies struggle with this as the new world transformation is one that is challenging as a stand alone function. However, it is mandatory to ensure you can understand how your users drive ad revenue, new product creation, customer marketing, content delivery and application development.
In the end your seeking to get your users the content they want, where they want it, how they want it and deliver them new products they will consume. This is easily said but hard to do. I am a sponge for good content. I want it. I need it. I thrive on it. However - why do my most trusted sources not realize this at a granular level?
I am in a fortunate spot where I get to see how some of the world's leading media brands are going about this and then helping them with this transformation. Many struggle to just get a simple customer fact table that is 10's of millions of rows deep and 1,000 of columns long. The process of iterative feature selection and supervised learning over this is overlooked in such a widespread manner that folks are not capturing "the event" as data that needs to be weighted to a far greater degree than any of the previous features.
If you created a new product to deliver 60 second clips of sports content and I reacted or did not react than what do you do with that data? Is it driving your new product decisions?
I see too many people spending too much time on automation, orchestration and consolidation. Important for sure but as the great James Watters says - doing that without having scalable services in mind is just like repainting your car.
Instead - let's work together to build a new car that can take you places you have never been that can fundamentally change your world.