How to Monetize & Tokenize?Data

How to Monetize & Tokenize?Data


This is a transcript of my talk at the MOBI Colloquium in Los Angeles on Nov. 12, 2019. Credits to Dr. Trent McConaghy and Dr. Dimitri de Jonghe for the inspiration and ideas to drive this project forward.

No alt text provided for this image

Data is being generated in ever greater amounts from vehicles. They are becoming sensing machines for the external environment, as well as internally for passengers and drivers.

This rich trove of data is under-utilized.

There are a host of use cases that are only now being explored by stakeholders — from the car makers and suppliers, to startups and high-tech giants. Cities also want the data to improve traffic patterns and quality of life. Not to be forgotten, insurers also have a big role to play in a future world of mobility.

No alt text provided for this image

Data sharing is going to happen one way or another. The only question is how?

Do we use existing, legacy models of data sharing where the large incumbents make deals amongst themselves or will it be an open system of data sharing that gives a fair playing field for everyone?

Existing legacy models are tedious, take a long time to strike and are in-transparent for most stakeholders.

An open system sets out clear standards for data sharing, that anyone can agree to, while reducing the friction for data sharing to happen.

I think an open system is where we need to go. It’s self-evidently better for everyone because the more data can flow, the faster we can roll out innovations across the entire spectrum of use cases.

No alt text provided for this image

I’m going to share with you, a picture of the four building blocks needed to make this vision of open data sharing a reality. This open system allows every data provider to monetize and tokenize their data, so that data itself could be a sustainable business model. Once I explain how data can be used more openly, can we then apply the tools of finance to it?

The opportunity is huge. McKinsey wrote in 2016, that car data monetization will be worth $450–750 Billion by 2030. If this target is hit, that means a completely new set of Apple, Amazon, Google and Facebook in the next decade, just for car data sharing.

In this new open data sharing future, will it be the existing car incumbents, the high-tech platforms or upstart startups who come onto the scene?

No alt text provided for this image

Why isn’t data being freely shared today?

Most obviously, there’s no clear financial incentive to share data that exceeds the barriers. Everyone knows that they can make more money if they share, but something’s missing or blocking.

Pricing of the data is unclear. Pricing signals are non-existent or unreliable in an illiquid market — so a lack of data sales begets more of the same.

Even if pricing is clear, there’s a question of trust. What is the data being used for? Do I give up privacy? Am I giving up company secrets? Trust entails a bunch of aspects related to custody, access, and usage for the provider, and things like data quality and provenance of the data for the consumer.

Assuming the first two problems of price and trust are solved, then the third barrier is process. How do I discover data and negotiate use?

That, in a nutshell are the 3 big reasons why data isn’t being shared — price, trust and process.

No alt text provided for this image

I once had a conversation with an EU parliamentarian and he asked me, what is the one thing that could help to kickstart a Data Economy?

I told him, have every CFO include, as part of their balance sheet for intangible assets, an estimate for the value of their data.

Think about it. Once every company CFO has this number in hand, you can be sure that they’ll start asking hard questions about why the data isn’t being better leveraged in some way or another. Maybe it triggers a change in mindset of hoarding towards sharing.

Of course, just having a new asset class on your balance sheet isn’t the only thing. We have to actually remove the barriers to data sharing.


What are the building blocks to unlock this potential?

I think there are 4 — framing, financialization, infrastructure, & protocol

No alt text provided for this image

1. Framing

It starts with the proper framing of data. Is data an asset? If so, what kind of asset is data?

If we look to assets — you can think of land, companies, intellectual property, & physical objects.

Where does data fit in?

Clearly, it’s not land, companies or a physical object — thus it fits under intellectual property.

And what kind of intellectual property? Patent, trademark, copyright, or trade secret.

No alt text provided for this image

Data is definitely not a patent or trademark, leaving either copyright or trade secret.

Which is it?

Let’s say it’s either a copyright or trade secret. It’s one or the other or both.

Framed in this way, data is an asset that is intellectual property, governed by laws on copyright and trade secrets. The good thing, is that globally, the laws on copyright and trade secrets are relatively harmonized.

So if data is intellectual property, how can we handle it?

There’re a few precedents to explore.


2. Financialization

In 1985, the king of pop, Michael Jackson bought the Beatles Music Collection of 251 songs for $47.5m. One little fact that few people know, is that ironically it was Paul McCartney who had taught Jackson the value of buying rights to music already beloved by the public. McCartney and Yoko Ono declined to buy the rights, so Jackson swept in. Today, that collection is worth well over $1 Billion.

In the case of data, we’re not there yet. There isn’t a set of beloved data that everyone uses that is always fresh and isn’t stale, for which people are clamouring for. But it does point to the idea of scarcity on rights.

No alt text provided for this image

Everyone can listen to music and even have a copy on their iPhone, but for someone to have the privilege to earn a profit off the music, they need to own the rights. This is the scarcity that can be enforced. The good thing is that, as I said, laws for copyright are well established.

The anatomy of this construct is that each song can generate a revenue stream. Those songs can be bundled together to form a catalogue and sold off as a package. In the future, the package could be recomposed — with new songs or entire albums added, or songs and albums spun out separately into other catalogues. But the core concept is that there can be only one owner. Let’s leave this for a moment, and move on to another example.

In 1997 David Pullman, created the first “Celebrity Bond”, whereby artists could get paid upfront in exchange for investors to get a fixed interest rate coupon, secured by the rights to future royalties. The first person this bond was created for was David Bowie. It was an asset-backed security with 287 songs from 25 albums. Investors subscribed to the bond, $55m was raised and in 10 years, the bonds paid out, always on time and Bowie got his royalty streams back.

No alt text provided for this image

Since 1997, this type of financialization of music royalties has exploded. Pullman is still at it, and many others are in the game, including Goldman Sachs and other private equity shops.

In some areas, such as music, the intellectual property can be bundled up as rights and sold off. They can also be used as assets to back a bond security, with the royalties and rights used to guarantee the payment.

Data is intellectual property.

Data is intellectual property that is waiting for the pricing, trust and process to be simplified enough so that it can start to be traded.

Once data can be traded, the tools of the financial system can be applied to create novel financial assets based on the revenue streams generated.

No alt text provided for this image

3. Infrastructure

Now, I’d like to introduce the third concept.

The first was “Data is intellectual property”, and the laws around IP are pretty clear and well-established.

Second, “Intellectual property can be financialized”, we have examples from music and other creative industries. It’s a rapidly growing business.

This third concept is that the financial tools can now be deployed on a much faster scale that ever before seen on open public infrastructure that now exists.

The public infrastructure that is currently the most mature is Ethereum. Let me take you through an example to illustrate how this works.

No alt text provided for this image

The easiest way to think about this is to picture a hotel. When you check in to a hotel, you get exclusive use of the hotel room, say room 888, for the period of time that you’ve booked it. That is a non-fungible token. Access to room 888 is only via a single privilege that you control. This access privilege can be created using the ERC721 standard for non-fungible tokens. The access token gives you sole privileges to access the room.

No alt text provided for this image

Now let’s say that you have additional guests staying with you. You can create 2 other keycards to access room 888. These keycards “wrap” the primary rights for use of the room encapsulated within the ERC721 token in Ethereum. In fact, you could have 10 keycards printed giving access to room 888, but sole control of the room remains with you. The keycards are what we call fungible tokens — freely interchangeable with one another, like a dollar bill. Here, you’d use the ERC20 standard for fungible tokens on Ethereum.

Let’s take this example further. Say it’s a family reunion and you booked the rooms from 881 to 888 — a total of 8 rooms. You have your brothers and sisters, aunts and uncles, cousins all piled in.

You made the booking for everyone because you’re the master of ceremonies. Maybe you want to leave gift packs for everyone. Each room has 3 keys for the people staying inside. But to make things easier, the hotel just gives you one key that can access all 8 rooms. This is what we’d call a composable access token. It’s the basket of all the rooms — both the access privilege ERC721 and the access tokens (ERC20) for each room — which is the ERC998 standard.

There you have it.

With this combination of ERC721 — non-fungible tokens, ERC20 — fungible tokens and ERC998 composable token, you can create an infinite combination of access controls on top of an asset, including data.

No alt text provided for this image

Going back to data for cars, assume each car has an ERC721 token for access privileges to the data generated. The ERC721 covers all data generated, for the life of the car. The ERC721 can be sold off or kept by the original owner.

If the ERC721 token is financialized, the owner can choose to issue 10 ERC20 tokens for usage rights — say for the car manufacturer, a couple of mechanic shops, the city of Los Angeles, 2 insurance providers, toll collectors and a few others. These ERC20 tokens are interchangeable, transferrable and sellable.

Then bring in someone like David Pullman, who then starts collecting ERC721 access privileges and ERC20 tokens for data streams together, let’s say for drivers in LA — and then sells them off to the City of Los Angeles for better traffic management.

What has happened, is that essentially, the car data has been both tokenized and monetized — using the existing tools of finance on top of public decentralized infrastructure with Ethereum.

Where financialization comes in — is that once this infrastructure is used towards data, you can sell the ERC721, ERC20 and ERC998 tokens depending on the pricing models that suits you or the consumer.

The radical suggestion I’m making, is that as soon as the data sharing problem with car data is solved, we will head rapidly to a point where you will have asset backed bonds and lending based on data.

No alt text provided for this image

4. Process

What is the last piece?

It’s the pricing, trust and process on unlocking data. This is a hard but solvable piece because not only does it require new technology, it also requires the mind shift in thinking from closed to open.

My team and I have spent the last 6 years trying to crack this nut. We think that we have a reasonable first go to make it happen.

The fourth piece is a protocol that implements access control on top of data so that the ERC721, ERC20 and ERC998 can be fully utilized. Let me unpack.

Access control is the key piece that governs whether or not data can be used. For access control to work, the data provider needs to be able to define the conditions of access, and then rely on the access control tool to enforce the rules.

I believe that this access control protocol needs to be decentralized in order to scale globally. It needs to be completely trustless, not relying on any central party, using blockchain technology so that anyone can use it, so that it is censorship resistant and most importantly, that it isn’t controlled by any single nation state or mega-tech company.

No alt text provided for this image

This access control layer is Ocean Protocol.

Ocean Protocol has been in development since late-2017 ever since our CTO, Trent McConaghy imagined what could happen if AI, blockchains and data could merge. The result is an elegant set of smart contracts that give data providers full flexibility in defining how consumers can start to access data, on their terms.

At the heart of Ocean Protocol is what we call Service Execution Agreements (SEAs). Other terms we could have used were Decentralized Policy Stores, holding the access policies for each dataset. Common to today’s world, they’re simply Service Level Agreements for the data and rights to access the data, except machine readable.

These smart contracts govern access to the datasets and the grease within is the Ocean Token, which can be used to pay for services such as data, compute and storage.

Here you have, the last piece. A trustless protocol that allows data providers to start experimenting with the sharing of data on their own terms using a defined process that is both transparent and auditable.

Ocean can be deployed in the open, publicly — or within a company for more private data sharing. You can have consortiums like MOBI where there are multiple parties involved.

We’re still at the early stages of technology but once Ocean is more mature and tested out, all the other pieces — the infrastructure of Ethereum, the financialization tools for securitization and the framing of data as intellectual property all coalesce into a powerful architecture for the monetization and tokenization of data.


No alt text provided for this image

Closing

Someone once said, “the car is the browser for the physical world”. And it’s true. So many of our experiences with the world happen through the windshield of a car. Maybe it was getting driven around for soccer practice as a child. It could have been the first time the family visited the Grand Canyon.

We might have had our first date, cruising around.

In a new world, these experiences will still be with us.

But how could these experiences change?

Will the cars be safer so that you never ever have to hear the words “Drive safely!” again.

Will it be more immersive — with various modes that passengers can select, such as “tourist mode” where historical details of each neighbourhood and building are spoken out by an Alexa-like voice.

Streaming through this new world, is data.

Data that is monetized and tokenized. And just like the tools of finance have been used to make buying and selling homes, businesses and products easier, the tools of finance can be used to make the buying and selling of data more profitable, while building on open and transparent infrastructure.

Let’s make this vision a reality. I look forward to the day when we can celebrate the “Ballinger Bond”* for automotive data.

* Chris Ballinger is the Founder / CEO of MOBI who formerly worked on bond issuance for Bank of America.


No alt text provided for this image

Ready to learn more? Dr. Trent McConaghy has written a 3-part series of blogs explaining these concepts in more detail - data custody, data tokens and data & DeFi.

The Data Economy Challenge is open! 3 tracks. 3.4 Million OCEAN tokens and 6K Euros in prizes. Register today and help build the new Data Economy.


Ready to dive in? — Check out the Ocean Documentation to get a sense of what we’ve been working on, and fire all your technical questions in our Gitter chatroom.

Follow Ocean Protocol on Twitter, Telegram, LinkedIn, Reddit, GitHub & Newsletter for project updates and announcements.

Sheridan Johns

Building the Open Data Economy

5 年

Thanks Bruce. Indeed, the future of data monetisation will mark the shift away from data ownership towards data access. 2020, the year of the MOBI Ballinger Bond?

回复

Real question for you. Do folks “get” this when you discuss it w them? We are raising a Series A round and I do find getting investors to this level is a challenge. I’m just not making it clear how clearly the future will be where you are discussing and I need better and shorter sentences.

回复

The methodology and market to create data products for the data asset market is exactly what we do at DrumWave. This narrative is what we testified before the US Senate re personal data. This streams of thought & work will merge where DataCap is as or more powerful an indicator of success as MarketCap.

Lucy Hakobyan, MBA

Innovation and Emerging Tech

5 年

Thank you Bruce Pon for the truly great presentation!

Rob Begley

Info Mgt; InfoSec; BIM; Problem Solver; Hemp Advocate; Mullingar Man; Always Learning!

5 年

Data + Blockchain = The Future! Thanks for posting Bruce Pon

回复

要查看或添加评论,请登录

Bruce Pon的更多文章

社区洞察

其他会员也浏览了