How technology is helping the fight against Child Sexual Abuse Material (CSAM) with Krunam CEO Chris Wexler

How technology is helping the fight against Child Sexual Abuse Material (CSAM) with Krunam CEO Chris Wexler

Nicole Bremner: Chris, this is a difficult and complex conversation to have. The most recent statistics from 2019 show that nearly 30 million images and over 40 million videos relating to child sexual exploitation (CSAM) were released online. That's a lot.

Chris Wexler: Yes, and it's grown dramatically. It was about 10 million just four or five years ago. That probably was artificially. There are two reasons it exploded. One is that we have companies paying attention and looking for it for the first time. Many people give Facebook a lot of grief and it’s often earned, but in the case of fighting the inhumanity of CSAM, they've done a really good job.

They had 20 million reports last year where it's like, oh, 20 million, but they have 7 billion people on their platform. The most surprising part is that it's not necessarily on the dark web or hidden; it's on everyday tools that you and I use. It's on Google, Facebook, and Microsoft; they're hiding in plain sight. They're using all the same tools to perpetrate these crimes. To hold CSAM is a crime in and of itself, but it's a documentation of the worst moment of someone's life. So every time that gets shared and viewed, that's a victimisation of that victim.?

We recognise this as part of the cycle, and we're just trying to break that part of the cycle. Part of what drives the production of this is people wanting to trade images. If it's hard to trade images, you're going to be less likely to get into that lifestyle. In the end, we're trying to protect the kids.?

COVID has been the other element that has been driving this forward as people are not traveling and they moved more to virtual, just like all of us - we've moved to streaming more, Zoom calls. It's one of the reasons why the industry has become so vigilant, and they need tools like Krunam to really make the policing of that even more vital.

Q: It's something that you've been an advocate for such a long time. What was the reason that you got involved in fighting CSAM?

CW: Via human trafficking. My brother actually started a nonprofit 15 years ago now. I knew that human trafficking happened because it was at movies and whatnot, but I didn't really know the scale of it. When I realised that one of the worst toxic waste of capitalism is that there are still over 30 to 40 million people that are enslaved in the world in various supply chains, from rubber and chocolate and fill in the blank, and obviously the illegal sex trade, and that 70% of the victims were women and children. It's one of those things that once you realise the scale and the scope of the problem, you just have to get involved.?

I was like, well, what can I do? At the time I had transitioned into advertising, which is not precisely the most impactful field, right? But I was consulting on the backend with organizations and companies that started to fight this and got pulled into an ecosystem.Some people have a Eureka moment - I had a Eureka decade where I just slowly evolved into it.

Q: Take us through Krunam, and what exactly does it do to help with this fight?

CW: We are a technology company. We build tools to make it easier to mitigate, stop and investigate the inhumanity of CSAM. We originally started with two of our co-founders Benjamin Gancz, a CSAM investigator in the UK, and Scott Page our CTO, who was an expert in computer vision, AI and deep learning.?

They had a Eureka moment at a conference - Ben realised that 70 to 80% of his time was spent going through confiscated materials when he was investigating. They'd go and raid a perpetrator, find terabytes of images and videos. Then they'd have to go through and determine, is it illegal? what class is it? Can we identify the person? And then 20% of his time doing what humans do better, which is actually investigating complex crimes. So you ended up having to really triage the investigations to either the most heinous, the easiest to determine, or the youngest - sometimes as young as two. It's mind boggling when you have to triage something that important.?

Luckily someone at the UK home office had started collecting in a database called CAID (Child Abuse Image Database) in 2012. They started essentially bringing together all the images that they found. But it was just there as kind of a reference, and you'd have to connect by hand? to do what we were able to do. Scott ended up doing this on a pro bono basis in the first place and applied some of the latest in computer vision, deep learning, and AI to essentially train an algorithm to look at a picture and tell you what it is.?

If you think about it on a very simple basis, you can show it a million squares, and every time there's a square, you go, yeah, that's a square. You know, we've all heard some of the horrors with facial recognition and how AI struggles with that. There was a legitimate question, whether the technology was up to the task, because you're looking at some fairly unique things to imply that this is illegal content. You're looking at the amount of skin showing, body positioning and? relative size of bodies.?

It was a Eureka moment when they came out of the lab and realised we were onto something. That work eventually evolved into a paid project with the UK Home Office. This technology is now deployed nationally in the UK.?

During COVID, we realised we needed to bring this out further because all of this content is out there and is propagating because it's getting spread. I don't think most people realize when you hit a report or a flag button on a site for something that shouldn't be there, the process is surprisingly manual. It literally sends whatever those images are, text, or whatever, to a human, usually in the developing world. So usually in Indonesia or the like, that low-paid not a very well-trained person then determines is it bad enough? And then it gets elevated to another human being.?

With CSAM, it didn't make sense to us that thousands to millions of people might see an image online that another human has to look at all day, every day. They see the worst of humanity. Typically people in these content moderation jobs are in the positions for about nine months because they get PTSD and have to quit. It’s the worst job you can imagine. Then, it goes to a trained person who suspends the person or reports it to authorities following the protocol.?

Our tool stops it from ever getting posted. You put it on the upload process so that eliminates the thousands of views online. It also can stop the frontline employee from looking at it sending it directly to a trained individual that can mitigate the situation. We still want a human being involved in this - AI is at the point where it can just automatically do this - you have to have some human somewhere in the chain, but we're reducing the exposure by 20 to 30 times in an organisation by having a tool to do it. Computers do a much better job than the human brain on large scale repetitive tasks that have a low probability of finding something. We've been surprised in the real world when we went into real-world testing that our classifier outperformed human classification because, after 10 minutes of doing this and seeing this, you just get exhausted.

It's a brutal way to live and to work. We think that it's vital and humane for the victims of these crimes and that the workers fighting for them do not have to spend so much time with it. We can also give warnings. If it's a particularly heinous image, we literally warn to be careful, and this will be bad. We're trying to strip the pain of fighting this as well as the pain of having your content distributed.

Some big companies are building their own tools for other elements, but luckily they're not allowed to keep any CSAM they run across. So by using government databases and legally obtained and privacy-safe data, we solve a data problem for companies as well because they don't have to break the law to build the same kind of tools we're building.

Q: Does Krunam also allow for the trolling of a site to look for potentially dangerous images?

CW: Yeah. That's one of these cases that we're actually talking to a major company with right now is proactively going out. It's a very flexible tool - we can run any image. And we're the only ones that can do video. Over 60% of CSAM has done video because we all have a 4k camera sitting in our pocket. It really went from all images just a few years ago to over 60% video in the last couple of years. We had our solution because of the different nature from the current technology. We can look at a video that was created five minutes ago and identify it.?

The current technology has been in the news a bit with Apple announcing that they will scan for CSAM with a really nice technology called Perceptual Hashing - a photo DNA. It's something Microsoft did back in 2008, but it only looks at known images. So if it's already been found and determined to be CSAM, then it goes, oh, this is something I know. However most CSAM is newly generated. That's frankly the most pressing issue to get into authorities' hands because those children are still at risk. And so those are the ones we want to prohibit as quickly as possible.

Q: What are some things that we can do as good human beings going about our own business to help with this fight?

CW: One thing is if you see something, say something. If you do see content on a site, make sure to report it. I think that's really critical. The other element will be a little bit of a surprise - don't be afraid to talk about this problem. This is hard and it's not popular at a dinner party, right? But the more people know it's a real problem, the more important it's going to be.?

When I started getting involved in human trafficking it was a completely invisible crime. In the last 15 years it’s become much more visible largely because many people in organizations have started talking about it as a real problem.

WhatsApp is an end-to-end encrypted messaging system. Think long and hard. When you talk about privacy rights online, there are extremes on both sides that are really dangerous for society. If everything is being shared with a company or government, that's a very dangerous thing.

You know, nobody wants to get into a George Orwell's 1984 scenario. Equally dangerous is if we go completely end-to-end encrypted and full privacy, because that protects criminals, protects oligarchs and protects the powerful to hide their crime. And it's really critical for platforms and society to have a real nuanced conversation about privacy. I was heartened by Apple deciding that it was time to scan and then really disheartened by their response to a very small vocal minority that wants the internet completely private. They backed off.?

My opinion and Krunam's opinion is if you have a platform that you own and operate, and you're profiting off advertising or subscription sales, it's your responsibility to make sure there aren't illegal activities happening on that platform. If everything is entirely encrypted and opaque to authorities or even the company itself - that is where the criminals go. They know they can hide, and we can't create an entire ecosystem for criminals to thrive. It's going to damage us as a society and as a culture. I think too many of the arguments around privacy are that if we do that, then the government will be kicking down your door. That's a slippery slope argument that isn't fair. These are two competing goods. We need to have privacy where appropriate. So financial transactions, conversations with your doctor, and legal conversations are the areas that are appropriate to be private. If you and I are talking, you know, sharing recipes and just don't like the idea of Google looking over our shoulder, that's probably not an appropriate place for that.

But the big platforms that have used us as a product and mined all that data have led to a distrust that drives people to privacy. But I would say it's a privilege to have your unease about Google or Facebook knowing something about you. It's a real privilege not to worry about you being abused in that situation or having your image sent over those over those platforms. So we, as a culture, need to have a nuanced conversation about those two competing goods, online privacy and security, the line that needs to be drawn at a point where we're at least watching for illegal activity, and then have a real conversation. Because everyone's going to draw that line a little differently, but I think that the great political challenge of our next ten years is figuring that out globally.

Q: As you said, this is becoming more and more topical, especially with COVID, COVID vaccines and COVID vaccine passports and travel and entry into events.?

CW: You know, I think it's really pushing for common-sense reform of the marketing technology space. That's my background. I am part of the culprit of leveraging people's data. I know that over the years, I think I added it up once, my team's placed over $10 billion in advertising. I was one of the first clients of Facebook. I was one of the first clients of Google. I see the power of that for business. I think that industry, which I was part of too often, wasn't honest enough about what they were doing.

As a result, when people found out, they were freaked out and allowed bad actors to abuse that data at times. We were always careful because I was working for big companies. I was always careful. I'm like, if I do this and it's on the front page of the wall street journal, that's a problem. We would always shy away from the scariest stuff. But I think, unfortunately, it's the cost of having a fully privatised public space.?

When you and I go to the park and have a conversation, that's a public space - nobody owns it. But when we go online and talk, we're talking about the company's equipment and they have certain rights when they do that. And frankly, certain responsibilities. That's part of the trade-off. Unfortunately, I think the US government, in particular, is really lagging in regulating those organizations and it's time for us to do something about that. It's going to be hard because it's mainly the US; it's coloured by, I think, the insanity of our political system at the moment. I believe steps like GDPR and CCPA in California are good first steps, but we still need to hold these companies responsible.

I think they see it coming. They're investing billions of dollars into what they call trust and safety or integrity to ensure their communities are safe. They realise it's an existential threat, not only from regulation but from every one of us going online.?

We're at the early stages of that. But you know, it's just going to take time for us to figure that out. I'm a student of history. When you see technology come into the culture, it usually takes us about 30 years to figure out what this means. It took about 30 years for us to figure out the radio. It started as a short wave. Then the Navy got mad because they said, oh, a ship sunk because a shortwave radio operator made a mistake. In fact, that was one of the controversies around the sinking of the Titanic. So they heavily regulated it. We ended up with our kind of commercial radio system. That was the way they controlled it.?

As for television when Gilligan's Island was first broadcast, the coast guard got 45 calls from people going about these people lost offshore. It just seems laughable to us now, but at the time, it was so natural to people they didn't know what to do with it. By the 1970s, we had settled into what television really meant in culture and society.?

We're probably just entering the third decade of the internet being broadly applied. The first ten years was the hobbyists, nerds like me. Then the social media era ramped up and realised that we could do so much with this. How amazing are the tools? We're in the last ten years of refinement and figuring out how to make this work as a society.

I hope that we learn from the past and don't just hand this to a couple of large companies and have it completely controlled and still allow some of that innovation and spirit that's been so great on the internet. But have it in a way where the users are protected. It's an exciting time. It's an important time and will have implications for our society for decades to come.?

Q: Where does the name Krunam originate from?

CW: Whenever you start a company, you're like, "What are we going to name it?" We went back and forth, and we realised that a woman in Thailand had done amazing work that we were all inspired by. Her name is Kru Nam. She was a street artist up in Chiang Mai - doing well, making money, selling paintings to tourists. But as artists do, they do projects to feed themselves. And so she worked with the street kids there and taught them how to paint and they said, "What are we going to paint?" Kru Nam replied, "Paint your life." She was shocked by what they painted. She realised most of them were being trafficked in the local karaoke bars that were fronts for child abuse and exploitation.

Unlike 99.9% of humans, including myself, she just started walking into the karaoke bars and pulling the kids out. She had 20 kids in her apartment, a tiny apartment in Chiang Mai when the traffickers came to her house and said, "If you pull one more kid out, we're going to kill it. One more kid. We're going to kill you." With that, she fled to the northern part of the country.?

She now has a compound there with three buildings and a school. She has saved thousands of kids. In fact, one of the first kids she pulled out is now the first graduate of a university in Thailand, a non-state child in the history of Thailand to graduate from a university. She is constantly being compelled to do the right thing and change their tactics to save more kids.

We decided that a way to honour her is to take what she's doing in real life and bring it to the virtual world. The spirit of being compelled to do something right and constantly innovating to make that impact larger every day. We're just trying to live up to her.

Q: What an incredible lady and what a legacy she's leaving! Does she know about you? Do you know her? Have you met??

CW: Yeah, we do know each other. In fact, when we asked her, "Can we name our company after you?" She was like, "Yeah, that's great. The more westerners know about me, the less likely the police will continue to give me problems here." She's fighting every day. That a little bit of fame in the western world protects her from the forces that are still protecting traffickers in Thailand. I've met her several times and once we've controlled COVID enough, we all planned to make a little patronage, go and help out at the compound because she's a great woman.?

Q: You were very beautifully articulated about why you took on the name Kru Nam and why you set up this social enterprise. Can you explain to us what it is about social enterprise and why you really believe that this is the next wave of social and business over the next coming decades?

CW: Yeah, I think there’s a breakdown in the business and nonprofit world through regulation that you're either all about money or you're all about social good. I think if we look back it's an accident of history. The tax laws pushed people that way. As a result, brilliant people had to make a decision. Am I going to excel financially or am I going to focus on my passion and be fed other ways? If not literally fed, emotionally fed. I think that what has happened is sad, and that's the reason that one of our joint venture partners, Just Business, was created. What that means is that the non-profit world's agenda has been hijacked by the people funding it.

There are large private foundations, the government and there are donors. Instead of necessarily doing the best thing to solve a problem, they're doing the best thing to raise money because that's how they have to live, hand to mouth. Just Business looked at them and decided this just isn't right. They started their first company ten years ago now called REBBL.? Here in the States they say the fastest growing natural drink company. It's made from roots, extracts, berries, barks and leaves. They started it, and this will sound crazy, to fight human trafficking in the Amazon. Like, why is there a drink company getting started to do that?

It's not just a profit share. It's literally built into the DNA of the company. They were on a project that they were doing in Lima, Peru. They found that 80% of the kids there were from one remote area, thousands of miles away in the Amazon. They went up the river to figure out what was going on and found environmental degradation and companies buying up land to protect it. We're kicking off the people that had lived there for thousands of years destroying natural economies. As a result, these kids were in horrible economic and social harm - they were being sold just because they couldn't be fed.

So REBBL created a resource extraction company that brought out roots, extracts, berry barks, and leaves to sell on the open market. They set it up and then handed it over to the local community to run, and it became their company. Then they promised to buy materials from them. Well, in the last five years, there have been zero children trafficked out of that area of the Amazon. It's about breaking cycles of abuse and is often driven by environmental and economic strain.

The same loops that they've worked on with companies in the Netherlands tied to Eastern Europe, with women getting trafficked into the red light district. You see the same in Africa and Thailand. They've been building these companies with a social mission but can scale enough to actually deal with the problem, not just pull people out once they've been damaged but stop the problem at its root. That's the reason we're here at Krunam is to prevent this problem online.?

We can't rely on a hundred people donating $30 a month. We need the power and the scale of capitalism to do that. We need to be able to raise and make money. Then we can have the dual goal of making money, pulling in the best talent and solving problems at scale. When you're talking about 70 million images and videos a year, that's a problem at scale, and that's just not going to be taken care of in a traditional charity approach. So that's why we did it. We can tap into every bit of the power of capitalism to fight some of the toxic waste of capitalism.?

Q: What's something that we can all do in order to filter out some of these companies?

CW: There are a lot of charity rating spaces. I struggle with them sometimes because some of them are just like, we want low overheads, but sometimes overhead is what you need to solve a problem. I think it's about talking to organisations and asking the hard questions. I would get to know them versus do a cursory look online and give them a call because they're usually amazing people who are kind of fun to talk to, and you can learn something amazing by asking the hard questions.?

I think that one of the fantastic bits of power of the millennial generation as they've entered the investment world is that they've cared. They've directed their financial advisors to care about ESG when they invest their retirement (environmental, social, and governance). 40% of investors care about ESG. When you think about your retirement, or you're talking to your company about their pension scheme insist on going to companies with a high ESG rating. I think it's driving real change in corporate America. The job in corporate America is to raise and make money and keep your investors happy. That's the power of actually making that a criteria in your investing. If you could do that, you're going to change the whole system and that's a pretty powerful thing.

Q: Clearly CSAM is a challenging topic, and many will not want to go into too much depth, but is there anything we can do to make even a little difference?

CW: Call your local politician. When California put in CCPA, it forced a policy change across the entire United States. When GDPR came in it caused a global shift in handling data protection. Changes in small localities that hold companies accountable to police their platforms will have ripple effects globally. And so you could be sitting in New Zealand and say, "Hey, if you don't do this, we're not gonna allow Facebook to operate."

Typically it starts with a couple of companies doing the right thing, a couple of countries doing the right thing, then it moves to a wholesale societal change. So make the change locally, if you can. That's my recommendation.

Q: What company size does Krunam generally work with? Do you work with SMEs as well?

CW: We can work with everything from the smallest to the largest, anybody that holds the public's images and videos we'll work with. I think chat and messaging are key areas that are critical for this kind of control because that's often where things are shared as there's that feeling of anonymity. It's rarely on your public Facebook page, but it's often in a group or a chat so that's really critical.

Nicole Bremner: Chris, thank you very much for your time. Someone with a martech and digital technology background can really make a difference - and thank you very much for that. And where can people get in touch with you?

Chris Wexler: I am open on LinkedIn. It's Chris Wexler. I'm also on Twitter @ChrisWexler. ?And our website is https://krunam.co/. Thanks for having me, Nicole.

要查看或添加评论,请登录

Nicole Bremner的更多文章

社区洞察

其他会员也浏览了