Krunum has created the best in class AI technology and machine learning to remove digital toxic waste from the internet like Child Sexual Abuse Materials and other indicative content to improve and speed content moderation.
This article is sourced from the Evolve Podcast, a top social entrepreneur startup podcast. Listen or subscribe below.
Scroll below for important resource links & transcripts mentioned in this episode.
Krunam is committed to attack the world's most intractable problems to make the world a better place. After working 25 years in corporate, Chris wanted to find a better match between his goals and making the world a better place. For years he had been giving advice and volunteering with Not For Sale which is anti-human trafficking organization that his brother Mark Wexler and David Batstone started. Krunam was a perfect opportunity to take his expertise in the world of business and large scale technology and bring that together with a true social enterprise.
Chris Wexler Founder Friday
Brandon Stover: [00:00:00] Welcome to founder. Friday's out evolve a podcast about social entrepreneurs, changing the world. I'm your host, Brandon Stover. And today I'm here with founder, Chris Wexler. Now, if you're new to the show on Fridays, we feature inspiring founders from our very own involved community. The rest of the week. We have long form interviews with a variety of social impact founders, visionary leaders, And social enterprise experts as they share how they built their startups that are solving the world's greatest challenges. Now today's featured. Founder is Chris Wexler co-founder and CEO of kronem. which is using the best in class AI technology and machine learning to remove digital toxic waste from the internet, like child sexual abuse, materials, and other indicative content to improve and speed content moderation.
Hmm. Go ahead and introduce yourself and how your technology is revolutionizing the internet
Chris Wexler: [00:00:50] absolutely. My name is Chris Waxler. I am the CEO of crew nom PBC, which is a public benefit corporation. So it's that kind of the evolution of being Corp into actual legal structure. And so we are a organization focused around making the internet a safer place for people and really starting With our initial product is all around child sexual abuse material online and stopping them.
And we have a technology that is able to bring the next generation of identification and classification of this horrifying material to help large technology platforms, law enforcement foam, the blank identify it. Last fight and deal with it and get it off their platforms.
Brandon Stover: [00:01:41] What made you so passionate about jumping into this as a founder?
Chris Wexler: [00:01:45] A couple things. One is I spent 25 years on the corporate side doing, and I was kind of the quote unquote black sheep of my family and my. Sister and brother-in-law are we're in rural Honduras running, you know, providing world-class health care to people in Honduras. And my brother started a anti-human trafficking, nonprofit.
My father's like started the health law department and Minnesota. And was. Principal and kind of fighting big tobacco, et cetera, et cetera. My mom kept seniors in their homes and I sold motorcycles to middle-aged beds and hope they didn't kill themselves. So for years I wanted to find a better match between my, you know, goals and what I viewed as important in the world, which is making the world a better place.
And so. For years I've been giving advice and volunteering with not for sale which is anti-human trafficking organization that my brother Mark Wexler. And David Batstone started and have always tried to, wanted to find a way to get into that ecosystem.
And so Kernan was a perfect opportunity to take my expertise, which is. In the world of business and large scale technology and bring that together with a true social enterprise.
Brandon Stover: [00:03:05] Can you expand on just how big the problem is of child sexual abuse material?
Chris Wexler: [00:03:10] It's. Pretty jaw-dropping. I think most people are shocked when the hair in 2019 alone, there were sick over 68 million reported incidents of Sam or child sexual abuse material reported. And that number is probably. Dramatically low, because about 60 million pieces of that were reported by Facebook, which is the largest company really doing the hard work of, of rooting it out on a, on a native basis.
And so they're the true number is actually probably a multitude, a multiple of that. And the numbers in 2020 with COVID and more people being doing live streaming and et cetera, et cetera, the numbers have actually doubled in G it looks like they've doubled in 2020. And so this is a large scale problem now.
it's, it's one that revictimized these children. Every time it gets redistributed. And so it's one where we want to stop a cycle of abuse. as we build the process, right, we find the materials, we then get to the source of where these materials come from, work with law enforcement actually stop the abuse at its at its core, but just stopping the revictimization that sending these images out over and over again is critical.
Brandon Stover: [00:04:31] Yeah. before getting on this call, I was looking at, and seeing you had a post a while back talking about the toxic side effects of our digital lives. And I think this is a huge thing that we have to start looking at as a society. The Internet's like a very early experiment, and we're just starting to see some of the negative side effects of it.
Chris Wexler: [00:04:49] Yeah, I think it's really interesting when you look at technology, any technology, right culture it takes about 30 years for our human brains to really adapt to it. It took 30 years for us to figure out radio. It took 30 years for us to figure out television when television was new. We didn't know what to do when Gilligan's Island first was on the air about these, you know, wacky six people that get stranded.
The CoStar got. Dozens of calls, worried about these people stranded because people didn't understand that it wasn't real. Yeah. We are in probably stage three of our dealing with the internet, you know, and I've been involved in one way or the other with this internet culture from the beginning.
The first 10 years was really it's real, honestly, it's real. And then we got to the point. Where we were seeing wide-scale application and connection. And so, you know, there was promotion and then connection. And I think that's really the social media and we saw nothing, but kind of boundless, endless opportunities.
And I think there's been a ton of positivity that's come across you and I being able to have this conversation right at distance and then send it out through a podcast. This is all things that didn't exist in or, or were very nascent 10 years ago. Right. what we're seeing now in our political life, in our social life, cyber bullying, we're seeing the toxic side effects of an unregulated market.
And, you know, as a, as a business person, that's an opportunity, right? As someone who cares about society, It's imperative. And so anytime a business opportunity and something that's imperative comes together, that's a ripe spot for innovation and for, and for activity.
Brandon Stover: [00:06:30] Well, let's go ahead and talk about your guys's solutions with the classifier and some of the things that you're doing with deep learning and the computer technology.
Chris Wexler: [00:06:37] Yeah. And I, I will just say just for your listeners and for you. I am not an engineer, and our amazing CTO Scott Page, when he finally listened to this, we'll probably roll his eyes as I describe it.
But let me just back up a little bit and talk about for, and how we came together as a company. We're actually a joint venture between three organizations. One is vigil AI, which is a artificial intelligence company out of London. Not for sale, which is a global anti-human trafficking organization that has projects on five different comments and is just made amazing impact around the world. And just business, which has a impact investing organization and a incubator that has been a critical part of social enterprises, like rebel lectures that you may know if you go to whole foods, it's one of the fastest growing natural drinks in the history of the United States. We're lost city, which is redefining corporate relocation, particularly in the age of COVID. They're doing amazing work there. And so there's, and then American battery technology company is another new company in their portfolio. That's really working towards recycling batteries for electric cars, which as we run out of lithium, when we run out of these rare earth minerals, that's critical.
And so we're part of this ecosystem that is multi multi-stakeholder. We have technologists. We have business expertise. We have the voice of the survivors and kind of the non-profit space. That's really critical for a problem. If we also pull in law enforcement, because vigil has strong ties to law enforcement our initial product coming out, which is our vigil AI Cade, classifier.
Is coming from years of work with the digital AI team. They originally did the work pro bono for the w in conjunction with the British government and British law enforcement. And the, initially it was Kennan. This even happened. So they're using a combination of techniques. One is cutting edge, computer vision.
They've done a lot of amazing work in other areas of computer vision and they applied that knowledge here deep learning and machine learning and being able to really understand the, for the, for the algorithm to understand what I was looking at and then applying it with a deep understanding of the needs of law enforcement and privacy, et cetera in the space of CSUN.
And so essentially what we've, been able to do is work with the British government who have pulled together. What's called the Cade database, which is a child abuse image database, and they use it really to help with investigations and identify victims and prosecute perpetrators.
And so w we put our computer vision onto that data set and work with them. It's a unique situation in that we don't actually most machine learning. You're dumping a whole bunch of data into your servers and doing a whole bunch of work with it. We don't have that luxury. This is illegal content.
We can't hold it and never hold it. But we're going in, we're going into government offices and working with law enforcement, actually access the information and train. And I don't know how familiar you are with artificial intelligence, and computer vision, but it's really good at going.
That's a cat, that's a dog or that square that's round. Like th those are pretty simple things that computer vision does really well. It struggles when it gets to human faces. Even there, there's a lot of, kind of moral quandary around facial recognition and its struggles with Ms. Identification, particularly among people of color.
And, and that's it seen millions of examples there, essentially you show the the algorithm, a million examples and it learns, Oh, okay. That's what that is. We were, we were asking the algorithm to do something even further, which is understand human behavior through the body positioning through implied elements.
And so frankly, when it was first done, it was a big risk because we knew if the technology couldn't. Luckily it was kind of Eureka moment that when they first, when they ran it the first time through, they went, okay, there's something here. And it took three years to really get it to the technology to a point where we could bring it to market.
Brandon Stover: [00:10:46] is it basically classifying these images and then you're working with the authorities to basically find the perpetrators for this, and are you selling the technology to be authorities or to use for database?
Chris Wexler: [00:10:58] so essentially what it's doing is is scanning every pixel of an image and going, is this something like I've seen before, you know, something that you and I. Can do pretty quickly. It takes a computer a lot longer to learn that. And so we have millions of examples, sadly that we've trained on.
And it's led to kind of the next generation of identification in currently the current technology, which is important technology. It's called perceptual hash. I'm not going to get too technical, but essentially it's pretty a fingerprint on a known image. So if somebody finds an example see Sam parochially called kitty porn, but we don't porn pornography implies consent, which there is none here.
So we don't use that term. But when when you do perceptual hashing, you, you, you find an image of it and you go, okay, we found this one. You essentially go, okay. If we find this again, we'll be able to identify it, but that doesn't find everything that's new and novel. And unfortunately there's contents being created every day.
And so, you know, the current technology of fingerprinting is doing you know, when it was, it was important, this is the next step. And so we're moving from finding maybe two to 10% of content. That's out there to be able to identify almost all of the, of the content that's out there. And it's a fun, you know, it's a fundamental shift.
In the ability of us to automate this. And I think that's not only critical because of the scale of the problem with, you know, hundreds of millions of these images out there. But it's also important for when you think about how platforms like Facebook or Google, et cetera, are actually finding this this content users on their platform.
See it, which is adverse exposure. That then gets referenced and sent to a human content moderator, which is you know, often are experiencing PTSD from having to see these images. And in the U S the many of the large platforms have had to actually pay large payouts to their contract moderators because of the emotional damage that this has done to people.
So what did they do? They offer shifted this. Content moderation to developing countries because the Philippines to India in my mind that doesn't really solve the problem. It protects them from not having to pay out, but all that does is damage a different set of people. Yes. And so that, you know, that is not a long-term solution.
Computers are really good at this because they don't have emotions. Right. It's repetitive. And so, you know, even among highly trained. CSM or CSA investigators their effectiveness of classifying and identifying this type of content goes down after just 10 minutes, 15 minutes, because it's repetitive.
It's draining a computer. Doesn't get tired. A computer doesn't get emotionally distressed. It just goes, yep. It's this? Yep. It's that? And then moves on. And so it works really well on scale and it protects human content moderators, not only the victims, but also these moderators. It also, because we have the technology team was so smart to build this as a multilevel classifier, where we can actually see the relative levels of Damage in the images.
So everything from the worst of the worst, all the way down to indicative to just sexual posing versus the actual act it's able to actually bring to humans for decisioning, the worst of the worst first versus what they find. And just going through millions of things. Here are the 10, here are the 10 users are here are the 10 images that you need to deal with right away.
Not only does it protect people in the process, but it speeds resolution. And so, you know, it's kind of a w on that level, it's a, win-win, it's what technology is, is best. Yeah.
Brandon Stover: [00:14:40] What is your guys's business model for this?
Chris Wexler: [00:14:43] so w w one line of our business is with law enforcement. And so we have law enforcement partners that by this technology, we also are at the beginning stages of working with large technology platforms to help them with their content moderation. Gotcha. There are some forward looking organizations that realize that this is highly risky for them to, to not deal with and to deal with.
Well, And so in a kind of a risk management way, this is critical for them to invest into for their users, for the markets. You know, if you're on the front page of the wall street journal saying that, Hey, there's millions of incidents of children being abused in your platform. That's not good for your shareholders.
So there's like a. It's not exactly good, you know, it out of the goodness of their hearts, but it's definitely out of, you know, it's where the pocketbook and social good actually align in this case. Yeah. And so there, we're, we're actually working with large technology platforms to implement it into their standing and content moderation system
Brandon Stover: [00:15:45] with traction.
Have you guys seen so far with it?
Chris Wexler: [00:15:49] Surprisingly strong considering how we haven't been in the market that long. And so we're already in. Mid to late stage testing with several large clients and the pickup in law enforcement you know, the, the feedback we're getting from them is that this is, you know, quote unquote game changer.
That's good to hear it because unfortunately, you know, one of our founders, Ben is his background is as an actual child sexual abuse investigator. And what he saw when he was doing this work is that he was spending 60 to 80% of his time going through these materials that were either referred to them by other jurisdictions because they recognize it was happening in the UK or that they found on raids versus actually taking these images and finding where these kids are.
And so he's like there has to be a better way. And that's when he met up with Scott, our technology lead to go, Hey, we have to find a better way. And, and as a result, we can flip the script for law enforcement, where they are not spending 80, 60 to 80% of their time trying to figure out what to do, but actually doing.
And that's a fundamental shift as well.
Brandon Stover: [00:16:53] What do you guys think the opportunity for growth is in
Chris Wexler: [00:16:56] this market? I mean, th this growth is a multi hundred million dollar market just in the area of CCM. But we have a wider vision. we talked about the stages of internet you know we're in that third stage and wow.
In the last stage, these platforms are all about enabling speech and that's critical in labeling free speech. You and I talking about this as enabling free speech. That's great. What we haven't had is the technology to protect people from speech that is harmful, correct. And in the United States, we have the first amendment.
Everybody talks about the first amendment. It's free speech, but there are classes of speech within the first amendment that are not protected. You can't yell. Spring fire in a crowded theater. You you can't produce a real true threat. You can't blackmail. You can't incite the violence.
And so all of these things are things that we can help identify through technology. And so our vision is to build a full suite of products that are able to protect users from. Harmful and unprotected speech. And we really view it as a new category in technology and protection. You know, we have security of from hackers and from, from viruses, et cetera, but protection from violent speech or cyber bullying, et cetera.
There's are a lot of companies working in this space. We think that's probably a billion dollar plus marketplace. Working on in the protection space. Hmm.
Brandon Stover: [00:18:23] what is the biggest struggle you're having as a founder right now in your startup?
Chris Wexler: [00:18:26] I mean, there are so many when you start a company, right. I think that the, the biggest struggle is finding a way to talk about such a complex issue succinctly.
And I think I've already proven I'm not great at it on this podcast, but, but you know, cause you're dealing with issues of social justice, you're dealing with issues of complex technology. You're dealing with issues of. Law enforcement, but also morality and ethics. And there's so many elements that are going into this one element of CCM and figuring out when I'm talking to someone at a large technology platform, are they doing it to be defensive to their profits?
Are they doing it because they care about security? Are they doing it because they care about. Children, because all of those types of people exist in these organizations. And so understanding what door to open as I talked to them, is it all about the technology or is it all about the outcome? Is probably the biggest challenge because every, you know, there's probably 10 different ways to talk about this.
And everybody it's an emotionally charged issue. And so You know, you have to, you really have to be delicate with talking
Brandon Stover: [00:19:37] about it. Yeah. I think it's even harder. You know, people, these are complex topics. People are trying to speak about them now, but trying to start a company around them, I think would be exponentially harder.
Chris Wexler: [00:19:47] Yeah. It is a really difficult space to start a company. And my background has been unique. I was on. Wall street and Alec say the aptitude, but not the attitude. And I was a digital marketing for years. And, but one of the things I saw when you know, I was working with some of the world's greatest brands with Microsoft and tripled my and Harley Davidson is that as a service provider to these brands or on wall street, the companies that did the best are the ones that were doing things that companies couldn't do for themselves, or didn't want to do for themselves.
And so, you know, when we look at a complex issue like CSUN like protecting people from harmful speech that's a really complex art thing that companies are not built around. Google is they're built to organize the world's information. They're not built to stop CSUN. Right. You know, Microsoft is amazing at cloud computing and you know, office Microsoft office and fill in the blank.
They have all these products, but this isn't what they do full time. And so that's really a business opportunity. Like if you look at anything that's hard and complex that that companies can't do for themselves, that's a huge business. Yeah. So that's kind of, you know, that's more of a cup, half full approach, but.
You know, that's the exciting part of this is that this is something that we can provide expertise to a lot of different
Brandon Stover: [00:21:09] kinds. Yeah. I'd love that perspective. Well, where, and how can people get involved with helping your startup?
Chris Wexler: [00:21:16] Well, you know to be honest, it's helping kids. If, you know, whenever you can talk to your legislators about holding companies accountable to take care of this If they run across any incidents of abuse, report it immediately to the platforms and to the authorities.
It's only when we, as a culture take this really terrible dark thing and bring it into the light that we can deal with it. And so, you know, the day-to-day person could just be an advocate for these children. You know, as far as you know, our business, Well, we'll take care of that, but I think it's really critical for we as a culture to, to make this an issue that's important.
And so just, you know, take care of the kids. That's a pretty easy ask, I guess,
Brandon Stover: [00:22:03] love it, Chris. Love it. Well, thank you so much for coming on the show today and sharing about your startup.
Chris Wexler: [00:22:09] Absolutely. Thank you, Brandon. I think what you guys do is great. So I appreciate the work you've been doing.
Brandon Stover: [00:22:14] that was Chris co-founder and CEO of crumb. As a reminder, if you want to hear more inspiring and purpose-driven founders like Chris then subscribed to the evolve podcast right now, it only takes 15 seconds. And in return you will hear a variety of social impact founders, visionary and leaders, and social enterprise experts, as they share how they built their startups that are solving the world's greatest challenges.
So take out your phone and hit the subscribe button on your podcast app. Now
The Evolve Podcast is focused on evolving the world through evolution of the individual. Brandon Stover unpacks the stories and mindsets of extraordinary social impact founders, visionary leaders, and social enterprise experts as they share how they built startups that are solving the worlds greatest problems. To listen to any of the past episodes for free, check out this page.Leave A Review
Krunum has created the best in class AI technology and machine learning to remove digital toxic waste from the internet like Child Sexual Abuse Materials and other indicative content to improve and speed content moderation.