Fighting Human Trafficking in the Digital Age
In this episode, Mikel Del Rosario and Emily Kennedy discuss fighting human trafficking in the digital age, focusing on how her start-up company uses artificial intelligence to help law enforcement.
Timecodes
- 00:15
- Kennedy’s background in fighting human trafficking
- 05:18
- How does artificial intelligence help to fight human trafficking?
- 13:38
- Artificial intelligence and social media privacy issues
- 18:47
- How does Kennedy incorporate her faith into her vocation?
- 21:48
- Is there really an increase in human trafficking around the Super Bowl?
- 26:42
- What does it take to develop artificial intelligence software?
- 30:21
- What was involved in starting Marinus Analytics?
- 33:58
- Challenges facing young women in artificial intelligence
- 37:11
- What can churches and ministries do to fight human trafficking?
Transcript
- Mikel Del Rosario
- Welcome to The Table, where we discuss issues of God and culture. I'm Mikel Del Rosario, Cultural Engagement Manager here at The Hendricks Center at Dallas Theological Seminary. And our topic on the podcast today is "Fighting Human Trafficking in the Digital Age."
My guest in studio is Emily Kennedy, and Emily is the President and co-founder of Marinus Analytics. Welcome to the show, Emily.
- Emily Kennedy
- Thanks for having me. This is awesome.
- Mikel Del Rosario
- Yeah, it's great to have you here in the studio.
- Emily Kennedy
- Yes.
- Mikel Del Rosario
- Well, I want to start out, before diving into our topic today – start out talking about way, way back in the day, to when we first met, when you were a teenager. How did this whole idea of human trafficking even get on your radar?
- Emily Kennedy
- That's a great question. So, these days, human trafficking is kind of the crime du jour. It's something that everybody's talking about. But back when I first learned about it, as a 16-year-old, it was not that way.
And so, I learned about it through a couple of different ways. One was that I was traveling through Eastern Europe at the age of 16. And I grew up very naive, kind of in a bubble – a safe bubble. And we were traveling through some pretty rural areas, driving down from Hungary through Serbia and Macedonia, Albania, around there.
And I saw some kids on the street who were begging, trying to wash our car windows, which is fairly common. But they just had this air of desperation about them. They kind of swarmed our car, and it was like we didn't know what was going on.
And after we left that place, I asked my friend, who was from Hungary, who was from that area, what was going on there, and he said, "Well, those children are actually trafficked by the Russian mob to beg on the street. And they have to make a daily quota every day from begging and washing windows. And if they don't make that quota, they'll be punished by their traffickers."
And that was my first brush with human trafficking. And when I got to the US, I learned that it was not at all simply an international problem; it was happening in the US. And then one of my youth leaders at my church actually moved to – after he graduated college moved to Svay Pak, Cambodia, to the red light district to rescue children out of the child sex tourism/slave trade there.
So, just those were a couple of things that really hit me at a – I would say a pretty tender, young age, and that just really was something that I carried with me throughout the years after that.
- Mikel Del Rosario
- Did that become a calling for you at that point or was it later on in college?
- Emily Kennedy
- That's a good question. I don't know. Probably around that time because it was just something that stuck in my heart. Like I just couldn't stop thinking about it over the years. And so, I tried to just educate myself.
I remember, actually distinctly, walking out of church thinking, "I want to do something about that," as a teenager. But I had no really skills or no knowledge of what was actually needed to solve the problem.
And so, it wasn't until college that I started looking into the technology aspects of it and started to actually get a concrete idea of how one might actually build a solution. So, yeah, in the beginning, it was more of a desire and – yeah, I think a calling in the sense that it really stuck with me; I just didn't really know how I was gonna go about that.
- Mikel Del Rosario
- Mm-hmm. So, you went off to college. And what was your major?
- Emily Kennedy
- I studied ethics, history, and public policy. I thought I was gonna do pre-law, probably go to law school. You know?
- Mikel Del Rosario
- Mm-hmm.
- Emily Kennedy
- Go that whole route. So –
- Mikel Del Rosario
- So, how did you end up working in the technology field?
- Emily Kennedy
- Yeah, which is funny. A lot of people think that I'm like a genius coder who's coding all this. No. I like to say that I get all of the geniuses together and then connect them with the people who need solutions.
So, for me, that was in, I think, either my junior or my senior year when I was looking to do my senior honors thesis. So, for my major, I wanted to do a thesis, to focus in on the problem of human trafficking with resources and in a way that I hadn't had time to do before.
And so, I started looking into it, started observing how the Internet had really changed the game when it came to human trafficking, and in the last 10 to 15 years, at that time, had really enabled traffickers to stay anonymous, to advertise more people with a broader audience, make more money and all of this, and started to learn how law enforcement was really behind the game on it because they didn't have tools to be able to make use of data.
So, I started working with some great researchers and programmers at the Carnegie Mellon Robotics Institute. And it was just – they were blowing my mind with what is machine learning? What is AI, and what tools can actually help us?
And so, that was the time when I started just playing around with software that they've developed at the lab, testing things out, and starting to see how can we basically take this huge amount of data – we're talking millions of ads – and turn it into actual intelligence for law enforcement?
- Mikel Del Rosario
- Wow. Well, when people hear the words "artificial intelligence," a lot of times the just think about things they see in movies or maybe video game characters that you play against.
- Emily Kennedy
- Of course.
- Mikel Del Rosario
- How would you explain to just a lay audience what artificial intelligence actually is?
- Emily Kennedy
- Sure. So, I'm definitely a fan of those movies. I love AI – what I call AI horror movies. I don't know if that's actually a genre, but it's definitely fun. AI can encompass a really wide range of things, but essentially I think of it as kind of teaching robots' brains to help humans.
So, AI isn't necessarily – it doesn't necessarily have to have a physical robot, but it's how do we automate a lot of the processes that humans have to do to process data? So, for us, it's the idea of there's hundreds of millions of ads that are selling sex online where these victims of sex trafficking are advertised.
And so, how do we process through all of that data, within seconds, to pinpoint the needle in the haystack that law enforcement is looking for, which would be the victim or the perpetrator, within seconds? As if a human could read through those millions of ads in seconds and find all of the relations that relate to their case.
So, by no means is it replacing the work that law enforcement does; AI is not advanced enough to replace investigative processes that they do. But the goal is to really save time on those processes that involve processing a lot of data, reading a lot of data, so that instead of reading through thousands of ads, a detective can instead spend more of their time actually investigating.
- Mikel Del Rosario
- Mm-hmm. So, you started your company to help law enforcement, to help these organizations. Tell us about some of the groups that you've worked with.
- Emily Kennedy
- Yeah, so, we spun out of Carnegie Mellon in 2014, and we currently work with local, state, and federal law enforcement across the US. We've worked in Canada for about three years now, and then we're recently working on our United Kingdom expansion, as well. So, Traffic Jam is available there, and that's kind of the start of our European expansion.
- Mikel Del Rosario
- Oh, wow. Tell us a little bit about Traffic Jam. What is Traffic Jam?
- Emily Kennedy
- Sure. So, Traffic Jam is a suite of AI tools that help law enforcement turn this huge amount of data into actionable intelligence for sex trafficking investigations.
So, we say "a suite of tools" because it puts a bunch of different tools in their toolbox. Human trafficking investigations are complex investigations; they need a lot of different ways to go about it. But one – just one example is facial recognitions.
So, we have a tool in Traffic Jam called "FaceSearch." And detectives can upload a photo of a victim; this would be from social media, maybe a missing person's poster. We've actually, had many success stories involving a detective seeing an online news article about a runaway 16-year-old who's believed to be exploited for sex online. Upload that image, and within seconds find some potential matches, and then further investigate to determine if those are true matches. And we've had many, many success stories come out of that.
I think it's important to note that not all ads contain a trafficking person, and that's why AI is so important, because it really is the needle in the haystack. And so, we're enabling law enforcement to find those a lot more quickly.
So, we had a fairly recent case that involved – ultimately, over time, it involved 21 victims of a particularly violent trafficker. And the detective did great work. He was able to put together this case against this trafficker in three months that would typically, take about two years to put together, using those tools – especially FaceSearch.
And so, that's just powerful – I mean the time savings, the fact that timing is so crucial in finding these victims quickly. So, that's what we enable.
- Mikel Del Rosario
- Wow. Well, tell us one of those success stories where you can contrast the way it used to be for them vs. how it can be now with the help of artificial intelligence.
- Emily Kennedy
- Yeah, absolutely. So, we had one recent success story. So, before these tools were available, I actually had an FBI analyst tell me, probably about one or two years ago – that she, in order to try to find missing or runaway children who are sold online for sex, she would print out a photo of that victim, tape that photo to her computer screen, and then manually scroll through thousands, if she could – you know, if she had the time – hundreds or thousands of ads, trying to hope that she would find that victim.
And not only is that really, really just not likely to find someone, we also know that these groups move from city to city. So, what are the odds that she's even looking in the right city? And so, that is how things had gone before. You know, that's really the best they could do is try to do this manually.
And then we had a recent case where a detective was looking for specific victims. One in particular, he was looking for this victim with a photo from when she was 15, but she was currently 17 years old. So, the photo was two years old. And he was able to find successful matches of that victim.
And what's interesting is that he – when he was looking at this – the kind of the results, he didn't recognize her originally in the results because her appearance had changed so much – makeup and hair, and she just looked so different – and yet the facial recognition was able to see that it was a potential match. And he only realized that when he was searching through the phone numbers, and he realized one of those phone numbers was linked to her real name.
And he realized – that's when the light bulb went off – and he realized, "Oh my gosh, this is the victim that I'm looking for," and he was able to put together that case really quickly. So, it's just really powerful what AI is able to do.
It by no means – I think people have this idea that you just push a button, and then you get the answer, and you just blame the AI for anything that it did wrong. And that's just not where we're at. I see it really as a collaborative tool where there's always human input pretty much at every step of the way. So, by no means would a detective get a result and then just go blindly off of that result. They have to verify it using other investigative techniques, and they have to go through those checkpoints.
So, having AI does not offload our ethical obligations by any means. And I think that's kind of how people tend to take it. But I see it really just as a collaborative tool that can further the work that these detectives do.
- Mikel Del Rosario
- Wow. What are some other tools that help law enforcement not only find people who have been or who are being trafficked, but who potentially are being trafficked, but they're just not sure. Like runaways, for example. Can they track where people are and what they're doing? Is there anything that tips them off?
- Emily Kennedy
- That's an interesting question. So, for us, we're focused on this exploitation stuff specifically. So, in our cases, it would just be if the detective has a photo of the runaway; he thinks they might be exploited. Or maybe he has an old cell phone number; he can use those as starting points to search for victims.
So, I think that's the – one of the reasons that runaway cases can be a really good fit for these tools, because they typically are – like the detective is starting with a tip or some sort of lead on this person because maybe they have a family who's looking for them; maybe they have social workers who've given tips.
But foster children and runaways are – and I like to bring this up because people need to understand that those are the most vulnerable to sexual exploitation. People often think that it's someone getting kidnapped off the street or – and I think we see those stories of people saying, "Well, my child was almost trafficked." You know? And honestly, that's not typically the case.
It can happen every now and then, but it's not typically the case that someone is kidnapped and then has a family looking for them and they've been trafficked. So, that's, I think, important for people to understand.
- Mikel Del Rosario
- Well, today there are a lot of concerns about privacy issues on Facebook, on other social media –
- Emily Kennedy
- Oh my gosh, yeah.
- Mikel Del Rosario
- – and traffickers use some of these social media platforms to lure children in. Have you encountered any pushback in terms of the work that you're doing with AI because of privacy issues?
- Emily Kennedy
- Yeah. There's definitely a pushback whenever people hear the word "facial recognition" pretty much, even if people don't fully understand what we do. So, yeah, I definitely like to clarify. One thing to note is that the data that we look at is all publicly available. So, this is totally visible to the public; we're not hacking into anything; we're not taking private information. So, that's one thing to note.
Secondly, we're not a facial recognition company. We partnered with Amazon Web Services to deploy this. And honestly, the cat's out of the bag on facial recognition; it exists. And my personal opinion, as it has been with all technology, is as with any tool – you know, as with a car or an airplane or a pen – it can be used for good; it can be used for bad. And if we want to do something good with it, it's up to us to do that and to make it a force for good.
I always say, "AI for good," because people will use these technology tools for bad. But we strive to use it for the best application that I can think of.
And then thirdly, it's very important that we vet our user community, that we make sure that the law enforcement that we give access to are trained in a victim-centered approach; they're trained to identify cues that uncover trafficking victims as opposed to someone who's just working in _____ _____ prostitution.
And so, we do what we think is our part to include the training that's necessary to vet the users and make sure the right people are getting it into their hands.
- Mikel Del Rosario
- Hmm. So, what drives you to get up every day and keep fighting the good fight with this? Are there people who have inspired you over the years? Mentors?
- Emily Kennedy
- Mmm, that's a good question. I think just seeing the success definitely inspires. And we don't get to see that all the time, sometimes due to law enforcement privacy of their cases or whatever it may be. We don't always get to hear the successes.
So, people think – people ask us, "Do you interact with victims? Do you see them?"
No. We pretty much never do because we see our goal as supporting law enforcement, as giving them tools and then, hopefully, encouraging them to work with people who are trained in victim services, who are able to give them the appropriate things that they need.
As for mentors, I think my dad made a huge difference in my life just raising me to – I mean giving me that Eastern Europe opportunity, raising me to be aware of things going on in the world outside of my own little bubble. And really, I think, the way that he brought me up was the foundation for what I believe today.
And what I've said, actually, since we started this work, since I was working in my pajamas as a researcher, that I can do all of this work and all of this research, and my team can do all this work, if we enable the rescue of one person, it's absolutely worth it 'cause that person has infinite value. You know?
- Mikel Del Rosario
- Mm-hmm.
- Emily Kennedy
- And so, all of that work is worth it. You know? And you get into the business, and there's all these other stressors and challenges and things that drive you crazy, but ultimately that's why we do what we do.
- Mikel Del Rosario
- Mm-hmm. Have you encountered any challenges to your faith working in this industry?
- Emily Kennedy
- That's a good question. Maybe challenges in the sense that seeing all of this bad stuff all the time is – it can be really frustrating. And I think law enforcement deals with that way more than we do. I mean we don't see half of what they have to see up close and what they have to deal with.
But, yeah, it can be really frustrating when there's only so much that you can do. And I don't know; I think for me it's just been realizing all we can do is focus really hard on our niche, on our area of focus; stay focused and do the best that we can do; and then, hopefully, partner with other people who are doing great things.
But, yeah, it can definitely be discouraging to be immersed in this area, but at the same time, since I came across it as a 16-year-old, I couldn't get it out of my mind anyway. So, why not be doing something about it? And I think there's also the element of you kinda become numb to it over time, which I don't think is necessarily the best. You know?
And there's part of me – and I think a lot of people who are in the law enforcement field feel this, that, "I don't want to sit here and cry about it; I want to go do something; I want to make a difference." So, that impact definitely, definitely drives us.
- Mikel Del Rosario
- Mm-hmm. That's awesome.
- Emily Kennedy
- Yeah.
- Mikel Del Rosario
- Well, you mentioned how people have infinite value, infinite worth, and if you could just help one person, it would all be worth it. How else do you incorporate your faith with the work that you do?
- Emily Kennedy
- I think it's pretty simple: just honesty, integrity. I think something that I've always been really influenced by is – so, statistics are really hard to come across that are good in the human trafficking space.
And that was another thing that really motivated me to get into this work is seeing how little academic research was done on it at the time and also seeing that because we talk about this issue so much, just in the US – in the media and politicians and everybody's talking about it – I think there's this motivation to make the statistics as big as they can possibly be. You know? Have the biggest numbers that we can possibly say.
And I think there's that danger, whether it's for companies, or even NGOs who have to show their impact, to say these big numbers. But a lot of times, when you look at the data, there's just – they're old statistics.
In fact, there – if people want to look this up, I highly recommend The Washington Post did a myth buster kind of series of articles about some commonly held human trafficking statistics. And many of these studies, that people are quoting all the time, have even been said by their own – the people who wrote those studies, "This is a 20-year-old study; don't use these statistics anymore."
- Mikel Del Rosario
- Oh, wow.
- Emily Kennedy
- And I think why that concerns me so much is if we can't have integrity in that, then we're not gonna have integrity in our solutions either. And going back to the idea that a life is of infinite value, I get the desire to kind of say how big the problem is, but at the same time, I personally feel like we don't have to have big numbers for it to be a big problem because each person is so valuable.
So, I just like to encourage people. And what we've done is be very conservative in our statistics, in our estimates. The one that I like to rely on is from the National Center for Missing and Exploited Children that said that – I believe the latest stat is one out of seven runaway children were likely to be exploited for sex.
And so, when you think about how many runaway children – I mean that's a lot. So – but I just encourage people to be careful with statistics because it really matters.
If we're not honest, and if we don't have integrity about our statistics, even though we think we're drawing more awareness, then when you get to the application side like we're at with law enforcement, if their going off bad statistics, then they're gonna misallocate their resources. They might be missing big opportunities to put their resources toward things that would actually make an impact.
And so, I've always just had – like held that really close that we have to back up what we say. We can't just say whatever we want; we have to have integrity, honesty, and that's always what we've led with.
- Mikel Del Rosario
- Yeah. Now, you did an analysis of a whole bunch of different ads and the ways that traffickers work. How much attention really or how much more trafficking activity is going on around the Super Bowl vs. other events? You looked into that, right?
- Emily Kennedy
- Yeah, a great question. So, this is – I'm so glad you brought this up. I mean people can Google – it was called "Looking at How Public Events Affect Sex Trafficking Activity." Public events like the Super Bowl and NBC Universal actually covered that research with an article that everyone's welcome to go and read.
But yeah, essentially what we did is we looked at the Super Bowl and compared it to 33 other events – large events like conferences, CES, Formula One – you know, big gatherings of people. Because it had always been said – and particularly there's a couple of politicians who said the Super Bowl is the number one human trafficking event in the country.
And so, again, I remember reading that stat – quote-unquote "stat" – back in probably 2012 and thinking, "Wow, that's amazing. There must be some research to back that up. I'm so glad this problem is getting more awareness because of this." But then, when you dig into it, you realize there's no research at all. There's a couple of anecdotal stories, but there's no data to back this up.
And so, at that time that we came to do the study, we had gathered hundreds of thousands of ads for a couple of years, and we thought, "Well, hey, our data's not perfect. You know? But I would argue that the selling of sex online is a proxy for measuring human trafficking. If that activity increases, it's likely that human trafficking also increases."
And so, we compared the Super Bowl to all these other events, and we found that the Super Bowl statistically did not have the most statistically significant increase in activity. In fact, there are many other events and things that were happening through the US at the time that had much more statistically significant increase in activity.
And I emphasize "statistically significant" because before that study, a lot of groups had been looking at, "Okay, what is the total ads the weekend before the Super Bowl, and then the total ads during, and the total ads after?"
And what people need to understand is what we care about is statistical significance. And that means is it actually significant – the increase – as compared to historical data, as compared to data across the rest of the US? Can we actually say that there's a statistically significant increase? Because totals really don't matter; we have to know if it's significant.
And so, we found, actually, through that study, a couple of events that nobody was talking about that actually had a huge increase in that activity. Just one of those is the oil boom in Minot, North Dakota. That area has had a huge boom in population and a huge spike in trafficking activity.
So, if you look up that study, you can see the graphs, and it just kind of blows you away because when we saw the statistically significant increase for the Super Bowl, there was, but it wasn't – it was like kind of a conservative increase. It wasn't off the charts.
And then, when you look at the Minot, North Dakota, graph, it is off the charts, and it is what we would have expected for the Super Bowl. And so, again, that's so important to me and to my team because armed with that actual data, then law enforcement can make data-driven decisions because they're putting huge resources towards this stuff.
And I even talk to detectives who said, "Yeah. You know, if we had these resources that were given during the Super Bowl, we could arrest the same amount of traffickers any weekend." And so, I just think data-driven decision-making is so important to ensure that we're using these really limited resources in the best way.
- Mikel Del Rosario
- Wow. Now, how can people access your study again?
- Emily Kennedy
- So, if you Google "How Public Events Affect Sex Trafficking Activity, Super Bowl, NBC Universal" it should come up. It's also on our website; it's around. If you google around, you should be able to find it, yeah.
- Mikel Del Rosario
- Okay. And what's your website?
- Emily Kennedy
- Our website is marinusanalytics.com. That's M-A-R-I-N-U-S analytics.com. And there's lots of articles on there. Keep – you can keep updated on what we're up to. We're also on Twitter @MarinusAI.
- Mikel Del Rosario
- Awesome. Now, how many government agencies and law enforcement groups do you work with now?
- Emily Kennedy
- We're in the hundreds now, yeah. So, a lot of different agencies. And we're able to have a broad reach. So, a lot of companies will focus federally because that's the biggest bang for their buck.
But our goal is not only reach but impact. And so, we'll work with the smallest police agency up to some of those federal agencies that are probably in your mind. So, yeah, we work with pretty much every law enforcement agency, yeah.
- Mikel Del Rosario
- How many kinds of people are involved in producing something like Traffic Jam or FaceSearch? In other words, we might think – I think immediately about coders; we might immediately think about tech people. But how many kinds of different jobs or job functions are represented there?
- Emily Kennedy
- Mmm, that's an interesting question. So, Traffic Jam was developed first as a research-grade software. So, it was developed in research. And it was first deployed, actually, in 2013, before we even had a company. A lot of people don't know that.
And so, that was the work of me to design it, to talk to law enforcement, to get law enforcement input, and then working with some of the research programmers who were able to – at that time, we were able to leverage some of the existing software. Because – and that was a lot of learning that we had just growing the tool is, in research, it's more that you're trying to build a prototype, put something together as fast as possible from existing parts.
And then – so, that was a small group. And we still have a small team now. We like to think that we're lean but mighty. And we've done a lot with just a very small amount of resources. We've not taken VC investment to date, although we're open to it. But we really wanted to maintain our social mission.
So, then when we spun it out of the university, we learned that a research-grade-level prototype just does not scale. It was not built to scale up to hundreds of millions of ads; it was built to be both fast and to be usable and tested and all of that, but it just doesn't scale up.
And so, we had a number of people on our technical team. We have a research scientist and a software architect who we brought on to actually scale that, but still a fairly small team. We're able to do, like I said, a lot with the researches that we have, and that's always been our goal.
And then, as far as facial recognition, I mean there's lots, lots more people who went into that on the Amazon Web Services side. But their team made it really easy for us to implement it.
- Mikel Del Rosario
- Wow. It's just amazing to think how many people are involved to get it from something that is a program that sits on your computer to where it's deployed and helping people, and it's evolved with law enforcement and all of that. And then, not even to think about the actual people who are being freed from sex trafficking and the people who are putting the perpetrators behind bars, as well.
- Emily Kennedy
- It's a whole – yeah, a whole complex network of people. I mean when you're talking about, like you said, getting people in jail, I mean there's the whole ecosystem of – and I think people don't understand this typically, but many times victims will recidivate multiple times before they finally find stability in their rescue.
So, there's a lot of work done by victim services organizations by law enforcement who partner with them. And then there's the prosecutors and the analysts on their side putting together the case. There's the jury. It's really important that people understand these cases more because we need juries to understand what's going on, understand the difference between human trafficking cases and other cases. And we're still working on that. I think there's still a lot of work to be done to just increase that awareness.
- Mikel Del Rosario
- Mm-hmm. Well, I think it's amazing just to think about all the kinds – different kinds of people who are using their giftedness, who are using the job function that they have to participate in this. Because it helps you, one, appreciate just how broad this thing really is. And it helps people understand that their work actually matters.
- Emily Kennedy
- Oh my gosh, yeah.
- Mikel Del Rosario
- And that their work matters not only to all these people and to the victims, but it matters to God, too.
- Emily Kennedy
- Absolutely.
- Mikel Del Rosario
- 'Cause this is really, really something that is in the heart of God that you're getting to participate in.
And then there's the whole entrepreneurial side to you, right?
- Emily Kennedy
- Oh my gosh, yeah.
- Mikel Del Rosario
- Having a start-up – how did the start-up come up we're talking about?
- Emily Kennedy
- Yeah, so, I was never one of those people who was thinking, "I'm gonna start a company no matter what. I don't care what it is; I'm gonna start a company." That kind of came about through a number of different things.
One was that, like I said, in 2013 we had the prototype out. It was starting to help people. We were getting users, and I started getting calls from law enforcement all over the place, just across the country, saying, "Hey, I heard about you. Can you help me on my case?"
And so, that's when we realized, "We need to really make this into a true product. We need to make it into something that's easy to use." And so – wait, remind me of your question?
- Mikel Del Rosario
- How did you decide to start your start-up? How did that whole thing happen?
- Emily Kennedy
- That's right, yeah. So, it was kinda this long journey. But yeah, so, we started getting these calls from law enforcement and realizing, "We need to scale this up." And we didn't know exactly what would be the way to do that.
So, we started going into this program – which I highly recommend if you're a university researcher looking to commercialize university-developed research products – the National Science Foundation I-Corps program, which stands for Innovation Corps.
And so, we did that in January of 2014, and that was basically a boot camp of if you're a researcher, and you have never started a company, and you have no idea what you're doing, let's throw you into the deep end and teach you how to – what they call "get out of the building," essentially.
In about a month-and-a-half time period, we had to go out and interview a hundred potential users and potential customers and get feedback on what their needs were, what their pain points are. And that was really helpful because I think, as a researcher, you tend to have the problem that you care about and the things that you want to solve. But those may or may not be real-world problems or something that's going to – that a solution is really needed or is gonna affect the work that law enforcement does in our case.
So, we did that. And then, yeah, after that, we learned a lot. We decided to spin off into a company. And then, in the years following, we've gotten a number of National Science Foundation grant support to continue our work, which has been awesome.
- Mikel Del Rosario
- Wow. And recently you were named a Mother of Invention –
- Emily Kennedy
- That's right.
- Mikel Del Rosario
- – by Toyota. Tell us a little bit about the Toyota Mother of Invention.
- Emily Kennedy
- Yeah. So, it's an awesome program. It's done in partnership with Women in the World, which is a group that does a bunch of events that highlight the work that women are doing across the world. So – I mean there's everything from lady sea captains who rescue refugees off the coast of Libya; there's an amazing woman who spoke at the New York event this year who works in India and has a safe house for victims of human trafficking.
And so, every year they select three or so Mothers of Invention, women who have companies that are involved in building technology solutions to social problems. So, everything from clean water, to public safety, to human trafficking.
And so, yeah, they selected me, at the end of last year, for their event to be honored this year. So, I went to the Los Angeles salon and was honored by Toyota there, which was amazing. And they gave us a grant to support our work. And then also got a chance to speak at Lincoln Center in New York City in April, which was really an amazing experience. And just going to the event – Women in the World event was amazing.
And so, they really helped to highlight the work that we're doing, get the word out – and it was an awesome experience.
- Mikel Del Rosario
- Wow. What are some of the challenges you would say that you face as a young woman in this field?
- Emily Kennedy
- Mmm, yeah, so it's been interesting because technology and law enforcement are both male-dominated fields. And I kinda just jumped in. So – but it's interesting to think about my evolution over that time. Because I remember walking into my first big presentation – this was probably in 2013 maybe, or 2012, at the LAPD Detective Symposium. And I was teaching a class, and I remember I walked in in this full suit that probably made me look even younger than I was at the time, which was 23 or so.
And I remember feeling very out of place, and they were probably wondering, "What are you doing here?" And then I walk up and start teaching, and they're like, "Hmm."
But what I found, for me, is that, standing out like that, being a young woman in this space definitely gets people's attention. But then it's my opportunity to use that and to say, "Hey…" Like a lot of times it's raising the bar. I think people need to have higher expectations for young women, because we can do so much.
And I guess that almost made it easier on me because once you have their attention, you say, "Okay, you put the bar down here. By the way, the bar needs to be way up here. I'm going to exceed your expectations."
And then another huge part of – like how do I actually impact the work that law enforcement's doing? How do I turn this from research that I really care about into something that has real impact? And a huge part of that is just bringing value.
So, I think this is true in so many industries. What they really care about is – they might be new to technology, or they might be kind of curious about what we're doing with AI – but what really gets people interested, especially law enforcement who we're trying to assist, is when they hear, "Oh, this brings me value. This is gonna save me 50 percent of the time on investigations."
Or like the case that we talked about, turn a two-year investigation into a three-month investigation. That value really changes their work. And so, once they hear that, that's one of the most important things. And just make it clear, "This is how we're bringing value."
Every training that I do now, I pretty much say, "I want to bring as much value as I can to you in however much time we have. I want you to be able to take away this tool, and this tool, and this tool in your toolbox that you can use today to find victims." So, it's really about making AI as accessible as possible.
- Mikel Del Rosario
- Wow. Who are the kinds of people that you train?
- Emily Kennedy
- So, I do a number of trainings. We actually have a weekly online training every week because, again, we're a lean team. We work with law enforcement across the country, and now internationally. So, we found that a great way to scale it so that if they – if law enforcement e-mails us tomorrow, then they can be on a training this week and start using it. So, yeah, that's been super helpful.
And so, typically, I do a lot of trainings for local police, but it could be FBI, it could be kind of – I've trained the full range. So, yeah, it's all around.
- Mikel Del Rosario
- Wow.
- Emily Kennedy
- Yeah.
- Mikel Del Rosario
- Well, you've done so much. And there are a lot of pastors and ministry leaders who listen to the show. Are there ways that churches and ministries could plug in and say they want to be a part of the solution? What can they do?
- Emily Kennedy
- Absolutely. So, there's so many things you can do. I like to think start local. You know? If you're looking to give to a local organization, first do your research. See what everyone's doing. Go out and meet some people; figure out what's going on.
If you want to give, really vet those organizations. You want to make sure your money's going toward something that you support. So, do that.
I also like to say that I truly believe anybody, no matter their skills, can contribute to this problem. So, if you're –
- Mikel Del Rosario
- Or the solution to this problem.
- Emily Kennedy
- Exactly.
- Emily Kennedy
- Exactly – hopefully not the problem – to the solution. Yeah, so, if you're a CPA, you could be a forensic accountant and donate your time to make sure that the trafficking victims get the money that they've been making this whole time instead of their trafficker.
If you're a writer, you could work with a nonprofit and help work on their marketing, help populate their blog. If you're in technology, obviously, you can look on our website and see if we have any jobs posted. I know we're actually currently hiring for a software engineer. So, check us out.
But there's just a ton of different solutions. But probably if I said take away one thing, it would be do your research. Because I think there's a lot of people reinventing the wheel because they don't do the full research and see what's already out there and who they can partner with. So, do your research, find people to partner with and do it that way.
- Mikel Del Rosario
- Wow, that's amazing. Amazing. You know, when we're – we've been talking about this in this 21st century digital age, it reminds me of something that – when Jesus, in Luke, was reading from Isaiah – from Isaiah 61 – there's a part in there where God's special servant says, "I've come to declare freedom to the prisoners and sight to the blind and to set the captives free."
And then in Revelation, there's this – Revelation 18 has a judgment against those who sell the bodies of other people. And it's like this is such – the heartbeat of God –
- Emily Kennedy
- Absolutely.
- Mikel Del Rosario
- – that you get to manifest the character quality of God by bringing more justice into the world by setting people free and really common grace for the common good.
- Emily Kennedy
- Thank you.
- Mikel Del Rosario
- That's just amazing to think about what you say a little group of – your little team can actually have a huge impact.
- Emily Kennedy
- Thank you. Yeah, we hope to. We hope to continue doing it.
- Mikel Del Rosario
- Well, thank you so much for being on the show, Emily.
- Emily Kennedy
- Yeah, thanks for having me. This was fun.
- Mikel Del Rosario
- And thank you so much for joining us on The Table today. We hope that you will join us once again next week here on The Table podcast where we discuss issues of God and culture.
About the Contributors
Emily Kennedy
Mikel Del Rosario
Mikel Del Rosario (ThM, 2016; PhD, 2022) is a Professor of Bible and Theology at Moody Bible Institute. While at DTS, he served as project manager for cultural engagement at the Hendricks Center, producing and hosting The Table podcast. You can find him online at ApologeticsGuy.com, the Apologetics Guy YouTube channel, and The Apologetics Guy Show podcast.