This special episode is the third of 5 talks delivered on stage at StartWell’s Event Space on King St W in downtown Toronto on November 28, 2019 for a globally roaming annual series called Dark Futures.
This talk was presented by Calla Lee (https://www.linkedin.com/in/callalee) and is titled “Tinder”
*Dark Futures is presented by globally renowned Futurist, speaker, researcher, and author Nikolas Badminton. (https://nikolasbadminton.com/)
[expand title=”Podcast Transcript”]
Qasim Virjee 0:27
Welcome back to this the 27th episode of the start well podcast. As always, I’m your host start Will’s founder and CEO Qasim Virjee. And for this special episode, we’ve got the third talk that was presented a dark futures y, y, z in our event space on King Street West in downtown Toronto. This talk is about Tinder, and it was presented by calla Lee.
Calla Lee 0:57
Hi, everyone, this is good. Alright, so Tinder, it’s a pretty loaded topic. And I’m gonna say whether you’ve actually used it or not, it’s kind of undeniable that Tinder has actually changed the way that we actually meet people and the way we date and how we interact with each other even. So when Tinder first launched, it launched with an algorithm that was actually called the ELO rating system. So if you’ve never used Tinder, it basically means that if you, it serves you up a stream of people. And if you like them, you swipe right. And if you don’t, and if you don’t like them, you swipe left to Nope. And then basically, if two people swipe right on each other, it’s a match. And it’s really that simple. But it’s actually underneath that stream of people that get where it gets really interesting and complex. So this ELO rating system, it’s actually the same system that they use to rank chess players with. So basically, so what you see in that stream of people are a collection of people who have all received a similar number of swipe rights as you have. So the kind of cluster you and what they call groups of that are based around your desirability score. So that was our first algorithm. And their second algorithm actually launched earlier this year. So they basically said that, now that they have enough sufficient data, they can actually start writing their own algorithm. So their new one will adjust the potential matches you see, each and every time your profile is liked, or note. And any changes to that order of your potential matches are reflected within 24 hours or so. So what does this basically tell us? It basically tells us that within 24 hours, like, it’ll basically keep turning, and it’ll keep collecting data on you. So Tinder cubed, what we’re actually looking for is, will one day Tinder be able to actually tell us if, based on two dimensional experiences, so all the things that make us look really great on paper? Will those actually translate into three dimensional, long lasting human interactions?
Calla Lee 3:16
So we actually have all the things that we need today to actually say, Yeah, Tinder can do that. And there’s actually four things, I’m going to walk you through those right now. So the first one is actually big data. It’s kind of everywhere. And it’s this invisible layer that just captures everything that we do. So whether it’s how long you what words, you search for it, the places you go, the people you meet, how long you spend taking selfies, how long what you purchase, and how frequently you purchase them, you name it, we actually track that. So one of the best examples that I know is actually the time that target was actually able to figure out that a teenage girl was pregnant before her father did. And the way they did that is they took they took a look at the data and they said, What do expectant mothers purchase and they were actually able to figure that out. So once you figure that out, then you’re basically at a point where all of these things that we continue to gather gather all the time, is the sister conversation to this has to be privacy concerns. So and you’ll know that it’s actually important because Apple, they have ads right now about their iPhone s that tell you that privacy matters. But the reality is that it’s actually companies like Target, Google and Apple, they actually gather some of the most data out of everybody. And if you think about it, think about all the touch points that Apple actually has. They have your cell phone, they have your air pods now. They also have your biometric data from your Apple Watch. And now, if Apple has your biometric data, Google wants your biometric data. So they actually bought Fitbit earlier this year for 2.1 billion And then, but the verdict is still out kind of on who’s going to buy 23andme. And then they will also own your, your genetic data. So big data is interesting, because it’s only about, it’s only as valuable as your ability to derive insights and knowledge out of it. And that actually brings us to number two, which is machine learning. So machine learning is a subset of AI that basically says, we’re going to teach machines to learn from the big datasets that we have. But additionally, they’re also going to be able to continuously learn based on the ever evolving new information that we feed it. So the more that it learns, the better it actually gets. And Google Translate is an interesting one that I really like to talk about, because it learns an entire language that of the world. And then from there, it’s able to take a look at images that you send it, figure out the patterns and and actually shoot back your translation for you. So now our machines are able to actually communicate with us to a certain level. But actually, the biggest innovation that happened in machine learning so far, is the time that we actually gave machines, a voice and a name. So in February 2010, there was a voice assistant app that launched on iOS. And within two months, Apple bought that up. And in 2011, when the iphone four s launched, Siri was born. And then three years later, Alexa was born.
Calla Lee 6:31
Now,
Calla Lee 6:32
Alexa and Siri, the kind of make machine learning seem innocuous. We ask them to play our favorite song, we ask them to read us the next recipe step, we ask them what the weather is. And they tell us all this stuff, and it’s all fun. But let’s not forget, actually that these are all other ways that Amazon and Apple continue to gather data about us and actually listen to us. So the more that we actually are able to understand each other. And the more that data is actually able to understand us, we get to a point where the single data systems are kind of creepy. But then what’s even creepier is actually when you think about if we link multiple datasets together, and match you, when you put them under one roof, you can get something called a social credit system. So if anybody’s ever watched that, the Black Mirror episode where there’s a world where we’ve adopted a rating system, you can rate everybody based on their interactions from one to five, there’s a character in there where she gets obsessed with her rating. And in that rating, there’s a series of unfortunate events that she experiences one day that actually really hits her riding hard to the point where her best friend, and invites her from her wedding, because her association with somebody with such a low rating actually lowers her rating. So obviously, because weddings make everybody a little crazy, she goes on this rampage, and this tirade drops her rating down to one. And men gets arrested, sent to prison and then removed from this rating system.
Calla Lee 8:08
Now,
Calla Lee 8:08
this episode aired in 2016. But actually, China has been piloting a social credit system since 2014. And their social credit system, what it looks at is it takes each citizen and each business and it gives them a rating score. And if you have a high rating, then you actually get a better than you get better access to like hotel discounts and like better, better places to go, you can make restaurant reservations super easily. And if you have a low score, they might actually they might actually bar you and restrict you from good schools or even getting a plane ticket. Now, China’s system is meant to be a standardized national assessment that will one regulate social behavior to it’ll improve quality of life. And three, promote traditional moral values.
Calla Lee 9:00
Yeah, a little creepy, right. But
Calla Lee 9:04
at the end of the day, a social credit system, no matter how much we use, it, is still trying to understand us on a deep level. So as the social credit system, they’re just going to continue to collect data and collect data and collect data. And to them, the limit does not exist. So when you have all of this data, eventually what you’re going to need to do is find a better way to understand all of it, which is what brings us to this wildcard number for quantum computing. So quantum computing, in its simplest form, it’s kind of the is able to process in one second, what normal supercomputer we have today would take up to a year to do so. And not only is it able to process faster, it’s actually able to process its quantity of data exponentially more than what we have today. So Mostly quantum computing is kind of in its concept phase, up until about last month, when Google and NASA actually co published a paper that said they had achieved quantum supremacy. So they set it up their current their quantum computer that they had built together, they fed it a problem, it was able to solve it in 200 seconds, where our normal supercomputers today would be would take about 10,000 years to do the same thing. So if we now consider how big our digital and data footprint is, and we were able to push it into a quantum computer, would it actually know us better than we know ourselves? Would it know when we get mad when we get sad? Would it know what makes us angry? And in fact, will it even know who we hate, and consequently, who we love, and how we love. So is quantum computing and a lifetime worth of data, the actual leverage point that we need for a machine to understand the three dimensional qualities of the human experience to be able to actually help us find better matches, and actually maybe even solve our dating woes. No more ghosting, none of that. So I want to leave you guys with a bit of like a pocket scenario, and then a couple of thought starters. So let’s actually say that one day, Tinder gets plugged into a social credit system, and from the day that you’re born, and it gives you your ranking based on sort of who your family is, where you grew up what their beliefs are. And then it actually starts to tell you, these are the number of companions and friends that you can have in this that we think you get along with. And that counter is going to change every day, as you make decisions, you don’t like french fries, half of your friends are gone. And then as you grow up, all of a sudden you It also tells you, these are all the potential future companions you have and that number changes. And then another day, you actually get a number that tells you how many casual sex partners that you’d pair well with, and how many friendships that you’d still have. And then one day, what it tells you is that you’ve been pre selected for mandatory procreation, because your credit, your reputation score, and your genetic data are so great that we want to see you continue. So that’s a little pocket scenario. And I want to leave you guys with a couple of thought starters. So the first one is, if we actually remove the entire journey, the experience the struggle out of finding a real companion, is that actually going to commoditize? The human experience? Is it our love and friendship going to kind of become social constructs of the past? Number two is, do we actually trust the people who are building these machines that are making these decisions for us? So in the case of the social credit system that China had, whose traditional moral values are we looking to promote? Number three is, say you’re on a date in the future, and your date ends up lying to you. Does this system that we have, is it obligated to tell you oh, they like their score went down a little bit? What did they lie about? Oh, well, this is what they lied about. And overtime, you’re actually able to see, oh, they had their score drop really low ones. What was that about? Social stalking would go to a whole new level. And number four is what happens if we actually cross quantum computing and all this knowledge that we have of ourselves, our memories, the people, we hang out with all of that stuff? Will we actually want to? Will these robots know us better than ourselves? Will they be able to draw us a hot bath when they read our biometric data? And they say, Oh, they’ve had a bad day.
Calla Lee 13:52
I got this. Will we actually
Calla Lee 13:54
prefer these over our actual, like human companions? Will these forever robots just be so great that it actually turns the human experience and sex and friendship into transactions or just mandatory acts of survival? Thank you
[/expand]