Through Conversations

Everybody Lies: Seth Stephens-Davidowitz

Episode Summary

Welcome back to another edition of Through Conversations Podcast. This time, thought-provoking Seth Stephens-Davidowitz joins us.

Episode Notes

Welcome back to another edition of Through Conversations Podcast. This time Seth Stephens-Davidowitz joins us. Seth is an author, data scientist, and speaker who studies what we can learn about people from new, internet data sources. His 2017 book Everybody Lies was a New York Times bestseller and an Economist Book of the Year. Seth is a contributing op-ed writer for the New York Times and has worked as a visiting lecturer at the Wharton School and a data scientist at Google. He received his BA in philosophy from Stanford, where he graduated Phi Beta Kappa, and his PhD in economics from Harvard in 2013. He is a passionate fan of the Knicks, Mets, Jets, and Leonard Cohen.

Seth is someone who will blow your mind. His most recent book “Everybody Lies” answers deep questions such as how much sex are people really having? or Are Americans still racist? We dive deep into his book and we continue the conversation around it but cover a lot of ground in many other fields including free will, artificial intelligence, government and much more. Even though this conversation was thought-provoking, I still have a lot of questions for him and I hope we can have him again as a guest on this Podcast.

If you find this conversation insightful, consider subscribing to the podcast at any podcast feed you use and share it with a friend, we truly appreciate your support.

And now, Seth Stephens-Davidowitz

---

Website: http://sethsd.com

Book: Everybody Lies : Big Data, New Data, and What the Internet Can Tell Us About Who We Really Are

Twitter: @SethS_D

Watch his keynote at Ciudad de Las Ideas 2018

---

In this episode we decided to write a transcript of the conversation instead of highlighting distinct parts from the episode. If you like this option better, let us know! If you like highlighting instead let us know, too. If you like both, you know what to do.

You can find a copy of the transcript here

---

Thanks for tuning in for this edition of Through Conversations Podcast!

If you find this episode interesting, consider subscribing to it. Also, you can share it with anyone who comes to your mind.

Instagram: @through_conversationspodcast

Twitter: @ThruConvPodcast

Website: throughconversations.com

Episode Transcription

Alex: [Music Intro]

Alex: Hey Seth.

Seth: Hi Alex.

Alex: Great to meet you. First of all, I need to say the vibes in the [Ciudad de Las Ideas] audience, they were going crazy with your keynote.

Seth: Oh, well thank you. It is a great event. And Andres was such an amazing host and I think - - I don't want to stereotype, but I think the Mexican audience is kind of a little bit less, less prudish than some American audiences.

Alex: Yeah.

Seth: Which I kinda like, cause a lot of my book does talk about these topics like sexuality.

Alex: Yeah.

Seth: That aren't always discussed. And I think, yeah, again, I don't want to stereotype. I think, you know, Mexican people have a nice sense of humor and a nice, kind of openness towards ideas that, I think, made very good connection between my talk and the audience.

Alex: It was great.

Seth: I also definitely thought it was, it was among my favorite audiences.

Alex: You should come when you write your next book: Everyone's still lies. You need to come again [To La Ciudad de Las Ideas].

Seth: Well, Andres said I'd have to have more information about Mexicans to analyze the talk. Although I did have that one fact about Mexicans. Do you remember that one? About the husbands and wives?

Alex: Like the poems for my pregnant...

Seth: Yeah, I think that a few people from Mexico said they really liked that cause they come across as the great gentleman. Yeah. It's, the top search Google searches that people make about a pregnant wife and in the United States the searches are like: my wife is pregnant, What do I do? My wife is pregnant, now what? And in Mexico, it's: love poems to my pregnant wife. So I think, Mexicans definitely when they, Google search they show off.

Alex: Read your book. I was just impressed. And after listening to you at Ciudad de Las Ideas, I couldn't stop writing questions to ask you. The first thing that came into my mind, listening you at Ciudad de Las Ideas and after reading your book, which was a roller coaster ride because you could feel depressed for a few facts you mentioned like racism, but then you feel excited about all the myriad of possibilities that big data can have.

Seth: Yeah.

Alex: So what do you think, in perspective of morality, big data, how can we stop using it to, as you say, stalk people and start using to help people?

Seth: Yeah, I think, you know, there are definitely concerns about some of the ethical implications of data and company - People get freaked out when stories like Cambridge Analytica, when they hear about stories like Cambridge Analytica and seemingly, the Trump campaign using data to manipulate people, I think some of those stories are a little overblown. The Cambridge Analytica story in particular, I think, I know some of the tools that we're using, I don't think they were quite as powerful as they claimed. I think it was a little bit of a snake oil, product they were selling. But in general, I do agree that big data is incredibly powerful and that corporations tend to try to make money and they can use it for bad means. So I think the answer just has to be more regulation. The thing I'm worried about is that governments tend not to know much about data. They don't know anything about machine learning of artificial intelligence that could be really hard for them to properly regulate these tools. And even data scientists themselves don't always know exactly what's going on under the hood. They don't know all the reasons that their algorithms make certain decisions. So it can be really more difficult to regulate some of these some of these tools than, traditional products.

Alex: In congressional hearings, the politicians saying, I can't even access it. How can you explain it to me? Like they're trying to regulate it. They don't know anything about it. But do you think Google can be used sustainably in a regulated manner? Who could be in charge? Do you hire a private company to do it? Who wants to work in the government regulating these companies?

Seth: Well, traditionally there've been really smart people who care a little bit less. You know, smart people always kind of have two options: They can try to make a lot of money or they can try to do something good for the world, or they can find the slight niche of doing something good and make a lot of money. But it's not always, it's not always that easy to find, those niches. But there are, there have always been a lot of smart people who, are happy to devote their lives towards, helping the world - - improve the world. And, you know, instead of being corporate lawyers, they choose to be public defenders or environmental lawyers or civil rights lawyers. So I think, similarly a lot of people with tools and data, will instead of trying to work for one of the big companies, get getting more people to click on ads, try to work with the government, to regulate some of these companies.

Alex: Yeah. And right now, who has the tide? Who has the leverage in using big data: corporations or institutions?

Seth: Corporations are way ahead, partly because they have so much more data. So a lot of the best people want to work for the corporations because the data's more fun, so if you, if you study artificial intelligence, if you work at a university, your resources are fairly limited, but if you work at Google, your research resources are nearly unlimited. So many of the best professors at least take breaks to work at Google. That said, I think that is also a check on some of these companies that scientists tend to have a pretty strong sense of ethics and they want research to be reproducible and research to be used for good and not for harm. And they put a lot of pressure on these companies to follow good ethical guidelines. So if the companies want to get the best scientists frequently they have to they have to act in ethical matters. Definitely harder for apple to recruit the top scientists compared to other companies because they don't make a lot of their research public. And scientists will sacrifice some salary to work for a company who has the traditional scientific values of opening up their data and freeing the information.

Alex: Well, one theme that I kept thinking about while reading your book was like, and you mentioned it in a chapter, was the ethics. How ethical it is to try to see the Google searches of a person who's trying to commit suicide or will commit suicide versus invading his privacy or her privacy?

Seth: Yeah, I think that's right, but I think there are new ethical questions. So another one is that Google traditionally has believed that they don't want to be biased. They show people information based on algorithms, not based on what the engineer's think you should see. And that's, I think, an important principle. If you search for tax policy, you're not going to right away get Donald Trump's policy is great, or Donald Trump's policy is horrible. You're going to get whatever their formula says are the most relevant articles, because otherwise Google would have an immense amount of power to manipulate the masses, to show people what advances Google's interest or advances the political desires of Google's owners rather than the truth or what's most important to the users. But you can imagine there are situations where this breaks down and clearly, so for example, if someone searches how to commit suicide, I think there was a period where, because people can be horrible, some of the top returns when you made these searches or message boards telling people basically: do it, jump like your life's a mess. If you're searching for suicide, you should commit suicide. And I think most people would say in that situation that Google has an ethical obligation to not rely on algorithms, but to overrule the algorithms here and say no, if someone searched for suicide, they should not see any sites that tell you that you should commit suicide. They should see suicide hotlines and other professional interventions to help, you know, against suicide. And they do that. They intervene and they show right now for most suicide searches, suicide hotline. But then, so on the extreme example, I think most people would say if you searched for a politician, Google does not have the right to just say that politician is horrible. But if you search for suicide, Google does have the right to give you information, but what happens right there, in between? I talk in my book a lot about racism and people search things like N-Word Jokes. So what happens if someone searched N-Word jokes? Should they just be shown the jokes or should they be told about civil rights and how it's not okay to laugh at other people? If someone searches how to join ISIS? Should they be given ISIS' website or should they be told that they should not join ISIS? And that ISIS is a terrorist organization no matter what they hear. So where's the line at which an intervention warranted? It's not an easy question to answer and not enough people are thinking about it right now.

Alex: No, definitely. Thinking about what you say about Google's algorithm and how it is driven to get as many clicks as possible. What are people inside Google thinking about? This questions obviously, but are they thinking about trying to create a new political landscape? Because Google has been a true game changer. We've seen it in 2016 and 2020 won't be an exception with a presidential election and the Democrats going around trying to win it. But, when I - - just hypothetically - - I'm writing a code, I'm on the back of the, of the website, trying to get as many clicks as possible. Right? So do they say, hey, we need to stop this or they just go with it?

Seth: Yeah, well, Google's a big company and there are all kinds of teams within Google. So to some degree, that's kind of a good thing because one of the most effective forms of government is checks and balances. So no one person makes the decision. There are all these different teams. There's a team within Google called Jigsaw, which is a think tank and it's working on problems such as terrorism. They actually recently came up with a plan where they bought ads to fight terrorism. So if you made searches for how to join ISIS, you would have ads that Google had purchased explaining why you should not join ISIS. So that's kind of an in between solution where you don't change the direct algorithm but you use advertising to try to persuade people against that position. And sometimes it's just one off: somebody comes up with an idea, like we should add the suicide hotline if someone searches for suicide and people will agree with that, with that policy. So, there's not one answer there and I don't work at Google now, so I don't think there's really a systematic policy to deal with these decisions they kind of come up as they come up and then they have different ways of updating them. But I think the general principle is that it's based on an algorithm and the algorithm shouldn't really have political leanings. It should be much more about what people are going to click on and giving people the information that they want to see, and that you kinda need extraordinary reasons to intervene. So something like suicide, I think poison was another one. If you search for poison, they made sure that right away you get a poison hotline or you get the information, that is most relevant. But again, it's just like you could- -once you start thinking about this it's like it's never ending. There are so many places where you might want to intervene. What if someone searches for vaccines and autism? I think most scientists say that's not true, that vaccines cause autism but a lot of people like those theories. So you search vaccine and autism and it's just based on clicks then presumably and conspiracy theories would rise to the top. But then if you say did Russia and Trump coordinate? A lot of Trump supporters would say, that's a conspiracy theory and it's ridiculous and nobody should be allowed - - If someone searches Russian coordination, they should be told that that's ridiculous, that it's not a true story. So is it okay to intervene there? How do you determine whether a conspiracy, whether a theory is a conspiracy theory or a legitimate theory? It's not, it's not easy to determine. What if someone searches for global warming? Should they be told that global warming is true? Should they be given contrarian opinions on it? It's very complicated.

Alex: But recently, I also read Yuval Noah Harari's book Homo Deus and I associated a lot of with your book when he says that our intuition has been hacked, the question about freewill can be stopped. Now, Google and the tech companies in Silicon Valley have hacked our intuition. So, let's say everyone is searching for these things, but are we truly searching in ourselves or have we been driven towards those roads?

Seth: I mean I think that's always been the case. Again, it's not new that corporations have been trying to persuade us to do certain things. So, if you see a Tide commercial television and then you go to Google to buy Tide laundry detergent, is that your freewill? If you hadn't seen that commercial, you wouldn't have made that search. You know, if you see a celebrity wearing a Rolex watch, then you Google "buy Rolex watch". If you had never seen that commercial, you wouldn't have even thought you needed a watch. And that's just companies. Parents try to persuade us, friends try to persuade us. I've never particularly believed in freewill. Because I think we all are, you know, our brains are just neurons that are firing. And if you could predict - - if you knew every particle in the universe at one moment, you could probably predict where every particle would be in the next moment. So it's hard to know how freewill fits with modern understanding of physics. But, I do agree that corporations, that said while it's always been an issue, people have always been persuaded by many sources. I do think that corporations with enormous databases do have an unprecedented advantage to manipulate people in ways that they want. I think the main way we're seeing this is how addictive the Internet is. I talk about in the book about A/B testing and how Google and all these other sites are doing all these experiments where they show one version of the website to a thousand users, a different version of the website to a thousand other users and see which version gets people to spend more time on their website. And then whichever one wins that test is then shown to the masses. And they do that over and over again until their site becomes incredibly addictive and people have a hard time putting down their phones. I think you could argue that Instagram and Facebook and Twitter are among the most addictive products in human history and I think part of that is because of the power of data analysis and particularly A/B testing.

Alex: What's interesting about this is that they weren't supposed to be like this. No, they were saved from trial and error and constant feedback loop between the users and the companies. Somehow we were the ones - - the customers, were the ones trying to tell them unconsciously this is the best way it works. No? It wasn't their idea.

Seth: Yeah. I mean some of the ideas just some ideas that turned out to be really addictive. Somebody on Facebook had an idea: let's allow people to share photos. And that wasn't an experiment or test or market research that was just an idea; or someone decided we should create the news feed and have a continuous update of what your friends are doing, that was an idea that one person had, not an experiment. Once they have that idea, the size of the of the photos or the friend that goes to the top of the newsfeed or the text of the post in the newsfeed or the buttons that they include in the news feed, all these rely on experiments and they choose the text and the names and the buttons that are likely to get people to spend the most time on Facebook.

Alex: So right now, we could say that big brother is not the government, big brother are the tech companies.

Seth: Oh, government also has done some creepy things -- Edward Snowden told us. I think the government, when they're big brother, they have to rely on the companies because they can't do it themselves. They can conduct a census but that's not going to get all that much information about people. So I think they could study IRS records and learn a little bit more about people. But I think the really juicy information about people that we wouldn't want the government to know tends to belong in the hands of corporations and for the government to become big brother, they collaborate with corporations, then use that data for bad purposes.

Alex: The reason why companies have such an incredible amount of information is because we don't have any incentive to lie to Google. As you say in your book, there is no incentive to - - no one's watching me writing what I'm writing, but if I'm talking to another person or someone comes to my house to do a census, I have every incentive to lie due to the social desirability bias.

Seth: It's not even that you don't have an incentive to lie, you have an incentive to tell the truth. So I explained in book, if you ask someone, are you gay? A lot of people in places where it's hard to be gay, don't say they're gay, but they will search for gay porn. And you could see like that's an incentive that if you enjoy watching gay pornography, you would have an incentive to type that into your computer. I think, you know, if you have a health problem you might not have an incentive to tell a stranger that you have that health problem, they might think less of you or you might not put it listed on a dating profile, but you would have an incentive to tell Google that because they can give you help. If you enjoy racist jokes, you have an incentive to search that on Google because you're going to get the jokes that you find really funny. So I think Google gives you an incentive to kind of confess your secrets, I would say.

Alex: I recently read that there's this psychologist, an AI psychologist, being created at USC and I don't know exactly the precise percentage of how many people feel more comfortable with the AI psychologist than a human being. But overall, everyone is more comfortable telling their things to the artificial intelligence robot. Is this good or is this bad? Do you think we should be working towards that role of being honest to an artificial thing or trying to get past the awkwardness of us talking to ourselves and work on it?

Seth: I think the psychologist one's interesting because on that one, the incentives are kind of similar. You would have an incentive to tell your secrets to both an AI psychologist and a psychologist because they can help you best when you tell them.

Alex: But the thing is that if we're talking and I'm a psychologist and you're telling me something impressive, something even creepy, I can make a facial gesture, you know, or think that he was way too impressed about this. Maybe I should stop saying that.

Seth: To some degree. I think if you really didn't care what people thought of you - -

Alex: You are not a human.

Seth: You're like, yeah, I it's a mental disease almost, right? So if you meet someone on the street and you're like: hey, I have this sexual perversion. Or you start listing your health problems or you say your marital problems, they'd just be like, what's wrong with you? So I think it's kind of normal that we feel a little wary telling human beings our secrets. And then it's hard to just kind of compartmentalize to that degree where: okay, I can say this to this person that looks exactly like the person I just saw on the subway but the person on the subway, I shouldn't say these things to, and to this doctor which I'm sitting next to I should say this thing to. Whereas if it's a robot, it's so different that you feel comfortable opening up your secrets. But I do think that probably we are not honest enough in life. I talk about in the book, relationship problems and clearly there are many people in relationships that are confessing their concerns to Google instead of maybe having a conversation together.

Alex: One time I burst out laughing was when you said in your book that people search on Google "I love my girlfriend's boobs". And you said, "I don't know what they're expecting to find with this."

Seth: You should just tell your girlfriend, presumably she'd like to hear it.

Alex: Yeah. And one thing that just popped into my mind was: so let's say we are in a congressional hearing. But instead of politicians asking the questions, there are robots. Do you think it would be easier to find the truth?

Seth: People would be more honest? That an interesting hypothesis, but again, it's not just about incentives. So incentives is one issue; they would be be the same because you're televising the hearings. There'll be no difference whether it's a robot or a politician, but it is possible that people would be more honest. I think of - - I don't know if you know the radio host Howard Stern, he's popular in New York and he was so honest on his show: how horny was and he had a small penis and it's just like all these things, you know, he'd talk details about his sex life, just all these things that normal people don't love sharing with the world. And he's saying these thing to - - at his peak, I think he had like 10 million listeners in the United States every morning, so he just says, you know, some embarrassing, humiliating story and 10 million people are listening to it. He's just in this little radio studio with his friend and a couple of other people he works with. And you feel like you're just talking to your buddies. It feels like a little locker room talk and you don't feel like 10 million people are listening when you're talking. So, if he instead had this huge studio audience, even though the incentives would be exactly the same because just as many people were listening, he may act differently. So I think what you're getting on is it's not just incentives to get people to be more honest. You can put them, you can try to put them in situations that make them open up more.

Alex: Yeah. Like, improve the scene. Yeah, definitely. You mentioned him in another podcast, Howard Stern, how much you liked it and I think it was on Smart People Podcast. What inspired you of him? His transparency?, His openness about his life?

Seth: I think there was a little bit like . . .it's the same thing that drew people to Trump. And I'm not a Trump supporter. You can tell if you read my book, he scares the crap out of me. His authoritarian tendencies. But I think there is this sense as you go through life that a lot of what people are telling you is just kind of bullshit. Like, it all feels very surface and very unreal. So I guess what probably drew me to the Howard Stern show when I was a kid and everybody was drawn to him. I think everybody in my school listened to that show religiously. I lived in New York area and he was in New York, so it was maybe even more popularbut all throughout the United States. He was very popular and I think it was this idea that it was just so different from everybody else and so much more and the conversation was so much more honest, you know, these are things that everybody's thinking about. You know? All men are thinking a lot about sex. Kind of everyone puts on this front as they go through life and it can be seductive when someone doesn't have that front and is a little more politically incorrect. I think, again, that is what drew a lot of people to Donald Trump. I can understand the seductiveness of Trump, but I was not moved to support him.

Alex: And maybe Trump should read your book on what you say about how immigration helps pretty much everything.

Seth: Well, yeah, I talk about how immigrants are disproportionately among the most successful Americans. Successful Americans are just so much more, more likely to be immigrants than other Americans. So, I think one of the secrets of America is that we've drawn the best and brightest and the most ambitious from around the world.

Alex: Definitely. One thing that also comes in your book is how - - well you also talked this at Ciudad de Las Ideas in 2018, during your keynote you mentioned that something weird happened when Youtube crashed that searches in porn sites spiked. Do you think people search more on short term things like superficial things?

Seth: I don't know, Youtube probably has that data. My guess is it depends on, there's probably a lot of variation. It depends on the person. I think I tend to watch short videos. I think a doctor should look at my Internet behavior. And I think he would diagnose me with me with ADHD because I can't watch anything for more than like two minutes. I'm like, I'm always bored and moving to the next thing. It's true on Spotify, too. I listened to a song for like two minutes and then I'm like, oh, let me find another song. I don't know. My guess there's a lot of variation though, I would imagine on average probably, yeah, people do watch longer things on Youtube than youporn.

Alex: When I thought about this question was: do we, as a society, look for longterm things? Like asking about the government officials...

Seth: I think in general, probably people search less for that, then we'd like to think. Mostly I think people tend to be more self-absorbed and selfish than they let on. So I even did this little study where I kind of looked at what people are searching for at 3:00 AM when they wake up in a panic attack. And, you know, if you asked someone, a lot of my friends will be like, you know, I can't sleep I'm worried about global warming or I'm worried about the state of the world. And that's not really what people are searching for it at 3:00 AM. They're not looking for global warming and its effects. They're looking for health conditions they might have or concerns about losing their job, how they can get a new job, much more concerns about one's own situation than global concerns. I would guess that overall people probably search much more for ways to improve their own lives than they let on.

Alex: On your book you mentioned a lot about racist issues and racism in the United States. Do you think, or are you using right now or someone using that data to see how would the election will go?

Seth: Yeah, it's a little early. Everyone's always asked me, can you predict the election using Google searches? And it's early on and even within the election, on the last election. I said there were clues that Trump could have done better, or there are sometimes clues that a candidate's doing better. The clues tend to reveal themselves closer to the election. When people start looking whether they're going to vote, you can predict whether turnout's going to be high, where turnout's going to be high, which groups turn out might be high among. This far out, I'd be a little hesitant to make a broad predictions about the election, though. There are probably insights there. I haven't looked into it as much as I probably should and since I get all these emails like "tell us what's going on in the election". As I said, I think I'm very ADHD and I get on different topics. I'm interested in all kinds of different questions from other books. So it's hard for me to kind of stick to the same thing for many years but I'm sure there are clues there.

Alex: And did you see, during Trump's presidency, did you see racist search on Google Spike or decrease?

Seth: No, they haven't really gone up much. I think a lot of what Trump has done is just brought out racism that has always existed. So I think racism used to be secretive and now people are more open about it than they used to be. But I don't know that it's gone up. I don't see much evidence that it's actually gone up. If anything maybe it's gone down a little bit. Some people have been turned off by some of the open racism.

Alex: Well, let's switch gears a bit towards social media. One thing that I found really counterintuitive in your book was that most of our information on Facebook is from another people's viewpoints, which are not the same as ours. So the mainstream idea that social media makes echo clusters and confirmation bias. Can you further up a bit on that?

Seth: Well, the argument's a little more subtle than that. It's not that on Facebook or on the Internet we are more likely to encounter opposing political views. We actually are more likely to associate online with people who share our political views than with people who oppose our political views. But the argument that these authors have made and many authors now have made and confirmed is that that's also true offline. And it may even be more extreme offline than online. For example, people are more likely to live near people who share their political views. People are more likely to work with people who share their political views. People are more likely to socialize to go to dinner parties or bowling arenas with people who share their political views. So in the offline world we're constantly encountering people who share our political views. Online it seems like a little bit more likely to see opposing political views. And one of the reasons for that is that on Facebook we have our weak ties. So you tend to connect not just with your really close friends or personal contacts, but with your crazy uncle that you haven't seen in 10 years or your high school friends, your high school acquaintances, and these people are more likely to have opposing political views. So in some sense you get a broader swath of the population on Facebook than in your every day offline life.

Alex: That makes sense. No? Because you wouldn't want to be living with someone who you could argue with and then start a fight. But on Facebook, at least you feel this sense of anonymity.

Seth: Yeah, exactly. I mean some people get so annoyed of the political views they see on Facebook that they block a bunch of the people with opposing political views. They create their own filter bubble. So, most of my friends share my political views and I would say 95% of people share my political views, but on Facebook it's closer to probably 60 or 70% that share my political views.

Alex: That's interesting. Let's switch gears a bit after you finished your research for the book and everything, did you yourself, try to change the way you use your phone or change the way you use school searches or did you expect from yourself change? And also, you mentioned this in your book, the power of knowing this that we can change our habits and improve.

Seth: I'm talking more about how we could use research as a society to learn how to fight problems. I don't know if the changes I was talking about were how you personally could use the sites. I don't think there's been a big change in my use of these sites. I do think twice about just about some of the emails I send just because of some of the hacks that have happened. Some mentor of mine said that he doesn't send an email unless he'd be okay with it being on the front page of the New York Times which I think is a reasonable approach to the Internet. But there can be some balance. Maybe the answer is to have that philosophy but also be okay with more things being on the front page of the New York Times. Like who cares if someone sees an email? You know, like there were like these hacks of celebrities where they had these nude photos and it was incredibly embarrassing, it was horrible. Obviously that's a horrific violation of someone's privacy. But I think one of the celebrities, maybe was Scarlett Johansen was like "it was horrible and whatever, but like, it's only a body, and like, is there anything I did that was all that wrong I sent photos to my boyfriend and I don't see what's so bad about that or why I should be so embarrassed". And I thought that was kind of a good point. Again, that doesn't mean that there shouldn't be a massive security to prevent these types of hacks and people do have the right to privacy, but I think people are a little uptight, people are so easily embarrassed by things that aren't really that embarrassing.

Alex: Yeah, definitely. What do you see in the next five years or ten years and just being the, well, let's keep it to five years, being the most amazing breakthrough on big data. What do you think, should be the first thing accomplished with it?

Seth: it's so hard to predict, the future of data and the future of science. I think health is an area where the breakthroughs -- we haven't had it as big breakthroughs as I was maybe expecting. And I think we will have more breakthroughs in that arena. I think one thing we'll be predicting diseases much better well before they're picked up by doctors.

Alex: Which one would you expect? Which one would you like it to be?

Seth: I think personalized health. I would put that in the domain of personalized health. So the idea generally, our healthcare system relies on treating people as if they are average. So, we have kind of a standard treatment for diseases, but I think in the future we'll try to really personalize medicine so that people will find what works most similar to them, not for people in general. And I think that trend is happening. Healthcare is moving slowly in part because the data, it's hard to share the data. There are all these regulations around data sharing that it's hard to build these big massive data sets. If we did a better job of sharing data among different sources, progress in disease research would progress much faster.

Alex: Yeah. Do you think one approach to this could be like, the use of information, do you share on 23andme and those DNA testing sites?

Seth: Yeah, I think some of the insights from 23andme are a little bit overblown. So you might have a genetic predisposition that might be a little higher to have a disease but as you go through life, you might get more clues just from your life experiences than the genetic clues. So for example, you could say a company -- you can now predict maybe 10% of some of the variation in IQ based on genetics. So you would have some clue if you had someone's entire DNA of how smart they were. So technically an employer could use that to make a hiring decision if they had access to someone's DNA. But it would be a very, very weak signal compared to all the other information they could use. So, for example, if they knew their SAT scores, they knew what college they attended or their grades, or they talked to them for 10 minutes, that would be a much better signal of their IQ than their genetics. That's a pretty common phenomenon, that genetics give a very rough range of behavioral outcomes. But that life tests can be much more accurate. So for example, if you want to know someone's risk of heart disease, you could look at their genetics, but you could also look at their blood pressure and their cholesterol levels and probably examine their heart in more sophisticated and modern ways and probably learn a lot more about their risk of a heart attack than you'd learn from the genetics.

Alex: Yeah. Yeah. Also, the power of big data and searching on Google is that - - you mentioned that you can also associate like searches of symptoms between people to say if this can be preventable or to treat that person.

Seth: Yeah. So I talked about a story in the book how they were able to predict someone get pancreatic cancer based on the symptoms they searched and I think that that research is moving quickly and it's not just using searches, it's also using other things such as mouse movements. Somebody recently showed me a study by Microsoft researchers where they're analyzing search behaviors and also clicks on the mouse to detect Parkinson's. So someone's kind of fingers are shaky, that may be a sign their Parkinson's. And it can be a quicker process than having the person realize theirs something wrong with them and go to a doctor and get a test.

Alex: That's crazy. And those things are the best outcome that can happen with technology. But also we can go to the other way, to the dark side. No? Companies being so concerned about profits that they forget that there's a person behind the screen. How can I maximize my profits?

Seth: Yeah, definitely. Again, it's always been an issue with health insurance that insurance companies don't want to cover someone who's sick because they're likely to spend a lot more money on healthcare in the future.

Alex: Or loan companies? Banks . . .

Seth: Again, a lot of these issues, they've always been issues. If you have a bad credit score, you have a more difficult time getting a loan and it may be that you have a bad credit score because in the past you had some troubles and those problems are all behind you, but you're still screwed by your credit score. And similarly, new information may be able to detect, using sophisticated machine learning, your probability of paying back a loan is very low and you might not get alone even if it's not true. It's not a new problem. It can be become a more severe problem with data.

Alex: Yeah. Like seeing your Facebook newsfeed. How can it intervene with an interview now?

Seth: yeah, and I think another concern is that people can waste resources trying to look good on paper. So it's kind of a arms race where everybody's spending more and more time giving signals that they're a reliable human being and it's not good for anybody. It's kind of a huge waste of resources. If people get loans only if they have a long Facebook profile with proper grammar and the perfect likes and the right interactions with your friends, then everybody will spend all this time on Facebook trying to get their Facebook profile just right to prove they're the type of person who can get along instead of out enjoying their life.

Seth: Do you think we will follow what big data tells us or who will dictate how we interact with this new technologies and new information?

Seth: I'm fascinated by the future. I read a book a week on the future and somebody smart explains what the future is gonna look like and I'm like, oh my God, he or she exactly right, then I read someone else and I'm like, oh my God, she's exactly right, a different view for the future. So I think it's really hard to predict what's going to be out of five or 10 years. Except general areas, I think algorithms are going to play a much bigger role in many, many areas of society that people are gonna be relying much more on computers to make decisions than on human beings. So medicine, you know, you talked about AI psychiatrists, but I think much of the medical profession is going to rely much more on algorithms to diagnose and treat diseases rather than people. Education will also use much more algorithms to determine what lesson plans people should use. Businesses have already been revolutionized by data and analytics. Some businesses haven't and I think the business that haven't will be. So, I think that it's clear the trend is moving towards more analytics and I don't think hasn't really been an industry that's gotten too far in the direction of analytics. I think most industries just keep moving more and more and learning more and more as they rely more on analytics. So, that's happening in baseball, which I talked about being in the book, but I know there were all these early insights from analytics that a player - -

Alex: From my team actually from the Oakland Athletics.

Seth: Exactly. Moneyball and Billy Beane, there was this idea that you should have players who walk more and you shouldn't bunt and you shouldn't do hit and runs, and those kinds of insights came into baseball, but as more teams started relying on analytics more and more insights happened. People thought the moneyball thing was daunted. Everyone kind of copied what the A's did. But then the Astros, used even more advanced analytics and had more success than the, than the A's or the Red Sox or previous teams. And now the Tampa Bay Devil Rays have invented new forms of pitching where you have one pitcher who pitches just the first ending of a game. And that's all based on the analytics. So, you know, just when we think it's done we get a new kind of revolutionary insights from data.

Alex: Yeah. Well Seth. We've run into one hour almost - - 55 minutes. What are you working on? Are you working on a new book?

Seth: Yeah, I am working on new book, which hopefully will be out within the year which is about how use data to make better life decisions.

Alex: Like which career to choose . . .

Seth: Which career to choose, who to date . . .

Alex: Can't wait to read it, really. Thank you so much for this. This has been great and I truly appreciate you taking the time and hopefully I'll see you in the next Ciudad de las Ideas with the new book.

Seth: Yeah. Great. Thanks so much for having me.

Alex: Thank you so much. Take care.

Seth: Bye Alex.