Listen

Description

Is social media harming us? Dr. King, the Director of Consumer Privacy at the Center for Internet and Society at Stanford Law School, discusses what is wrong with the current internet algorithms, unseen manipulation, and behavior modification techniques.

Transcript:

Lisa:Method to the Madness is next. 

You're listening to Method to the Madness, a biweekly public affairs show on KALX Berkeley, celebrating Bay Area innovators. I'm your host, Lisa Kiefer. Today I'm speaking with Dr. Jennifer King. She's the director of consumer privacy at the Center for Internet and Society at Stanford Law School. We'll be talking about the problems with social media today. Welcome to the program, Jennifer. 

Jennifer:Thank you. 

Lisa:You've recently gotten a new job at Stanford Law School. Can you first of all tell us what you're doing down there? 

Jennifer:Yes. I just graduated my PhD back here at Berkeley. 

Lisa:In what?

Jennifer:Information science. At Stanford, I am the director of consumer privacy at the the Center for Law and Society at Stanford Law School.

Lisa:You just started though.

Jennifer:At Stanford, yes. I started in April before I graduated. 

Lisa:Last week, I had an interesting conversation with Jaron Lanier, who just wrote a book called Ten Arguments for Deleting Your Social Media Accounts Right Now. I thought I'd have you on the show to talk about some of the ideas that we talked about since that is your area. Everybody knows there's something wrong right now in our society. Journalism is failing. Politics is failing. People are afraid they're losing their jobs to AI. Whether they are or not, they're afraid of it. There's a lot of social anxiety. What do you see as the problem with social media or do you?

Jennifer:With social media specifically? Because there's a lot there. I think one of the challenges with social media is that it de-individuates us or it takes us away from our humanity to some extent. It's the same way when you're driving in a car and there's that object between you and the rest of the world and you might be a totally reasonable person in real life and then you get behind the wheel and you get road rage or you just find that you treat people more like objects than other people. When you communicate with people through a computer, it's that same object between you and them. I think it prevents us in some ways from connecting with people. 

Lisa:There's a lot of research now that backs up, especially with young people, that there is more anxiety, there's more sadness. I don't know exactly how they're measuring sadness but that people are acting out differently, particularly young people, which is scary. I think we need to re-examine Google and Facebook and others. Some aren't in the business of behavior modification, but the business model, it's not that the people behind it personally are doing this, but the business model they've created with machine language literally takes us on a downward path. It's not left or right. It's actually down because the algorithm support and make money off of negative emotions. 

Jennifer:Sure. I've worked in Silicon Valley, and I can tell you having been-

Lisa:Who did you work for?

Jennifer:I worked for Yahoo. I worked for other startups too, but I worked for Yahoo back in the early 2000s, and was part of not directly developing social media software that was part of that scene, you could call it in the Bay Area back around 2000-plus where I was part of those social networks that emerged during that time. I think we were all very optimistic, and there wasn't a lot of thought about what the consequences were of any of these things people made. It was mostly like, let's just try this and see what happens. I think at first, there was an optimism driving it. We're doing this because let's see what happens. It could be really interesting. I think that shifted. It shifted over time from that to let's do this and maybe we'll get acquired by somebody to now let's do this and see how much personal data we can potentially mine from this product and from these people using it. Part of that is the consequence of building this entire infrastructure off the idea that it's free and not making people pay for it. 

I think the other piece of it too is that most of the people in this space, I would argue, are not thinking about what these products would do or these services would do to kids. It was one thing to put a lot of this in the hands of people who already had a solid footing on what it meant to talk to people in reality. We didn't grow up with phones and we barely grew up with computers, many of us, and so we had a foundation for what it meant to interact with people. Now suddenly, you have kids who've grown up immersed in this technology and it's shifted to where it's almost as if they don't know how to interact with each other. 

Lisa:Right. It's a big intermediary for them. 

Jennifer:Yeah. Professor Sherry Turkle has written extensively on this. I think she's done some of the best research on it. 

Lisa:Where is she?

Jennifer:She's at MIT, and she's published several books in this area and that's where I'm drawing some of my own insight. 

Lisa:It's an unfortunate collision of math and human biology. 

Jennifer:Yeah. I would say, too, part of the challenge is that being a technologist has suddenly brought with it a lot of power in the society. We don't educate technologists to think about other people. If you are a Berkeley or a Stanford computer science student, for the most part, I don't believe you even had to take any ethics requirements in the past. I know that's changing, but you've been able to tinker with this giant social experiment without necessarily having any education or training or having been challenged to really think about the consequences of your actions on other people. It's mostly just been a chase to see what cool thing can we make next. I think we're seeing the consequences of that.

Lisa:We are. There seems to be a groundswell now of people, at least researchers, academicians, economists, who are now looking at all of this behavior modification and the implications. They're also looking at data as labor instead of data as capital because for the first time ever, I think there are just a few people who own these big, what Jaron Lanier called siren servers, and they're making money on everybody else. There's only one buyer and multiple sellers of information so it's a monopsony. 

Jennifer:Yes, a very hard word to say. 

Lisa:Yes. I want to talk about that, all of the data that's been pulled from us with our knowledge and without our knowledge. 

Jennifer:That's a tough one because from my perspective, I study privacy and I study people. I try to understand how information privacy, how people think about it, what they care about. I'm willing to bet that most of us have figurative piles of digital photos hanging out either on our personal computers, on our phones, and managing all those things is really hard. I don't think I know anybody who actually has a grip on the number of photos they take. 

Lisa:I don't even look at them anymore. 

Jennifer:Right. I think you can extend that to your own data. We talk about a lot about we want to give people more control and we want to put them in control. If we could just somehow get our hands on this ephemeral data, then it will be okay. My skepticism with that just comes from the fact that it's such an information overload that it's possible we could build an infrastructure that makes it easy for people or at least easier. Right now, I think the push to get people's hands on the data isn't going to necessarily have the effect we want it to or that we might be hoping it will. I think there are good reasons for making the companies open up their platforms that have to do with issues of power and control and just trying to force a level of openness that doesn't exist presently. Whether that ends up with empowering people individually because they can actually see what data is collected about them, I'm a little bit skeptical of that actually.

Lisa:What about data? People talk about universal basic income, but now people are talking about you've gotten these companies rich off of all this data and with your consent. You've given this away, but now-

Jennifer:Kind of your consent. 

Lisa:Yeah. There are people, groups like datavest and researchers. Even at Stanford, they're looking at the idea of monetizing your data so that in place of a universal basic income, someday you might get every month a certain amount of money in return for the barter that you've given away your private life. 

Jennifer:Not to wallow in trendy technologies right now, but I think we've ... I don't know if your listeners or if you've talked so much about blockchain. 

Lisa:Oh yeah, I've had people on here actually from the UC Berkeley blockchain group. 

Jennifer:Great. I don't know if blockchain is the answer to that problem, but it seemingly could potentially be an answer to the data management piece. Every proposal I've seen in this vein has (a) put the burden on the individual to manage it in a way that I don't think most people want to do. You can't manage your photos. You don't also probably want to manage your personal data on a day-to-day basis. 

Lisa:Exactly.

Jennifer:I don't even balance my checking account anymore. I just ... What has to give? I have to say I don't know too much about the blockchain proposal insofar as I have seen it voiced as a potential solution for this distributed data management problem. 

Lisa:It seems to me that if Facebook and Google were smart, they would get off this business model that's on a downward anyway because it's going to implode. You can't take data as capital forever. If they would say, okay, we realize what we're doing and now we're going to turn around and give you back something, they'll probably never do that because their business model, they make too much money. There are groups like of datavest. They propose a co-op organization where they are the intermediary between the big computer monsters that they're leasing to do this complex mathematical, but blockchain would be part of that probably, keeping accounting records and-

Jennifer:Right. Making it manageable for end users, for individuals. I think that the challenge is that right now in some ways, collecting data is more valuable than it potentially has been before because companies are using this to feed their AI systems. It's a big training base. Given how much focus right now is on AI and improving those systems ... As an information scientist, I can tell you that you need data to train those systems to improve them. 

Lisa:Like language translation. 

Jennifer:Absolutely.

Lisa:You need real people. They're grabbing real people's translations in order to make the Google Translate work better. 

Jennifer:Which I think is actually a really excellent example of this being used for good in a sense.

Lisa:It is, but what about the jobs of human translators? At some point, there's real no artificial intelligence right now, but at some point when perhaps there is, they won't have a job anymore. 

Jennifer:Well, I don't know if it necessarily obviates all human translators, but I will tell you I was in Mexico last year. I wasn't going to hire a translator to go with me from place to place to place, but Google Translate was really helpful for trying to talk to a cab driver because my Spanish is terrible. 

Lisa:I agree with you there, but let's pay those human translators for that data. 

Jennifer:Sure. Yeah. Just to go back to that thought though. One of the reasons why I don't think you'll see the recognition by the companies that this could be a downward slope right now is because right now as they're trying to improve their consumer AI systems, there is probably a fanatical need or desire for as much data as you can get. Given that, I think if you want to see the changes you're talking about, it will probably emerge through civil society and other groups putting together proposals and pushing it. I think you'll have to see it from a government side ultimately. I don't know if you'll see it in this country. 

Lisa:There does have to be some oversight. I don't know. I feel like this problem is so urgent right now. When you look at the Annapolis shootings, which some people are saying were triggered by trolls online, and that could be misinformation. It's hard to find the truth that is hurting our society. Also with journalism, I use that as an example a lot because they missed the Trump election. They missed the recent Brooklyn, the young woman who beat out the stronghold Democrat challenger. That was completely missed. What's going on? They can't afford investigative journalists. Most organizations can't anymore, so finding out the truth is really difficult. 

I think that's changing us. In so many ways, it's making us more siloed. We don't know what red states are thinking because we only see what the algorithms want us to see. It's creating this bifurcated society. In fact, it turns out a lot of technologists send their kids to Waldorf schools and Montessori schools because they're worried about this. 

Jennifer:I don't let my kids use a lot of technology. 

Lisa:You don't? Why?

Jennifer:Well, I guess to go back full circle to the social media piece. Again, I think using social media is a different experience for those of us who have developed the skill in her personal communication and relationships in person and that it's a much different equation when you're talking about kids. It used to be that the internet was connecting us across space, and now we're seeing it used in a very hyper local way when it used to connect people who were sitting right next to each other. That's a very different vision, I think, than where we started from, and I don't think we've thought so much about what that means for the people inhabiting that space together. Certainly with teenagers, you see it in terms of the competition it fosters for I want the best Instagram photo. 

I would say it's a double ... two big parts to it. One of it is parents saying something, I mean really being involved and understanding what their kids are doing, which I realize is not always easy, especially if you're not particularly tech literate. I'm just, as a parent, I'm often amazed how many small children I see who are just given phones and parents are ignoring them and they're just going on and on and on. It just amazes me. There's definitely been greater calls to tech companies to really start thinking more about the implications of what they're doing, not only on this, but a lot of parts of their work across society. 

I think that the types of restrictions we have on phones, for example, are in their infancy. We could do a lot more in terms of thinking through like what's an appropriate set of parental controls you can put on a phone? For example, to get to meter kids' usage so you can teach them, bound it, like this is what it means to be on your phone for 20 minutes and when the 20 minutes are up, you're done. You're locked out.

Lisa:They can get around that stuff though. They're going to be so much more tech savvy than you or I.

Jennifer:I have younger kids, so I'm still-

Lisa:They'll just hack your restrictions. 

Jennifer:I'm still biased towards the fact that I can take the thing away from my five-year-old versus having a 15-year-old with a phone, which I realize is different.

Lisa:If you're just tuning in, you're listening to Method to the Madness, a biweekly public affairs show on KALX Berkeley, celebrating Bay Area innovators. Today, I'm speaking with Dr. Jennifer King. She's the director of consumer privacy at the Center for Internet and Society at Stanford Law School. 

Well, I wanted to ask you about your new job at Stanford Law School. California just passed this pretty intense data privacy law. It isn't as restrictive as Europe, but can you talk about that and explain what's going on to our listeners? 

Jennifer:The law that was just passed was the result of we'll say panic by the tech sector with the upcoming ballot initiative that was to appear on the ballot in November. There was a ballot initiative or it was placed on the ballot that would have had placed some more restrictions on privacy with respect to tech companies. Some of the provisions in the ballot measure ended up in this final bill but not all of them. When I looked at this bill, again I'm not a lawyer so that's my disclaimer for my own analysis, but one of the things I actually was frustrated by, which I don't know if we'll see addressed ultimately because a lot of the talk last week was around the fact that doesn't go into effect until 2020 so we may see amendments to it. 

It was that it doesn't place any limits on the collection of data nor on the reselling of it. It gives consumers a little bit more power than they had before, but I'm actually fairly disappointed with the outcome of that bill because I don't think it really does much beyond allowing you to say, hey, don't sell my data. A lot of the big companies that we've been concerned about actually aren't selling your data to begin with. They're collecting it, and they're selling access to it, and that doesn't change at all under this bill. It doesn't curb some of the, I think, the worst cases we see of data being collected without your explicit consent. It does nothing about that consent issue. 

If you download a free app for a smartphone and the app developer is using a third party advertising service that serves ads in the app, that service is collecting data from your phone about your usage as you're using it. The same with any website that you're not blocking third party cookies or third party ad trackers on, if you're using a regular computer and a browser, those ad services are also collecting data from you or from your browser experience. This bill doesn't really do anything to curb that. 

Lisa:Does it do anything about the cameras on your phones and computers that are looking at your facial expressions and that goes into the machine language algorithm as well, the listening that goes on with your devices?

Jennifer:Yes, you have devices in your pocket that can listen to you and can take your picture. Certainly the way they get consent from you is often not clear. 

Lisa:Most of the time, you don't read the consent anyway on these sites that you go to. 

Jennifer:However, it is against the law for them to be surveilling you without you having consented. At the same time, you might be using a service that wants to capture your voice as part of what it does, so take a smart speaker, for example. That's an area I've been looking at a bit lately.

Lisa:Like the Alexis and Siri.

Jennifer:Right. They're voice activated. They need to listen to you. For how long and what it records and the duration and what it does with that recording is an interesting question, but that is the essence of a smart speaker so you do have to let it capture your voice. It's just a question of then what happens to that data. 

Lisa:In your capacity, in your new job, what are the problems you're trying to solve in the near term? 

Jennifer:My job is research focused, so part of it is about the type of research that I am looking to do. Because I just graduated with my PhD, some of it is about publishing my own dissertation work.

Lisa:Which was on what?

Jennifer:Privacy. I don't think I want to go into the details. It's a long and complicated thing. 

Lisa:It's private.

Jennifer:It's not private, but I think it would bore a lot of people. Some of the issues that I've been interested in exploring in this new role are genetic privacy. Actually, a part of my dissertation research was on 23andMe users. I was very interested in looking at-

Lisa:What they do with that information?

Jennifer:Yeah. Also just people's expectations around it and what motivates them to have their DNA sequenced and what happens to your DNA after you give it to a service like that. That's an area I've been interested in looking at, as well as emotional privacy because I think one of the things that's been a side effect of Facebook and Cambridge Analytica and something I saw in my own work is that people often get the most concerned about their privacy when it comes to data about them that really gets to who they think they are. 

By that, I mean it's one thing for a credit reporting company to collect your address and your credit history. That's important information and, of course, we're upset if it gets breached. Your sense of privacy around it I think is different than, for example, another piece of my dissertation research was looking at people's search queries. One of the things I found was that actually of the people I looked at, I asked these 23andMe users about their genetic data as compared to their search queries. Most of them were far more concerned about the content of their search queries than about their DNA. That was mostly because they felt like their DNA, sure, it identifies you uniquely, but they felt like it didn't tell people about them. The way that if you looked at five years of your search queries, your unfiltered search queries, that could tell you much more about who they are, what they're thinking about, what they care about. 

Lisa:That's interesting. Maybe because search queries are free, but the 23andMe, you have to pay to join that service. I've done it, so I know there's a certain fee. With that fee structure, maybe that makes people think, oh well, data is private. It's not going to be-

Jennifer:The question of paying for it, yes and no. Yes, it definitely ... When people pay for something, what I've observed is that there are definitely more expectations around I paid for this, so they better not sell my data or at least I hope they won't. With free services, there's also an expectation of privacy. It's not as if most people use something like Google search and assume that their search queries are going to be used in a multitude of different ways against them or released to the public. People had privacy expectations in that data even if it was [crosstalk].

Lisa:That's important to talk about.

Jennifer:What Cambridge Analytica and Facebook has also shown us is the power of the emotional data, which is something I'm also trying to focus on because I think that's the next frontier. I think it's the next frontier in terms of the types of data we're going to try to let's say extract from people. There are people focusing on emotion recognition as a way to improve different experiences, technological experiences. I, of course, being a skeptic, I'm always skeptical leading into these things, so I'm really curious to keep an eye on companies that are doing emotion detection and see where that goes in terms of the next type of data we've been collecting about people would be your emotional state. There's lots of research into computer mediated communication that charts basically all of this. The research is there. You just have to know where to look for it and put it into play. 

Lisa:Maybe we should start educating people at a very early age, like elementary school about privacy. Is that something-

Jennifer:You can talk to my rising fourth grader.

Lisa:Have you thought about that? We need to institute this in schools if we're going to-

Jennifer:Yeah, there are definitely people in the privacy research field who have worked on curriculum for at least high school students. I agree that it should go probably at least middle school and maybe the fifth grade, fourth grade, fifth grade level. There are definitely people working on that. How widely distributed that curriculum gets, I think that's the challenge. It'd be nice if California as a state did something with it rather than it just being a one-off one teacher in one school being interested in that issue. 

Going back to the genetic data piece and the search query piece. One of the things though that is really interesting about the genetic data area is the fact that a lot of what you're doing with that is sharing it with other people in the service. Whether that's looking for relatives or with 23andMe, you can share it with the company for their development or for their research purposes. One of the things I thought was really interesting about the people I talked to who used it was how much they were motivated by that sharing, the research sharing with the expectation that, hey, if my data is used to develop a new drug that can help the world, great.

I'm a skeptic so my counterpoint was, sure, it could be used, but it might be used to develop a drug that then their pharmaceutical partner charges $50,000 a dose for. There's no-

Lisa:Right, or that you get absolutely nothing for-

Jennifer:Right. You don't get anything from it monetarily. That's another interesting area of people willingly contributing their data to a private database for private development with no guarantees that there'll be a public benefit from it. 

Lisa:I really think we need to innovate that business model and return, in some way monetize this data that is benefiting a few people. You look at Facebook. 60% of it is owned by Mark Zuckerberg. They don't have that many employees. It needs to be more democratized. 

Jennifer:Well, I would argue. I was reading something recently online that was asking four notable internet theorists about basically what went wrong. It got me thinking about like what would I do? What would I have changed about the last 25 years? I think that going back to the mid to late '90s, there was a real ... The drum beat from Silicon Valley as much as it was an internet business at that point was very much like leave us alone. Don't regulate us in any way. Don't crush the internet. Let it blossom. Let it grow. There was pretty much a total hands-off approach with a couple of small exceptions along the way. I think if I went back in time, the thing I would change is not necessarily regulating, but I think making this expectation that there needed to be a public benefit. 

I don't know how I would do that, to be honest, if it's that the companies needed to ... Actually, I think maybe not a bad model would be looking back at radio and the development of radio and the fact that you used to have the fairness doctrine and public service announcements. There was this explicit recognition that the radio waves were a public resource and that they would lease them to private broadcasters, but there had to be some public benefit that they gave back. I wish we could have made that more explicit in the development of the internet.

Lisa:Some people think what went wrong is that it was free, that if we would have had to pay just a nominal amount of money for the right to browse or whatever, we wouldn't be dealing with all the advertising and behavior modification and so on. 

Jennifer:I was interviewed recently by some undergraduates at Stanford, and they asked me some pretty challenging questions that I had to stop and think about it too. Part of it was like, why do you do this? Why are you interested in this stuff? Given how many bad things feel like they're happening today, it's a real challenge to think about why are we doing this? Why am I involved in technology? Why don't I just run away and do something else? I think because there have been some real positive changes, despite all of the negative ones. I guess at the end of the day, I feel like it's not worth giving up on it at this point. Not that we even could, but I think that we let industry drive everything for the last 25 years. 

I think what you're seeing is a real recognition by people that they have to take this back into their own hands to some extent, both in terms of how they're being used and their data and just the power these large companies have to shape society in a way that I think people are really recoiling from. How we do that, I think some of the things we've talked about today are some of the hints that people collectively getting together and thinking about what can we do to shift the power balance. I think it is important to remember that this technology gives you a lot. There's a lot of things. I think if you asked us, would we go back to 1995 and give up some of the things we have now such as your ability to use a map online or a map on a phone? I think that's a pretty powerful tool.

Lisa:[crosstalk] from your child at school. 

Jennifer:Right. I always joke when I first got a cellphone, the first thing, I was living in Hawaii, the first thing I did was went to the beach and called people back in California going, "I'm calling you from the beach."

Lisa:It's not the internet. It's not the technology that's a problem, I think. It's the-

Jennifer:It's the people.

Lisa:The behavior modification algorithms. I think it's just we need to change the model. We're not going to get rid of the technology, but make it better, like you say. I think that's wonderful. It's a good goal. You have a lot of work ahead of you. 

Jennifer:Yeah. I can't retire anytime soon. 

Lisa:I'd like to have you back on at some point and once you've been in this role for quite a while and see what you're thinking then.

Jennifer:Yeah. 

Lisa:You've been listening to Method to the Madness. You can find all of our podcasts on iTunes University. We'll be back in two weeks.


Hosted on Acast. See acast.com/privacy for more information.