Anil Dash
Tech's Moral Reckoning
A wildly popular blogger, tech entrepreneur, and Silicon Valley influencer, Anil Dash has been an early activist for moral imagination in the digital sphere — an aspiration which has now become an urgent task. We explore the unprecedented power, the learning curves ahead, and how we can all contribute to the humane potential of technology in this moment.
Guest
Anil Dash is the CEO of Fog Creek Software. He also founded Makerbase, Activate, and the non-profit Expert Labs, a research initiative backed by the MacArthur Foundation and the American Association for the Advancement of Science, which collaborated with the Obama White House and federal agencies.
Transcript
KRISTA TIPPETT, HOST: The technologist and social media influencer Anil Dash has written this of the industry he helped create: “We fancy ourselves outlaws while we shape laws, and consider ourselves disruptive without sufficient consideration for the people and institutions we disrupt. We have to do better, and we will.”
Anil Dash has been an early, vocal activist for moral imagination in the digital sphere, including advocating for metrics that encourage generative behaviors online. This has become a galvanizing discussion since the 2016 presidential election. What is arguably the most powerful industry in human history has entered the lives of most people on Earth with openly world-changing ambitions — but without a deliberate process of ethics, inclusivity, and accountability. With Anil Dash, we explore this unprecedented power, the learning curves we’re on, and how we can all contribute to the humane potential of technology in this moment we inhabit.
MR. DASH: We’re still sounding our way through this incorporation of technology into our lives. And it always does come down to — what are our values? And what do we care about? And what are the things we think are meaningful? And then using that as a filter to understand and control and make decisions around these new technologies. And that’s part of the reckoning I’d ask everybody who’s not in technology to have, is to raise that flag. At the time when somebody says, “You’ve got to try this new app,” “You’ve got to use this new tool,” think through what are the implications of, one, me using this, but two, if everybody does.
MS. TIPPETT: I’m Krista Tippett, and this is On Being.
[music: “Seven League Boots” by Zoe Keating]
MS. TIPPETT: I spoke with Anil Dash in a public event at the Avalon Theater in Easton, Maryland. We were there at the invitation of the Aspen Institute’s Wye Fellows and the Dock Street Foundation.
MS. TIPPETT: A long, long time ago, when we set the title for tonight, it was “Emerging Technologies and Old Fashioned Civics.” And I was thinking today that we should rename it “Emerging Technologies and Their Interaction with Emerging Politics and the Civics We Have.”
[laughter]
MR. DASH: I’ll take it.
MS. TIPPETT: Or maybe “The Civics We Need.” [laughs]
MR. DASH: I’ll take it.
MS. TIPPETT: OK. So that’s where we’re gonna go. So, something you’re known for inside your industry in Silicon Valley is knowing how systems work. And you’ve said you’re almost kind of obsessed about this. Whether it’s the way people adopt new technologies, or the ways governments regulate markets, or the ways new ideas spring up in popular culture. So I’m curious if you — how do you trace that? Where does that come from in you? Or when did it surface?
MR. DASH: I got my first computer in the household — we had a Commodore computer when I was 5 years old my father brought home for us to tinker on. And so I started coding when I was 5. And my — as my son has done too. It’s like, oh, these things must be contagious. And what became — my impression of what a computer was was a tool you use to create and not to consume. And I think that influenced my view of what everything else was.
And through that lens, I came to understand that newspapers were that, that the low-fidelity, homemade broadcasts that sort of presaged YouTube that people were doing on Public Access was that same thing, that all these things — that hip hop music, which I loved, was people taking the tools they had and using the technology in new ways to express themselves. And that was the lens that sort of revealed the whole rest of the world, or at least where the media and entertainment and communications world was headed.
MS. TIPPETT: Also, you are a critic on the inside. You were pointing out contradictions and flaws in the tech industry at this stage. And I think that it’s good for us all to step back and realize this all has happened so quickly. And unfortunately, we’re not doing the work of discernment in advance; we’re doing it after the fact, which is how humans tend to function.
MR. DASH: [laughs]
MS. TIPPETT: Right? So you say nobody in your industry is saying, “I’m setting out to make the patriarchy worse. I’m setting out to — I’m saying I’m a rebel, but really…”
MR. DASH: I take it back. Some are now.
[laughter]
MS. TIPPETT: Some are? OK.
MR. DASH: So that’s the innovation that — we’ve broken the seal on that now. That’s great.
MS. TIPPETT: They’re not saying, “I’m being a rebel, but I’m acting like a robber baron.” That’s another way you’ve talked about it. They’re saying, “I’m making a tool.” And you’re saying, “That’s not good enough anymore.”
MR. DASH: Yeah. We bake our values into the choices we make when we design these tools. And the sort of turning point in my career about a dozen years ago was I was building some of the early blogging and social media tools. And the tools that my friends and I built were used to publish and launch Gawker and Huffington Post and many of these early blogs that became sort of seminal to the medium.
And it was interesting because there were hints all along that the choices we made, like on a whiteboard in our meeting room, had implications. So for example, there’s a box you type in, just like when you write an email, the box you type in when you write a message. And we would made the box bigger in the publishing tool, and the posts on Gawker and on Huffington Post would get longer, right?
And we see this today where — anybody who’s used Instagram, you see yourself — you see people framing their photo to be square because it’s going to be shared in a square format, even though the phone itself can take a rectangular photo, and every other photo over the last century has been a certain other shape, and here we are making this adjustment.
And so — I never connected the dots that while we have complete control over exactly how many words a journalist uses in writing a story by changing things that are arbitrary variables for us, and yet we would not connect the dots all the way through into all the choices we make about how communities work online, how people respond to each other, how accountable they are to each other. Those would have social impacts too. At that point, suddenly we would divest ourselves of responsibility.
MS. TIPPETT: It’s interesting, this moment we inhabit. We now say — and I think this is true — that we have a fifth estate as well as a fourth estate, this new sphere, which is public life, whether it’s set out to be or not, which is highly influential and has been incredibly influential in a presidential election now.
As a journalist, I see a lot of, actually, consonance between this kind of moral challenge that tech has and the moral challenge that journalism has. And they got all so mixed up and intertwined in this election. But I think the notion that — “This is what we do. We tell the news. We report what happened today. This is the form we use.” But when it colludes with technology, and when you see the tools themselves formed the craft — I mean, the 24-7 news cycle itself created the way journalism was functioning rather than the other way around.
MR. DASH: Very much so. Yeah, I mean, not to be glib, but it’s even worse than you describe.
[laughter]
MR. DASH: The thing that I’ve seen in particular in online media was — the world into which we started creating the social media tools around the turn of the century — it makes me sound like I’m ancient — I am ancient. The thing that jumped out to me was it was not centralized. There were — everybody had an individual site.
And what happened in short order, in about half a decade, was Google became the front door to all the content. You would go through one search engine and one news site, and that would be your entry point more or less. And one or two advertising platforms sort of took over, Facebook, Google.
And the people designing the products are well-intentioned. They are sincere in saying, “I’ll tell you what — we’ll just make it easy. We’ll bring the content from all these different publishers, New York Times, Buzzfeed, and we’ll put it all onto the Facebook platform or onto the Google platform and bring it inside our walls and make it fast and easy for everybody to browse on their phones.
Now incidentally, what happens with that is — “We’re just going to remove all of those other advertising and clutter things that are happening from all other companies that aren’t Facebook or Google.” So you can say, “We have our own staff of journalists and our own writers, and they’re out there reporting, and they’re doing everything. We are the masters of our own destiny.” But I don’t think that the publishers understand the tradeoff they’re making.
And it’s because of the social positioning of, technology as neutral. So like, “We’re a neutral platform anybody can publish on.” And then when you get to the current state of affairs, which is when you sell advertising, you are brokering attention. And so something that draws more attention and has more emotional appeal will be more successful and more lucrative. Then you say, well, some of the things that are most attention-getting aren’t true.
MS. TIPPETT: The Facebook fake news that we’re hearing about now.
MR. DASH: Right.
MS. TIPPETT: I mean, the Facebook fake news — I have to say I just thought it was just astonishing that Twitter played a role in a presidential election the way it did. I just — I don’t know who saw that coming. And then, of course, the catastrophe of polling, and the fact that you had algorithms, but people weren’t making…
MR. DASH: That’s a perfect example. I got to watch Twitter from before its public launch. I know the founders and a lot of the leadership very well, less so as there’s been a lot of regime change there.
MS. TIPPETT: [laughs] Regime change. That’s to your point that these are now titans.
MR. DASH: Yeah, exactly. But they were very vocal about their role in Arab Spring.
MS. TIPPETT: Yes.
MR. DASH: They were very vocal about how everybody in Tahrir Square is using Twitter. And when they at least nominally liked the results, then Twitter was taking the credit. And when they don’t like the results, Twitter is a neutral tool. Right? And I’ve been that guy. I’m not pointing fingers. But we did learn that lesson.
And then if you look at every other professional discipline, you look at somebody who goes to law school, somebody who goes to business school, journalism school, medical school, every single one of those disciplines has a professional society that sets standards. And if you don’t meet them, you can be disbarred. You can lose your medical license. There’s an expectation about what you’re supposed to do.
And in the educational process, there’s an extensive ethical curriculum. The bridge has to stay up; it can’t fall down. You have a historical tradition where, in medicine, they’re going back to Hippocrates. In law, you’re like talking about English common law that happened centuries ago. And then in computer science, they’re sort of radically anti-historical. Not even ahistorical, just like, there is nothing before now.
We refuse to see — there is no before time. And there is zero ethical curriculum. You can get a top-of-the-line, the highest credential computer science degree from the most august institutions with essentially having had zero ethics training. And that is, in fact, the most likely path to getting funded as a successful startup in Silicon Valley.
MS. TIPPETT: Do you think that’s partly because, as we say, it’s such a young field?
MR. DASH: Some of it is we haven’t had centuries to mature.
MS. TIPPETT: But then it’s so rapidly — achieve this disproportionate authority and power.
MR. DASH: That’s right. I mean the difference is in these centuries…
MS. TIPPETT: And we also haven’t thought of it as a discipline yet.
MR. DASH: Right, right. And the centuries over which engineering was maturing or that medicine was maturing, they worked it out, right? There were a lot of, like, “Well, that’s not actually medicine; that’s just butchery. Let’s clean that up.” Right? But at that time, they weren’t the wealthiest industry in the history of the world, so they had some time to work it out. And we didn’t do that. And we meanwhile amassed all of that power and all of that wealth, and I think that’s the reckoning we haven’t come to. And the stakes are incredibly, incredibly high.
I think we’re really going to face a reckoning as the economic impacts of that get stronger, as the cultural impacts of that get stronger. The idea that the halo around tech as “the good guys” is gonna sustain seems increasingly unlikely.
MS. TIPPETT: I want to read something that danah boyd wrote that you recommended. She’s been on the show. She’s fantastic.
MR. DASH: She’s incredible.
MS. TIPPETT: She’s another person many people outside of tech may not have heard of, but she’s a real leader and a great thinker. So she wrote — and I want to say before this, again, I hold journalists accountable, and I think this exact same conversation has to happen among journalists because there was a way — but you almost watched journalism not know how to — it was all so fast-breaking in the 2016 election…
MR. DASH: And so extreme.
MS. TIPPETT: …because something that you might morally want not to cover became news anyway. But still, I think there’s a huge reckoning. But anyway, she wrote, “I believe in data, but data itself has become spectacle. I cannot believe that it has become acceptable for media entities to throw around polling data without any critique of the limits of that data, to produce fancy visualizations which suggest that numbers are magical information.” As you said, “In the tech sector, we imagined that decentralized networks would bring people together for a healthier democracy.”
And there’s that idealism that I think really was there and is there in journalism too. And then she says, “We hung onto this belief even as we saw that this wasn’t playing out. We built the structures for hate to flow along the same pathways as knowledge, but we kept hoping that this wasn’t really happening. We aided and abetted the media’s suicide.”
MR. DASH: That’s fair. I bet a lot of you saw on election night, on The New York Times homepage, their animated, live, real-time data. They had these needles waving back and forth on the meter, which I think is probably the single most stressful implementation of technology I’ve ever seen in my life.
[laughter]
MR. DASH: And what it was — and I always think of this — actually, with systems thinking in general, in any institution, I always try to imagine the meeting — back track a few months, a few weeks to the whiteboard meeting, and a bunch of people with empty coffee cups sitting around sketching out what this thing is going to be. And somebody says — and it’s probably a young guy, and he’s probably a computer science grad — and he says, “You know, we can get the data in real time. We can get polling data, and we can put it on the homepage. And it’ll be incredibly compelling.”
And if you look at the page that made up that New York Times homepage while they were doing that chart on election night — and there were these wild swings of the needle for a long while. For hours, it was literally swinging back and forth between Clinton and Trump in “real time,” what they called “real time.”
MS. TIPPETT: [laughs] Right.
MR. DASH: And if you look at the code of the page, that was all fake. The data only updated, at most, a few times a minute. The needle never stopped moving. But I could see that. And what percentage of people that went to that page that night could see that? Maybe one percent, maximum. And it was so — one, it was causing undue stress to a lot of people. Two, it was, as danah said, making spectacle of data explicitly. And the animation was not true. It was not real.
And so what does it mean that there was a choice to put something false on their homepage when they are trying to decry all the falsity? And would they have allowed falsity from any other public-facing part of their organization except the technologists?
[music: “Miss You” by Trentemøller]
MS. TIPPETT: I’m Krista Tippett, and this is On Being. Today, I’m with Anil Dash, the entrepreneur and activist for humane technology.
[music: “Miss You” by Trentemøller]
MS. TIPPETT: Should we talk about Facebook, or is that like going down a rabbit hole?
MR. DASH: I’ve heard of it.
[laughter]
MS. TIPPETT: Is it a billion people around the world?
MR. DASH: More than that.
MS. TIPPETT: More than that. That is — you’re right. That is just so many. And so again, it’s like — when I was preparing for this, I was thinking I don’t usually have interviews where we talk about a company, but this is more than just a company. This is a human potential that is right now embodied in Facebook. But who knows if Facebook will be around 10 years from now? But the capacity to have the attention of a billion people.
MR. DASH: Yeah. It’s profound. I think when we — I started blogging in 1999. I was talking to people that were building the early social networking, social media tools. And I remember talking to a friend of mine, saying, “Someday —” well, at that time I started, there was like 50 or 100 blogs. And I’m like, “I’m late.” Right? Like, “Somebody’s taken all the good ideas.”
[laughter]
MR. DASH: And then the other 100 million showed up, and I was like, “Maybe I was early.”
[laughter]
MR. DASH: And that was true with social networks was — the first time I met Mark Zuckerberg, he came to visit the company I was working at in San Francisco. And Facebook was — I don’t know — six people, and they were at a couple colleges. And because they’re in uncharted territory, unprecedented territory, it’s hard to know when they’re making good choices and just choices. And no one can anticipate the way that network effects play out at that scale because no one’s ever done that before.
And I keep stumbling over one sort of anecdote. It’s got two sides to it. One is the ongoing and very important industry discussion we’re having in tech around inclusion and diversity, and who gets to participate and who gets to profit. And Facebook’s numbers look essentially like most of the companies in Silicon Valley, which is to say in California, which you presumably know the demographics of. They regularly have the technical staff of these companies be less than two percent black or less than two percent Latino workers. And 40 percent of Californians are Latino.
So it’s a huge, huge disconnect. And for many years, the conversation has been, “Well, the pipeline’s not there, and you don’t have enough people graduating from CS programs.” And the moment I started to realize that there was something really truly amiss even beyond that was when you looked at the marketing staff or the legal staff or the administration staff, the non-technical roles of these companies, and the same thing was true.
MS. TIPPETT: OK. So they didn’t — they weren’t in need of the same new pipeline.
MR. DASH: Right. You can’t tell me there’s no black lawyers in California. You can’t tell me there’s no Latino marketers in California. So that’s the first part of the Facebook story, to me, about their reckoning with who they are and who profits from what they do. And the second part was Whatsapp, which is an incredibly popular — may be the most popular messaging app in the world. I use it to talk to my family in India. Most of the rest of the world uses it a lot more than the States. And Facebook understood its value because it was the thing that’s used by the rest of the world. And they bought it at the time when the team was maybe 20 people building Whatsapp. It was a very, very small team.
They bought Whatsapp for what ended up being over 20 billion dollars, 22 billion dollars when the deal closed. And a board seat for the founder. He’s on the board of Facebook now. And he’s a very smart guy, very incredible team. And the decision to make that purchase, even though Facebook’s a publicly traded company and all these things, happened in about four or five days. So when Mark Zuckerberg cares about something, and Facebook cares about something, they can deploy 20+ billion dollars in less than a week. That’s the scale of what it looks like when they really want to move on something, and they really want to deploy their resources and commit to it.
MS. TIPPETT: You’ve written a lot also about how echo chambers emerge from the algorithms. Do you read Seth Godin? Do you follow him?
MR. DASH: Yeah.
MS. TIPPETT: So Seth Godin is a kind of mentor to me and to a lot of good people. And he’s been in tech, like you, a long time before it was what it is. And he wants to insist that one thing that sets our generation — and by which I mean all of us — apart from previous generations of humans is that we can create our own tribes beyond bloodline and geography, that that is one thing that technology makes possible for us.
I just want to know how you — do you also see that as something that this technology makes possible? Do you see this happening? How do you think through how we navigate the danger of bubbles, and then this amazing capacity this gives us to walk out of them?
MR. DASH: That was the thing that drew me to sort of the internet era. Like, I liked computers, but for half my life, the computer wasn’t plugged into anything. It was sort of this island. And then it sort of woke up once it got connected to other people.
MS. TIPPETT: [laughs] Got conscious.
MR. DASH: Yes. I mean, it’s hard to explain to the young people I mentor in the tech industry now that we had computers that didn’t have the internet. And they’re like, “What would you do with it?”
[laughter]
MR. DASH: Like, what do you — you just stare at it. It’s like a TV that doesn’t have an antenna or a cable. What do you do with that? You just bask in the rays.
[laughter]
MR. DASH: And that’s what we did. And so that was — I would say far less than 10 percent of the people creating these tools today think that they should distinguish between the uses of these social tools as to whether they’re being used for constructive purposes or destructive purposes. And what they’re afraid of is a lot of things. I think one is that, “Well, who are we to judge? We don’t want to presume.” I’m like, “You’re not humble.” Like, the tech industry…
MS. TIPPETT: [laughs] Right.
MR. DASH: The false modesty of the tech industry is the most ridiculous argument that they start with because at the same time they’re like, “We’re here to change the world. We’re going to put rockets on Mars and make self-driving cars, but we don’t want to presume too much.”
[laughter]
MR. DASH: And so that’s sort of the starting point. And then they get into, “It’s really hard, and it takes people.” As it turns out, yes. Yeah, it does. It takes human judgement. And you have to say where you sit, and you have to make a call, and you have to make some people angry because they didn’t get to be jerks on your platform.
MS. TIPPETT: I mean, and that’s even stopping short of saying, “We are going to encourage generative relationships,” right?
MR. DASH: Yes, yes.
MS. TIPPETT: Robust civics. Not just everybody being nice and banning bad voices, but something that’s robust where difference is being engaged.
MR. DASH: Or explicitly designing for good behavior.
MS. TIPPETT: Yeah. I mean, you’ve talked about sort of how we measure and what we measure, and how — you’ve written a lot about — you have a huge social media following, and you’ve written a lot about what that means and what it doesn’t mean, really in a very searching way. You’ve said, could we have metrics about how we’re presenting, about whether we’re listening, about whether we’re showing gratitude?
Like, could we decide some — and we could do this with social psychology, right? We could use science that’s out there about what makes for healthier, happier lives that affect the world around them positively. And you’re suggesting that these platforms could actually give us feedback on those things.
MR. DASH: Yeah. For years, I was building a tool with a friend of mine, Gina Trapani. We actually built a company around it. It didn’t really succeed, but we had built this tool that let you see really how you were interacting with people online, and how often you congratulated somebody or thanked somebody, how often you amplified the voice of somebody that had a smaller platform than you did, how often you — we sort of conceptualized a lot of generative behaviors, a lot of positive behaviors, a lot of good civic behaviors. “How many times have you apologized?”
And would just recognize it, give it back to people, show them this is what you did, and this is how you’re progressing. And gentle correctives about — like, “Well, you’ve talked about yourself a lot.” What percentage of all the things you’ve shared on Facebook or Twitter were about yourself, and you saying “I” versus somebody else? And not prescriptive. We didn’t say, “Don’t do this.” If we hold up a mirror, what do you see? And what’s your judgment about it? And we ended up with, not a very large base of users, but a very, very dedicated base of users. I mean, the people that were using the tool — the emails I got were incredible.
One of the things we’d spent a lot of time on was a look back — what were you doing a year ago, two years ago, three years ago? And then the tricky things that come out — and we spent a ton of time thinking about, which I think this is part of why Facebook and others who made similar features screwed it up even worse than we did, was — we would do things like mention somebody you had spoken to last a year ago and say — at first, it was like, “Do you want to get back in touch with Jim? You haven’t talked to him in a year.”
And really quickly, before we even put it out in front of the users, we were like, ”We can’t do that.” It has to be — “Isn’t it amazing how time flies?” was how we described it because sometimes somebody who did want to be reminded to get back in touch with that person. Sometimes they hadn’t spoken them for a reason, and there was a lot of weight behind that. Sometimes the person had passed away.
Facebook built a year in review, a look back. And the first time — you’ve seen this. And the first time it launched, an old friend of mine who had lost his daughter said — “This was the photo that got the most comments this year.” And it was after she had passed at the end of that year. When he’s sort of just starting to process it, it put her image back in front of him.
And it stayed with me for a long time because it was eminently preventable. And again, they mean well. And Facebook did respond and sent him a thoughtful apology, which I think was sincere. I don’t — I’m not saying these are bad people doing bad things. I’m saying these are good people doing bad things. And…
MS. TIPPETT: With a really powerful, very unformed — in fact, very unformed tool.
MR. DASH: And I have to think they could’ve anticipated it. And I think there’s gonna be that over and over and over. There are a lot of those mistakes to make.
MS. TIPPETT: Are you part of a larger conversation? Do you feel that this reckoning is happening or perhaps that his recent election will spur it on?
MR. DASH: There is a larger industry conversation about inclusion and diversity. And I can tell it’s working because it’s starting to be co-opted. So, every tech CEO is giving lip service to it whether they’re sincere or not. So it’s like, OK, that’s a mark of success to some degree. The idea about more thoughtful, ethical, humane technology a lot of us care about. I think there is a groundswell. And I see this in the response to what I write, to what danah boyd writes, to what Eric Meyer, who is the father I was talking about earlier — he writes — a lot of people writing thoughtful work.
But what’s telling, to me, and what has been very instructive is I’ve looked at some of these larger social movements and moral movements. Like, I look at the success of Black Lives Matter. And it’s self-organizing. People can identify and be part of something larger than themselves without it being conducted by someone. And so they can say, “I’m declaring a value.” And in the shortness of a hashtag…
MS. TIPPETT: Right. Extraordinary.
MR. DASH: …you know what my values are. And we don’t have a hashtag. We don’t have a name for a movement. We don’t have a shorthand way of articulating, “Well, when I say this, or that I support this, that it represents this larger idea in culture.” And I don’t know how to get there. I wouldn’t presume that I can be part of creating that.
But I think when it arises, all the people who have this latent intention about being thoughtful about what technology can be will rally behind it. But I keep looking. I’m always like, “Where is the brash 23-year-old creator who scares the here out of me, who makes me feel like I’m old and out of touch and I don’t get it?” I know they’re coming. I know that person’s coming, and I know they’re going to have the galvanizing name and idea for this to be able to coalesce it, but…
MS. TIPPETT: For this shaping technology to human purposes, shaping the technology enterprise?
MR. DASH: Yeah. I think for remaking the tech industry, for reforming it around being more ethical and humane. I think this is one of the most important missions around. I just think — because we have subsumed decision-making from media, from policy, from culture, from art, into the tech world, and we are influencing it. When we make the box bigger, the text gets bigger. Because we have that responsibility, then the urgency with which we have to address our moral failings is that much higher.
[music: “Canvas (Instrumental Version)” by Imogen Heap]
MS. TIPPETT: You can listen again and share this conversation with Anil Dash through our website, onbeing.org.
I’m Krista Tippett. On Being continues in a moment.
[music: “Canvas (Instrumental Version)” by Imogen Heap]
MS. TIPPETT: I’m Krista Tippett, and this is On Being. Today, I’m with the technologist Anil Dash, exploring the unprecedented power, the dangerous learning curves, and the humane potential of technology in this moment we inhabit. I spoke with him in a public event at the Avalon Theater in Easton, Maryland, at the invitation of the Aspen Institute’s Wye Fellows and the Dock Street Foundation.
MS. TIPPETT: You’ve asked publicly, “What is meaningful about all this time we spend online? What will we have to show for it?” I think a lot of us are asking that question. You don’t have to be a computer scientist to ask that question.
MR. DASH: No. Helps if you’re not.
MS. TIPPETT: [laughs] I wonder how becoming a parent — you said your son is 5 now, right?
MR. DASH: Mm-hmm.
MS. TIPPETT: Has intensified, shaped the way you’re working with these questions this kind of soul searching.
MR. DASH: It’s interesting. I’ve been blogging for — what — 17 years. It takes eight or nine years to get really good at it. And at 10 years in, I realized that I was going to be doing it the rest of my life. This was years before my son was born. And I started to think about, if I have a kid someday, what would they see?
And almost immediately, it changed things. It made me grow up a lot real quick. And part of it was wanting to be worthy of my words living on. I mean, I think there’s a real — there’s a shocking ephemerality to what’s happened on the internet. Most of the things that have ever been published on the internet are now gone. And that’s a weird realization ‘cause it’s a young medium.
And I thought I want to fight for, one, preserving my words and those of others. And two, I want to be worthy of preserving them. And we don’t build the tools that way. And I think, actually, there’s a very strong argument for tools that are ephemeral by design. I think having things that are designed to be short-term and just disappear is great, and we should not just rush to capture everything. There’s a weirdness to ordinary conversation being sort of preserved in stone too.
But some of this — some of what’s there is meaningful, and it’s useful for reflection. I think that’s been the most powerful tool for me is to go back to something I wrote 5 years ago, 10 years ago, 15 years ago and see what was right and what was wrong. And it’s different, at least from my perspective, than journaling or a diary would be.
MS. TIPPETT: Right. Previously, you would have a diary, but it would still be in a closet. And no one else would have ever seen it or would ever see it.
MR. DASH: Well, yes. And there was also — you think — you look at — any of you who have read the sort of letters between the Founding Fathers, they had an awareness that these would live on beyond them. So it’s correspondence, it’s nominally private, but it’s got sort of a winking eye to the fact that this will be having an audience in their absence, which is a really interesting form of writing.
And I think that, for me, not in the Founding Fathers sense, but in how you write sense, is very parallel to me because I write generally from a very personal place. I’m on my own, and I’m not — I’m in a room on my own, and there’s no one around it, but with the idea that the sort of shadow that it casts, a million people could see.
MS. TIPPETT: So my children are, right now, 18 and 22. And even in those four years, there was such an acceleration. And it was interesting also in terms of the platforms they and their friends use completely shifted.
MR. DASH: Totally different tools. Yeah.
MS. TIPPETT: And I’m actually — I find myself being really grateful that I’m not a parenting — we were still in that window where I could say, “No, you will not have an iPhone until you’re 14,” or something, which I just don’t think you can do anymore, right? And so they were already kind of formed before all of the technology entered their lives. And I know it’s changed so much now in the meantime, and you have a 5-year-old. I mean, I wonder how are you thinking about that question.
MR. DASH: We don’t have a very intelligent cultural conversation about how kids engage with technology at all.
MS. TIPPETT: No. No.
MR. DASH: I think…
MS. TIPPETT: It’s like a guinea pig generation.
MR. DASH: Yeah. Well, it’s also — I also think of the concept of “screen time.” When you’re with young kids, you’ve heard this, right? “Do you limit your child’s screen time?” And it’s like, no. I engage with what he’s specifically doing. I don’t limit his page time. I just choose whether he’s reading a book or a magazine or whether it’s something that’s like a bunch of — he’s 5 years old, so he likes poop jokes. But — how much of that and how much of, like, smart stuff? And so the idea that they’re both on pages and are therefore equivalent is absurd, and yet we talk about screen time that way. I’m like, is he playing chess on the iPad? Or is he watching funny YouTube videos of animals falling over? Which is also awesome, but different.
And so that really — that always sticks with me because I think it’s a very unsophisticated way to look at things, and then we carry that forward. And that’s when they’re very, very young, right? 2, 3, 4, 5. They first start seeing screens. And my son maybe spends 15 minutes a day on the iPad, and he loves it, and that’s all he gets. But that’s always been the rule for him, so it doesn’t matter. And I limit it mostly just because we limit everything. I mean, you just don’t let a 5-year-old do whatever they want, or you end up in hell.
[laughter]
MS. TIPPETT: Are you saying — so, this is a radical idea. You apply the same wisdom you apply to other things to technology?
MR. DASH: Yeah.
MS. TIPPETT: Yeah, well that…
MR. DASH: Well, that’s the thing. It’s part of your life. I think that was the thing. I saw so many parents — and this is not judgment. I don’t judge other parents. Other parents are fine.
MS. TIPPETT: No, we’re all on this frontier, and we’re learning a language.
MR. DASH: But as we’re figuring it out, they treat it as if there is life — they say this — like, “This is real life, and then there’s computer world.” And I’m like, “That’s not the thing. That’s not how their lives are gonna be.” And I think I had an unusual perspective, in that I did start using computers before I was in kindergarten, just as my son has.
And he has way better programing tools. I was like, “Gosh, if I had these things.” He’s got — because we had to do these primitive blocky green graphics on the screen when I was a kid, and he’s got this Star Wars robot that he can go on the iPad and give it programming instructions, and it follows his directions to roll around the living room. And I’m like, “That is the coolest thing I’ve ever seen.” Like, you wait until they go to bed so you can play with it.
[laughter]
MR. DASH: And that’s — no, he’s going to listen to this I bet. So — I don’t do that. I don’t do that.
[laughter]
MR. DASH: But the thing that I think about is that that’s part of his life. It’s not over there. It’s not an artifice. It’s not the virtual world. It’s just life. And I think about that with so many experiences where, when we were fighting for validating social media and social networking, saying these would be important, these would be part of our lives and there’s a reason to include it, it was about this idea that sharing makes something better.
I fully reject the argument — people say this all the time. You know, “I saw this young person in a restaurant on their own, on their phone, not interacting with anyone.” What do you think they were doing? They were talking to people.
MS. TIPPETT: [laughs] Right.
MR. DASH: They were interacting with lots of humans all at once. And it makes me furious because I’m saying they’re being deeply social. It’s not in the mode that you know, but it’s actually better than when they were sitting alone at the diner with a book. And I think there’s been this misunderstanding and this misapprehension about what the tech is doing. It is connecting us to people.
And there’s so much attention paid — and with good reason — to the bullying and the other things, the cyber bullying and all those. A general rule of thumb is anything that begins with “cyber” is a lie. Like, if you say “cyber bullying” or “cyber crime,” it was probably — that’s one of those rare areas where they — it’s a behavior that existed before, and the “cyber” is not the issue. So children being unkind to each other…
MS. TIPPETT: Right. So nothing happens online that doesn’t happen offline.
MR. DASH: Right. And so being able to integrate it — now it can be worse because of the network effects. It can be amplified by the immediacy and the fact that it happens in your home. But the principles can carry across. And it has to be an integrated conversation, and that’s the key. It’s like, how much time do you limit your child talking to their friends? I don’t care if it’s on the phone, on the computer on messaging, in real life, in person, out in public, whatever it is. If you have a set of rules, they apply across these things. But that demands a literacy and a fluency that I think takes a serious investment in time and understanding your child’s context. And that’s the hard part.
[music: “Southern Skies” by The End of the Ocean]
MS. TIPPETT: I’m Krista Tippett, and this is On Being. Today, I’m with Anil Dash, the entrepreneur and activist for humane technology.
[music: “Southern Skies” by The End of the Ocean]
MS. TIPPETT: This matter of technology and social healing, technology and weaving common life — when I see you writing about this, I see one thing you do — which is, again, related to what you just said — is you point people to analog places close to home.
MR. DASH: Yeah. I try. We’re still sounding our way through this incorporation of technology into our lives. And there are historical analogs for a lot of these things. There are things we can look back at. And the trick is to identify, where does the analog apply? And where is it irrelevant? And that line keeps shifting, especially as we learn more about the behaviors. I think — it always does come down to, what are our values? And what do we care about? And what are the things that we think are meaningful? And then using that as a filter to understand and control and make decisions around these new technologies.
But those of us in the tech world have not done ordinary folks any favors around making those decisions because we’ve adopted this stance that values don’t apply. And that’s part of the reckoning I’d ask everybody who’s not in technology to have, is to raise that flag. At the time when somebody says, “You’ve got to try this new app,” “You’ve got to use this new tool,” think through, what are the implications of, one, me using this, but two, what if everybody does?
I look at — I don’t know — just to pick one out of a hat, like, Uber. A lot of people are like, “Oh, you should try Uber, and it’ll get you a car service.” And it’s something I’ve thought really deeply about. Living in New York, being of Indian descent in particular, what happens to the taxi drivers is very personal to me. And we have a community in Queens, Jackson Heights, where it’s enormous number of South Asian immigrants. And somewhere — I don’t remember the number — it was like 20 percent or something of households make some of their income from driving, delivery drivers, taxi drivers, limo drivers.
And there were a lot of — obviously, the taxi industry is also corrupt in its own way, so not diminishing the challenges there — but that Uber has said, “We’re going to bring you in, make you a driver, and have essentially have full control over what your income is and how many fares you get, using an algorithm that’s opaque to you,” is terrifying. And then once they got the drivers on board — there are now more Uber drivers in New York City than there are yellow cabs — they said, “By the way, we’re going to replace you all with self-driving cars as fast as we can.” And that’s gonna happen. And this is a crash we can see coming. That’s the one we know, we anticipate.
MS. TIPPETT: Fortunately, what you’re asking people to do is think. [laughs]
MR. DASH: Yeah. But these are like the — yeah.
[applause]
MS. TIPPETT: I mean…
MR. DASH: And that’s it. And these are off the top of my head. These are the ones on the tip of my tongue.
MS. TIPPETT: So over the years, I’ve had conversations with people like Sherry Turkle and danah boyd, and Tiffany Shlain. This idea that is very hard for us to internalize, because we feel like this technology hasn’t landed on us and taken over our lives, permeated — and it has.
MR. DASH: But we have a choice, and our choices matter.
MS. TIPPETT: Right. And to internalize that this technology is in its infancy, and we are the grown ups in the room.
MR. DASH: Yes, yes. 100 percent. It is 100 percent up to us.
MS. TIPPETT: It doesn’t feel like that, but it’s true.
MR. DASH: It doesn’t. And we can ask those questions, right? When somebody says, “Try out this new app,” it doesn’t take very long to say well, “Well, what happens if this works?” Everybody in the Valley — they get their company funded, they make a startup, and they say, “We desperately don’t want to fail.” And I’m like — I’m not worried about the failures; I’m worried about the companies that succeed. And that’s got to be the obligation of the rest of us that care about these issues, to really deeply interrogate that whenever we can.
MS. TIPPETT: Your wildly successful Twitter feed, @anildash — is it — your name is “rap game Bodhi Rook?”
MR. DASH: Right now it is. Yeah. It changes a lot.
MS. TIPPETT: OK. But I love your — you just say in your profile — I love reading Twitter profiles. I think that is such a beautiful slice of humanity, at least the ones I read. “Trying to make tech a little bit more humane and ethical.” I once interviewed a French geophysicist, one of the people who discovered tectonic plates, and he pointed out that the word “human” in French is the same as the word “humane.” I don’t know why I thought of that when I was reading this.
I want to ask you how your — this life you’ve lived, these obsessions you have — which I think everybody in this room is grateful you are out there having these obsessions — how would you start to talk about how this has evolved the way you think about what it means to be human, humane? What does that mean?
MR. DASH: I describe myself as being in the technology industry, but technology always means “things invented after you were born,” basically. And so there was a time when the technology industry was the wheel. And there was the time when the technology industry was fire. And its every iteration along the way has been — the first people to do agriculture were the technologists of their time.
So I’m just saying that context of “this is only temporarily new” has been really, really helpful for me. And I think — and I guess it’s especially true, again, since becoming a parent, but just in general — like, marveling at the briefness of the time we have. And I think, how lucky to be at this genesis moment for something actually new. How rare to be at a time when things changed, even with all of the negatives that come, and all of the hard problems that come.
MS. TIPPETT: And all of the risk that always is there when change comes.
MR. DASH: Yeah. And of course, I know how fortunate I am to be on the right side of those changes. I think — my parents are from one of the poorest and most remote parts of India. My dad’s village today — a family of four around that area lives on between 600 and 800 dollars a year. And I think the quality of life improvement from my father growing up as a British subject with no vaccines and no clean running water to my son living in Manhattan is perhaps the greatest single generation leap in quality of life in the history of humanity.
And so that weighs on me a lot. To be bookended by these two incredible people, like, my parents on one end and my son on the other, it feels like a grave responsibility. To get to be the conduit between the greatness of what my parents have done and the greatness of what my son will do, I think is the thing I think about every day. So it’s — well, I have these tools, and they’re novel now, and they will be boring very soon. And so, en route to them being boring, how can I be sure that they are just?
[applause]
MS. TIPPETT: Anil Dash is the CEO of Fog Creek Software. He also founded Makerbase and Activate and Expert Labs, a non-profit research initiative backed by the MacArthur Foundation and the American Association for the Advancement of Science, which collaborated with the Obama White House and federal agencies.
[music: “Taskin” by Poppy Ackroyd]
STAFF: On Being is Trent Gilliss, Chris Heagle, Lily Percy, Mariah Helgeson, Maia Tarrell, Marie Sambilay, Bethanie Mann, and Selena Carlson.
MS. TIPPETT: Special thanks this week to Richard Marks, Amy Haines and Kathy Bosin at the Dock Street Foundation, Judy Price at the Aspen Institute Wye Fellows, and Austin Carter, Suzy Moore, and all of the fabulous staff at the Avalon theater.
Our lovely theme music is provided and composed by Zoe Keating. And the last voice that you hear singing our final credits in each show is hip-hop artist Lizzo.
[music: “From The Lotus…” by Prince]
MS. TIPPETT: On Being was created at American Public Media. Our funding partners include:
The Ford Foundation, working with visionaries on the front lines of social change worldwide, at Fordfoundation.org.
The Fetzer Institute, helping to build the spiritual foundation for a loving world. Find them at fetzer.org.
Kalliopeia Foundation, working to create a future where universal spiritual values form the foundation of how we care for our common home.
The Henry Luce Foundation, in support of Public Theology Reimagined.
The Osprey Foundation – a catalyst for empowered, healthy, and fulfilled lives.
And the Lilly Endowment, an Indianapolis-based, private family foundation dedicated to its founders’ interests in religion, community development, and education.
Reflections