Between Two Codes

Paul Ohm: Surveillance and the COVID-19 Pandemic

Episode Summary

Today, we sit down with Paul Ohm, the Associate Dean of Academic Affairs at Georgetown Law, to talk about technology and surveillance in the classroom. Dean Ohm was previously a federal prosecutor at the Department of Justice's Computer Crime and IP Section. He is an expert in the areas of surveillance and law. He discusses how his work on surveillance has impacted classroom recordings at Georgetown and lessons learned to guide the future of technology development. Join us for an inside look at the decisions Georgetown has made around technology in the classroom, in-person and online, and how that interacted with Georgetown's response to the COVID-19 pandemic.

Episode Transcription

Dean Ohm (00:01):

That's the problem with baking in technologies of surveillance. They're always done for really good reasons and they're never dismantled. As soon as the emergency ends, creative and hardworking people will use it to solve other problems. And half those problems we'll think are amazing, isn't the world better because of them. And half will be problematic,concerning, and lead to bad places. And then once in a blue moon—they'll come up with a use that's just evil.

Panya Gupta (00:34):

Hello and welcome to Between Two Codes, a podcast where law students talk to the experts at Georgetown Law about the intersection of law and technology. I'm your host for today's episode, Panya Gupta, and we're glad to have you join us. 

Between Two Codes is brought to you by the Institute of Technology Law and Policy at the Georgetown University Law Center. The Institute is training the next generation of lawyers and lawmakers with deep expertise in technology, law, and policy. You can find the many events that the Institute does at georgetowntech.org

Before we get started, we'd love for you all to subscribe and rate us on your podcast app of choice and follow us at Between Two Codes on Twitter and Instagram. We have a lot more great episodes coming down the line, and you will not want to miss them. This week on Between Two Codes, we have Pau Ohm professor and associate Dean at Georgetown Law. Dean home specializes in information privacy, computer crime, law, intellectual property, and criminal procedure. He has worked heavily on issues of privacy and joins us today to talk about the use of technology before, during and after the COVID-19 pandemic at Georgetown Law.

Panya Gupta (1:40): 

As a head's up, this episode briefly discusses Georgetown Law's policies around recording classes, both in-person and virtual, with Dean Ohm. Although released recently, we interviewed Dean Ohm and recorded this episode prior to an incident at Georgetown relating to classroom recordings. 

Panya Gupta (01:46):

Thank you so much, Dean Ohm for joining us today and welcome to Between Two Codes. We're really excited to have you here.

Dean Ohm (01:53):

Oh, it's really a pleasure. Congratulations on the new podcast. I'm glad to be part of it.

Panya Gupta (01:59):

We're really appreciative of your interest in coming and speaking with us. We wanted to hear a bit about your thinking around technology usage at Georgetown and how that plays into larger themes about technology. I want to start by talking about something you mentioned as a guiding principle of yours, “Technology follows policy. Technology doesn't dictate policy—except when it does.” Can you explain that further and talk about how it has guided your thinking both at Georgetown and beyond?

Dean Ohm (02:35):

I think the important context for this conversation is: I've been a technology law scholar for 15 years and only in the last five years have I been a law school administrator as well. I'm the associate Dean for Academic Affairs. That means I run the curriculum and I figure out who teaches what. But in the middle of—and even before—the pandemic, I realized that my prior thought as a technology law scholar had to be brought to bear in the decisions I was making. It had to because I couldn't help myself. I am who I am. More importantly, everything we do resonates with what I've been writing about. I'm a critic of Facebook and of Google and of Twitter and Apple and all of the other giant tech companies who—my core thesis is really boring and straightforward—the world is broken and tech companies are largely to blame. Now I'd put my money where my mouth is.

Dean Ohm (03:38):

I run a “tiny little nonprofit organization” with thousands of students and hundreds of professors. We use technology a lot! We had to use technology a lot in the pandemic. Let's go back to this thing, which I call my mantra. Whenever my staff would come to me (and I have a staff of around 30 employees, the Registrar's Office, the Office of Academic Affairs, more than I would have believed) they would have said, “Okay, but the database we purchased 10 years ago does X. And because of that, Student Y who wants to do something…  we just can't do it.” And whenever that happened, I would say, “No, the technology cannot drive the policy. The policy has to drive the technology.” And what that sometimes meant was we needed to change the technology, which is expensive, disruptive, and hard for the staff to do sometimes. I kind of said that repeatedly. I've probably said that 20 times since I became Associate Dean. But I have that little caveat at the end: except when it doesn't. That's a tip of the hat to the reality that sometimes technology really does tie your hands, defines what's impossible, possible, expensive, non-expensive etc. There's a lot there, but that's my mantra and I've tried to apply it since I became Sssociate Dean—including during the pandemic.

Panya Gupta (05:08):

Speaking of the pandemic, we've seen a lot more of our reliance on technology, both at Georgetown and on a wider societal level. Technology is how we've stayed connected to one another when we can't be connected physically. It's how we've continued schooling, businesses—a lot of different functions that previously were not considered well suited to technology usage to the degree they are now. How did the administration think about technology? At the start of the pandemic what did you anticipate or think the school needed to focus on?

Dean Ohm (05:58):

The pandemic has been this insane laboratory for kind of embracing technology and barreling right past prior reasons why we wouldn't adopt some technologies. We had no choice. I think it's important to say at the outset that I don't want this focus on technology to distract from what's really important about the pandemic. People are suffering; people have died. In comparison to that, the kind of things I worry about are petty and small in comparison. But you know, we didn't have a choice. It's almost the one year anniversary right now to the day we decided to move everything online. We made that decision over spring break, and it was something that was so difficult and jarring to consider. We had to take hundreds and hundreds of classes, hundreds of professors who had never once in their life used Zoom and we had to say, “This is now how you do it. You are going to adapt to this new reality.” We had to do that in an environment where people were sick and where family members were sick and where people were having trouble getting back in the country. The role technology played is one that will be chronicled for decades. Actually it's one that I want to build into my scholarship. I'm really starting to think in a kind of retrospective way. I said I was a scholar before Associate Dean. I'm going to be a scholar after I'm Associate Dean, because it's only a temporary job here. And so I'm beginning to think about the choices we made during the pandemic and whether or not we made the right choices and whether or not this says something bigger about how Facebook came to be Facebook and how Amazon came to be Amazon. Because at one point in their lives, they were a scrappy startup. Just like we were all scrappy startups in March of 2020.

Dean Ohm (08:01):

What I'd love to do is just go into lots of specific examples, but that's the overall idea.

Panya Gupta (08:08):

I would love to hear a bit more about these examples, but before we dive deep into the examples, I'm curious if there were any technology decisions that were made prior to the pandemic—prior to any thinking about the pandemic—that ultimately impacted the decisions that were made when the pandemic started and when you were in the “scrappy” startup phase at Georgetown?

Dean Ohm (08:37):

Soone of the big, enduring lessons of the pandemic is—and this is probably an offensive phrase—but you come to the dance with who brought you. Okay, I guess it's not offensive, but anyway. You start from the raw materials you have to work with. And technologically, that means you look at your pre-existing platforms. You look at what they do well, what they don't do well. More importantly, you don't have time in an emergency to set up new platforms. And so I think every university in America, in the world, every law school in America: their story technologically is the story of “what did we do beforehand?” And in that way, I think Georgetown is going to turn out to be pretty unusual. Georgetown for about four or five years had been embracing video recordings of classroom discussions, probably more than any law school in America.

Dean Ohm (09:31):

I don't think that fact has ever been written down or said aloud. Here I am, I'm airing this fact for the first time. You know, we could do an entire hour on why that is. What were the kind of historical quirks? What role did student advocacy play in bringing that about? But at Georgetown, we had a video camera in every single classroom. We had a stack of computers in the corner recording every single class. We had a system where professors in each classroom had to opt-out—not opt-in—opt-out of recording. And there's a lot long, long, a bit of scholarship in law and in other fields that says opt-outs are destiny. If you create an opt-out, then the default choice tends to be really sticky. In this example—even if you're a professor who hates the idea of your classes being recorded—it's just hard to say no to something when the default is yes. Especially when your students really want it.

Dean Ohm (10:34):

And so that means we had a lot of classes that were recorded. We had a student body that was used to the idea that classes were available after the fact. It was never 100%. It probably wasn't even over 70%. I don't know the exact number pre-pandemic, but that gave us something that no other law school in America had. This kind of technological infrastructure for making recordings available after the fact and the culture to adopt and accept something like that. I should be clear because again, I don't think we've talked about this publicly. We didn't record clinics. We didn't record some tiny classes. We were mindful of special hyper sensitivities of the civil surveillance system. I should also come clean and say, I didn't love our surveillance system. It was too much surveillance. I’m a surveillance scholar and I've studied for a long time how people are chilling when they know that a camera is watching. This is obviously well trodden ground: some call it the “panopticon”. I think one of the great ironies of our entire story is that the company we hired to do this is called Panopto. I had hilarious conversations with the IT director before the pandemic. I said, “You should tell that company that is the stupidest name they could possibly have picked.” And I think he passed along the note, but clearly the company didn't let me rebrand them. Today we use the system—a very accurate name. I mean, maybe it's honest. Maybe I should be celebrating their honesty instead of criticizing their ineptitude, but it's a dumb name and it's our system. We have a literal panopticon at Georgetown Law and it's one that we didn't build for the pandemic. We inherited it for the pandemic. Just to close out the thought: it helped us during the pandemic. It helped us that we had this rack of servers in every classroom that were really good at recording things. We made a lot of design decisions that other law schools couldn't make that were all geared toward that little rack of computers in the corner.

Panya Gupta (12:41):

And for context, the classroom recording systems helped both with synchronous and asynchronous classes and providing classes at the virtual level to enable students all across the country and all across the world to still continue attending.

Dean Ohm (13:02):

Yes, that's right. Georgetown I think has more students who come to us from abroad than probably any other law school in the country. And I'll remind what everyone remembers, which is, the pandemic coincided with really kind of draconian announcements from the Trump Administration about visas and about students studying from overseas. This affected the fall semester more than the spring, if I'm remembering correctly. We had hundreds of students studying from China, South Korea, Australia, and parts of Europe where it was the middle of the night when most of their classes met. The fact that we have this really sophisticated recording system—and let me just do a digression that maybe it's too deep in the weeds for a podcast—this also meant that because we had a prerecorded system, we didn't need professors to hit the record button themselves.

Dean Ohm (14:03):

That's a huge benefit. I've heard from not only from other universities, but from Georgetown's main campus, that one persistent problem was some professors would forget to hit the record button. This is what I've been studying as a tech scholar forever. There's this fancy word that scholars use, affordances—meaning what does technology in its configuration and its user interface, what does it make easy? And what does it make really hard? And it turns out what it makes easy gets done and what it makes it hard doesn't get done. In our system, we had this wonderful affordance for recording with very little intervention. Which made recording easier, which helped all these students in China—like you just said. This is again what I've been studying for decades. At the same time, it meant I couldn't let a professor just set up their own Zoom account. They had to use my Zoom links. I love that. I call them mine. I feel like a father with these things. That drove a hundred other decisions that students probably experienced, but didn't understand why it was true. Tt meant that in the spring, professors couldn't launch their own breakout rooms because they had to use our Zoom accounts and if we gave them what they call host rights and they hit the wrong button—they would screw up recordings for the next 10 classes. It meant that professors couldn't run their class over by 45 minutes, which students probably really loved because another professor would appear at that link 15 minutes later. But at the same time, it meant that professors couldn't also hang out for office hours afterwards using the same link.

Dean Ohm (15:52):

This one tiny little institutional technological decision made for completely different reasons changed the entire course of our technological capability. Largely for our benefit—but also it created friction. It created frustration. And then here's the big sweeping theme: It centralized what we did at Georgetown. It meant Paul Ohm and George Petassis—the head of our IT department—we were kind of in charge of this centralized architecture. It wasn't a laissez faire, let every professor do what they wanted. And I'm sure that had knock on effects that even I haven't really come to appreciate yet.

Panya Gupta (16:34):

You know, from a student's perspective, that background is so interesting to hear. I'm curious to hear more about these types of decisions. I know there are some really interesting ones that you've made related to QR codes, for example. Can you talk a little bit more about how those have been used and thinking around that?

Dean Ohm (16:56):

Yeah. I'm hoping to write an article about this. I focused so far in this conversation about technology—how technology decisions are made, and how they affect kind of important human interactions. But of course, the really interesting overlay is public health. In this pandemic, we've all had to become amateur public health experts. Every one of us—I don't just mean administrators. All of our decision-making is driven by what the CDC says, by what university public health experts say. And we quickly came to understand that one thing public health really needed was contact tracing. If someone was in our building and that someone was ill, we would have to then figure out who else was in the area within six feet, perhaps without a mask. Contact tracing and the technological facilities that enable it is something that a lot of people have commented on—but here's our own little micro-version of that story.

Dean Ohm (18:00):

Early on, we wanted to have—as only one example—hybrid classes. We wanted to have classes where students came back in, but they'd wear masks and be socially distant. And the one question is, if a student ever appeared and then later found out that they were positive for COVID, how would we find out who was within six feet? Listener, think about the menu of options. One is the good old fashioned way. You get them on the telephony, say “Who was sitting next to you?”, and rely on their imperfect memory. Zoom ahead to the most full-proof and invasive version, where we aim a camera at the students and we record where every single student sits and who they turn to during the break and whether or not they ever get within six feet.

Dean Ohm (18:43):

You can imagine this crazy artificial intelligence system where we measure the distance between faces and we determine how close you are. You can imagine something that's probably a little less complete where we track Wi-Fi registrations (this was actually floated by someone). You say, “At this time there were 32 iPhones and 12 Android phones next to the access point that services this classroom.” You could imagine we could ask students to register their Wi-Fi cards on their smartphones so we could then pull this out. The bottom line for all of this is: COVID contact tracing requires a surveillance system of some kind. We decided—and I drove a lot of this decision-making—to try and do the most minimally invasive surveillance system that would also enable decent contact tracing. We did an opt-in system where every seat would have a QR code.

Dean Ohm (19:44):

You would snap a picture of [the QR code] on your smartphone. It would send you to a form. You would put in your name and somewhere in our cloud would be registered that “Panya Gupta at this date at this time, sat at the seat.” And that's all I would say. It wouldn't get a picture of your face. It wouldn't even show when you left—because we're not asking students to do it on departure. It would give our contact tracing a happy medium between the old-fashioned “who was sitting next to you” and the fully surveillant “let's look at your picture.” It's not perfect. Students forget to do it. I don't know when you left. You know it's a little imperfect. It certainly doesn't let me know if you got too close to someone during a break. I'm really glad about this choice.

Dean Ohm (20:26):

I think it strikes a nice balance. I think there's an old school aesthetic benefit that there's this constant reminder that we're all in this together. that value is served instead of us just watching you. And I'm sure there are some critics who think this is a lot of labor and it's not even a perfect contact tracing system. I didn't want to build a perfect contact tracing system because I don't want to build a perfect surveillance system. 

Here's one other benefit: think about when the pandemic ends. Imagine we had built the Wi-Fi tracking system for smartphones that some people had pitched to me—guess what would have happened when the pandemic ended? Someone would say, “Well, we already have that software running and we already have it in the system. Why don't we use it to take attendance?”

Dean Ohm (21:16):

Or, “Why don't we use it to detect intruders in the building?” That's the problem with baking in technologies of surveillance. They're always done for really good reasons and they're never dismantled. As soon as the emergency ends, creative and hardworking people will use it to solve other problems. And half those problems we'll think are amazing, isn't the world better because of them. And half will be problematic,concerning, and lead to bad places. And then once in a blue moon—they'll come up with a use that's just evil. That's just downright pernicious. They'll use it to track people on police watch lists. Who knows what they would do next? I characterize that as evil; maybe some listeners are thinking that'd be great. A QR code can't be abused in that way—then we're done. It's just going to be this funny sticker on desks. My guess is we won't be very good about cleaning them off. There'll be this like sticky residue of a QR code and that's it. There's no inherent temptation to bring in mission creep, to use the surveillance system for some other purpose. It's narrowly tailored to the job at hand. That makes it worth all of the trade-offs about what it can’t do.

Panya Gupta (22:34):

And that seems to relate to your mantra of how technology follows policy technology, not dictate policy, whereas a secondary use or mission creep of a surveillance system might result in existing technology changing policies—such as being used for attendance.

Dean Ohm (22:58):

Yeah, absolutely. Right. You can imagine that. Step one would be COVID contact tracing. Step two would be—this is really interesting—professors who want to use this for attendance. I've been an administrator for so long that I'm actually picturing the email. I would write to the professors.

Dear professors, 

We have this wonderful new capability. Do you hate taking attendance? We know you do. You just go to this friendly app and you can find out which of your students was there.

That's step two. Step three is, “Wow! Look how many students are skipping class? That's really problematic.” You know what step four would be. I'm turning this into a scifi novel. Step four would be during our next ABA accreditation review, someone would say, “Oh, it's bad that we know that there's so many students skipping classes. This threatens our accreditation.” (Which is often what someone with an ulterior motive will say in a law school administration.)

Dean Ohm (23:51):

Step five would be the worst—the email we send all students. 

Dear students, 

We have now put in place a mandatory attendance policy that is being tracked by technology. You will get an email every time you've missed a class twice. After three times, if you don't have an explanation, you'll be withdrawn from the class.

I mean, that's not crazy or implausible. And you could imagine if your response is, “No, no, no—this is good. Professors of goodwill and students of Goodwill would rise up and advocate against it, fight it and kill it.” And you know what, you're probably right. But think about all the energy that would require. Think about how that would distract us from the other things we need to do as a community. Think about if they were wrong and you lost that future battle. So I'm glad. I didn't even think it through before this conversation. I'm glad that the QR codes are the reason three years from now. We don't have to fight over a mandatory technology, embedding an attendance system. I'm going to sleep better at night tonight. Just realizing that that's one of the futures that I've helped foreclose, at least for now. Who knows what they're going to do next year, but for now I feel safe in thinking that we dodged a bullet here.

Panya Gupta (25:09):

Yes, the academic version of black mirror, it seems.

Dean Ohm (25:15):

Black Mirror hasdestroyed being a tech scholar because they're so good at doing what we're supposed to do. It makes all of our stuff seem secondary and ham-handed compared to them. 

Panya Gupta (25:26):

The other example that I was curious about, and I know many students are curious about is the chat functionality in online classes and Zoom. Can you talk about your thinking and general thinking about that?

Dean Ohm (25:41):

The reason I love talking about this example is up until now we've just been talking about surveillance, and surveillance brings a lot of pros and cons to bear. We know we know what that debate looks like. The chat functionality is about a very different bucket of issues. One thing I want to introduce in this conversation is, technology is all about, or technology should be about thinking about how it enables certain human values and discourages other human values. And there are lots of human values we can talk about. It could be equity, fairness, or efficiency. There are lots of things that we as humans want and technology makes these things easier or harder. Just like my example about attendance policies and surveillance cameras. And I like the chat example because it is not about surveillance.

Dean Ohm (26:35):

It is about community. It's about pedagogy. It's about being social. Here's the issue: Zoom has a fine grain set of chat functions, and you've all seen them. You've all been in a chat room where you can start in a Zoom room where you can chat with everyone else. Zoom lets the administrator—which sometimes means the professors sometimes means us—set the policy for their session. And the choice is—there's three choices—which is everyone in the class can chat with: the entire class, only with the professor, or chat privately with another member of the class. Those are the three choices you get to set. I had an early conversation with the professors, “I said, I need to set the default setting. What should the default setting be?”

Dean Ohm (27:34):

I'm going to share the private things said by professors in meetings, but I'm not going to identify any names. There was a non-trivial number of professors who said, “I don't want the chat in my class. It's a distraction. It's not consistent with my pedagogical goals. It's just not something that I want to keep track of. There's too many moving parts in online teaching.” 

I am totally sympathetic to all of that. I think whether or not you enable chat should be an individual choice by the professors. But there were two complications that caused me a lot of anxiety and I'm not even sure we've landed on the right one. Here's the first one and it gets a little in the weeds. Zoom allows you as the host of a room to turn—I'm sorry, I'm doing this slowly because I'm trying to remember all the details myself—but Zoom allows you as the host of the room to let everyone talk to everyone or to let everyone talk privately to one another. But Zoom doesn't let you make that latter choice during the class.

Dean Ohm (28:46):

You actually have to set it for all classes ahead of time, if that makes sense. Someone has to log into the Zoom server and say, this classroom allows private chat. And what we realized was—go ahead—

Panya Gupta (29:00):

Can I clarify, is that a permanent decision for the entire semester or can that change at any point? Later on too?

Dean Ohm (29:11):

For this conversation I should have looked into the details, because there's an important detail that's escaping me at this moment. I'm trying to think of a more general way I can frame it to still get the point across. There were some things that Zoom required us to set once for the class, as a default for all classes. And once we set it, professors couldn't easily change it on the fly. One of those things was private chat—that private chat turned out to be something that was really, really, really hard to let professors turn on and off on their own. I had to make the choice for all classes and it flows to an earlier part of this conversation, which is if I let every professor use their own Zoom ID, they could do whatever they wanted. But we decided not to do that because of our Panopto recording system, we had to centralize some of the control.

Dean Ohm (30:03):

In this case it took away flexibility. Although I've kind of butchered the details, the takeaway was I couldn't allow private chat in classes—even for professors who wanted it—without allowing it for all classes. There were a couple vocal professors who said vehemently and passionately, “I cannot have private chat going on in my class.” What I want to tell the listener is you probably have an opinion about this one way or the other, and you probably feel pretty strongly about it, but I want you to understand that there was someone who had the opposite opinion from you. It was left to me as a decision maker to balance the competing interests. More importantly, deeply felt emotionally competing interests. Let me just frame some of them for you. Why would a professor—and again, don’t be too judgemental, whether you agree or disagree with these, you should have your own opinion—but why would a professor want to prohibit private chat in a classroom?

Dean Ohm (31:05):

There are many answers. One is that it is just distracting, says some professors. “It just is one more thing my students will be doing instead of focusing on the conversation.” Number two is that it opens the door to harassment that some obnoxious student is going to use it to send something vile to another student. The professor kind of sees their role as the person in charge of the discourse in their classroom, and it bothers them that they might be in some way responsible for abetting harassment that they're going to have trouble with. Which would also mean, do we send these people to student discipline, etc. These all sound very paternalistic, of course. Number three, I think some professors—although they didn't say it this way—didn't like the thought that the students would be talking. For all of these reasons and probably more—I'm probably not being super charitable to this argument—professors, some of them vehemently, said, “You cannot allow there to be any private chat in my classroom.” Because of Zoom's choices and because of our choices that stuck for all professors [private chat was disabled]. Let's take the student’s point of view.

Dean Ohm (32:16):

We heard from students passionately, not a lot of them, but a few, “I'm brand new to Georgetown. I've never been here before. I don't know a single soul yet.” This was especially stark in the beginning of the fall semester. “What I would love more than anything is if I could follow up a student comment by saying, ‘I totally love what you just said. I don't know what your email address is, because it's hard to take the little snippet on your Zoom screen and figure out your email address.’ And by the way, it might just be creepy. If I send you an email saying, here's something about what you said in class. It'd be more organic. It'd be interesting for me to say, ‘Can we chat for 15 minutes this afternoon about that amazing comment you just made?’ “

Dean Ohm (32:56):

“Maybe that's how I find a study group. Maybe that's how I find a friend.” The analogy I kept making is the five minutes before your first law school class, most of us probably turn to the person next to you and said, “What do you think about law school so far?” And in a non-trivial number of cases, that person became a friend, a study group partner. We can't do that in Zoom. Zoom doesn't allow side conversations. I felt so much for students who expressed this opinion. This happened to be at the same time that we were worried a lot about student isolation and student wellness from Zoom and from Zoom fatigue. On more than one occasion, I almost overrode the professors, flipped that switch, and said, “We are now a school that has private chat in all classes.” But I couldn't get myself to do it for complicated reasons of governance. Political will, picking fights, being distracted by a thousand other emergencies. But to this day, I'm not sure we made the right choice. For those who aren't at Georgetown, we don't allow private chat in our classes. And I regret that sometimes. I wonder if I could have improved the lives of  dozens, if not hundreds of law students in some small way. I don't think this would have revolutionized anything if we had enabled private chat. And so I'm just fascinated by the way technology both enabled this, but also foreclosed and dictated the end result.

Panya Gupta (34:26):

It's so interesting to hear the background thought process that goes through these decisions. I'm curious how you're thinking—both during that decision making process and now that you've seen how it's played out—in terms of chat and no other technology considerations, QR codes, Penopto, etc, have impacted your wider thoughts on use of technology in society, and how reliant we are on it.

Dean Ohm (34:59):

I mean, I'm thinking a lot about how I use the lessons of the last three years to enrich my scholarship. I think a lot of these are old ideas I have, but I've modified them or I am going to double down on things I've said in the past. Let me just ramble on a few of them. 

One is this mantra we started with: Technology shouldn't dictate policy, policy should dictate technology. This chat example is a great example of when that falls apart. Zoom made some choices about their defaults and some choices about recording that foreclosed the best solution. The best solution to the chat debate would have been by default it's turned off. Professors who don't want it don't have to think about it, but a professor with two clicks could turn it on.

Dean Ohm (35:52):

That choice wasn't available to me. I know there's probably some Zoom experts out there saying, “No, no, you could have done that.” But we couldn't have done it within the constraints of our other technological choices. Trust me, I looked into it. And that's frustrating to me. I wished I could wave a magic wand and get Zoom to change 16 lines of code to give us the option we didn't have. But I can't do that. I don't have that power. Zoom has its own concerns and incentives. Not that I asked anyone because I just assumed I would be ignored. That's why I had that caveat in my mantra: except when it doesn't. That's number one. Number two—and I think this is something that's a little abstract. There were moments when design choices we made, such as the QR codes, for good purposes to forestall bad purposes, meant there was something clunky about our solution.

Dean Ohm (36:49):

There was something inefficient about our solution. That our solution was full of friction. I'm using that word because Silicon Valley always talks about the frictionless model and how their job is to find all the rough edges in technology and sand them off. Everything is seamless. You're just like luxuriating in eight hours of Netflix, because you didn't even realize the time was going by. One thing I've been saying to my scholarship for a few years, and now I'm going to double down on it, is we have to realize that friction is where we preserve human values. Friction and technology. That we should sometimes force our technology to be imperfect and clunky and odd and unhelpful. When we get frustrated by those moments of friction, we need to—this is going to sound very new age—we need to take a moment to be mindful of where that friction comes from and all of the other amazing things it's doing for us.

Dean Ohm (37:48):

We need to sit back, appreciate, and embrace the friction. I don't think that's usually what happens. Usually in tech companies—I've never worked for a tech company, but let me just pretend like I have. Usually in tech companies, points of friction are seen as the enemy, the thing to get rid of. If you're someone who is encountering friction, it's almost like an embarrassment. You're almost going to say, “Oh, it's too bad. Our users weren't able to do X.” I've been encouraging my staff to instead embrace the friction to say, “Yeah, you know what? You can't do this thing you want to do easily. And it's because we don't want to create a surveillance state.” Because mission creep always happens with the best of intentions. The thing that cracks open the door to mission creep is something that we all would agree is just an amazing, amazing feature.

Dean Ohm (38:45):

And so it's too easy to get wrapped up in that and say, well, you know, I recognize that this might open the door to problems, but how can we stand in the way of X, X is amazing.

Panya (39:07): 

And I think we're really seeing the issue of friction in technology usage in current events. Facebook allowed for greater and global connectedness, but that was truly for better and for worse, as we're seeing now with the role of social media in organizing the January 6th insurrection. I wonder if technology companies ever contemplated the potential for negative consequences from their new innovations. Did they ever wonder how the potential of technology could be harmful, or was that kind of ignored or muffled under technology optimism?

Dean Ohm (39:37): 

I wonder if Facebook did that earlier in it's conception. Now there's no reason to think Mark Zuckerberg with his weird pathologies would have done this—but what if Zuckerberg in his Harvard dorm room said, “You know what, someday people might storm the Capitol building in an attempt to overthrow the American government because of this design choice I just baked into Facebook.” As implausible as that sounds. That's what I think is missing in Silicon Valley. You brought up Black Mirror earlier. If Silicon Valley embraced the Black Mirror thinking of the worst case scenarios instead of thinking their job is to search and destroy every moment of friction. And that's one of the things I learned as Associate Dean. You can do that. You can learn to celebrate rather than regret the friction, but it doesn't come second nature.

Panya Gupta (40:27):

That's kind of the work of a lawyer, too. To anticipate, think about corner cases, worst case scenarios, build them into a general law or a general policy. And then inevitably more corner cases will arise because that's what human nature is: unpredictability and uncertainty that we just have to work with. And then the hard cases, the really hard cases end up in court.

Dean Ohm (40:55):

That's a really, really apt analogy. I think one thing I would say though, is we often use the terminology and framework of costs and benefits to do what you were just talking about. To me, those words are a trap. They're like an intellectual framework that leads us to trouble. I think the point I'm trying to make is: costs are benefits, and benefits are costs. And that law is all about the scales.To me that is the mistake. When you're building technological platforms, you cannot—it's not this “how can I have as many benefits as possible with as few as cost as possible?” I think what I'm trying to argue for is, we think of laws, legal systems, and statutes as having infinite complexity. Where we route around the good stuff and the bad stuff, and we have this perfect line. I think that what we have to embrace more often with technology is: clean, bright lines are better. This shows up in other areas of law. We need to have things that have false positives, false negatives, benefits forgone, and costs that we have to eat. Because there's something about drawing these clear straight lines that give us all sorts of other benefits.

Panya Gupta (42:18):

I think we're at time now, but thank you so much for a fascinating discussion. Dean Ohm, it's fascinating how much overlap there is between thinking that goes into technology and thinking that goes into the work of a lawyer and how much that overlap is being underutilized—honestly—in both professions and both fields.

Dean Ohm (42:43):

No, you're very welcome. I enjoyed this conversation. I think I'm going to keep this podcast and use it to like write the introduction to my next article. Thank you for that. And congratulations. I mean, it's great that the Georgetown Law students. I just love our students in so many ways, but the thought that you're creating an entire podcast and getting it off the ground is great. So best wishes. And I will definitely subscribe to you in my podcast feed as soon as I can.

Panya Gupta (43:05):

Fantastic. Thank you again for joining us. Thank you again for joining us today for Between Two Codes. And thank you all for listening to another episode.

 

Credits:

Producer: Panya Gupta & Ian Carrico
Audio Engineer: Luke Evans
Researcher: Judy Faktorovich & Connor Kane
Transcription: Liza Clark
Social Media: Ada Locke