I’m excited to bring you this interview with Michael Sacasas. Michael is the Associate Director of the Christian Study Center of Gainesville and the author of The Convivial Society, a newsletter about technology, society, and the moral life. We share his work often, and wanted to directly talk with him following a private exchange we had. I hope you enjoy our conversation.
Michael
Michael Wear: You know we’re big fans of your Substack account, but the immediate prompt to interview came when you followed up with me in response to a tweet of mine: "There’s a popular view that grace is mostly about sin, but the truth is that grace is, first and foremost, about life and life to its fullest.” You reached out and referenced some of the connections you see between technology, how we use it and think of it, and our understanding of grace. Would you share a bit about how you view the connection? What does grace have to do with technology?
Michael Sacasas: The connection between grace and technology becomes clear to me as I've lately revisited the early chapters of Genesis and the story of Cain. In the story, Cain murders his brother and is condemned by God to a life of futility and wandering. I was struck by Cain's refusal to abide in his status as a wanderer and how he turns toward technology in order to overcome the fragility and precarity of the human condition. He decides to settle and build a city, which he names after his son. Then we read that agriculture, the arts, and metallurgy all emerge within his family line. Later on, of course, we come to the story of the Tower of Babel, long read as a parable of technological hubris, and we find a similar pattern repeating itself. In both cases, however, the text presents an alternative: the line of Seth and the family of Abraham. The contrast between these pairs can be framed as the contrast between the way of techno-scientific mastery and the way of grace. The former is characterized by the desire for power and control. The latter is characterized by a willingness to receive life as a gift. In both cases, I see a response to the human condition in a broken world. I don't think the point is that technology is bad or that we ought to be fatalistic or indifferent to suffering. Quite the contrary. But these two paths are what I would describe as fundamental orientations toward the world, or two distinct ways of being in the world. The path of mastery and control tends toward hubris. It has little place for hope, except in what humanity can do for itself. It becomes incapable of recognizing the goodness or beauty of what it has not made. The mysteries of human experience from birth to death and everything in between become chiefly problems to be solved by the application of a technique. It seeks knowledge as a form of power. It is tempted, in T. S. Eliot's memorable formulation, by "dreams of systems so perfect no one will need to be good." Given this orientation to the world, there is little to no room for grace, understood as precisely that which we cannot command, plan, fabricate, or master.
In many respects, the modern techno-scientific project is simply a much more sophisticated instance of the same project that animated the builders in the story of the tower of Babel. And all of us who inhabit the modern world, whether Christian or not, are shaped by this spirit that has informed the making of our pervasive technologies and dominant institutions. Christians, especially Protestants, will readily grant that they receive salvation as a gift, which they cannot earn or master. But we live the rest of our lives in the mode of mastery. We seek to control, plan, manage, and optimize our lives, and we are consequently exhausted and plagued by anxiety precisely because the project of mastery and control is untenable. Moreover, it's difficult for us to experience suffering as anything other than an unmitigated evil to be eliminated at any cost. And, to the degree that the path of techno-scientific mastery entails a regime of quantification and measurement, we blind ourselves all that is good and beautiful but unquantifiable. So from a theological perspective, the technological structures of modern society tend to attenuate our capacity to receive life as a gift, to experience the grace of God. In a wonderful line from one of his Sabbath poems, Wendell Berry summed all of this up by reminding us that "we live that given life, and not the planned."
MW: Dallas Willard often said that “grace is not opposed to effort, but earning.” There has been this explosion of apps which kind of self-consciously suggest, “yes, smartphones have harmed your emotional lives and mental health, but here’s this layer of technology to help turn this thing around.” And so there are meditation apps, and a whole class of Christian apps meant to help people read the Bible more. Churches are using social media and phone apps as a resource for congregants. I’ve seen advertisements for apps you can download to your phone with the promise that they will help you use your phone less. Can these kinds of interventions be helpful, or are they just another expression of "techno-scientific mastery?”
MS: My inclination is to say that, generally speaking, they tend to be just another expression of the impulse driving the pursuit of techno-scientific mastery, but it's worth unpacking how that works. To begin with, it's a case of a common tendency: when technology causes problems, turn to more technology as the solution. This is not always a bad thing to do. But it is worth asking whether, in certain cases, the better approach might be to revisit the initial use of technology before doubling down. There's also the question of what the appropriate means to achieve certain ends might be. One characteristic of thinking that has been shaped by the patterns of technological culture is that it assumes the interchangeability of means. In other words, it's the assumption that all the ways one might pursue a goal are equal so long as they get you to the goal. But this is simply not true. How something gets done may, in certain respects, be as important as the goal that is achieved. Or, to put it another way, some goals are intrinsically connected to a certain means. So, for example, I wouldn't say that it is wrong to read your Bible on an app, but reading on an app cultivates a different set of mental and even moral habits than reading the Bible embodied in book form. Relatedly, it is often the case that technologies promise to outsource labor that is intrinsic to the goals that we want to achieve. So we should be mindful that when we use a tool that promises to do something for us or make something easier, we are not, in fact, outsourcing some of the work and effort that is essential to the goal we want to achieve. Finally, although I'm sure there's more that could be said, it's important to think about our technologically mediated actions at different scales. So, for instance, it is one thing to consider a single use of a single app, and it is another to consider the habitual use of an app situated within the technological ecosystem of the smartphone or even the wider digital media milieu. So it is worth thinking about the collective dimensions of our tech use. What systems am I unwittingly feeding by my use of seemingly innocuous apps, or even apps, which taken singly, appear helpful? As a church thinks about the spiritual formation of its congregants, does it make sense to push them toward social media environments that may form users in ways that are at cross purposes with Christian moral formation? We sometimes think that the technology is neutral and what matters is whether we are using it in morally good ways. But even if we are using social media to post and consume good content, our use of the platforms will have a formative influence that is independent of the uses to which we put it. So I would encourage us to think through these matters more deliberately, exploring both the explicit and implicit consequences of technology and the various scales at which those consequences manifest themselves. And coming back to the grace/mastery framing, we should consider, too, whether we are relieving ourselves of burdens that we ought to bear or deploying a tool in a spirit of mastery, by, for example, trusting to and hoping in the operations of the tool rather than the grace of God.
MW: What is the best decision we’ve made as a society regarding technology in the last thirty years or so?
MS: At the risk of appearing to avoid the question, my initial response is to suggest that we don't, in fact, make decisions as a society about technology (or much else, for that matter). Or, as I have sometimes put it, with respect to the ethics of technology there is no "we" there. And this is worth noting because it is a critical part of the problems we face with regards to technology: we lack a space for moral deliberation and practice. Many of our technologies exist in an odd space that is difficult to define given our usual categories. They are neither private nor public, in the way we tend to think of those terms. Similarly, they often present as matters of individual (or consumer) choice, but also have collective and social (or network) dynamics. This tends to be why, as Colin Hogran once put it so well we "ended up living in a world we all chose, but that nobody seems to want.”
There are vanishingly few mediating institutions that function as communities of moral deliberation and practice (reflecting a longstanding trajectory in modern western societies). Consequently, we get stuck in interminable debates about whether technologies ought to be regulated by the state or subject to personal discretion. This is not true of all technologies, of course. But even when it seems obvious that technologies ought to be subject to government oversight—nuclear power, for example—it's not clear to me that there is a "we" representative of society that is contributing to the decision making process. To take the example of social media, there was no collective deliberation over the last 15 years or so about allowing social media to structure so much of our public discourse. And now there is little agreement as to how to curb its noxious effects, in part because it's hard to identify the collective entity with the appropriate authority and power to do so in a democratic society. By contrast, China just issued national rules about how much time minors can spend gaming each week. The point is not to endorse China's move, but to illustrate the contrast to our situation in an open, liberal society that privileges individual freedom.
I realize there's a lot of stuff going on in the background of this response having to do with questions about community, moral authority, political legitimacy, and more. But the bottom line is this: we don't always have a framework for collective or communal decisions about how technology is adopted and implemented. So for that reason it's difficult to answer the question directly! The answer would inevitably be about a more narrow dynamic—government policy, market forces, university research, etc.—than a specific direction that society has taken with regards to technology.