Program Or Be Programmed | Reflections & Notes

Posted on November 16, 2019

0


Douglas Rushkoff. Program Or Be Programmed: Ten Commands For A Digital Age. Soft Skull Press, 2010, 2011. (155 pages)

The GeekDadInterview With Douglas Rushkoff, Jason Cranforoteague [Wired]


REFLECTIONS


In response to the title of this book, I quip, “too late.” The force of digital media, and specifically the ubiquity of platforms that are unfortunately named “social,” have done their damage, have taken root in our individual and collective consciousness, and wreaked havoc on our souls, our relationships, and on our democracy. Unlike other books with which I have interacted, there are several highlights below that are less about “hmm, that’s insightful,” and more about “No duh!” A digital expansion of mass choices leading to entitlement? Yup. Check. Binary language leading to polarizing epistemologies and tribes? Yup. Check. The inability to distinguish the fake from the real? Yup. Check, CHECK! Anonymity exposing the darker, shadow, and sinister demons of our nature? *Sigh. Yup. Check.

To be true, my sincere reflections are not this glib. I am deeply grateful to Rushkoff for such a clear and erudite articulation of the state of things, and it is by reading books like this that I feel more equipped to counter-balance the deleterious effects that our current technological/digital age is reaping upon us all. However…

There are some considerations that I’ve been wrestling with over the past several months that have me slightly morose. While authors like Rushkoff and books like Program Or Be Programmed are really fantastic, it appears to me that our evolutionary biology has conditioned us to be incapable of meeting the demands to which books like this call. We are incapable of rising above our more primal instincts, namely survivability and sociability. While these may sound like good aims, the cultures we have built have emerged more rapidly than the evolution of our psychologies. And, so, we’re somewhat trapped, desiring to be functionally astute in this world, but knowing we’re neurologically trapped in another. In fact, the problem is even quite explicit. Most of us already feel, and instinctively know the negative effects social media is having on ourselves, and on our relationships. We even say it out loud, “I need to get off Facebook,” or “I should really put my phone down.” Studies abound on depression, loneliness, and other mental health ailments that are most prominent in technologically immersed communities. And yet, we persist, not in rising to become better humans, but in descending to a life-extracting seduction of the flickering pixels. Is it possible that our frontal cortices are no match for the dopamine hits that the digital age has so finely hacked? I’m starting to feel as if there is an inevitable telos, a future of disdaining the most precious elements and virtues of our humanity and a persistent penchant for outsourcing our selves to what we can build, rather than be.

I suppose I write this reflection because, while admittedly morose, I do not believe this is destiny. Perhaps, reading books like this is the perfect “natural conditioning” needed to steer us towards a more human-centered future. Regardless, please consider reading this, and by reading, consider carefully how we collectively will program our tomorrow.


NOTES


Preface

…no, you don’t have to learn to program. | You do, however, have to learn that programming exists. These environments in which we’re all spending so much of our time these days–the websites and social networks where we do our work and play–they are not nature. They have been constructed by people (or at least for people) with real agendas. (8)

Understanding programming–either as a real programmer or even, as I’m suggesting, as more of a critical thinker–is the only way to truly know what’s going on in a digital environment, and to make willful choices about the roles we play. (8)

Introduction

Computer sand networks are more than mere tools: They are like living things, themselves. (14)

The good news is we have undergone such profound shifts before. The bad news is that each time, we have failed to exploit them effectively. (18)

We teach kids how to use software to write, but not how to write software. This means they have access to the capabilities given to them by others, but not the power to determine the value-creating capabilities of these technologies for themselves. (19)

| Like the participants of media revolutions before our own, we have embraced the new technologies and literacies of our age without actually learning how they work and work on us. (19)

Before, failing meant surrendering our agency to a new elite. In a digital age, failure could mean relinquishing our nascent collective agency to the machines themselves. (20)

As a result, instead of optimizing our machines for humanity–or even the benefit of some particular group–we are optimizing humans for machinery. (21)

We are not just extending human agency through a new linguistic or communications system. We are replicating the very function of cognition through external, extra-human mechanisms. These tools are not mere extensions of the will of some individual or group, but tools that have the ability to think and operate other components in the neural network–namely, us. (21)

The industrial age challenged us to rethink the limits of the human body: Where does my body end and the tool begin? The digital age challenges us to rethink the limits of the human mind: What are the boundaries of my cognition? And while machines once replaced and usurped the value of human labor, computers and networks do more (22) than usurp the values of human thought. They not only copy our intellectual processes–our repeatable programs–but they also discourage our more complex processes–our higher order cognition, contemplation, innovation, and meaning making that should be the reward of “outsourcing” our arithmetic to silicon chips in the first place. (23)

We do not know how to program our computers, nor do we care. We spend much more time and energy trying to figure out how to use them to program one another instead. (24)

I. TIME
Do Not Be Always On

The human nervous system exists in the present tense. We live in a continuous “now,” and time is always passing for us. Digital technologies do not exist in time, at all. By marrying our time-based bodies and mins to technologies that are biased against time altogether, we end up divorcing ourselves from the rhythms, cycles, and continuity on which we depend for coherence. (28)

Recognizing the biases of the technologies we bring into our lives is really the only way to stay aware of the ways we are changing in order to accommodate them, and to gauge whether we are happy with that arrangement. Rather than accepting each tool’s needs as a necessary compromise in our passively technologized lifestyles, we can instead exploit those very same leanings to make ourselves more human. (40)

II. PLACE
Live in Person

Digital networks are decentralized technologies. They work from far away, exchanging intimacy for distance. This makes them terrifically suitable for long-distance communication and activities, but rather awful for engaging with what–or who–is right in front of us. By using a dislocating technology for local connection, we lose our sense of place, as well as our home field advantage. (41)

…digital media are biased away from the local, and towards dislocation. (43)

…digital technologies can bring news and pictures to us from far away, instantaneously and constantly. (48)

Meanwhile, what is happening just outside our window is devalued. As we come to depend on the net for our sense of connection to each other and the world, we end up fetishizing the tools through which all this happens. We associate our computer screens and email accounts with our most profound experiences of community and connection, and mistake blog comments sections for our most significant conversations. (48)

By recognizing digital media’s bias for dislocation, we are enabled to exploit its strength delivering interactivity over long distances, while preserving our ability to engage without its interference when we want to connect locally. (51)

III. CHOICE
You May Always Choose None of the Above

In the digital realm, everything is made into a choice. The medium is biased toward the discrete. This often leaves out things we have not chosen to notice or record, and forces choices when none need to be made. (52)

…early tests of analog recordings compared to digital ones revealed that music played back on a CD format had much less of a positive impact on depressed patients than the same recording played back on a record. (54)

…the problem is (54) not that the digital recording is not good enough–it is that it’s a fundamentally different phenomenon from the analog one. The analog really just happens–the same way the hands of a clock move slowly around the dial, passing over the digits in one smooth motion. The digital recording is more like a digital clock, making absolute and discrete choices about when those seconds are changing from one to the next. (55)

The digital realm is biased toward choice, because everything must be expressed in the terms of a discrete, yes-or-no, symbolic language. This, in turn, often forces choices on humans operating within the digital sphere. (55)

[via: In other words, binary languages create binary epistemologies. The solution, then, is not more persuasion via binary tools and technologies, but analog relationships.]

For something to be digital, it has to be expressed in digits. (56)

For instance, information online is stored in databases. A database is really just a bit–but the computer or program has to be able to be able to [sic] parse and use what’s inside the list. This means someone–the programmer–must choose what questions will be asked and what options the user will have in responding: Man or Woman? Married or Single? Gay or Straight? It gets very easy to feel left out. Or old: 0-12, 13-19, 20-34, 35-48, or 49-75? The architecture of databases requires the programmer to pick the categories that matter, and at the granularity that matters to his or his employer’s purpose. (57)

Choice stops us, requiring that we make a decision in order to move on. … Each option passed over is an opportunity cost–both real and imagined. The more choices we make (or are forced to make) the more we believe our expectations will be met. But in actual experience, our pursuit of choice has the effect of making us less engaged, more obsessive, less free, and more controlled. And forced choice is no choice at all, whether for a hostage forced to choose which of her children can survive, or a social network user forced to tell the world whether she is married or single. (58)

| Digital technology’s bias toward forced choices dovetails all too neatly with our roles as consumers, reinforcing this (58) notion of choice as somehow liberating while turning our interactive lives into fodder for consumer research. (59)

…choice is less about giving people what they want than getting them to take what the choice-giver has to sell. (59)

Meanwhile, we are always free to withhold choice, resist categorization, or even go for something not on the list of available options. You may always choose none of the above. Withholding choice is not death. Quite on the contrary, it is one of the few things distinguishing life from its digital imitators. (60)

IV. COMPLEXITY
You Are Never Completely Right

Although they allowed us to work with certain kinds of complexity in the first place, our digital tools often oversimplify nuanced problems. Biased against contradiction and compromise, our digital media tend to polarize us into opposing camps, incapable of recognizing shared values or dealing with paradox. On the net, we cast out for answers through simple search terms rather than diving into an inquiry and following extended lines of logic. We lose sight of the fact that our digital tools are modeling reality, not substituting for it, and mistake its oversimplified contours for (61) the way things should be. By acknowledging the bias of the digital toward a reduction of complexity, we regain the ability to treat its simulations as models occurring in a vacuum rather than accurate depictions of our world. (62)

But not everything is a data point. Yes, thanks to the digital archive we can retrieve any piece of data on our own terms, but we do so at the risk of losing its context. (64)

In the more immediate sense, facts devoid of context are almost impossible to apply sensibly. They become the fodder for falsely constructed arguments of one side or other of the social or political spectrum. The single vote of a politician is used to describe his entire record, a single positive attribute of caffeine or tobacco receives attention thanks to public relations funding, and a picture of a single wounded child turns public opinion against one side of a conflict rather than against war itself.

| Both sides in a debate can cherry-pick the facts that suit them–enraging their constituencies and polarizing everybody. In a digital culture that values data points over context, everyone comes to believe that they have the real answer and that the other side is crazy or evil. Once they reach this point, it no longer matters that the opposing side’s facts contradict one’s own: True believers push through to a new level of cynicism where if the facts are contradictory, it means they are all irrelevant. The abundance of facts ends up reducing their value to us. (65)

| As a result, we tend to retreat into tribes, guided primarily y our uninformed rage. And we naturally hunger for reinforcement. (65)

…we overvalue our own opinions on issues about which we are ill informed, and undervalue those who are telling us things that are actually more complex than they look on the surface. They become the despised “elite.” (66)

Ironically, perhaps, as our digital experiences make us more simple, our machines are only getting more complex. … While digital technology liberated us from our roles as passive spectators of media, their simplifying bias reduced us once again to passive spectators of technology itself. (68)

With each upgrade in technology, our experience of the world is further reduced in complexity. The more advanced and predictive the smart-phone interface, the less a person needs to know to use it–or how it even makes its decisions. Instead of learning about our technology, we opt for a world in which our technology learns about us. It’s our servant, after all, so why shouldn’t it just do what it knows we want and deliver it however it can? Because the less we know about how it works, the more likely we are to accept its simplified models (68) as reality. (69)

…our inability to distinguish between a virtual reality simulation and the real world will have less to do with the increasing fidelity of simulation than the decreasing perceptual abilities of us humans. (70)

V. SCALE
One Size Does Not Fit All

On the net, everything scales–or at least it’s supposed to. Digital technologies are biased toward abstraction, bringing everything up and out to the same universal level. People, ideas, and businesses that don’t function on that level are disadvantaged, while those committed to increasing levels of abstraction tend to dominate. By remembering that one size does not fit all, we can preserve local and particular activities in the face of demands to scale up. (72)

Survival is a purely digital realm–particularly in business–means being able to scale, and winning means being able to move up one level of abstraction beyond everyone else. (74)

The net has turned scalability from a business option to a business requirement. (74)

Because the net is occurring on a single, oversimplified and generic level, success has less to do with finding a niche than establishing a “vertical” or a “horizontal.” … In either case, “scaling up” means cutting through the entire cloud in one direction or another: becoming all things to some people, or some things to all people. (75)

…all media are biased toward abstraction in one way or another. By allowing us to describe events that had already taken place or those that had happened to other people, speech disconnected the doer from the deed. Text, meanwhile, disconnected speech from the speaker. Print disconnected text from the scribe, and the computer (78) disconnected print from paper. (79)

Language is an abstraction of the real world, where sounds represent things and actions. It requires a tremendous amount of agreement, so that the same words mean the same thing to different people. (79)

Finally, the digital age brings us hypertext–the ability for any piece of writing to be disconnected not just from its author but from its original context. Each link we encounter allows us to exit from a document at any point and, more importantly, offers us access to the bits and pieces of anyone’s text that might matter at that moment. In a universe of words where the laws of hypertext are truly in effect, anything can link to anything else. Or, in other words, everything is everything–the ultimate abstraction. (80)

…we have stuff, we have signs for stuff, and we have symbols of signs. What these philosophers feared was that as we came to live in a world defined more by symbols, we would lose touch altogether with the real stuff; we would become entranced by our simulated reality, and disconnect from the people and places we should care about. (84)

Just as the framers of the Constitution and the Talmudic scribes before them understood, abstract codes of laws are fine–so longa s we’re the ones writing them. (84)

VI. IDENTITY
Be Yourself

Our digital experiences are out-of-body. This biases us toward depersonalized behavior in an environment where one’s identity can be a liability. But the more anonymously we engage with others, the less we experience the human repercussions of what we say and do. By resisting the temptation to engage from the apparent safety of anonymity, we remain accountable and present–and much more likely to bring our humanity with us into the digital realm. (85)

In a hostile, depersonalized net environment, identity is one’s liability. (88)

The way to dampen the effects of this problem is not to retreat into anonymity ourselves, but to make being real and (88) identifiable the norm. (89)

The less we take responsibility for what we say and do online, the more likely we are to behave in ways that reflect our worst natures–or even the worst natures of others. Because digital technology is biased toward depersonalization, we must make an effort not to operate anonymously, unless absolutely necessary. We must be ourselves. (89)

Living in a 7 percent social reality has real effects. As MIT researcher Sherry Turkle has discovered, teens online rarely if ever apologize to one another. When they are caught having wronged someone, they confess–but they never say they’re sorry. It’s as if the factual statement of guilt matters more than having any feelings about it. Sorrow goes out with the other 93 percent. [Sherry Turkle, Alone Together: Why We Expect More from Technology and Less from Each Other] (93)

Once we surrender to the status of the anonymous, our resentment at having to do so will seep into our posts. We become even less present than we are to begin with, less responsible for what we do, and less likely to sense the impact we are having on others. We become yet more anonymous actors in a culture where it’s hard enough not to antagonize the people we know–much less those with whom we interact namelessly and facelessly. (95)

VII. SOCIAL
Do Not Sell Your Friends

In spite of its many dehumanizing tendencies, digital media is still biased toward the social. In the ongoing coevolution between people and technologies, tools that connect us thrive–and tools that don’t connect us soon learn to. We must remember that the bias of digital media is toward contact with other people, not with their content or, worse, their cash. If we don’t, we risk robbing ourselves of the main gift digital technology has to offer us in return for our having created it. (96)

It turned out, content is not king–contact is. (99)

What all these social networking businesses keep getting wrong, however, is that the net is not becoming a social medium. It already is one. … Our digital networks are biased toward social connections–toward contact. Any effort to redefine or hijack those connections for profit end up compromising (99) the integrity of the network itself, and compromsing th ereal promise of contact. (100)

The real way to “go social,” if they wanted to, would not be to accumulate more page friends or message followers, but rather to get their friends and followers to befriend and follow one another. (101)

We value our increased contacts for what they might provide and miss the greater value of the contact itself. (104)

But it is this contact, this desire to construct a social organism together, that has been the driving force of digital technology all along. The instinct for increased contact is the evolutionary imperative we feel to becoming something greater than ourselves. Just as atoms combined into molecules, molecules clustered into cells, and cells collected into organisms, we organisms are networking into greater levels of organization. (104)

The content is not the message, the contact is. The ping itself. It’s the synaptic transmission of an organism trying to wake itself up. (105)

VIII. FACT
Tell the Truth

The network is like a truth serum: Put something false online and it will eventually be revealed as a lie. Digital technology is biased against fiction and toward facts, against story and toward reality. This means the only option for those communicating in these spaces is to tell the truth. (106)

So the peer-to-peer bazaar that almost brought down feudalism was dismantled, and feudalism evolved into what we now think of as corporate capitalism. Sadly, along with the peer-to-peer economy went peer-to-peer communication. Companies tried to replace what had been relationships between people with relationships to brands. Instead of buying your beer from Bob the brewer, you’d buy it from the officially sanctioned monopoly. The seal on the bottle was to substitute for whatever human relationship existed before. To make this transition work, brands turned to the sorts of mythologies still in use today. The Quaker on a package of oats has nothing to do with the grain in the box; he is a story. (109)

| As the Industrial Age gathered steam, more products–even more disconnected from their producers–needed to be (109) sold. Ad agencies developed powerful brands to camouflage the factory-based origins of most of what people consumed. Industrial agriculture became the valley of a green giant, and factor-made cookies became the work of little elves working in a hollow tree. Mass media arose to disseminate all of these new myths, utterly devoid of facts. And as long as media remained a top-down proposition, there was very little fact-based, peer-to-peer communication to challenge any of it. People were working hard on assembly lines or in cubicles anyway, no longer experiencing themselves in their multiple social roles simultaneously. They were workers on the job trying to earn a paycheck, and consumers at home relaxing to the mythological drone of mass media. (110)

| Digital technology broke this. (110)

| The fundamental difference between mass media and digital media is interactivity. Books, radio, and television are “read only” media. … Digital media, on the other hand, are “read-write.” …we are transitioning from a mass (110) media that makes its stories sacred, to an interactive media that makes communication mutable and alive. (111)

The original reason computers were networked to one another was so that they could share processing resources. This makes them biased toward peer-to-peer activity. Mass media respected only the law of gravity: The people with the presses or broadcast facilities dropped their myths down onto the masses. Digital media go up, down, and sideways. In a sense there is no longer any up or down at all, as each node in the network can receive the message or refuse it, change it or leave it alone, and delete it or pass it on. (111)

| We’re back in the bazaar. (111)

It’s hard for any company to maintain its mythology (much less its monopoly) in such an environment. As we transform from media consumers back to cultural communicators, we message one another seeking approval and reinforcement. Myths and narratives will always be deconstructed, and mistruths eventually corrected. The bias of our interactions in digital media shifts back toward the nonfiction on which we all depend to make sense of our world, get the most done, and have the most fun. The more valuable, truthful, and real our messages, the more they will spread and better we will do. We must learn to tell the truth. (112)

…on the net, mythologies fall apart and facts rise to the surface. (114)

Those who succeed as communicators in the new bazaar will be the ones who can quickly evaluate what they’re hearing, and learn to pass on only the stuff that matters. (116)

The way to flourish in a mediaspace biased toward nonfiction is to tell the truth. This means having a truth to tell. (117)

IX. OPENNESS
Share, Don’t Steal

Digital networks were built for the purpose of sharing computing resources by people who were themselves sharing resources, technologies, and credit in order to create it. This is why digital technology is biased in favor of openness and sharing. Because we are not used to operating in a realm with these biases, however, we often exploit the openness of others or end up exploited ourselves. By learning the difference between sharing and stealing, we can promote openness without succumbing to selfishness. (118)

Digital technology’s (120) architecture of shared resources, as well as the gift economy through which the net was developed, have engendered a bias toward openness. It’s as if our digital activity wants to be shared with others. As a culture and economy inexperienced in this sort of collaboration, however, we have great trouble distinguishing between sharing and stealing. (121)

We are living in an age when thinking itself is no longer a personal activity but a collective one. (124)

The real problem is that while our digital mediascape is biased toward a shared cost structure, our currency system is not. (129)

Participation is dependent on knowing both the programming code necessary to make valuable additions and the social codes necessary to do it in ways that respect the contributions of others. (133)

| Digital society may always be biased toward sharing, but a real understanding of the codes through which it has been built makes stealing a nonstarter. (133)

X. PURPOSE
Program or Be Programmed

Digital technology is programmed. This makes it biased toward those with the capacity to write the code. In a digital age, we must learn how to make the software, or risk becoming the software. It is not too difficult or too late to learn the code behind the things we use–or at least to understand that there is code behind their interfaces. Otherwise, we are at the mercy of those who do the programming, the people paying them, or even the technology itself. (134)

That’s right: America, the country that once put men on the moon, is now falling behind most developed and many developing nations in computer education. We do not teach programming in most public schools. Instead of teaching programming, most schools with computer literacy (135) curricula teaching programs. (136)

Digital technology doesn’t merely convey our bodies, but ourselves. Our screens are the windows through which we are experiencing, organizing, and interpreting the world in which we live. (138)

They are fast becoming the boundaries of our perceptual and conceptual apparatus; the edge between our nervous systems and everyone else’s, our understanding of the world and the world itself. (139)

Programming is the sweet spot, the high leverage point in a digital society. If we don’t learn to program, we risk being programmed ourselves. (139)

So the people investing in software and hardware development sought to discourage this hacker’s bias by making interfaces more complex. The idea was to turn the highly transparent medium of computing into a more opaque one, like television. Interfaces got thicker and more supposedly “user friendly” while the real workings of the machine got buried further in the background. (141)

…we were told not to look behind the curtain. (141)

Our enthusiasm for digital technology about which we have little understanding and over which we have little control leads us not toward greater agency, but toward less. We end up at the mercy of voting machines with “black box” technologies known only to their programmers, whose neutrality we must accept on faith. We become dependent on search engines and smart phones developed by companies we can only hope value our productivity over their bottom lines. We learn to socialize and make friends through interfaces and networks that may be more dedicated to finding a valid advertising model than helping us find one another. (146)

In the long term, if we take up this challenge, we are looking at nothing less than the conscious, collective intervention of human beings in their own evolution. (148)

The less involved and aware we are of the way our technologies are programmed and program (148) themselves, the more narrow our choices will become; the less we will be able to envision alternatives to the pathways described by our programs; and the more our lives and experiences will be dictated by their biases. (149)

| On the other hand, the more humans become involved in their design, the more humanely inspired these tools will end up behaving. … As biologists now understand, our evolution as a species was not a product of random chance, but the forward momentum of matter and life seeking greater organization and awareness. This is not a moment to relinquish our participation in that development, but to step up and bring our own sense of purpose to the table. It is the moment we have been waiting for. (149)