It seems that one of the first things that strike many humanists critically examining computers, code, and technology, is the fact that these digital systems are in reality built on only two digits: 0 and 1. The technology that drives our society is built on binary. To anyone even remotely familiar with postmodern thinking, such a statement at first appears immensely striking and profound. After all, one of the key projects of post-modernism in the past thirty to forty years been the examination of the role binaries--male/female, gay/straight, true/false, left/right, us/them, to name just a few--play in society, and to aim to deconstruct those binaries. Thus, struck by the seeming congruence with postmodern thought, many humanists treat binary code as being an essential feature of computing, suggesting that, because computers use binary, society is itself increasingly dependent upon binaries. Unfortunately, in so doing they drastically overestimate the significance of the use of binary in computing, while missing what it really is: a choice driven by convenience and long use. Worse, besides the fact that this fixation on binary code is founded on a conflation of two radically different uses of the word "binary"--namely as a notation, and as a socially constructed dichotomy--it obscures much more serious issues with how computers work, and how data is represented.
In fact, binary is not fundamental to computers at all; it is used for entirely practical and incidental reasons. Turing machines (the theoretical representation of a computer) can be constructed to run on any finite set of characters. They can run on any numerical base, be it two, three, sixteen, or forty-two. They can use Roman letters, Chinese characters, or Egyptian hieroglyphs. We simply use base two because it is convenient--it allows us to use switches with only two states, on and off, which makes life much, much simpler for the hardware engineers. Furthermore, any number that can be represented in base two can be represented in any larger base.
Has the choice of binary had an effect on the ways people think about and develop software? Almost certainly. One can imagine, for instance, that if we used ternary computers, we might have a ternary numerical compare operator in assembly and programming languages. But I fear that the significance of binary notation in computing has been dramatically overstated. In so doing, we miss a far more important fact about the way computers store data and run programs, namely, that modern computers are digital and thus by extension discrete. Digital computers cannot represent non-discrete or analog data in its original form; rather, such information must be converted into discrete form by sampling it at periodic intervals. Anything that falls between the gaps or outside of the range either drops away entirely, or gets "rounded" one way or the other. Information is lost; to use the technical engineering term, it gets aliased.
Practically speaking, this means that software developers are encouraged to think in terms of rigid categories (of which binaries are a subset) with, definite criteria. You offer the user a choice between male/female for gender, because it would be far easier for the machine to use that data then for it to try to interpret the users' response if you left a text-field. You start thinking of how to categorize people, beliefs, ideas, in broad categories with well-defined boundaries. The innumerable, fine, messy distinctions that are so important, so vital, so beautiful in human life get stripped away--especially anything that happens to lie near the interstices of these so-carefully-defined boundaries--because ultimately, the machine only understands numbers. Even when you quantify this data according to some kind of sliding scale (as, for instance, the Political Compass test attempts to do for one's political leanings) those distinctions get abstracted away, boiled down to a number which is then passed through some hard filter which tells you if you're very left-wing or only somewhat left-wing, and then quite helpfully puts you on a graph with Gandhi, Hitler, and George Bush.
Speaking more practically, anyone who has ever tried to express sarcasm over the Internet has invariably gotten some furious response about how they must be some kind of monster to seriously believe in, say, eating Irish infants. Alternatively, consider someone who argues facetiously that, say, the use of contraception should be punishable by public flogging--and gets a flood of earnest, enthusiastic posts agreeing. Why does this sort of thing happen? Because the analog nuance, the voice inflection, all the other means which we use to indicate that we are being sarcastic and not actually serious, cannot easily be represented as printed text. Some people even resort to <sarcasm> tags or large quantities of ~tildes~ (see, doesn't that make even the most ordinary word look like it's dripping sarcasm?) in an attempt to get the point across--and even this sometimes doesn't work.
These are very serious problems, but they aren't problems with binary. In fact, they have nothing to do with binary. They would be just as substantial problems with any kind of digital computer, no matter what numerical base the underlying architecture uses. These problems are inherent to any system that attempts to convert the analog into the digital, whether they are built. So please, if you find yourself tempted to casually drop the term "binary" into the next piece you're writing about technology, stop, think, and ask yourself: is the fact that the hardware represents data in 0s and 1s really relevant to the question you are addressing? Would the issue I'm addressing be any different if we lived in an almost-identical alternate universe where computers ran on ternary instead? Would whatever you were writing sound as cool if, in this alternative universe, you were referring to the computer's ternary instead of its binary?
If the answer to any of these questions is yes, then the fact that the machine uses binary is clearly relevant. Otherwise, though, it probably isn't--and odds are, the answer to all of these questions is "no".
Note: I've also written about this in this comments thread.