My involvement with Critical Code Studies

The discovery of this area of study opens up a whole new epistemological frame by which I could define what I have myself been looking at in various instances when dealing with the analytics of data and informational flows, much of which are derived from digital sources. At the same time, I want to caution that code is not the sole artifact of digital electronics and computers, but were already in existent in ancient mathematics, mechanics, astronomy, weave patterns and various message sending devices. One can get as much code in the analog as in the digital. Moreover, some of these digital codes were seen in the supposed pre-digital age, in the calculating machines, punch-cards of analog computers, and chronometric devices. If we go even further, we may even see snippets of codes in Egyptian hieroglyphs and the Babylonian wedges.

My entry into critical code studies has much to do with computer code as it has nothing to do with it. I am primarily interested in ontology and in understanding what exactly ontology entails. This was the same interest that drove eminent physicists such as  David Bohm to speculate beyond the limited possibilities that other physicists of his time were willing to attribute to quantum forms of knowledge. To do that, he started looking at the foundation of quantum mechanics, which was and still is all about statistics. That is when he also began to think about discrete forms of informational flows, and this is where Claude Shannon and his theory of information came to mind. In thinking about quantum-information systems, Bohm and Hiley introduced the idea of active information, where very little energy that enters the system can direct much greater energy. If we think about this in terms of computer code, it is exactly that. To reduce the processing power needed to produce the code needed to generate a lot of other data, there is a race to produce simpler streamlined, neater, elegant and minimal code, which translates to having to construct a more powerful algorithm. Powerful algorith translates from elegant looking mathematical equations that may have been derived from pages of messy equations and theorems.

The more minimal the code, the harder it is, but the more powerful it becomes. This probably goes against the computer science concept of how the machine-like the code is (which means, the more intricate and machine-obfuscated the code is), the more powerful it is for controlling the machine. This is probably true, at a time when we are not processing the zillions of information that we are doing today.

As one of my prospective (I say prospective as I am not yet ABD, but in reality, it is more of a certainty for me, barring other issues) project is to study data simulation of experiments relating to the Large Hadron Collider. To do that, I have to study both the data produced through collisions of mesonic, baryonic, fermionic, bosonic and ionic particles, as well as to look at how the physicists conduct monte carlo simulation using various physics simulation engines by inserting pre-define parameters that they have predicted through their calculations. The attempt to reconcile simulated (theoretically predicated outcomes) with raw data generated from 'nature' is a study of phenomena, known as phenomenology (which is probably closer to Husserl's earlier definition of the term than his later revisions and that of Heidegger's). In understanding how phenomenology works, I venture that one is closer to understanding the ontological world.

So, where does critical code comes into that? Data, of course! When we gather data, whether from 'nature' or through 'artificially' generated means, they appear to us in codes, because we can only see them after they have been processed by an intermediary, a machinic creature, and we see alphanumeric codes. We try to decipher and make sense of them. The deciphering process and the end result of that process are equally important. How do we then create the methodology for historicizing and deconstructing the process? This is where I hope the conversations in critical code with an interdisciplinary audience will allow me to do so.  We should look at the code in juxtaposition with the larger picture in which the code exists.

The HASTAC forum on Critical Code will be coming up live real soon as the co-hosts continue brainstorming for more ideas to be put on the table for discussion, argument and deconstruction.

 

 

Richard Mehlinger

The fastest code generally is *not* elegant

I'd just like to push back a bit at this statement: "To reduce the processing power needed to produce the code needed to generate a lot of other data, there is a race to produce simpler, neater, elegant and minimal code. The more minimal, the harder it is, but the more powerful it becomes. This probably goes against the computer science concept of how the machine-like the code is (which means, the more intricate and machine-obfuscated the code is), the more powerful it is for controlling the machine."

In fact, if performance is essential, the code is probably going to be written in C or maybe C++--not exactly elegant languages, in my own humble opinion--and if performance is *really* vital, instead of leaving optimization solely to the compiler, programmers will optimize the code by hand. What does hand-optimized code look like? Well, in a word, ugly. Loops are unrolled into chunks of copy-pasted code, functions are in-lined, and bulky, precalculated tables replace space-saving, elegant, but time-intensive calculation. Thus, counterintuitively, the most elegant code probably isn't the fastest; in fact, under real-world conditions the ugly, hand-optimized code may well prevail.

clarissal

ugly code

Thanks for starting that conversation, Richard :) I probably was too vague when I talked about elegant code and I do agree with your contention. As someone who is guilty of different iterations of loops when writing code, I often create a lot of 'ugly' and clunky code in the process.  I wasn't thinking so much aesthetically as in thinking optimally when I was thinking about 'elegant' code (probably even the use of that adjective has to be rethought).  I was thinking about creating the kind of 'minimal' code that will not get lost in the maze of unexpected 'errors' especially when one has to build extremely large programs (thinking of some versions of Windows). I am thinking in terms of streamlining and 'cleaning' up (as much as possible) code.  Knowing how to account for each module, being able to hand-optimize the work of another code. But you are right, the fastest code can sometimes be the messiest, and coders are more interested in getting the thing to work the way they want it at that moment. Exigency versus navigability. It'll be interesting to see what this discussion can generate from others.

But even when one thinks about interfaciality, the ugly "Linux"/Unix is a lot more powerful than any GUI type of OS. And how would their code compare?

 

clarissal

Critical Code Studies now Live!

There are two forums.
Main forum: http://www.hastac.org/forums/hastac-scholars-discussions/critical-code-s...

Sub-forum for actual code critiques: http://www.hastac.org/forums/hastac-scholars-discussions/code-critiques

I look forward to the creative intellectual energy!