My involvement with Critical Code Studies
- CFParticipation: Critical Code Studies Working Group 2014 (2/12, 2/23-3/23/14)
- Hello, world! A few introductory words to get started at HASTAC
- Vaidhyanathan: The mandate for Critical Information Studies
- Critical Code Studies workshop - upcoming
- CFP: International Conference on Computational Creativity
The discovery of this area of study opens up a whole new epistemological frame by which I could define what I have myself been looking at in various instances when dealing with the analytics of data and informational flows, much of which are derived from digital sources. At the same time, I want to caution that code is not the sole artifact of digital electronics and computers, but were already in existent in ancient mathematics, mechanics, astronomy, weave patterns and various message sending devices. One can get as much code in the analog as in the digital. Moreover, some of these digital codes were seen in the supposed pre-digital age, in the calculating machines, punch-cards of analog computers, and chronometric devices. If we go even further, we may even see snippets of codes in Egyptian hieroglyphs and the Babylonian wedges.
My entry into critical code studies has much to do with computer code as it has nothing to do with it. I am primarily interested in ontology and in understanding what exactly ontology entails. This was the same interest that drove eminent physicists such as David Bohm to speculate beyond the limited possibilities that other physicists of his time were willing to attribute to quantum forms of knowledge. To do that, he started looking at the foundation of quantum mechanics, which was and still is all about statistics. That is when he also began to think about discrete forms of informational flows, and this is where Claude Shannon and his theory of information came to mind. In thinking about quantum-information systems, Bohm and Hiley introduced the idea of active information, where very little energy that enters the system can direct much greater energy. If we think about this in terms of computer code, it is exactly that. To reduce the processing power needed to produce the code needed to generate a lot of other data, there is a race to produce simpler streamlined, neater, elegant and minimal code, which translates to having to construct a more powerful algorithm. Powerful algorith translates from elegant looking mathematical equations that may have been derived from pages of messy equations and theorems.
The more minimal the code, the harder it is, but the more powerful it becomes. This probably goes against the computer science concept of how the machine-like the code is (which means, the more intricate and machine-obfuscated the code is), the more powerful it is for controlling the machine. This is probably true, at a time when we are not processing the zillions of information that we are doing today.
As one of my prospective (I say prospective as I am not yet ABD, but in reality, it is more of a certainty for me, barring other issues) project is to study data simulation of experiments relating to the Large Hadron Collider. To do that, I have to study both the data produced through collisions of mesonic, baryonic, fermionic, bosonic and ionic particles, as well as to look at how the physicists conduct monte carlo simulation using various physics simulation engines by inserting pre-define parameters that they have predicted through their calculations. The attempt to reconcile simulated (theoretically predicated outcomes) with raw data generated from 'nature' is a study of phenomena, known as phenomenology (which is probably closer to Husserl's earlier definition of the term than his later revisions and that of Heidegger's). In understanding how phenomenology works, I venture that one is closer to understanding the ontological world.
So, where does critical code comes into that? Data, of course! When we gather data, whether from 'nature' or through 'artificially' generated means, they appear to us in codes, because we can only see them after they have been processed by an intermediary, a machinic creature, and we see alphanumeric codes. We try to decipher and make sense of them. The deciphering process and the end result of that process are equally important. How do we then create the methodology for historicizing and deconstructing the process? This is where I hope the conversations in critical code with an interdisciplinary audience will allow me to do so. We should look at the code in juxtaposition with the larger picture in which the code exists.
The HASTAC forum on Critical Code will be coming up live real soon as the co-hosts continue brainstorming for more ideas to be put on the table for discussion, argument and deconstruction.