Minor Project Mid Term Mulling Over
My project began with an initial concern with how many representations of knowledge gleaned from data mining databases is presented to us in visual form – in order that our innate visual pattern matching preponderance be harnessed to make the process of finding interesting patterns in a dataset even easier. This troubled me, as it added another level of abstraction to a process which already deals in several abstractions from the physical reality of which the databases are digital residue of. These abstractions then carry the force of authority as knowledge and hold sway over the reality that they originated from.
So my initial focus aimed at making the data contained in databases apprehendable to senses other than vision, in order to see whether the pattern discerning faculties of those senses might be employed to navigate the info glut.
Following from this a lot of my early research centered upon looking at accounts of cultures which are not as ocular centric as ours, and also consulting scientific research into how well our other senses are deemed to perceive patterns. The former part of this research dovetailed nicely with my embodiment and experience work.
I was able to find general scientific consensus on the pattern discerning properties of vision – less so did I find empirical support that hearing might proceed by a similar manner.
With hindsight I can understand this recourse to neuroscience as a fairly standard urge of mine to situate any research in the realm of what is known and understood to be real, and quantifiably evidenced – in other words this is a kind of fall back to the way of working with, and understanding, a problem that is within my comfort zone – i.e. the empirical path.
Looking back I can see that this approach was a little misguided, I was not setting out to prove the validity or feasibility of using other senses to make databases intelligible but rather provoke people to think about why so much of database produced knowledge comes to be represented to them via visual aids.
The kernel of this realisation was distilled by a neuroscience studying friend of mine commenting upon my idea by stating “you may have difficulty in demonstrating any advantage in trying to portray such large and detailed qualities of information through a sense other than the visual, as the transformation of the information for detection through a different sense may necessitate simplifying that information“
Aside from illustrating that I needed to work more on communicating my concepts to people whom I was seeking aid from, it also got right to the core of what is at stake with databases. Databases, by their very nature, necessitate a simplification (and trivialisation) of the raw information (i.e. lived, ‘real’, life) and yet there seems to be not enough consideration paid to this simplification when it comes to appreciating the knowledge produced at the other end of the KDD process. (I must stress that within most KDD processes painstaking attention is paid to the fidelity of the information utilised, however making sure the data covers enough bases and categories in order to produce novel and profit turning information is not the same to paying consideration to what the database conceptual machine elides once it is set in motion).
I hadn’t as much scope to get into the realm of KDD (knowledge discovery in databases) within my MAIM essay as I’d desired, so I took this opportunity to do more research on it, as well as getting to grips with the gritty business of establishing and querying a relational database via MySQL. It was the KDD research which was most crucial at this juncture, for I managed to find some explanatory lectures on the most basic knowledge discovery algorithms available. These lectures made clear that the means by which KDD operates are, at a philosophical level, different to how SQL works. When someone uses SQL you are querying the dataset, implying you have an idea of what you are looking for, at least in terms of what tables and columns you should be accessing. KDD positions itself a little differently, its algorithms proceed via ‘interestingness criteria’.
Thankfully this new appreciation of a dissonance, if not quite an opposition between SQL and KDD, did not force me to throw an early idea on the scrapheap.
One of the first ideas that occurred to me while reading scientific literature on the senses was to tap into the pattern detecting abilities of our ear, given how attuned to music and rhythm each one of us is. This link from pattern to music cross fertilised with an interest in the Incan Quipu (the common link being that I thought that the way my headphones always get tangled could be perceived as a form of unconcsious data organising) to produce a project idea whereby:
the user is faced with a set of speakers (playing cacaphonous noise) and an audio interface whereby she can plug in headphone jacks (in the initial vision they were still connected to tangled headphones). The noise coming out of the speakers represents the sum total of data in a table playing row by row. As headphones are plugged in only certain columns are selected. The user can play with the audio set up until she finds a noise output that is somewhat pleasing, and then she can see the SQL query which said noise relates to.
This initial project was meant simply to be a reflection on Edgar Codds belief that SQL “should protect the user from knowing how the data was organised in the database”. I felt the Quipu was a nice link given its another means of organising data, but it was a hierarchical manner, the same mode of organisation that SQL and relational databases made redundant.
However this project also worked in the context of a commentary upon KDD: in this scenario the user is using his ears to hear what he thinks is interesting, and then he can observe the data that corresponds to what he deemed interesting (for all that matters it could be utterly redundant).
Recently enough I realised that this project is unconsciously indebted to John Searles Chinese Room thought experimental rebuttal of “strong AI”.
After establishing the feasibility of this project (certainly possible to do in Alan’s estimation) I talked to Graham, and he pushed me back towards a more experimental methodology, encouraging me to enjoy the investigation. I am grateful for this nudge, because looking back on the audio experiment I can see that I was falling into the same manner of working which I had intended to break from by being part of non - group last term. I adopted two more low fi and abstract ways of exploring sensory avenues into databases - contact mic-ing the server and looking into ways to input and extract data from databases via breathing. As a trade off I decided to keep plugging away at the data aural pattern while focusing efforts on these new avenues. This was useful as the data aural pattern was already at the stage where heavy amount of code was going to be required to check its feasibility.
By contrast research into the breath and contact mics was immediately more hands on; a sort of ‘do it live’ experimentation ethos.
This worked well with the contact mic - I got my external hard drive micced, and was content with the sound produced, but I realised that I would need to get my hands on a PC tower to truly be able to contact mic the whole of the PC to see what a database server ’sounds’ like.
Trying to find something that would register exhaled breath was a degree more difficult. Two solid days of experimenting with old PC fans as a means of detecting speed of exhaled breath yielded very little, with Alan’s assistance I realised I would need to have a brush or ball bearing fan in order that some voltage be detected (i.e. using it as a reverse dynamo). As a stop gap I set up a circuit using an old bog roll, an LDR and the fan in order to see if some sort of reading could be garnered from the spinning of the fan blades, and results could be obtained, but the circuit was so rough and ready some serious modifications would be required. However spending so long working with fans planted a lot of seeds for ideas in my head.
Switching from working experimentally to working steadily on the audio interface made me realise that I was getting something from the tinkering experimentation that I wasn’t getting from the audio interface. In effect the audio interface was a fully formed concept, all I was gaining from further work on it was appreciation of the difficulties required to execute it (though these are certainly important things to learn), and no interesting observations could be gleaned from it until it was actually realised in its final product.
Working in an experimental way also enabled me to work along a “path of least resistance” method with the audio interface too - after Graham suggested sending the data to the sound card as a means of ensuring the least possible amount of abstraction I did some playing with output using audacity in conjunction with command line output to see if the output matched what I envisioned for the project - it didn’t and I will elaborate upon it later.
While the toilet role LDR RPM counter worked I did desire a more finesse solution, and I identified a DIY sip and puff device or a hacked peak flow meter as likely candidates. It was while researching these ideas in Newcastle that my mind became preoccupied with switching the breathing interface more to inhalation than exhalation (in correlation with the amount of cultural objects we inhale to enjoy). Alan again had rough and ready solutions for that - but ones that would not be easily implemented. As a return to a DIY exhalation Graham suggested the USB weather wave; I duly disassembled it but the inner parts looked like they would be too much work to reassemble into something that would register exhalation easily.
The next solution was an old serial mouse: this is the best candidate to date as it picks up exhalation quite easily - the only hitch is trying to get at the X Y coordinates within the computer. I am excited about using a mouse in a new manner to interface with a PC given its colossal status as an iconic HCI device.
In some ways the two paths I’ve been following illustrate two methods of working. With the one I had a fully realised end goal that I had to work towards and would need to acquire coding competency to get there. With the other I had a very nebulous idea, and the extra time spent tinkering with various means to actually achieve an exhalation interface to a computer (never mind a database) provided me with other interesting notions I would like to interrogate. This means that basically I am not learning code in a manner which will let me competent in it, but equally it means I am finding other ways to do things, and more importantly I am learning to do things simply, by breaking them down into easier paths if the code doesn’t work out. I still find it maddeningly frustrating when I waste hours working out something that someone who has the fundamentals down would twig fairly quickly; however I am now resigned to knowing that it would take too much time to get up to coding scratch in the languages I need to work it, and I feel this time is better spent tinkering as I seem to arrive at more interesting ideas that way, and for me right now learning to think creatively and outside the box is more important than acquiring rudimentary coding skills.
That being said I am a little uncertain where to take this project forward to:
The audio KDD interface has scaleability issues - while I think I can get queries linking between two or more tables easily enough I do worry about how feasible it will be to translate the knobs into “select ?? from table where column = <quantity as defined by analog input from arduino>.
Furthermore there is the issue of audio sent to the soundcard. The sound as it can be heard now is a result of Perl translating arduino input into SQL queries, and the result is not quite the overlayering effect desired.
The breath interface still needs to be taken from receiving input from the mouse into values that can be stored in a database as a record. Then there is the matter of configuring the mouse to move through the database. And also of getting the database to blow air back at the user.
I envision the contact mics can work alongside one or both the above projects, in an ideal world the contact mics would pic up the sound of both these databases being interacted with.
And there are a number of abandoned ideas that could still be rehashed or experimented into further.
- A database which animates a stroboscope: as a more abstract way of databases creating patterns for our visual sense
- A utilisation of the static of old PC monitors and databases
- Something with Piezo’s and jawbone conduction