Way back when, my undergraduate class was treated to a guest speaker from the Darmstadt University of Applied Sciences. Our guest – whose surname I have benightedly forgotten – introduced us to technology hype. The following week, Michael Loftus (@ml_loftus), Head of the Faculty of Engineering and Science at CIT, spoke to us on the same topic. Before turning to the education sector, Michael was a director at Gartner Consulting – the name in IT strategy. Needless to say, he was very familiar with the theory behind Gartner’s hype cycle. For a group of technophiles nearing the end of their four-year stints as undergrads, understanding this cycle was a valuable addition to our knowledge. We went into those lectures thinking that the latest technology was unquestionably the greatest technology, and emerged with a far more informed view. That view, in a nutshell, is that one must do their best to avoid the inflated expectations that surround any emerging technological trend.
We aren’t just living in the digital age – we are living in the era of technology hype. Buzz words are being tossed about with far too much ease. What’s worse is that people are buying into the hype that these terms are creating. Take “cloud computing” for example. Cloud computing arrived on the scene like some sort of electronic messiah, sent to free us from the supposed shackles of localised computing. I remember when I first heard the term. Excitedly I looked up the early definitions that accompanied the cloud revolution, and with considerable disappointment thought: “hmmm, how’s this different to what we’ve been doing already?” Granted, there are a number of more recent offerings that do successfully distinguish what we have now from what we had before, but generally, most definitions of cloud computing could be applied to internetworking technology in the broadest sense. Yet, everyone continues to speak about “the cloud” as if it were the ARPANET reincarnate (maybe it is).
Mobile computing is, to my mind at least, the most recent offender as far as technology hype is concerned. There appears to be a sustained push towards developing for mobile platforms – a constant focus on mobile platforms as “the future”. Maybe in domestic usage, but in industry and academia, this will never be the case. Take my own daily routine as an example, which would be typical of your average doctoral candidate. I read, I take notes on what I’ve read, and, if I’ve something important to say (which is less frequent than it should be), I do some writing. As a digital humanist concerned with stylistics, I do some light coding and data crunching as well. Sure, my iPad is great for reading text that I don’t have in printed form, or for running slides during a presentation, and my Android phone lets me respond to emails anywhere at anytime. But if any of these devices were to be removed from my life, would my research be affected? No – it wouldn’t. You can’t write a thesis chapter on a tablet – well, you could, but not very efficiently – nor can you write any decent amount of code on your phone. When I’m doing electronic textual analysis, I need proper hardware if I’m to get through the heavy lifting.
Don’t misunderstand me here, I’m not saying that mobile devices are merely expensive toys. Being deprived of a proper array of ports does make certain products seem like toys (yes I know, here comes “the cloud” to save the day), but no, I do see the scholarly potential of such technology. But this potential will always be secondary to those clunky immovable machines that weigh down your desk and cripple your lap: it’s here that the writing, the preceding analysis and the resource intensive activities that accompany research will continue to reside. At least for the foreseeable future.