Is the Internet making us stupid? In his latest book, Nicholas Carr suggests that, at the very least, it may be changing our thinking patterns. In The Shallows, he cites a UCLA study in which several seasoned web users were asked to conduct Google searches alongside several web neophytes. Scans show that their brains fired differently, particularly in the dorsolateral prefrontal cortex, which is associated with decision-making. The phrase 'this is your brain on Google' springs to mind.
It was further shown that after a relatively short period of Internet use, the brains of the Internet 'newbies' changed to match those of the web veterans, indicating, according to Carr, that it is relatively easy to change the very physical act of thinking through short-term exposure to the Internet. Now, consider the hundreds of millions of us who sit at a desk for hours each day doing nothing but hypertext-based work. Moreover, reflect on how early our schoolchildren are exposed to these new technologies. I was recently informed, to my horror, that my five-year-old would not be taught cursive handwriting because it was no longer deemed relevant. When Apple coined the phrase 'think different', I doubt it meant us to go this far.
Carr suggests that we are beginning to think more broadly, instead of deeply. Cue the by-now hackneyed arguments about modern students' inability to read a full-length novel, and the attention deficit disorder that plagues the average knowledge worker who is torn between a panoply of hyperlinked documents each day.
Such arguments may well be true, but they're also boring, and overcooked. The real meat of the debate lies in whether this switch to broader thinking is a good thing or not. Or whether, indeed, one has to choose between the two, or whether it is possible to maintain a level of both depth and breadth by compartmentalising our usage of online technologies and devoting time to more meditative activities.
One of the most interesting reflections on this argument lies in the London Review of Books. In his review of The Shallows, Jim Holt says that while it may indeed be possible to eventually augment our own 'postcode' memory-the part of our brain into which we cram facts and figures-with search engines, there may be some unhappy side-effects.
It seems ostensibly more productive to simply wire an Internet connection directly into our heads (something which will surely be doable within a couple of decades), and simply use it as a form of extended memory. Why bother remembering when William Howard Taft was US president, when you could simply think a search query and have the data returned to you?
However, things get sticky when one considers associative memory. This, as Holt points out, is the fountainhead of creativity. It is the landscape in which metaphors emerge, and it is the filter through which big ideas ultimately trickle.
Holt points to the French mathematician Henri Poincare, who would immerse himself in facts and theory for days without conclusion, but who would then reach a sudden epiphany in mathematical theory while stepping on a bus. Poincare concluded that soaking himself in ideas and facts enabled unconscious memory to process them in ways that lead to creative results, which appeared when he wasn't even thinking about mathematics.
This is something that computers can't do, Holt says, warning us against throwing the creative baby out with the bathwater. If we 'outsourced' our postcode memory to the Internet, would we eliminate creativity by stifling the subconscious processing of associative ideas?
There is another potential future, of course. Holt is a self-professed late adopter who doesn't have his finger on the technological pulse. Computers might well be able to learn how to process ideas using a simulated form of associative memory, after all. We are already working quite hard at an industry producing semantically linked information storage, in which concepts are wired directly into the data representing them. Semantic search-in which the search engine understands the ideas that you're looking for-has long been a holy grail for the search business, and we are getting closer. Perhaps, in the future, the Internet might have its own 'Eureka moments', without our help? What would that form of digital creativity look like?
Last week I attended QCon London, a conference focused on enterprise development and which spans multiple technologies, including Java, .NET, open source, database, mobile, and general development methodologies.
It is among my favourite conferences thanks to its vendor neutrality and the high quality of the speakers and attendees it attracts. Nevertheless, only a tiny fraction of developers make it to QCon. Vendor events like Microsoft TechEd or Oracle OpenWorld/JavaOne are bigger, and great for keeping up with what that vendor is up to, but tend to be less though-provoking as the content is steered by what one company wants to promote. It still seems to be only a small minority of developers that make it to such events.
There are lots of reasons for staying away: they are too expensive, or your company will not make the time available, or you are not convinced that there is enough signal above the noise, or you are too busy simply keeping up with the work you have already. Are conferences an unnecessary and costly distraction?
My view is the opposite. Of course I am a journalist and it is my job to track what is new; but I do some development as well, and feel that there is a real risk of falling into a safe pattern of work that makes us blind to new ideas that can quickly repay in productivity or quality the time spent in learning about them.
It is also remarkable how a good event can recover enthusiasm for the craft of writing software, something easily lost in the humdrum world of requirements and deadlines and sometimes dysfunctional corporate structures.
As for QCon, there were several things I came away with - though bear in mind that there were six tracks, so one person could only attend one sixth of what was on offer, keynotes aside.
I spent some time on the mobile track. A session by Fraser Spiers on the Apple iPad in education was irritatingly Apple-centric, but also stimulating in showing how a new model of computing can bring about profound and beneficial changes. I think we will see a lot of iPads in business computing too.
Jerome Dochez spoke on the future of Java EE, not the most exciting session but helpful to see how Java is embracing the cloud computing model.
Google's Patrick Copeland spoke on innovation at Google, with the underlying question being how to create a culture that is friendly rather than hostile to innovators and their ideas.
At the .NET State of the Art track I learned about creating RESTful services, both with the open source OpenRasta and with the official WCF Web APIs; I had not been paying attention to this area and it was an example of how attending a conference can highlight existing and important developments that you might otherwise miss. Of course being QCon the speakers were OpenRasta's author Sebastien Lambla and on WCF Microsoft's Glenn Block; exactly the people you would want to hear on these subjects.
On the last day at QCon I got best value from the NoSQL track. A simple example in a talk on graph databases and Neo4j, where the database needed to model social connections and answer questions like "Who are this persons friends, and the friends of his friends", convinced me that SQL relational databases are not the answer to every kind of data storage problem. Note that NoSQL stands for "Not only SQL" rather than "Never SQL"; you should choose the right data store for what you are storing.
The Guardian's Matt Wall described how the Guardian web site is migrating from Oracle to mongoDB, giving the rationale and describing the benefits. I had never looked at mongoDB before and it was a fascinating talk.
At a high level, QCon has its roots in Agile development methodology; and if you study this you find that much of it boils down to fostering communication between all of a project's stakeholders (not just developers). If you came away with one good idea for improving communication in your own organisation, the whole event may well have been worthwhile.
It does not have to be QCon. My point is that going out, talking to your peers, and getting this kind of input is enormously worthwhile, even in busy or economically testing times.