Hidden in plain sight
Our failure to acknowledge that science is spurred by the same creative impulse that fuels the arts does a disservice to future scientists and to society as a whole, argues Geraint Wiggins
Creativity is one of the things that makes us human. While some other species, such as corvids (the family of birds that includes magpies, crows and ravens), demonstrate the capacity to reason creatively in the search for food, humanity is the only species known to create for abstract pleasure alone, and the only species known to demonstrably reflect on its own creativity, producing theories of art, music, architecture and design. Alongside that reflection lies admiration: "great creators" seem to be admired in all cultures, even in those, like Stalin's USSR, that suppress individual difference.
Creativity, then, is very precious to us. And we can sometimes be very precious about it. My research aims to understand human creativity by means of computational cognitive modelling. When giving talks outside my usual audience, I've learned to start off by making it clear that I don't aim to replace human creators, and follow that up by making the case that understanding something doesn't render it any less amazing. Even so, I often experience outright hostility, especially from artists and musicians, who are shocked that I would dare to ask the question of how creativity works. Indeed, it was only recently that the study of machine creativity (established, now, as the field of computational creativity in its own right) became accepted currency within artificial intelligence, which, as a research field, has been around for more than 50 years. Even those who are convinced that machines can do intelligent things sometimes resist the notion that they might do creative things, too.
One of the common arguments against machine creativity is that, if a programmer wrote the program that created something, then it is really he who is the creator. This might be a valid argument, were it not for the fact that computers can now learn. The field of machine learning has burgeoned since the mid-1980s, and has demonstrated considerable theoretical and practical success. Its contribution to computational creativity is that a researcher (or an artist) can set up a machine learning system in such a way that it educates itself about a creative domain, and then generates new artefacts (or at least their specifications) from the resulting learned model, without human intervention. A good analogy is the famous promotion of Wolfgang Amadeus Mozart by his uber-pushy parent, Leopold. Leopold undeniably created Wolfgang the boy, and he undeniably trained Wolfgang the performer and composer. But no one could reasonably claim that Leopold wrote Wolfgang's symphonies.
As a result of my studies in this fascinating area, I have become less and less convinced of the common idea, probably originating in the self-indulgent Romantic period, that creativity as a capacity is specific to certain special individuals and certain special domains. Henry Plotkin, in his excellent textbook on evolutionary psychology, Evolution in Mind, suggests that creativity is the sine qua non of language generation, meaning that we are all creating new sentences to express newly created thoughts, every day. Of course, it doesn't follow that everything said expresses an important or novel idea, but creativity is a relative thing: what is amazingly innovative in a three-year-old is not so impressive in a 33-year-old. Furthermore, not all good creativity has to be "great" creativity: working out how to open an over-wrapped package in the absence of scissors may not compare with the creation of the Mona Lisa or Beethoven's Ninth Symphony but it is probably more useful at the time.
So my suggestion is that, by failing to question our value judgements, we have developed both myopia and tunnel vision with respect to creativity: we only see it when it is very large and we only see it in certain places. Such selective identification is likely to be a barrier to understanding, as it precludes observation of the entire phenomenon. This leads me to ask: what about creativity outside the arts and humanities? Is there any? If so, where?
For me, as a scientist, an obvious place to search for other kinds of creativity is in the scientific literature. But, aside from the study of creativity in other domains, that literature is mostly devoid of reference to creativity: it very rarely describes what it does in creative terms. This is probably largely due to a laudable preference for objectivity: amazing thinking is described in the passive voice, and emotional response to amazing outcomes is at most understated and - more usually - absent. The closest epithet one regularly finds for "creative output" is "discovery", and even that is uncommon.
I'd like to unravel that word, "discovery", a little. Literally, it derives from the idea of making visible something that was not, but it carries no connotation of method. In historical terms, it often refers to the location of a place (as in "Columbus discovered America", which also serves to demonstrate the relativity of the concept). In science, it sometimes refers to the recognition of the significance of a serendipitous event. For example, Henri Becquerel discovered radioactivity when he placed a piece of uranium next to a photographic plate in a drawer, having abandoned his original experiment into photoluminescence because the sun had failed to shine; he found later that the plate had developed patterns consistent with exposure to light.
Another type of "discovery" is that of a new mathematical structure, along with a demonstration of its properties.
Both of these, though, are creativity in disguise. Serendipity, a happy accident, is not enough. The accident and its significance need to be recognised, and doing that takes imagination: were it not so, then Becquerel's discovery would have been not of radiation but merely of marks on a photographic plate. Even to imagine a prosaic account (perhaps the plate was faulty, for example) requires a basic kind of creative reasoning - the kind that Sherlock Holmes calls "deduction", but which is more precisely termed "abduction". Becquerel, though, realised that a deeper cause was required, and proposed spontaneous emission of radiation from the uranium, by analogy with light.
The kind of creativity that takes place in the purest of scientific philosophies, mathematics, seems to me to be no different in kind, but of a different magnitude. To hypothesise (invent, discover) a new mathematical structure, or a new proof, it is necessary to imagine elements of sets, constraints, rules, expressions and other things that cannot exist in the real world (such as higher-dimensional objects), and which the imaginer can never have directly experienced. Alternatively, perhaps it means imagining the movement of symbols, as mathematics is written, or both: the research jury is still out. Furthermore, sometimes, mathematics involves deliberately denying the evidence of one's experience to imagine a contradictory hypothetical world: thus was the great leap of Einstein's theories of relativity. Mathematicians often talk about "seeing" solutions, referring to their mind's eye, as though the solution were already there, waiting to be seen.
Between the recognition of serendipity and the abstract purity of mathematical imagination lies a broad range of scientific events, each one involving creativity that is every bit as great as that of a novelist, painter or composer. But are these great creators feted like rock stars? In general, no: the public sees them as (at best) scary intellectuals or (at worst) nerds.
Part of the reason for this is that one needs substantial background knowledge to understand most of the great leaps forward in science - increasingly so, as science advances. Immediately, therefore, there is a barrier between this kind of creator and her audience, of a kind that simply doesn't arise in music or art. Broadcasters, scientists and, sometimes, artists all try earnestly to bridge this gap, but to do so without patronising and obscuring is difficult. Even when the ideas are successfully conveyed, scientists still cling to their objectivity, and talk themselves down, and these stunning feats of creativity are often presented as though they were simply applications of straightforward logic. Commonly, too, remarkable human reasoning is pushed aside in favour of weird physical effects that can be more easily illustrated on a screen: for example, we regularly hear about the predicted Higgs boson, the particle, in science reports, but how often do we hear told the tale of how Peter Higgs, the person, realised that he needed to predict it?
Background knowledge may be a problem in another way, too. There is no doubt that expertise is important in creativity, at least if it is to have an impact in the world. But acquiring and understanding such knowledge requires a level of self-application that is far from cool in school corridors. Since we spend so much time telling our schoolchildren that science is very difficult, and stereotyping scientists as terminally uncool, perhaps it's no wonder that some of them are suspicious and perhaps a bit resentful of those of their peers who can do it.
But background knowledge and skill are just as important to creativity in music, sculpture or cookery as they are to creativity in audio-electronics, materials science or chemistry. So why aren't Evelyn Glennie, Nika Neelova and Jamie Oliver classroom nerds like so many of their scientific counterparts?
In my opinion, the root of the problem lies in the impression given when we teach science. I recall being taught at (a rather good) school how to do experiments and to write them up. There were hypothesis, apparatus, method and results. There was not one jot of Popperian or Lakatosian philosophy to explain why there should be these things, and, crucially, there was no discussion of how one would arrive at a hypothesis: it was merely presented as something of which one needed to know the validity (and not, incidentally, to attempt to falsify). Likewise for methods. Overall, the impression given was that scientific progress was the incremental application of known deductive rules, requiring no imagination at all. The teaching was focused entirely on the surface of the problem: on what to do, and not on why to do it.
At the time, I didn't notice the omission, and, I imagine, neither do most of the other schoolchildren who go through the same process. My undergraduate education (maths and computer science), although enlightening in many ways, was no better in this respect, and contained even less scientific philosophy. This problem doesn't begin at primary or secondary level, but spreads its doleful seed from our universities, where teachers are taught.
Twenty years' experience of teaching computer programming has led me to the realisation that we are, much of the time, barking up the wrong tree. The norm in computer science departments is to choose a programming language (often only one) and then define its syntax, repeatedly demonstrating the behaviour of each construct, often with simple, disparate examples. What we mostly fail to do is address what programmers starting out really need: the cognitive skills to imagine abstract processes applied to abstract objects, and to imagine problems in these terms. Some freshers have learned to do this already, but many have not. The fact that generally we are not even trying to develop students' capacity to do this (because we are focused on the syntax of programming languages instead) explains why those in the latter category often don't make much progress. The results of programming modules remain notoriously bimodal.
I do not believe there is a practising scientist, mathematician or engineer in the world who would deny that their work is creative, if asked. But it is very uncommon for them to say so unprompted. It is time for a sea change in the teaching of science, maths and engineering, to celebrate the amazing creativity in these disciplines and, more importantly, to encourage our students to take part in it. Celebrating creativity does not have to mean abandoning scientific objectivity: there is a place for everything, and we must simply put each thing in its place. For my part, I have for a few years now been promoting the idea of teaching programming using techniques from the arts and design, focusing on the development of creative practice, per se.
I'll keep trying. Like the most exciting kind of creative hypothesis, it's crazy. But it just might work.
Geraint A. Wiggins is professor of computational creativity at Queen Mary, University of London.