"No," he replied, "It took me all my life." JL
Eliot Gattegno reports in Quartz:
In order to arrive at a creative solution to any problem, we have to have knowledge and understanding of the factors, institutions, and phenomena at play. In the divergent phase, where you assess the ideas you’ve generated, “being creative” starts to depend heavily on what we already know. It’s not enough to only encourage ideation and other LinkedIn-esque traits. Put serious effort into achieving mastery—of a skill, a field, a discipline - to truly recognize the value of that eureka moment.
The idea that comes out of nowhere. The eureka moment. If we could figure out how to get there faster and automate up the process, humankind would be forever changed, right? This is something we can’t stop obsessing about as a society—but maybe we’re thinking too hard about it.
Two years ago, the New York Times reported on a whimsical new trend on college campuses: studying creativity itself. Schools were suddenly offering minors in creative thinking and asking their students to problem-solve for problem-solving’s sake. The classes seemed to make the students more confident, and had benefits that were tangible if slight: one student figured out a quicker way to re-shelve DVDs at his library job.
And yet this worship of creativity has haunted me since I first read the article. Aren’t we thinking of “creativity” too broadly here? Is it truly something we can study on its own, divorced from the problems and distractions and flash cards of the real world?
The commonly-accepted definition of “creativity” is quite vague: “a novel work that is accepted as tenable or useful or satisfying by a group in some point in time.” But that definition does have its roots in neuroscience. Creativity is a characteristic born in Homo sapiens by virtue of extended cortical development, which served to privilege environmental influences on the brain over a genetic determinism. Recent studies have suggested that creative people are more fluent in generating responses to problems overall. And generating responses is part of what the study of creativity teaches.
Creativity is often mistaken for a universal trait, but in order to arrive at a creative solution to any problem, we have to have knowledge and understanding of the factors, institutions, and phenomena at play. If creativity relies on knowledge of a subject plus all that fluent, I-got-it-from-the-Muse brainstorming, then studying creativity for creativity’s sake might feel progressive—but it’s actually quite narrow.
Can creativity be learned?
Scientific evidence does link the creative process to certain patterns of brain activity—and it is possible to “train” some of these these patterns the way you would build a muscle. So yes, you can learn to be more creative in certain ways. But that’s just one side of the neurological coin.
Any effective creativity training program should focus on both aspects of the creative process: “convergent” thinking (the generative, brainstorming phase in which tasks are carried out without judgement or hesitation), and “divergent” thinking (that part where you hone and assess the wealth of ideas that you’ve already generated). The convergent stage is the time when ideas are freely generated without rejection, no matter how bad they may seem. And this is all many creative studies minors focus on—that sexy, sky’s-the-limit phase when everyone is tossing ideas around, the wilder the better.
Additionally, people who find it easiest to arrive at those “novel works” that our vague working definition of creativity provides are able to achieve psychological “flow”—being completely immersed in a task. That, too, can be learned—or at least improved—through techniques such as meditation, which can enhance the various brain oscillations and statesneeded to achieve flow.
So yes, a creative studies minor can be useful for the first part of “being creative”—the convergent phase. But when it comes to the divergent phase, learning to be broadly “creative” isn’t enough. In the divergent phase, where you assess the ideas you’ve generated, existing knowledge is incredibly important. In other words, “being creative” starts to depend heavily on what we already know.
Whatever happened to expertise?
This prior knowledge of a system or field may be the most important aspect of “creativity”—much more so than convergent thinking.
Some of the most compelling experimental evidence describing brain activity patterns during the “divergent” phase of a creative task implicates the medial temporal lobe and hippocampus, which is the part of the brain that humans use when making, storing, and accessing memories—and the hippocampus lights up like a firecracker during memory recall. Evidence of hippocampal activation during the “divergent” thinking part of the creative process may indicate that subjects are calling upon existing knowledge to complete the task, in order to ultimately generate unique or novel outputs. The mathematician Terry Tao hinted at the same end point, albeit less neurologically, when he said that the ability to apply and intuit arises from mastery.
This is why learning to brainstorm and listening to the Muse isn’t enough when it comes to studying creativity. In order to “be creative,” in order to problem-solve with the best of them, we need to work on becoming not just artists—but experts.
Because of this, it’s not enough to only encourage ideation, problem-solving, and other sexy, LinkedIn-esque traits. Students, leaders, and business developers shouldn’t just aspire to be nebulously “creative,” but should put serious effort into achieving mastery—of a skill, a field, a discipline. Only then will they be able to truly recognize the value of that eureka moment.
0 comments:
Post a Comment