Entries from November 1, 2010 - November 30, 2010
Artificial Worlds and the Cinema

On the set of Avatar
The cinema was dominated for a hundred years by a certain kind of theatricality, mise-en-scene, in which a set or a scene, something visible, determined the structure of shots and the actor’s role in them and most importantly the director’s creative vista. Now the actor and the director must imagine the scene they are in to a far greater degree than ever before, which is why production processes like pre-visualization and post-production have become so important.
In the literal sense there is now no scene. There is nothing scenic about a green or blue wall onto which images and events, backgrounds and foregrounds will later be digitally grafted or a motion capture studio where everything happens in complete abstraction from reality. Rather, to be in a scene in the digital cinema is to enter into an imaginary wonderland, a Narnia of the mind, a fantasy upon which and through which actors produce their roles. As in the past, they have to produce themselves as characters but within a carefully constructed space that is existentially artificial and quite bereft of physical markers. In a sense, because special effects are so important to the cinema in the 21st century, the process of production is more akin the creation of animated films which is why the cinema is now a hybrid or mixed medium.
This is why the contemporary documentary cinema is so wildly popular. Throughout the 20th century, documentaries were of marginal value to the film business and generally viewed by small audiences with specialized interests. This has changed completely. I attribute the shift to the need to capture events and people with some spontaneity and to produce meaning by engaging with the world — the lens as window without the interference of special effects.
Artifice is of course foundational to all genres of cinema. There is a young character by the name of Lucy in the film of The Lion, the Witch and the Wardrobe. Her role is central to the narrative, especially in the beginning when she discovers Narnia, a winter wonderland of snow and extraordinary creatures. In order to make sure that her reaction to the “scene” would be as innocent and as open as possible, the creators of the film kept her away from the set until they needed to film her shots. Then they blindfolded her and brought her into the studio. She still only saw a fragment of the final version of the environment that had been created for the film — heavily composited, reshaped through a variety of sophisticated tools and technologies — but the director wanted her to look as if she had never seen what the filmmakers had created. They needed her innocence to be as genuine as possible in order to bring some authenticity to the shot, as if they were afraid of the artifice of the special effects. This is the challenge generated by creative processes that move from screen to screen until they finally make it to the big (film or television) screen.
A series of phantoms, images that barely exist outside of the computer, scenes that are not built by actors but by the agile use of technology, all of this adds up to an imaginary world, one that is generated by the use of technologies that have transformed the production process.
An imaginary scene built upon the imagination of viewers, this doubles the effects of identification and viewing and perhaps explains why the movie Avatar was such a success. The director, James Cameron realized that in order to convey the intensity of the world that he had created for his actors, he had to actually play back each scene to them. As much time as possible was spent witnessing the artificial world of Pandora as managing the complexity of acting within the confines of a studio.
Ironically, this is precisely what viewers have to grapple with while watching the film. They have to enter a world they know is artificial, believe in its geography and physicality and struggle with the knowledge that the world only exists in their imaginations. They are imitating the struggle of the actors who have to devise all sorts of ways of legitimizing their roles within non-existent spaces.
Artifice, audience and imagination have merged.
Learning in the 21st Century (Part Three)

I recently had the privilege of talking to a group of parents about the culture of schools and the education that their children were receiving during what is clearly a transitional phase in the history of education.
Many of the parents were very worried about their children and with some justification. This was a boy’s high school and the parents were concerned that their sons were spending an inordinate amount of time on computers as well as playing video games. I put up a slide with the words moral panic written in bold and this seemed to describe their feelings — a combination of hostility, fear and acceptance.
However, my intention in putting up the slide was not to reinforce the moral panic that they were feeling, but rather to explore the implications of the shifting cultural space now occupied by a generation that lives within the “net.”
Distinctions between online and offline life are no longer relevant nor are they germane to the way people learn. The continuum of relationships set up through mediated environments will only become more complex as societies explore the many layers of information and knowledge that now define not only relations among people but also among societies.
We are living within a period of history that is not dissimilar to the massive changes experienced during the late 18th and early 19th centuries. These changes were as much a product of scientific invention as they were of fundamental social change. In fact, a key feature of that period was the advent of real scientific solutions to previously difficult challenges. At the same time, many old ways of thinking had to change as science gave empirical explanations for what had hitherto been thinking based on religion or superstition.
Social and cultural changes ‘dislocate’ societies in various and often-unpredictable ways. For example, the Internet makes schools not so much centres of learning, as social spaces for the exploration of relationships, which may include immersion in particular disciplines but not in the manner to which we have become accustomed over the last fifty years. The issue is not only the availability of numerous venues for learning, but also comes down to the choices students make and the emphasis they place on learning experiences in different places.
As John Falk and Lynn Dierking emphasize in a recent and brilliant article in American Scientist, (Nov-Dec 2010 issue) students spend only five percent of their lives in the classroom and learn most of what they know about the sciences outside the classroom. “We contend that a major educational advantage enjoyed by the U.S. relative to the rest of the world is its vibrant free-choice science learning landscape—a landscape filled with a vast array of digital resources, educational television and radio, science, museums, zoos, aquariums, national parks, community activities such as 4-H and scouting and many other scientifically enriching enterprises.” (p. 486)
Since Falk and Dierking are talking about K-12 as well as post-secondary, it would not be too hard to extrapolate an even lower percentage of university students for whom the classroom is the main venue for learning. This raises interesting issues for policymakers who have focused all their efforts on grading and testing while not recognizing that informal modes of learning are the dominant mode of learning.
I believe that parents are worried because mediated environments can lessen social interaction and can decrease if not eliminate the qualities of everyday conversation so essential to our well being. They are also worried because the information on digital culture is itself so contradictory. Statistics appear everyday from varying sources that suggest a whole variety of impacts caused by the swift appropriation of the Internet for nearly everything we do on an everyday basis. This is so to speak more of a source for the ‘panic’ than the actual engagement of children and adults with digital experiences.
In part four, I will look into the issues of moral panic and digital culture in greater detail with an emphasis on the importance of this discussion for learning and education.
Learning in the 21st Century (Part Two)

One of the recurring themes in discussions about learning and education is that our post-secondary institutions are always to varying degrees on the verge of decline or even death. “The American Liberal Arts College died today after a prolonged illness. It was 226 years old.” (Washington, D.C., 2 July 1862) Quoted in the Winter 1971 edition of the History of Education Quarterly, Vol. 11, No. 4 p. 339.
In 1862, colleges in the US shifted from a skills orientation to broader curricula more concerned with social, economic, artistic and cultural issues than traditional approaches to job-ready training. It is important to remember that in the 19th century it was not necessary to go (as Richard Hofstadter has put it) “…to college to become a doctor, lawyer, or even a teacher, much less a successful politician or businessman….Higher education was far more a luxury, much less a utility, than it is today.” (History of Education Quarterly, Vol. 11, No. 4 p. 340)
The key word in what Hofstadter says is “utility.” Today, in our rush to promote the utility of education, we have reduced learning to a series of “courses” defined in larger measure by a structure that privileges speed over gradualism. Intuitively, learners know that new knowledge cannot be ‘acquired’ through the simple consumption of information. Intuitively, teachers know that tending to the emotional intelligence and needs of their students is perhaps more important than promoting rote learning. Nevertheless, schools try to squeeze learning into narrow disciplinary boundaries. So much of the structure of schools works against change including the fact that hiring of new teachers is still defined by discipline.
When economies go into crisis, policymakers look to schools to solve the immediate challenges of unemployment and thereby raise expectations that schools will simply ‘produce’ the workers needed to solve the economic challenges. This is also why the for-profit sector in education has become so large because they play into the fears learners have that they will not be employed unless they have specific skills needed for specific jobs. Policymakers amplify this even further by linking funding for public institutions to labour market data that is often years behind the economy itself.
In a globalized environment, it is increasingly difficult to predict economic direction and to manage complexity. Schools should be the places where we encourage complex thinking and doing, creating and collaborating. Instead, we rush to both prove the value of education and its outcomes. In the process, we have created straightjackets that limit invention, innovation and crucially the human imagination from flourishing and thereby actually decrease the opportunities for change and impact.
Our educational institutions are not dying, although some will disappear. The rhetoric around their value has become embedded in the fabric of Western democracies. The challenge precisely is to understand how that value can be transformed to reflect and enhance the ability of learners to generate, shape and contribute to knowledge-based societies.
Part Three will examine some of the central characteristics of the knowledge society and whether schools are in fact the pivot for the new digital era.
Learning in the 21st Century (First of a series)

These days there are many documents and reports circulating about 21st Century learning and outcomes. For example, here is a classic programmatic statement: “Within the context of core knowledge instruction, students must also learn the essential skills for success in today’s world, such as critical thinking, problem solving, communication and collaboration.”
BUT, and it is a big but, this has always been the ambition of most schools, most teachers and most governments. Who doesn’t want their students to be good communicators? Would any school suggest that problem solving is unimportant? Collaboration has always been celebrated as essential to learning.
So, is this just rhetoric? Are these just convenient descriptors without any real content or are they essential and new aspects of the learning process?
Context is crucial here. Can schools built on a mid-twentieth century industrial model of education promote critical thinking in the 21st century?
Can twenty to forty students sitting in a classroom develop the insights needed to meet and challenge not only their own points of view, but also those of others?
In 1828, the Yale Report (a foundational document in the history of American education) appeared and here is a brief quote from page six: “From different quarters, we have heard the suggestion that our colleges must be new-modeled; that they are not adapted to the spirit and wants of the age; that they will soon be deserted unless they are better accommodated to the business character of the nation.”
Sound familiar? Have our schools ever been able to meet the needs of the age? I doubt it. More often than not education and learning are sources of dispute, mediators in the culture wars or progenitors of conflict. These are not bad characteristics, it is just that learning, for better or worse is not about information, schools or responding to what teachers suggest or talk about. The social space of schools is much like social media, places of conversation where the unintended outcome is often far more important than any of the artifice used to frame conversations in a specific way.
The hubris of educational institutions is that they believe they are central to the lives of their students and are the hubs around which learning takes place. For the most part, learning is neither clear (as to intent — you may want to learn, but everything from the emotional state that you are in to the classmates and teachers you have muddy the waters) nor is it linear. The lack of linearity drives policymakers crazy. They have forgotten of course that play is central to learning and deficiencies in the understanding of information and knowledge cannot so easily be cajoled into positive outcomes. In fact, the drive to constrain the inherently chaotic nature of learning leads to examinations and modes of evaluation that measure not what has been learned, but how effectively students can play the outcomes games required of them.
Part Two will examine the specifics of outcomes and the expectations of policymakers.