Search
Recent Entries
Twitter
Responses
  • Contact Me

    This form will allow you to send a secure email to the owner of this page. Your email address is not logged by this system, but will be attached to the message that is forwarded from this page.
  • Your Name *
  • Your Email *
  • Subject *
  • Message *

Entries in Learning Communities (30)

Tuesday
Jun202006

The context for learning, education and the arts (4)

(This entry is in five parts) One, Two, Three, Four, Five)

So why explore the intersections of human thought and computer programming? My tentative answer would be that we have not understood the breadth and depth of the relationships that we develop with machines. Human culture is defined by its on-going struggle with tools and implements, continuously finding ways of improving both the functionality of technology and its potential integration into everyday life. Computer programming may well be one of the most sophisticated artificial languages which our culture has ever constructed, but this does not mean that we have lost control of the process.

The problem is that we don’t recognize the symbiosis, the synergistic entanglement of subjectivity and machine, or if we do, it is through the lens of otherness as if our culture is neither the progenitor nor really in control of its own inventions. These questions have been explored in great detail by Bruno Latour and I would reference his articles in “Common Knowledge as well as his most recent book entitled, Aramis or The Love of Technology. There are further and even more complex entanglements here related to our views of science and invention, creativity and nature. Suffice to say, that there could be no greater simplification than the one which claims that we have become the machine or that machines are extensions of our bodies and our identities. The struggle to understand identity involves all aspects of experience and it is precisely the complexity of that struggle, its very unpredictability, which keeps our culture producing ever more complex technologies and which keeps the questions about technology so much in the forefront of everyday life.

It is useful to know that the within the field of artificial intelligence (AI) there are divisions between researchers who are trying to build large databases of “common sense in an effort to create programming that will anticipate human action, behaviour and responses to a variety of complex situations and researchers who are known as computational phenomenologists . “Pivotal to the computational phenomenologists position has been their understanding of common sense as a negotiated process as opposed to a huge database of facts, rules or schemata."(Warren Sack)

So even within the field of AI itself there is little agreement as to how the mind works, or how body and mind are parts of a more complex, holistic process which may not have a finite systemic character. The desire however to create the technology for artificial intelligence is rooted in generalized views of human intelligence, generalizations which don’t pivot on culturally specific questions of ethnicity, class or gender. The assumption that the creation of technology is not constrained by the boundaries of cultural difference is a major problem since it proposes a neutral register for the user as well. I must stress that these problems are endemic to discussions of the history of technology. Part of the reason for this is that machines are viewed not so much as mediators, but as tools — not as integral parts of human experience, but as artifacts whose status as objects enframes their potential use.

Computers, though, play a role in their use. They are not simply instruments because so much has in fact been done to them in order to provide them with the power to act their role. What we more likely have here are hybrids, a term coined by Bruno Latour to describe the complexity of interaction and use that is generated by machine-human relationships.

Another way of understanding this debate is to dig even more deeply into our assumptions about computer programming. I will briefly deal with this area before moving on to an explanation of why these arguments are crucial for educators as well as artists and for the creators and users of technology.

Generally, we think of computer programs as codes with rules that produce certain results and practices. Thus, the word processing program I am presently using has been built to ensure that I can use it to create sentences and paragraphs, to in other words write. The program has a wide array of functions that can recognize errors of spelling and grammar, create lists and draw objects. But, we do have to ask ourselves whether the program was designed to have an impact on my writing style. Programmers would claim that they have simply coded in as many of the characteristics of grammar as they could without overwhelming the functioning of the program itself. They would also claim that the program does not set limits to the infinite number of sentences that can be created by writers.

However, the situation is more complex than this and is also subject to many more constraints than initially seems to be the case. For example, we have to draw distinctions between programs and what Brian Cantwell Smith describes as “process or computation to which that program gives rise upon being executed and [the] often external domain or subject matter that the computation is about. (Smith, On the Origin of Objects, Cambridge: MIT Press, 1998: 33) The key point here is that program and process are not static, but are dynamic, if not contingent. Thus we can describe the word processor as part of a continuum leading from computation to language to expression to communication to interpretation. Even this does not address the complexity of relations among all of these processes and the various levels of meaning within each.

To be continued........

 

Sunday
Jun182006

The context for learning, education and the arts (2)

This Entry is in Five Parts. (One, Two, Three, Four, Five)

Let me begin by quoting the head of IBM, Lou Gerstner in reference to Deep Blue, the computer developed to play chess at the grandmaster level:

“Deep Blue is emblematic of a whole class of emerging computer systems that combine ultrafast processing with analytical software. Today we’re applying these systems to challenges far more vital than chess. They are used for example in simulation — replacing physical things with digital things, re-creating reality inside powerful computer systems? (“Think Leadership? Vol. 3, No. 1, 1998: 2)

Now, what is important here is not only the references to Deep Blue and very fast computer systems, but the assumption that the replacement of physical things with digital things re-creates reality inside computer systems and by extension in reality itself. This may well be true and may well be happening, but we need to examine the implications of the claim and locate this claim within a cultural, social and economic analysis. And we need to become quite clear about the meaning of the term simulation which is used most often to refer to an artificial environment that either replaces the real or in Jean Baudrillard’s words become the real. Simulation as I will use it refers to the creation of artifacts, their use and their integration as well as co-optation into an increasingly digital culture.

“And soon we’ll see this hyper-extended networked world made up of a trillion interconnected, intelligent devices — intersecting with data-mining capability. Pervasive Computing meets Deep Computing? (Gerstner: 3)

I will return to the implications of this quote through a variety of different routes. Historically, the advent of new technologies in the 20th century has generally been paralleled by claims of social effect and cultural transformation and these are synoptically represented by the continued influence of Marshall McLuhan on present thinking about technology and its effects. I will not examine McLuhan’s ideas in great detail, suffice to say that many of the assumptions guiding his cultural appropriation by a variety of writers, commentators and politicians do not stand up to scrutiny of a rigorous kind. For example, McLuhan’s famous statement that “The Medium is the Message? grew out of a report that he wrote in 1959-60 for the Office of Education, United States Department of Health, Education and Welfare. It was entitled, “Report on Project in Understanding New Media? In it McLuhan analyses media such as television using the tools of cognitive psychology, management theory and economics. For McLuhan, media include speech, writing, photography, radio, etc.. And he is puzzled by why the effects of these media have been overlooked for as he puts it, “…3500 years of the Western world? (McLuhan, 1960: 1)

McLuhan searches for an explanation and much of the research for the project is prescient and fascinating as well as a precursor to the publication of “Understanding Media? in 1964. When it comes to the famous aphorism about the medium and the message, McLuhan reveals a rather interesting foundation for much of his later research.

“Nothing could be more unrealistic than to suppose that the programming for such media could affect their power to re-pattern the sense-ratios of our beings. It is the ratio among our senses which is violently disturbed by media technology. And any upset in our sense-ratios alters the matrix of thought and concept and value. In what follows, I hope to show how this ratio is altered by various media and why, therefore, the medium is the message or the sum-total of effects. The so-called content of any medium is another medium? (McLuhan, 1960: 9)

It is clear from this statement that the medium is actually the subject, that it is human beings whose sense-ratios are altered by participating in the experiences made possible through the media. It is not the content of the communication, but the encounter between the medium and subjectivity that alters or disturbs how we then reflexively analyse our experience. Although the medium is the message is generally interpreted in formal terms and although it has been appropriated as a generalization used to explain the presence of media in every aspect of our lives, McLuhan is here playing with cognitive and psychological research as it was developed in the 1950’s. More importantly, at this stage, he is avoiding a binary approach to form/content relations. He is effectively introducing a third element into the discussion, namely, the human body.

Saturday
Jun172006

The context for learning, education and the arts (1)

This entry has five parts. (One, Two, Three, Four, Five)

The context for learning, education and the arts has altered dramatically over the last few years as has the cultural environment for educators and artists/creators. Part of what I would like to do here is examine the intersection of a number of crucial developments that I think have transformed the terrain of technology, education, art and culture.

This is a grand claim and I would be the first to admit that we are being incessantly told that change has become the major characteristic of the late 20th century. But, I do think that we are witnessing shifts which will have a profound effect not only on the social and political structure of Western countries but on the ways in which In which we see ourselves, act upon and within the communities of which we are a part and how we create meanings, messages and information for the proliferating networks that now surround us.

The one important caveat here is that although I am concerned with the transformations we are experiencing, I will in no way claim that we are undergoing a revolutionary change. I tend to see history as evolutionary, which in no way precludes dramatic shifts from occurring. As intellectuals, artists, technology developers and educators, I believe it is our responsibility to become active within this environment and to develop the critical and creative tools to respond to the ongoing evolution of an emerging aesthetic of interactivity in which aesthetic goals are linked with ethical goals and are based on a perspective of caring for both the individual and the larger economic, political, ecological, social and spiritual circumstances that create contexts for the individual. (Carol Gigliotti;Bridge to, Bridge From: The Arts, Technology and Education? Leonardo, Vol. 31, No. 2, April-May, 1998 p.91)

Our cultural claims about the various factors that produce change tend to be linear, the line being one that moves along a fairly straight, if not narrow trajectory from the less complex to the more complex. The approach that I will take looks at the displacements that are created by the movement from one phase to another, movement in this instance being more like transportation framed by what Bruno Latour has described as connections, short circuits, translations, associations, and mediations that we encounter, daily. (Bruno Latour, Trains of Thought, Common Knowledge, Vol. 6, # 3, Winter, 1997, p. 183.)

So, I will begin by exploring the various conjunctures and disjunctures created by the presence of digital technologies in nearly every aspect of the cultural context of the early 21st century. My goal, however, is not an overview, but rather, to raise as many questions as I can in order to introduce increasing levels of mediation both to our understanding of the digital and to our creative transformation of the digital into various media of communication.

To be continued.....

 

Saturday
Jun102006

Geographies of Dissent (2)

There is another term that I would like to introduce into this discussion and that is, counter-publics. Daniel Brouwer in a recent issue of Critical Studies in Media Communications uses the term to describe the impact of two “zines"? on public discussion of HIV-AIDS. The term resonates for me because it has the potential to bring micro and macro into a relationship that could best be defined as a continuum and suggests that one needs to identify how various publics can contain within themselves a continuing and often conflicted and sometimes very varied set of analysis and discourses about central issues of concern to everyone. It was the availability of copy machines beginning in 1974 that really made ‘zines’ possible. There had been earlier versions, most of which were copied by hand or by using typewriters, but copy machines made it easy to produce 200 or 300 copies of a zine at very low cost. In the process, a mico-community of readers was established for an infinite number of zines. In fact, the first zine convention in Chicago in the 1970’s attracted thousands of participants. The zines that Brouwer discusses that were small to begin with grew over time to five and ten thousand subscribers. This is viral publishing at its best, but it also suggests something about how various common sets of interests manifest themselves and how communities form in response.

“One estimate reckons that these "Xeroxed, hand-written, desktop-published, sometimes printed, and even electronic" documents (as the 1995 zine convention in Hawaii puts it) have produced some 20,000 titles in the past couple of decades. And this "cottage" industry is thought to be still growing at twenty percent per year. Consequently, as never before, scattered groups of people unknown to one another, rarely living in contiguous areas, and sometimes never seeing another member, have nonetheless been able to form robust social worlds? John Seely Brown and Paul Duguid in The Social Life of Documents. Clearly, zines represent counter-publics that are political and are inheritors of 19th century forms of poster communications and the use of public speakers to bring countervailing ideas to large groups. Another way of thinking about this area is to look at the language used by many zines. Generally, their mode of address is direct. The language tends to be both declarative and personal. The result is that the zines feel like they are part of the community they are talking to and become an open ‘place’ of exchange with unpredictable results. I will return to this part of the discussion in a moment, but it should be obvious that zines were the precursors to Blogs.

As I said, the overall aggregation of various forms of protest using a variety of different media in a large number of varied contexts generates outcomes that are not necessarily the product of any centralized planning. This means that it is also difficult to gage the results. Did the active use of cell phones during the demonstrations in Seattle against the WTO contribute to greater levels of organization and preparedness on the part of the protestors and therefore on the message they were communicating? Mobile technologies were also used to “broadcast? back to a central source that then sent out news releases to counter the mainstream media and their depiction of the protests and protestors. This proved to be minimally effective in the broader social sense, but very effective when it came to maintaining and sustaining the communities that had developed in opposition to the WTO and globalization. Inadvertently, the mainstream media allowed the images of protest to appear in any form because they were hungry for information and needed to make sense of what was going on. As with many other protests in public spaces, it is not always possible for the mainstream media to control what they depict. Ultimately, the most important outcome of the demonstrations was symbolic, which in our society added real value to the message of the protestors.

To be continued...

 

Sunday
May142006

Breakfast Speech on Learning, May 6, 2006 (Emily Carr Institute Graduation)

“Most people believe that it is education that will save us. But this bland, sweeping, and unexamined assertion reduces us into continuing to uncritically support and tinker with the current story of schooling. It is education that will save us, but not any kind of education—only education of a certain kind: only education that is generative and life-affirming, that invites, engages, and integrates the fullness of our children’s capacities and ways of knowing, and that nurtures the creation of integral minds committed to the creation of a truly just and wise global civilization. Only education that develops our capacity to become more fully human is truly worthy of the human spirit. Only education that invites deep learning and reconnects us to life will light and sustain the fire within?

(Stephanie Pace Marshall)

Learning is a complex and challenging subject. The learning experience both within schools and outside of them has been an area of debate and contention for centuries and we still do not know that much about the optimum conditions for learning or even how humans internalize information and process knowledge. In this context, post-secondary and K-12 institutions are struggling to respond to sometimes-excessive expectations on the part of students and their communities, trying at one and the same time to create value and be valuable.

Stephanie Marshall quotes Mary Catherine Bateson: “You can’t prepare the child for the job market that will exist 20 years from now. So how can you build a curriculum that will shape an individual to be a pioneer in an unknown land — because that’s what the future is? (Stephanie Pace Marshall, “[The Learning Story of the Illinois Mathematics and Science Academy? ](http://www.learndev.org) The future cannot be known and we do our children a great disservice when we suggest to them that getting a degree, for example, should be connected in a linear way to their future employment. This means that a creative student exploring their often profound and sometimes confusing desire to craft or produce a work of art is has to struggle to explain both the value of their creative process and the outcomes of their creative engagement in the context of an employment picture that may not produce a simple fit. A philosophy student or even a learner with a philosophical outlook will judge speculative thought to be less than useful, largely because it cannot be connected to a clear and discernable outcome. To me, learning is as much about the practice of engaging with materials and ideas as it is about speculative thinking that cannot and should not be translated into a concrete form.

It is interesting to note that the present model for most universities is and has been a contested one. Notions of original research and inquiry only took hold in the late 19th century. Public education as we know it is relatively young with some of the biggest growth coming in the 1960’s. The idea of teaching the liberal arts in a university only reached some critical mass in the late 19th century, while in the 1930’s, research and graduate teaching were prioritized over undergraduate teaching and public service. It was only in the 1960’s that Clark Kerr proposed that a single institution “could perform multiple missions to benefit society.��? (John C. Scott, “The Mission of the University: Medieval to Postmodern Transformation,��? Journal of Higher Education, Vol. 77, No1 (Jan-Feb 2006) p. 3.) These different positions span the history of post-secondary education and learning and remain in place today with institutions bearing the weight of trying to distinguish among strategies and choices that are not well understood either by the public or by government.

Have you ever wondered why educators continue to rely so heavily on lecture formats within classrooms? In medieval times, before the printing press was invented, before it was possible to disseminate ideas to a broader populace, teachers, who were generally clerics, spoke to students, read from the bible and from other available material. They read and spoke very slowly so that the students could take notes, which was the only way for learners to reproduce the ideas and information for their own personal use. The teachers of the 12th century gained great authority from this teaching strategy. It was the beginning of a process of institutionalization, which to this day remains central to the practice of teaching. But does it remain central to the practice to learning? How do we bring new insights into our understanding of learning? Have we reached the point where our institutions, their rules, regulations, policies and practices are not able to optimize the conditions within which learning can take place?

It is within the context of this discussion that I am so very pleased to introduce Chris Kelly to you. Chris’s biography is rich and varied having been the Superintendent of Schools and Chief Executive Officer for the Richmond School Board for nine years and completing his third year as Superintendent of the Vancouver School Board. As an educator and administrator, Chris’s experience includes elementary and secondary teaching, Aboriginal education, special education, curriculum development, and professional and organizational development. He is presently the President of Canadian Education Association, is on the Advisory committee to the Deans of Education and Science at UBC and a member of the Board of Directors of the Institute for Global Ethics.

What I have described here only reflects a small portion of what Chris does, how he interweaves his passion for learning and education with the tremendous responsibilities of managing a large k-12 system, how he manages at the same time to play a public role as an advocate for our educational system, how beautifully and clearly he articulates his concerns for the quality of learning and the needs of students. Chris and I have known each other for some years now and every time we have met, our discussions have been rich and varied. So, it pleases me tremendously to announce to you today that we have agreed in principle to explore the possible creation of a specialized high school in Art and Design in Vancouver that would be supported by and developed with Emily Carr Institute. Chris will talk a little more about this, but you can be rest assured that we intend to follow through on this visionary project that we feel will ensure a place, a strong place for the creative arts in the curriculum of young learners.