Systemic Evaluation Methodology: The Emergence of Social Learning From Environmental ICT Prototypes
Systemic Evaluation Methodology: The Emergence of Social Learning From Environmental ICT Prototypes
Systemic Evaluation Methodology: The Emergence of Social Learning From Environmental ICT Prototypes
5, October 2004 (
C 2004)
DOI: 10.1007/s11213-004-5789-7
This paper investigates why and how systems approaches can help in evaluating the
design of new Information and Communication Technologies (ICTs) as social learn-
ing platforms. It focuses on the prototypes created by the research project Virtualis,
whose objective is to promote social learning on environmental concepts and practices
amongst a variety of stakeholders. The paper presents the principles of systems thinking
and practice that did help in formulating such evaluation processes. It illustrates how
both a peer systemic evaluation process (within the research team) and a participatory
evaluation process (involving potential future users of the ICTs) were carried out.
1. INTRODUCTION
The formulation of this systemic evaluation methodology took place in the con-
text of the Virtualis research project.2 This project involved a multidisciplinary
team of Information and Communication Technology (ICT) specialists, “learning
experts,” and environmentalists, all interested in exploring how ICTs can help a
variety of stakeholders coming from different backgrounds in understanding the
natural environment and how human activities can affect it. We constructed var-
ious ICT prototypes aimed not only at improving environmental awareness and
stakeholders’ involvement in environmental decision making but also at facilitat-
ing changes of (environmental) practices in order to promote more sustainable
ones. The premise of Virtualis was a special interest in ICTs as “democratic,
non-exclusive learning platforms”—that is, platforms that have the potential to
1 Department of Systems, Faculty of Technology, Open University, Walton Hall, Milton Keynes, MK7
6AA United Kingdom; e-mail: s.m.simon@open.ac.uk.
2 The Virtualis project, or “Social Learning on the Environment with Interactive Information and Com-
munication Technologies,” was funded by the EU and lasted from Sept 2001 to April 2004. For more
detail, check http://systems.open.ac.uk/page.cfm?pageid=sustdevptV
471
1094-429X/02/1000-0471/0
C 2004 Plenum Publishing Corporation
472 Simon
give a voice to groups of people who are normally ignored, either for social or for
political reasons, or because they are considered as “non-experts.” We believe that
“environmental knowledge” is multifaceted and that learning platforms that wel-
come and merge a diversity of knowledges on environmental issues and sustainable
practices are needed.
Here, we describe the learning processes that, we hope, will take place when
people use these ICTs as “Social Learning” platforms: the objective of these ICTs
is not only for people to learn individually, but also to use an interactive tool
as a way to share their knowledge. This project puts a special emphasis on the
communication and exchange that can take place between expert and “non-expert”
stakeholders.
The evaluation tool described can help us identify whether the objectives of
these ICTs have been met and, consequently, what makes a good interactive ICT,
the use from which can generate social transformational learning.
Various types of methodologies can be used when evaluating ICT tools. In
this paper, we explain why developing a systemic methodology is of particular
relevance when working on ICT prototypes that are focused on social learning
processes. We also present the two main dimensions of this methodology (a peer
evaluation process taking place within the research team and a participatory eval-
uation process involving future users) and how it was implemented while the
Virtualis water prototypes were being developed by the Cranfield Virtualis team.
Interestingly, these evaluation exercises highlighted the fact that social learning
was potentially going to take place when the ICTs would be used but also pre-
cisely when the ICTs were being constructed, within the very team of virtualis
partners.
Fig. 1. Activities, application domains, and areas of interest in the Virtualis project.
This systems diagram presents six main systems of interest focused on the
prototypes:
– their users, be they experts and nonexperts;
– their design;
– their policy dimensions;
– the changes that their use will, we hope, generate;
– the environmental domains they cover; and
– the learning content and strategy they encompass.
The bigger, overall, system, “social learning,” illustrates the fact that the
learning that is taking place when the prototypes are being used goes beyond the
mere use of the tool: it extends to the way in which the learning effects changes in
society, in policy design, in participatory processes, or in the improvement of the
prototypes throughout time.
One can identify the interactions taking place between the components of
each of these systems. For instance, within the system “Environmental themes
covered by the prototypes,” it is interesting to examine the links between the
information provided within the prototypes, the way in which stakeholders collab-
oratively define an environmental problem or issue, and the process through which
a stakeholder becomes sensitive to an issue, whether this issue is directly related
to his/her daily life or not.
476 Simon
Looking at these three components can help us understand a great deal about
the meaning of environmental information and the perception of “environmen-
tal problems” in a context of social learning. These three components are also
related to the other components of this system—for instance, the various ways
of improving stakeholders’ motivation to learn and construct meanings about the
environment can be strongly motivated by the wish or need to improve the quality
of the environment, or by new policy measures.
have got conflicting objectives and aspirations, however, the qualities valued in the
learning process, such as empowerment, may actually increase conflict, yielding
a whole team that achieves less than the sum that its members can, if they were to
work in isolation.
Team learning in Senge’s outlook (Senge, 1990) is best pursued through
methods of dialogue. In this way, people make a genuine attempt to appreciate
matters of concern through the eyes of people who raise the concerns. People
learn from this by expanding their understanding of circumstances that prevail.
Systems thinking can help to bring together people’s mental models in a shared
systemic language, generating team learning and understanding, and a shared sense
of purpose.
Within team learning, one type of open learning (the most common) is partic-
ipatory. It can be described as the freedom to speak one’s mind and to state one’s
view and is reflected through interactive processes in ICTs, as well as debates
with other stakeholders. It can encourage wide modes of involvement in decision
making. Another generative learning comes from reflective openness: this entails
challenging one’s own thinking. It necessitates surfacing assumptions that shape
our views and then subjecting the assumptions to open criticism.
One important challenge of the Virtualis ICTs is to demonstrate that team
learning can empower their users, be they environmental experts or not. One way
to do so is by generating a common sense of purpose in which they focus energy
in a meaningful way. Systems thinking may empower people by enabling them
to begin to appreciate rather than be confused by the interrelated nature of the
world and how this might cast insights into their experiences. In order to put these
principles into practice, systems thinkers have been keen to promote “interactive
planning.” Interactive planning builds on the premise that obstruction to change sits
mainly in the minds of participants, rather than separately “out there” in the problem
context. Interactive planning is a form of scenario building methodology that offers
in tangible form helpful guidelines about realistic intrinsic desires and shared
vision. The Virtualis ICTs have been developed in view of helping stakeholders
understand better the impacts they have on the environment and how they could,
individually or collectively, change their practices and lifestyles. A change in their
way of thinking and behavior would reflect the outcome of a successful learning
process.
We have chosen to develop our systemic evaluation with these principles in
mind, the objective being to check whether the ICT platforms developed in the
project genuinely allow team learning and interactive planning to take place in an
empowering learning environment. In order to do so, we have focused on
– examining three types of systemic interactions (between learning stake-
holders, environmental applications and between learning and change) in
a peer evaluation and
478 Simon
collective actions (environmental practices), hence affecting the way in which they
initially perceived the environmental issue. Uncertainty and surprise also entered
the “equation” and contributed to the practical testing of the original understanding.
The social learning processes based on the environmental domains explored
in the prototypes are therefore made of a “mixture” of practical experience, per-
ceptions, and understanding of an environmental issue, provision of “scientific
information,” and uncertainty and surprise. By mixture, what I really mean is
that these components are linked to each other through the deliberation processes
facilitated by the prototypes.
The evaluation questions related to the links between environmental domains,
as well as their relevance from a systemic perspective, are presented in Table I.
These questions could be used and adapted by anybody designing a questionnaire
aimed at evaluating, in a systemic way, ICTs focused on environmental issues.
More questions could be asked, of course. We found that explaining why asking
these questions was relevant (right column of the table) proved to be an important
part of the design of the questionnaire as well as of the learning process involved
in identifying “what makes evaluation systemic.”
The outcome of our evaluation of the Virtualis water and agricultural proto-
types can be presented as a series of recommendations that are valid to anybody
trying to develop the same type of ICTs. They relate to the following themes:
– Systemic visualization of interrelated environmental issues: Illustrating
the links between environmental domains and issues should be helped by
the creative innovative representational systems provided by new ICTs.
Virtual reality, for instance, can help the user explore an environmental
domain and get an overview of how different issues and aspects of this
domains relate to each other.
– Links between geographical scales: The progression in learning between
the various types of Virtualis prototypes (e.g. between the personal barom-
eter, focused on individual environmental impacts, and the scenario gener-
ator, focused on aggregated impacts at the society level) allows the users to
understand the context of their impacts and which effects they have, once
combined with other stakeholders’ impacts.
– Links between the specific issue and the broader context: The instructions
given (text, videos, audio, animations) and the Virtual Reality tool in par-
ticular, should clarify the context within which the environmental issue is
being explored. We felt that this contextualization was potentially missing
from the prototypes we evaluated and that the integration of some hyper-
links to sites that either help understand the background or context better
could be a good starting point to correct this shortcoming.
– Social learning processes should allow users to understand the links be-
tween environmental domains through the perceptions of stakeholders
Systemic Evaluation Methodology 481
Issues covered—is the Here the evaluator identifies whether the prototype offers a reductionist
information provided view of environmental domains or whether it allows the users to
correct, up to date, appreciate the broader, systemic meaning of the issues presented in
systemic? the prototype.
Are the issues relevant to The links this evaluation question is intended to highlight are the links
the users? between the representation of reality in the ICT and the real life
experience of the user—how is the prototype highlighting such a
link?
Are the environmental “Contextualizing” is a crucial component of systemic approaches. In
issues covered order to appreciate the systemicity of an issue, one can be helped by
contextualized? exploring the boundaries that delineate the issue from its “external
environment.”
Concepts used—how do This evaluation question relates to the choice of concepts used to
they affect the users’ animate the curiosity and interest of the users on the subject. The
perceptions of the issue question is how do these concepts bridge the reality of the issues and
at stake? the perceptions of these issues by the users?
Are meanings clearly This evaluation question is intended to assess whether the understanding
explained? of what defines the issues and domains explored is both clear and shared.
The prototypes should help the users to share their understandings of
an issue.
Is the information on the The prototypes are intended to address current “priority issues.” We
environmental theme also need to ensure that the ways in which people’s perceptions,
precise and up to date? understanding, and sensitivity to certain issues evolve with time are
taken into consideration.
Are users likely to be This evolution of understanding and ways of addressing an issue can,
motivated to reuse the in part, emerge from using the prototype and through embracing an
prototypes? iterative mode of learning. The question is whether the prototype
can be used as an inviting means for people to make sense of various
environmental issues and to improve their understanding of
them—which might ideally include them being able to appreciate
the systemicity of environmental issues.
Are notions such as The environmental domain must be presented in a way that is not
complexity and simplistic nor reductionist. The prototypes must help in addressing
uncertainty related to the complex characteristics of the issue discussed, as well as the
the environmental uncertainties attached to it. The social learning process is intended
themes addressed in the to acknowledge and face both complexity and uncertainty as
prototypes? important characteristics of environmental issues to be taken into
consideration both in “environmental information” and in
environmental decision making processes.
Is the presentation of the The use of ICTs allows one to represent issues in alternative, creative
issues clear? Does it ways—we therefore believe in the importance of making the best
affect the users’ out of these ICTs capabilities in view of both animating the users’
perceptions of the curiosity and opening up their perspectives on the issue.
issues?
Is the learning structure The learning path designed through the prototypes must help the user
presented in the in developing a systemic cognitive representation of the
prototype clear and environmental issue.
understandable?
482 Simon
Table I. Continued
Does the prototype provide The advantage of using hyperlinks is that the users can concentrate on
hyperlinks to useful urls one main domain, while appreciating the ways in which they are
that contain information linked to other domains.
on these issues?
Does the prototype provide These might represent the four domains in different ways and this
hyperlinks to other types might help the users understand better the links between different
of similar ICT tools? types of environmental domains.
Is more than one Although the prototypes focus on one environmental domain and
environmental domain hyperlinks can be used to highlight the links with other issues, the
covered by the facilitation carried out within the prototype needs to explicitly make
prototype? it clear that, for instance environmental domains focused on in
different prototypes or suites of prototypes (for instance water and
agriculture) are very closely interrelated.
Here again, these questions can constitute the basis of other evaluation ques-
tionnaires of ICTs focused on social learning and participation. Indeed, debating
with other researchers like yourselves, about which other questions might be seen
as relevant in the context of systemic evaluation of participatory process would be
a social learning experience in itself.
The various ways in which social learning is to be promoted in the Virtualis
prototypes are through people’s sharing of knowledge and experience, deliberation,
and collaborative tasks. Certain skills have to be developed through the social
learning process. Thus, meaningful interaction of the form argued by Cobb (1996)
requires some negotiation of meanings, probing one another’s understanding, in
attempt to generalize meaning across different experiences. For social learning
to take place, the prototypes must therefore be tailored appropriately and certain
characteristics of ICTs have to be valued—for instance, the ability of people to use
these tools at a distance and to work on interactive tools. Hypermedia have in fact
been described by many as “enablers of social learning” (Jonassen, 1993, 1996)
in that they can help generate a shift from electronically presenting information to
providing support for the learner in constructing knowledge and deriving meaning.
Relevance of the subjects Often, people disagree on environmental issues; have different
covered and discussed objectives, interests, or constraints. For them to become interested
for a varied audience? in learning together and from each other, not only the facilitation of
these deliberations must be carefully designed but also people must
be interested in the subject to start with. The prototype must
highlight how relevant the issue is to the users.
Clear instructions People from different backgrounds, professions, views on an
concerning the environmental issue, do not communicate as often as would benefit
involvement and society. Learning from conflicts is something we are not used to
participation of the value and therefore people are not used to deliberating and learning
users in collaborative from each other in a participatory, nonexclusive way. The
learning? instructions included in the ICTs on how users can deliberate,
negotiate, collaborate, must be genuinely facilitating. These
instructions can really be the key to successful social learning.
Clear learning structures? Facilitating social learning processes is one thing, helping the
participants in capturing the learning from it is another. Whether
they are linear or not, the learning paths designed in the prototypes
must help the users in realizing that various learning steps have
been “climbed” and that something (learning and potentially
practical change) is emerging from this process.
Hyperlink toward other Social learning refers to what is being learnt about but also how
learning ICT learning is taking place. In terms of learning, participation and
platforms? collaboration comes into play not only regarding the exchange of
various views on an issue, or the exchange of information and
knowledge, but also when people are prepared to learn differently,
through different processes. Social learning can therefore also take
place when it comes to “learning to learn.”
Possibility of using the The type of participation that Virtualis is encouraging could involve
ICT learning tool at a people who live far from each other. This is one of the most
distance? interesting characteristics of ICTs that such prototypes should make
the most of.
Identification of targeted This should help the prototype designers in building some flexibility
users? (in terms of presentation of the material, type of help provided,
examples given, type of learning feedback provided, etc.) and in
helping participants in communicating better too.
Relevance of the modes There are different ways of learning about a same issue. To retain the
of learning for a varied motivation and attention of all types of participants, the modes of
audience of users? learning must be varied enough and appropriate to different types of
tastes and needs.
Promotion of autonomous For social learning to take place, people must trust each other and also
learning? their facilitator. The trust must not only focus on what is learnt and
how, but also on people’s belief that them becoming autonomous
learners is one of the main outcomes of this social learning process.
Bias taken into account in The existence of bias in the way in which issues and/or perspectives
collaborative learning? are sometimes presented in society has to be addressed in the
context of social learning process, not only in order to install an
atmosphere of trust but also to demonstrate a rigorous exploration
of knowledge and meanings.
Systemic Evaluation Methodology 485
Table III. Questions Focused on the Links Between Social Learning and Societal Change
Context of issues Often, stakeholders do not realize that a particular issue is actually
discussed in prototype practically relevant to them. Articulating this relevance to them and
and context within helping them understand in which context the ICT platform is used
which prototype used and useful then becomes crucial.
and developed?
Clear expression of Most users will want to know why they are doing what they are doing
objectives of the in the prototypes—the learning and practical outcomes (changes in
prototypes, context, practices) constitute the major motivators for the users.
and expected
outcomes?
Is the user motivated to go Maintaining the motivation of the users all along is crucial—otherwise
all through the learning the social learning process will be incomplete and meaningless. This
experience until he can be done both by articulating what has been learnt and how, plus
sees a result (reflected what is going to be explored next and also by reminding the context
through changes of within which this exercise is being carried out, i.e. why it is useful to
practice)? them.
Do the instructions help A component of “Social learning” lies in helping the participants
users realize when they become autonomous learners. The instructions must help them first
have learnt and how? by articulating such learning steps and outcomes, and then by
helping the participants in reflecting on learning and practices.
Gains in knowledge from Reflecting capacities developed throughout the learning experience
learning experience must help the users in articulating what was learnt, what changed
when using the through the learning processes. Not only the users should be able to
prototypes? reflect on this, but they should also be encouraged in articulating this
clearly.
Links between prior Various pedagogic strands have been developed to describe “Social
knowledge and learning” processes. One that we particularly value derives from
practice and outcomes experiential and “students”-centred learning: it is important that the
of learning? users realize (through the design and learning processes facilitated
in the prototypes) that their prior knowledge and experience is being
valued and genuinely contributes to the social learning process. If
too much comes from the “machine,” . . . we are back to an “expert
versus nonexpert” learning traditional process.
Common, collaborative Social learning is not only about exchanging knowledge and
construction of information but also about learning and working together. From a
meanings? social learning process, collective outcomes should emerge—the
construction of meanings, the concerted decisions to change a way
of managing a resource . . . The role of the facilitator/moderator in
the prototype becomes then absolutely crucial.
Change of attitudes This is (potentially) one outcome of the social learning process, one
toward the subject that the moderator must help the users realize. The prototypes could
explored as a result of provide information for people to be better equipped to actually
learning through using change and overcome the barriers in place that are preventing then
the prototype? from doing so.
Systemic Evaluation Methodology 487
This long-term evaluation process was carried out within the temporal and
logistical constraints of the project: the teams that focused on the construction of
the ICTs and those that focused on the pedagogic evaluation of these “tools” had
different ways of learning and working, different time constraints,4 and often used
different “languages.” And so, having carried out this type of peer evaluation, we
conclude that, regrettably, and even though it did respect some important systemic
principles, it was a rather linear, one-sided process. Above all, we learnt that the
systemic dimension we concentrated on had mainly focused on the design, content,
and future use of the prototypes but not enough on the interactions taking place
between the members of our own research team during the construction of the
prototypes. This was partly due to the fact that evaluating our own social learning
processes had become a sensitive issue:
– The coordination of the project proved to be difficult, not only because
people came from different disciplines and had different tasks to focus on
but also because, like in any European Research project, many partners
were involved (11 in our case, with at least 2 people involved per host part-
ner institution) and asked to provide deliverables that would complement
each other but which ended up doing so only at the end of the project’s life.
– Evaluating our own social learning process and the difficulties involved
when a genuine social learning process is taking place amongst people
coming from different disciplines and different cultures proved to become
a tricky exercise, from a political perspective. A critique of our learning
process was directly synonymous with a critique of our ability to do what
we actually researched on and . . . preached. We did carry out our evaluation
but made sure that it was presented as an outcome of our learning process,
something positive to build on—our funders, them, were keen to hear that
various evaluation processes had been taken place throughout the project
and that iterative modifications of the ICTs had helped in generating tools
that were “nearly perfect.”
– The timing related to the delivery of the material to evaluate, thus, did not
allow for any substantial iteration and modification, therefore making the
evaluation exercise appear as being quite linear.
If more time had been allocated to do so, the ICT evaluation process would have,
in fact, greatly benefited from having the prototypes designers involved in taking
part in the design of the evaluation questions. In effect, we wondered whether
the improvement of the prototypes wouldn’t have been more effective if we had
facilitated the collective design of the evaluation questionnaire with the designer
4 The evaluation team was dependent on the production of the tools it was to evaluate and the “ongoing”
team—rather than asking that team to fill the evaluation questionnaire once it
had been written. We attempted to address this shortcoming in our evaluation
methodology in a participatory evaluation session we carried out with potential
future users, as described in the next section.
be, from a social learning perspective, a mistake. What works and what doesn’t,
from a point of view of learning, depends on the learner and the facilitator. The
Open University makes a point of concentrating on student-centred experiential
learning, where the “teacher” is a facilitator. As a complement to participatory eval-
uation processes carried out in the context of other technological developments
(such as those focused on Genetically Modified Organisms, Joss and Bellucci,
2002), the OU Virtualis team therefore focused its participatory evaluation pro-
cesses on enabling certain recommendations and collective outcomes to emerge,
hence empowering the “users” by allowing them to take part in the design of the
(communication) technology itself.
The session therefore focused on
– honouring people’s prior knowledge on how ICTs can promote social learn-
ing and why;
– identifying what evaluation criteria seem to be of particular importance in
a broad brainstorming session organised around three themes
– the relevance of ICTs to improve people’s awareness and participation
in environmental debates and decision-making processes
– the relevance of ICTs to motivate learning processes
– ICTs, social learning and change
– discovering the prototypes in small groups
– filling the evaluation questionnaires
– discussing about the questionnaires and
– identifying ways in which the prototypes could be best disseminated and
used.
The evaluation of the prototypes constituted, in fact, a platform for broader
reflections and recommendations on how ICTs can help promote social learning.
A variety of recommendations emerged from the session.
have different power levels.” Ultimately, what emerged from this first discussion
is that “the question [of whether ICTs can facilitate participation] is impossible
to answer. The technological tools are neutral: what needs to be evaluated is how
they are used—we can only evaluate the actual usage of tools.” The discussion
then focused around three areas. First, tutors were keen to address the question
of “participation around what issues?.” On the one hand, people acknowledged
that the use of ICTs can help in providing up to date information, from a range of
sources. But they expressed concerns regarding the choice of information provided
to people. Would people trust this information? How relevant would this informa-
tion be to people if they don’t choose it? Will the presentation of information in
ICTs make it appear as unimportant because part of a “game?” And, beside, would
the use of such tools in the arena of decision making dictate what type of issues,
selected, “frozen” by decision makers, would be discussed? Would the choice of
issues discussed be updated, maintained frequently enough?
This led the participants to discussing who would take part in using such ICTs
platforms. The fact that these tools are targeted at the “general public” generated
more scepticism than enthusiasm in the group, as expressed by one of them:
“Are this form of participation and choice of issues equivalent to ‘the others’
telling us (the general public) what to do—while commercial organizations are still
not changing their understanding of environmental issues nor their environmental
practices?.” Beside the question of “who are these tools for?,” the question of
“who can they realistically be for?” was raised. While the issue of accessibility
was regarded as less and less of an obstacle, at least in European countries, the
questions of who would be motivated to use such tools and how to bring these
issues to the attention of a wide audience was discussed at length. As was stressed,
local (environmental) authorities websites—including the ones with consultation
platforms—do exist but it doesn’t mean that people take part in using them. It was
emphasized that, ideally, a maximum user coverage, including corporations, was
needed for an effective use of these prototypes. In other words, the use of these ICTs
by a self-selected audience who would, anyway, be interested in these issues, would
somehow defeat their object—they would “preach to the already converted.” Much
more of a challenge would be to direct the attention of a wide range of people to the
existence of these tools by, for instance, using other media (radio, TV, magazines,
etc.). Interestingly, many tutors emphasized the importance of the “human factor”
in promoting the use of such ICTs. Thus, “young people might come to use these
ICTs by browsing or by ‘word of mouth’ between their peers and school pupils,
excited by the use of ICTs, will help older generations in getting familiar with and
interested in them.” So the non-ICT aspect of the initial introduction to the tools
was presented as having a major importance, somehow beyond the way in which
the ICTs were designed. The variety of media encompassed in ICTs was therefore
presented as having a positive impact on their users. But it was recognized that
users had to be ICT literate in order to make the participation processes more
492 Simon
effective— or else, the navigation tools included in the ICTs, the way in which
the information is presented and the interactive tools integrated in them, have to
genuinely enable people to comfortably take part in the social learning process.
The tutors spontaneously divided the second brainstorming, focused on the
relevance of ICTs to facilitate learning processes, in two parts:
a) The first point focused on “motivation ingredients” for learning. From a design
perspective, tutors highlighted the fact that the learning instructions, the facilita-
tion within the ICTs, need to address the issue of “retention of knowledge and
understanding”—and therefore reflective and self-assessment questions need to
be included. The temporal dimension of the learning processes was also related
to the design of the ICTs in that the tutors stressed the fact that the ICT users need
to have time to follow the learning through each prototype. Beside, and amazingly
so, the principal focus of discussion became “interaction, interaction, interaction”
as the best motivation to learn. The main advantage of using ICTs as learning
processes was explained by the fact that these facilitate “learning from others.”
“The interaction component is the crucial motivator for deeper learning.” The use
of ICTs for doing so was presented as particularly interesting because still novel to
the users: this means of learning is progressively breaking down barriers for long
put between education and leisure and the learning experiences can be varied with
the user being in control of his/ her own learning path.
b) The second point covered in this brainstorming session was on the characteristics
of ICTs that would help in promoting participation and interactive learning. Many
advantages in using ICTs as learning platforms were highlighted (they take out
some of the tiresome reading; they offer all sorts of multimedia potential (sound,
sight, interactive environments, 3D representations of information, etc.). They were
seen by the group as generally giving learning more scope. However, the tutors
also highlighted the fact that, in order to make the best use of these new tools,
people must be trained to use them, like they would use a new language.
In the third brainstorming session, we addressed the issue of transformational
change—i.e. how the learning processes experienced in such ICTs could result in
practical changes in the users’ ways of life, for instance. The main change that
the group highlighted was the fact that “ICTs will certainly change the way in
which we socially interact.” But whether these changes are for the better or for the
worse depends on the overall structured framework and societal strategy within
which these changes are taking place. As one tutor stressed “affordable housing
is high on the agenda in the UK, but what about helping with making computers
affordable to more people?.” This would help bring new learning opportunities to
previously excluded groups (disabled, people for whom the cost of higher education
is prohibitive, etc.). Beside, the use of ICTs as educational/learning tools will
have to be introduced carefully since, whether these ICTs are supposed to be
participatory or not, “current evidence appears to suggest that the use of ICTs in
education reinforces societal divisions.”
Systemic Evaluation Methodology 493
The prototype worked beautifully and the navigation was faultless. The com-
ments therefore focused more on the content, assumptions, guidance, and presen-
tation of the material.
Interestingly, most comments were focused on the fact that there was a crucial
need for the contextualization of both the content of the learning and the learning
tool itself. In other words, the reason(s) why any stakeholder might be interested in
using this tools, how they would be invited to do so, why they would be interested in
focusing on an environmental domain such as water when they perceive that there is
enough water as it is in a country like the UK, etc. . . . needed to be presented, if not
in the prototypes themselves, at least in a “learning support” that would introduce
them. Using participants’ knowledge, as opposed to mere data and information,
was also highlighted as one of the main issues.
The users also found the prototypes relatively dry and it is noticeable that,
even amongst older generations, the expectations for up to date graphics and sounds
in such ICT tools has become very high.
5. CONCLUSION
Evaluating ICTs from a systemic perspective can mean different things.
One fundamental principle of evaluation is to carefully reflect on the con-
text within which these tools are going to be used, whom by, why, and to what
extent these ICTs can offer what other tools apparently cannot. In this paper, we
ACKNOWLEDGMENTS
Acknowledgments to James Fisher and Susan Ballard who contributed to the
Virtualis research project at the OU, as well as to Wendy Fisher who helped with
the organization of the evaluation session in Leeds.
REFERENCES
Clayton, A. M. H., and Radcliffe, N. J. (1997). Sustainability. A Systems Approach, Earthscan, London.
Cobb, P. (1996). Where is mind? A Coordination of sociocultural and cognitive perspectives. In
Fosnot, C. T. (ed.), Constructivism: Theory, perspective and practice. Teachers College, Columbia
University, New York.
Flood, R. L. (1999). Rethinking the Fifth Discipline. Learning With the Unknowable, Routledge,
London.
Hoekstra, A. Y. (1998). Perspectives on Water. An Integrated Model-Based Exploration of the Future,
International books, Utrecht, The Netherlands.
Jonassen, D. H. (1996). Scaffolding diagnostic reasoning in case-based learning environments. J.
Comput. High. Educ. 8(1), 48–68.
496 Simon
Jonassen, D., Mayes, T., and McAleese, A. (1993). A manifesto for a constructivist approach to uses of
technology in higher education. In Duffy, T. M., Lowyck, J., and Jonassen, D. H. (eds.), Designing
Environments for Constructivist Learning Springer-Verlag, Berlin, pp. 231–247.
Joss, S., and Bellucci, S. (eds.). (2002). Participatory Technology Assessment—European Perspectives,
Centre for the Study of Democracy, London.
Marten, G. (2001). Human Ecology. Basic Concepts for Sustainable Development, Earthscan, London.
Martinez-Alier, J. (2002). The Environmentalism of the Poor. A Study of Ecological Conflict and
Valuation, Edward Elgar, Cheltenham, UK.
Newson, M. (1997). Land, Water and Development. Sustainable Management of River Basin Systems,
2nd edn. Routledge, London.
OECD Information Technology Handbook. ICTs, E-commerce and the information economy (2000).
Organisation for Economic Co-Operation and Development, Paris.
Senge, P. (1990). The Fifth Discipline: The Art and Practice of the Learning Organisation, Century,
London.
Stowell, F. A., Ison, R. L., Armson, R., Holloway, J., Jackson, S., and McRobb, S. (1997). Systems for
Sustainability. People, Organizations and Environments. Plenum, New York.