Deprecated: preg_replace(): The /e modifier is deprecated, use preg_replace_callback instead in /home/aes40246/public_html/2014/libraries/rokcommon/RokCommon/Service/ContainerImpl.php on line 464
Tuesday pre-conference workshops

Tuesday pre-conference workshops

Tuesday 9 September 2014

A printable version (PDF) of the pre-conference workshop program can be downloaded here.
View Monday program here.

Click on the workshop titles to view the workshop description.

9am –12:30pm

Data visualisation in evaluation (full day)

Ellen Vasiliauskas

> register


First, Do No Further Harm (full day)

Gill Westhorp

> register

Unleashing the power of Story in evaluation: Performance stories, episode studies and the most significant change technique
(full day)

Jess Dart; Lee-Anne Molony

> register


12:30pm LUNCH
1:30 – 5pm [Ellen Vasiliauskas

An Introduction to the Essential Competencies for Program Evaluators
(half day)

Jean King

> register

Designing and Embedding Strategic Learning and Performing Systems for Organisations, Departments, Programmes and Projects
(half day)

Annalize Struwig; Kate Averill

> register

[Gill Westhorp
[Jess Dart; Lee-
Anne Molony continued]


Data visualisation in evaluation

presented by Ellen Vasiliauskas; d-sipher, Sunshine Coast, Queensland, Australia

Ellen is an experienced evaluator, having project managed, designed and undertaken major evaluation and market research projects for over 20 years for large and small government agencies as well as not-for-profits, and the private sector. Her particular interest is in data visualisation and logic modelling of complex projects to assist in applying evaluation learnings to strategic and policy contexts. She holds a Master in Neuro Linguistic Programming and is a Master Business and Personal Coach.

In this workshop, participants will gain an overview of the meaning of data visualisation as it applies to evaluation and current trends. They will learn techniques for applying data visualisation to reporting and presentation of results in evaluation. The benefits of these techniques are many – from more effective communication of results and clarity on the meaning of large amounts of complex data to better engagement of decision makers and recall of information.

However, not everything that looks good communicates meaning. This workshop will guide participants through the good and bad of data visualisation and applications to quantitative and qualitative data and reporting. Participants will gain knowledge in various methods and processes of data visualisation in quantitative and qualitative applications. Techniques will include modelling, developing visual maps, the use of metaphor and storytelling, use of colour to convey meaning, graphical presentation, mind maps, storyboards and so forth.

The workshop will explore:

  • Definitions of data visualisation as it applies to evaluation.
  • The case for data visualisation and the needs of stakeholders.
  • How audiences absorb data and information.
  • How does data become clear, intuitive and even fun?
  • What makes for clear data visualisation and what makes things murky?
  • Strategies for enhancing visual presentation and reporting.
  • Rules for charts and simple quantitative presentation formats.
  • Options for visualising qualitative data.

The aims of this workshop are:

  • to understand broad directions and trends in data visualisation
  • to understand the meaning of data visualisation as it applies to evaluation
  • to understand the relevance of data visualisation and needs of evaluation audiences and stakeholders
  • to understand the difference between good and bad data visualisation
  • to understand fundamental rules for charts and simple quantitative presentation formats
  • to explore options for visualising qualitative data
  • to be aware of the resources available in data visualisation.

Learning strategies will include individual and group exercises as well as whole group Q&A. Participants are invited to bring their quantitative or qualitative data visualisations to discuss. Examples will be selected and the techniques learnt applied in group exercises. Target group and prerequisites including the level of experience the workshop is pitched at Beginners and those new to data visualisation.

This workshop will enhance participants’ capability to more effectively communicate evaluation findings and engage with their stakeholders.

> back to overview  > register

First, do no further harm

presented by Gill Westhorp; Director of Community Matters Pty Ltd

Community Matters is a consultancy firm specialising in realist research and evaluation methodologies. Gill is also a University Fellow at Charles Darwin University and an Associate in the School of Global, Urban and Social Studies at RMIT University.

One of the contributions of realist methodologies in evaluation has been highlighting that programs work differently for different people. Net program impacts are by definition an average – implying that for some people, the results are less positive or indeed negative. Negative impacts are sometimes concentrated amongst the most disadvantaged. This is a particular dilemma in social services programs which seek to improve outcomes for disadvantaged groups.

The purpose of the workshop is to begin to address a vexed question: how can evaluation assist policies and programs to avoid doing harm to disadvantaged groups?

In the first session, some of the evidence that programs can and do ‘do harm’ will be presented, and principles of complexity theory and realist evaluation will be introduced. The second session will sketch an original theoretical framework for understanding how and why social programs can make things worse for the disadvantaged. Workshop participants will be asked to use this framework to sketch how a policy or program they work with might cause further harm. Principles for programs to avoid doing further harm will then be proposed and participants will be asked to sketch how their policy or program might be modified to avoid doing further harm. In the third session, a stakeholder simulation and a rotating panel of experts will be used to develop ideas for evaluation approaches and roles for evaluators that might be used to assess whether programs are doing further harm, to whom, and how programs might be modified to avoid harm and build benefits. In the final session, the implications for three steps in evaluation design (evaluation questions, program theory and data) will be examined, using activities grounded in the participants’ own policies or programs.

This workshop is suitable for intermediate and advanced evaluation practitioners and for commissioners with a solid understanding of evaluation practice.

> back to overview  > register

Unleashing the power of Story in evaluation: Performance stories, episode studies and the most significant change technique

presented by Dr Jess Dart; Clear Horizon Consulting; Melbourne, Australia and
Lee-Anne Molony; Clear Horizon Consulting; Melbourne, Australia

A good story defines relationships, a sequence of events, and cause and effect - and those elements are likely to be remembered as a complex whole. If stories about the impact of a program can infiltrate the collective memory of an organisation or community, the members will gain and retain a more deeply shared understanding of what is being achieved. This not only helps communicate achievements but also creates a common base for dialogue about what is desirable in terms of expected and unexpected impact.  

This workshop introduces and explores three contemporary approaches to using story in evaluation (Performance stories, episode studies and the most significant change technique). Each technique works at very different scales: from the story of a whole program; the story of a policy change and the story of an individual beneficiary. Regardless of scale, all techniques harness the structure and depth that story offers and can breathe life into a mixed-method evaluation process.

Story techniques can form a key part of an effective monitoring and evaluation plan, or an evaluation study. They can draw out changes that are intangible and emergent, and work particularly well in complex programs. They also complement more indicator-based approaches to monitoring.

Using participatory techniques, this full day workshop is aimed at beginner to intermediate participants. It provides an overview of each method, examples of how they have been used and explores their differences and the contexts in which they are best applied.

> back to overview  > register

An Introduction to the Essential Competencies for Program Evaluators HALF-DAY (pm)

presented by Jean King; Distinguished Teaching Professor, Department of Organizational Leadership, Policy and Development, University of Minnesota

This workshop is designed to teach participants the Essential Competencies for Program Evaluators, a set of knowledge, skills, and attitudes in six categories. The session will begin with the analysis of program evaluation vignettes representing diverse areas of practice to show both the common competencies across settings and those unique to specific contents or contexts. Following a brief history of how the competencies were developed, the session will then examine the competencies in all six categories: professional practice, systematic inquiry, situational analysis, project management, reflective practice, and interpersonal skills. This discussion, which builds on the continuum of interpersonal evaluation practice, will ground participants in the competencies’ content and allow people to ask questions as they think about their own evaluation practice. After a short break, participants will develop concept maps to explore how the competencies make sense in their roles or content areas. Comparative discussion will further illuminate the competencies, and then participants will complete a self-assessment tool and discuss how to set priorities and action steps for professional development. Most of the session will consist of interactive exercises with just enough lecture to frame the discussion.

You will learn:

  • The Essential Competencies for Program Evaluators
  • How to assess your own competencies to identify knowledge and skill gaps
  • How to improve your practice through strength-based reflection

> back to overview  > register

Designing and Embedding Strategic Learning and Performing Systems for Organisations, Departments, Programmes and Projects HALF-DAY (pm)

presented by Annalize Struwig; Evaluation Consult; Wellington, New Zealand and
Kate Averill; Evaluation Consult; Wellington, New Zealand

This workshop focuses on key principles and practical tools for designing and embedding strategic learning and performing systems within organisations, programmes and projects. This systematic, innovative approach utilises transformative evaluative thinking to enhance connections between planning, implementation and management processes. The value of linking strategy and evaluative techniques within organisations, programmes and projects is demonstrated using practical examples.

Participants will gain knowledge and understanding of how to design and embed learning and performing systems within organisations, programmes and projects using a combination of systems mapping, integrated evaluative design and management with organisational psychology. The workshop focuses on the use of practical tools for designing integrated learning and performing systems including organisational culture, competencies and capability pathways to extend evaluative design and capability within organisations, programmes and projects. The half day workshop will be interactive and is aimed at managers, programme personnel and evaluation practitioners. The implications of the transformative evaluative practices being mainstreamed and the emerging role of specialist evaluation practitioners will be discussed.

> back to overview  > register