hero-pic

Workshops

2013 Canadian Evaluation Society Conference – Toronto | June 9 – 12

CES Professional Development Workshops

Workshops are held on Sunday, June 9, 2013
Language of presentation is English unless otherwise noted

Download Print-Friendly Version →

 

Competencies for Canadian Evaluation Practice

Workshop facilitators identified up to three Evaluation Competencies to be addressed in their workshop. Click to view a summary of the Competencies for Canadian Evaluation Practice.

Full Day Workshops

(1) Empowerment Evaluation – Beginner
(2) Handling Data from Logic Model to Final Report – Intermediate
(3) Using Innovative ICT Tools for Effective Evaluation of Social Impact - Intermediate
(4) CANCELLED: Web Analytic Toolkit for Evaluators – Intermediate
(5) Testing the Logic in Logic Models: Lessons of Experience – Intermediate
(6) Causal Inference for Qualitative and Mixed Methods – Intermediate
(7) Rapid Impact Evaluation – Intermediate/Advanced
(8) Lessons from Using Culture-based Approaches in First Nations Settings – Advanced

Morning Workshops

(9) CANCELLED: Making Your Case: Needs Assessments that Get Results
(10) Project Management for Evaluators – Beginner
(11) Evaluation of Training Programs – Intermediate
(12) CANCELLED: SOAR: A Unique Approach in Strategic Planning – Intermediate
(13) Building Evaluation Capacity and Culture – Intermediate
(14) CANCELLED: Development of Logic and Process Models to Identify Performance Indicators (In French) – Intermediate
(15) Conducting Research on Evaluation and Getting Published – Intermediate
(16) Advanced Issues in Evaluation Survey Research and Design – Advanced

Afternoon Workshops

(17) Demonstrating Research Impact – Beginner
(18) Using Qualitative Data Analysis Within and Across Settings - Beginner
(19) CANCELLED: An Executive Summary is Not Enough – Beginner
(20) Smart Data Visualization – Intermediate
(21) Write to be Read – Intermediate
(22) CANCELLED: International Development and Accounting for Change
(23) Process of Learning in a Developmental Evaluation – Intermediate/Advanced
(24) Advanced Approaches to Evaluation Frameworks – Intermediate / Advanced


1. Empowerment Evaluation

Empowerment evaluation builds program capacity and fosters program improvement. It teaches people how to help themselves by learning how to evaluate their own programs. The approach is guided by process use – the more that people conduct their own evaluations the more likely they are to find their findings and recommendations credible and the more likely they are to use them. Key concepts include: a critical friend, cycles of reflection and action, and a community of learners.

The basic steps of empowerment evaluation include: (1) Establishing a mission; (2) Taking stock – creating a baseline; and (3) Planning for the future – establishing goals and strategies to achieve objectives. Actual performance is compared with benchmarks and goals.

Employing lecture, activities, demonstration and discussion, the workshop will introduce you to the steps of empowerment evaluation and tools to facilitate the approach. This workshop will also highlight how empowerment evaluation produces measurable outcomes with case studies.

You will learn:

  • Basic steps of empowerment evaluation.
  • Key concepts guiding the approach (focusing on accountability).
  • Selecting appropriate technological tools to facilitate an empowerment evaluation.

Evaluation competencies to be addressed include:

Dr. David Fetterman is the founder and major proponent of empowerment evaluation. He is the author of three books on the topic and published Empowerment Evaluation in the Digital Villages: Hewlett Packard’s $15 Million Race Toward Social Justice (Stanford University Press). David maintains the Collaborative, Participatory, and Empowerment Evaluation blog for the American Evaluation Association. Fetterman is a past-president of the American Evaluation Association and the recipient of the highest honors in theory and practice from the American Evaluation Association.

Level: Beginner
Prerequisites: None
Scheduled: Sunday June 9th from 9:00am – 4:00pm


2. Handling Data from Logic Model to Final Report

This workshop has drawn large audiences for the last three years at the Summer Institute sponsored by the Centers for Disease Control and the American Evaluation Association. Learn how to collect, analyze, and present data from complex evaluation studies in ways that are feasible for the evaluator and meaningful to the client. Based on her more than twenty-five years of consulting experience, Gail will share some hard-won lessons about how to interact with stakeholders, ask the right questions, collect the right data and analyze and present findings in useful ways.
You will have the opportunity to work in small groups to tackle some common data collection and data handling problems. Actual work samples will be provided. At the end of the workshop, you will take away some fresh ideas and a number of useful tools and techniques for application in your program evaluation context.

You will learn:

  • How to use a logic model as a study scaffold.
  • How to ask the right evaluation questions and to develop a data collection plan.
  • How to analyse, summarize, integrate and report data.

Evaluation competencies to be addressed include:

Gail Barrington was recently named to the Dynamic Dozen, the top-rated presenters at the American Evaluation Association. For more than 25 years, she has owned and managed her consulting firm, Barrington Research Group, Inc. Gail has conducted over one hundred program evaluation and applied research studies from the federal to the grassroots level and has developed many ways to collect, organize and analyze complex data and to prepare excellent reports and recently published Consulting Start-up & Management: A Guide for Evaluators & Applied Researchers, (SAGE, 2012).

Level: Intermediate
Prerequisites: Some evaluation experience,elementary knowledge of logic models and their use, some knowledge of qualitative and quantitative analysis and report writing
Scheduled: Sunday June 9th from 9:00am – 4:00pm


3. Using Innovative ICT Tools for Effective Evaluation of Social Impact: Case Studies from Africa and Live Demonstration

This workshop will introduce you to innovative ICT (Information and Communication Technologies) tools and techniques to measure and report project / programme outcomes to your stakeholders (e.g. donors, funders, supervisors or the general public). You will become familiar with the components of an effective monitoring and evaluation plan using ICT, and methods and tools to conduct data collection, statistical analysis and reporting.

Through highly interactive and personalized coaching, we will cover the following subjects:Differentiating supervision, monitoring & evaluation and research; Components of a successful monitoring & evaluation plan; ICT based M&E and advanced ICT Tools (i.e. Mobile Phone and Tablet PC based data collection, storage and dissemination); Data types and data collection challenges; Quantitative and qualitative methods and Writing an effective research publication and M&E report.

You will learn:

  • Use of ICT tools for data collection and analysis.
  • Use of GIS for mapping targeted interventions.
  • Concepts of M & E, Research and Impact Evaluation.

Evaluation competencies to be addressed include:

Dr. Valentine J Gandhi is a development economist and knowledge manager with 11 years experience in Asia and Africa working at both the policy and grassroots level. He is the founder of The Development CAFÉ. Valentine is also a consultant for several international donors and UN Agencies on Impact Evaluation and Organizational Capacity Building, Team Building and Gender Training.
Mrs. Vida Razavi is a sociologist who integrates ICT Based tools in her research particularly on activating empathy among school children. She is editor for an international journal run by Development CAFE called The Development Review.

Level: Intermediate
Prerequisites: Some evaluation experience,elementary knowlege of logic models and their use, some knowledge of qualitative and quantitative analysis and report writing
Scheduled: Sunday June 9th from 9:00am – 4:00pm


4. Web Analytic Toolkit for Evaluators

NOTE: This session is no longer offered

This workshop will provide an introduction to web analytics and their use in program evaluation. Participants will gain an understanding of how web analytics can be used in program monitoring and evaluation, learn about common indicators and metrics and how to create reports and dashboards, and gain first-hand experience setting up web analytics.
The workshop is designed for program evaluators with little to no experience with web analytics, who would like to expand their capacity to incorporate web and social media data into their monitoring and evaluation work. The workshop will also appeal to program managers who want to better understand how these metrics can be used to optimize program performance.

You will learn:

  • How to apply web analytics in program monitoring and evaluation.
  • Common indicators and metrics available in web analytics.
  • First-hand experience setting up web analytics.

  


5. Developing and Testing the Logic in Logic Models: Rules and Lessons of Experience

The workshop is designed to explore the use of logic models in program and project planning and evaluation. Various types of logic models will be presented and explored. The commonly used definitions will be presented and explored. A set of tests developed to assist in verifying the consistency of logic models will be presented. Each test will be used by the participants to test their logic models or logic models developed by the instructor. Participants will have the opportunity to add their own logic model tests and explore them with workshop participants. Examples from health, international development, and agriculture will be explored. Approaches commonly used by governments, not for profits and other agencies will be discussed.

You will learn:

  • To describe the various types of logic models.
  • To identify ways to test the logic.
  • To apply the tests to programs in various sectors: health, international development, etc.

Evaluation competencies to be addressed include:

Dr. Harry Cummings teaches programme evaluation at the University of Guelph. He has run workshops for professional development purposes for 20 years and has regularly presented and run workshops at CES and EES conferences. He has published on logic models and results based evaluation. He recently (2012) designed the two day logic model course for CES.

Level: Intermediate
Prerequisites: Some experience with logic models
Scheduled: Sunday June 9th from 9:00am – 4:00pm


6. Causal Inference for Qualitative and Mixed Methods

“Causation: The relation between mosquitoes and mosquito bites. Easily understood by both parties but never satisfactorily defined by philosophers and scientists.” – Scriven (1991).
Many people argue that causal inference simply can’t be done without large-scale quantitative studies, high-powered statistical techniques, and the ability to control the program or intervention. But aren’t there ways to get an approximate answer to the causal question, even using qualitative or mixed method evidence? I think there are – and that’s what this workshop is about.

This workshop will demonstrate eight practical, common-sense strategies to build an evidence base for causal contribution: (1) Ask observers; (2) Match content to outcomes; (3) Modus operandi; (4) Logical timing; (5) Dose-response link; (6) Comparisons; (7) Control variables; (8) Causal mechanisms. You will learn how a judicious mix of evidence can be woven to build a case for a causal claim – to a level of certainty that makes sense in that context.

You will learn:

  • Eight practical, common sense strategies for causal inference.
  • How to build causal elements into interview and survey questions.
  • Using multiple sources of evidence to build a case for causal contribution.

Evaluation competencies to be addressed include:

Dr. E. Jane Davidson is author of Evaluation Methodology Basics: The Nuts and Bolts of Sound Evaluation (2004, Sage Publications) and Actionable Evaluation Basics: Getting succinct answers to the most important questions. She runs a popular blog, GenuineEvaluation.com, with Dr. Patricia Rogers who together share a commitment to improving the quality of evaluation and an unwillingness to accept credentials or power as a substitute for quality. Jane delivers keynote addresses and professional development workshops internationally, and as well as webinars and other evaluation learning opportunities online at http://RealEvaluation.com. She won the American Evaluation Association’s Marcia Guttentag Award in 2005.

Level: Intermediate
Prerequisites: At least some practical experience with conducting, comissioning, or using outcome evaluation
Scheduled: Sunday June 9th from 9:00am – 4:00pm


7. Rapid Impact Evaluation

To date, rapid impact evaluations (RIAs) have required less than two months, cost under $25,000 USD and can perform well on the usual tests of validity and reliability. RIA can be used ex-post for summative purposes or ex-ante as part of formative or developmental evaluations. RIA triangulates the expert judgment of three different classes of experts to arrive at judgments of the incremental change in effects attributable to the intervention. It employs the ‘negotiated alternative’, a new approach to counterfactuals, as well as a simplified approach to measurement. RIA is designed to achieve high levels of use/influence and can result in highly valid and reliable evaluation judgments of the incremental change in targeted impacts attributable to the intervention.

This workshop introduces the RIA approach to impact evaluation using small groups and group discussion interspersed with short descriptions provided by the facilitator.

You will learn:

  • Principles of the Rapid Impact Evaluation approach.
  • Principles of evaluation in natural resource and sustainable development settings.
  • How a method can be successfully designed to improve prospects for use and influence.

Evaluation competencies to be addressed include:

Andy Rowe has over three decades experience in evaluation and has worked in North America, Africa, Asia and Europe. He is a former President of the Canadian Evaluation Society and has a PhD from the London School of Economics. He now works mainly in natural resource settings. He has developed several innovative methods to address the gaps in evaluation practice including his Rapid Impact Evaluation approach.

Level: Intermediate / Advanced
Prerequisites: Experience and knowledge of different purposes of evaluation and familiarity with evaluating outcomes
Scheduled: Sunday June 9th from 9:00am – 4:00pm


8. Lessons from and the Practice of Using Culture-based Approaches in First Nation Settings

Utilizing culturally focussed facilitation methods, such as open space, photo visualization, group discussion, wax modelling, and breathing techniques; this workshop will review appropriate Aboriginal Evaluation Methods and discuss how to overcome difficulties. We will also review various / multiple ways in which cultural ways of knowing and technology can be used to enhance Aboriginal evaluations by examining our past experiences. We will then review a culture-based evaluation success story and participants will investigate ways to enhance the evaluation with cultural teachings and technology. Finally, we will look at our experience and lessons learned from the field as well as look at some of our reports when we have used cultural knowledge and technological evaluation tools and discuss how these methods could enhance your reports.

You will learn:

  • Appropriate Aboriginal evaluation methods and how to implement evaluations that work in First Nation settings.
  • Various methods for using technology in evaluation.
  • Success stories in culture-based evaluation.

Evaluation competencies to be addressed include:

Andrea L. K. Johnston CE is CEO of Johnston Research Inc. With 15 years of Aboriginal evaluation experience, she has managed over 120 local, regional and national projects. Andrea is recognized as an expert in the use of Aboriginal indigenous knowledge in evaluation. In 2010 she received the CES-ON 2010 Excellence in Evaluation Award. She was guest editor for the Canadian Journal of Program Evaluation (CJPE) Winter 2010 edition and authored two papers: ‘Aboriginal Ways of Knowing: Aboriginal-led Evaluation’, and ‘Using Technology to Enhance Aboriginal Evaluations’. Andrea currently chairs the Board of the CES-Ontario Chapter.

Lori Meckelborg, B.A (Hones.) is an experienced researcher with over ten years of experience in project management and consulting in an Aboriginal context. Lori’s work has focussed on population health, program evaluation and strategic planning. Lori’s interests include participatory approaches to program evaluation, ethics in evaluation and culturally responsive evaluation.

Level: Advanced
Prerequisites: At least 2 years evaluation experience and understanding of basic evaluation methodology practices
Scheduled: Sunday June 9th from 9:00am – 4:00pm


9. Making Your Case: Needs Assessments that Get Results – has been cancelled

NOTE: This session is no longer offered

As an evaluator you are conscientious about conducting the best evaluation possible, but how much thought do you give to communicating your results effectively? Do you consider your job complete after submitting a lengthy final report? Reporting is an important skill for evaluators who care about seeing their results disseminated widely and recommendations actually implemented.

This interactive workshop will present an overview of three key principles of effective reporting and engage participants in a discussion of its role in effective evaluation. Participants will leave with an expanded repertoire of innovative reporting techniques and will have the opportunity to work on a real-life example in groups.

You will learn:

  • The role of communication and reporting in good evaluation practice.
  • Three key principles for communicating results effectively.
  • Four innovative reporting techniques.

10. Project Management 101 for Evaluators

In a time when resources are strained and organizations are required to produce more with less, project management is the key to delivering projects in a timely and cost-efficient manner. Using a case study format, this workshop will provide evaluators new to project management with practical tools and techniques to identify and manage project risks by planning and managing schedules and resources, both human and financial, using project management techniques.
Topics to be covered include the Project Life Cycle (i.e. project initiation, planning, execution, closing) and Project Constraints and the ‘project management triangle’ (i.e. time, cost, scope).

You will learn:

  • Project life cycle phases and how to identify the logical sequence of activities/processes to facilitate accomplishment of project goals.
  • To apply tools and techniques to address project constraints such as scope, time and budget.
  • To define activities, determine sequencing, estimate resource requirements, estimate duration of activities and develop a time-phased project schedule.

Evaluation competencies to be addressed include:

Judy Lifshitz M.S.W., P.M.P, C.E is an Evaluation Manager in the Office of Audit and Evaluation at Agriculture and Agri-Foods Canada. Her responsibilities include evaluation planning, conducting and reporting. Judy is a member of the CES National Capital Region, Professional Development Committee and Core Mentoring Working Group.

Level: Beginner
Prerequisites: None
Scheduled: Sunday June 9th from 9:00am – Noon


11. Evaluation of Training Programs

Often we read that participants enjoyed a training program. While participants’ affective responses to the program are important, how do we know the training program worked to impart knowledge, change behaviours and meet its designed outcomes?

This intermediate level workshop will build interactively on participants’ knowledge of logic models and introduce them to Kirkpatrick’s and Guskey’s levelled approaches to training program evaluation. Hybrid approaches to evaluating training programs, using logic models and the levelled approaches as frameworks will be explored. These approaches allow a more fulsome look at the effects of training programs, providing funders and program managers with many sources of data on which to base their decisions. Participants will develop and take away a comprehensive evaluation plan used throughout the session based on their learnings and prior experiences. Data collection, analysis and visualization approaches to match the various stages/levels of the hybrid approaches will also be discussed and demonstrated.

You will learn:

  • The basics of Kirkpatrick’s and Guskey’s levelled approaches to training program evaluation.
  • How to integrate logic models and the levelled approaches into a hybrid approaches that will be useful in evaluations training programs.
  • To develop an evaluation plan for a comprehensive evaluation of training programs.

Evaluation competencies to be addressed include:

Dr. Sid Ali, CE, is an experienced educator and researcher with expertise in educational measurement and program evaluation. He earned an M.Ed. and Ph.D. in Measurement and Evaluation from the University of Toronto. Sid is a strategic thinker and has conceptualized, implemented and managed multiple large-scale program development and evaluation projects.

Level: Intermediate
Prerequisites: Knowledge of logic models and basic program evaluation approaches
Scheduled: Sunday June 9th from 9:00am – Noon


12. SOAR Building a Strengths-based Strategy: A Unique Approach in Strategic Planning

NOTE: This session is no longer offered

This workshop will provide an overview on an alternative approach to the strategic planning process. This approach, known as SOAR – Strengths, Opportunities, Aspirations and Results, is a strategic planning framework that focuses on strengths and attempts to understand the entire organizational interaction by including the voices of relevant stakeholders. At the conclusion of the workshop, participants will have a better understanding of the approach as well as the tools that can be used within their own organizations.

You will learn:

  • To develop a better understanding of how to frame strategic planning.
  • To understand the differences between SWOT and SOAR.
  • To Develop new techniques to identify organizational strengths.

13. Building Evaluation Capacity and Culture

Having a supportive organizational culture is essential to the advancement and integration of evaluation. This workshop – using recent literature of Mayne, Kim, Boyle and Lemaire, and Sonnichsen – will present the conceptual basis for the presence of evaluation capacity and culture. A synthesis of the literature on ‘Understanding Organizational Capacity for Evaluation’ CJPE Vol 23(3) will be integrated into the concepts. A model to assess and compare capacity and culture will be presented.

Participants will be asked to assess themselves and their organizations using the model. Issues such as where the evaluation regime is located, how evaluation is linked with strategic planning and budgeting, how evaluation is used in decision-making and how demand for evaluation is created will be part of the diagnosis of evaluation culture. Participants will be given a workbook that will enable them to assess their own settings and develop strategies to take back to their workplace.

You will learn:

  • The importance of an evaluation culture to the practice of evaluation.
  • To understand the need to use capacity to build culture.
  • How to build strategies that can be applied to building a culture.

Evaluation competencies to be addressed include:

Dr. Kaireen Chaytor, CE, served on the National Council of the Canadian Evaluation Society (CES) and as president of the Nova Scotia chapter of CES. The Canadian Evaluation Society recognized Kaireen in 2003 with the national award for contribution to theory and practice in evaluation and in 2011 installed her as a Fellow of the Canadian Evaluation Society. As an evaluation consultant she has conducted evaluations for federally and provincially funded projects and for the non-profit sector.
Dr. Nancy Carter is Director, Evaluation Services for the Nova Scotia Health Research Foundation where she provides guidance and advice to catalyze evaluation initiatives for Nova Scotia’s health system. Nancy holds a Ph.D. in Organizational Behaviour and Human Resource Management from the University of Toronto’s Rotman School of Management. She has facilitated workshops at previous CES conferences and other professional peer reviewed conferences. Nancy currently teaches an Introduction to Evaluation course she developed for a local regional health authority.

Level: Intermediate
Prerequisites: Some experience attempting to integrate evaluation into organizations or knowledge of organizational learning
Scheduled: Sunday June 9th from 9:00am – Noon


14. Performance measurement and evaluation: The development of logic and process models to identify performance indicators

– Presented in French

NOTE: This session is no longer offered

Based on the content of courses in evaluation and performance measurement delivered at l’ÉNAP in Gatineau, the workshop consists of three sections:

1. A theory-based approach to development of an integrated evaluation and performance measurement framework, with a focus on practical issues in the formulation of assumptions for research, the identification of relevant indicators and the development of integrated performance measurement and evaluation strategies.
2. The illustration of the integrated approach discussed in Section 1 using an actual Performance Measurement Framework (PMF) developed for the Employment Benefits and Support Measures (EBSMs) – Human Resources and Skills Development Canada This PMF received an award from HRSDC. The framework developer received a DM Award of Excellence for the quality of the document and its contribution to the EBSM Renewal.
3. A participatory exercise in the development of limited versions of PMFs, pre-selected to deal with specific challenges, allowing participants to test their skills and exchange on lessons learned.
You will learn:

  • The distinction between evaluation, audit and management theories, and how performance measurement relates to them within the context of Results-Based Management.
  • An integrative approach to performance measurement through the formulation of change theory and the use of modelling techniques to support development of strategic and operational plans for public interventions.
  • To identify and select performance indicators to inform and support implementation of government policies, programs and initiatives, and to respond to broader organizational needs and reporting requirements.

15. Conducting Research on Evaluation and Getting Published

Many evaluators have the research bug, but don’t get around to conducting and publishing research on evaluation. This workshop is designed to help get evaluators moving. It demonstrates the need for more empirical research (and conceptual development) on evaluation. Participants will learn how to conduct research on evaluation using hands-on, experiential, small-group work. They may bring their own research questions to the table and benefit from the input of peers on how to go about conducting their study. Finally, participants will receive guidance and tips from the editor of the Canadian Journal of Program Evaluation on how to prepare manuscripts for publication in peer reviewed journals.

You will learn:

  • Latest and greatest on state of empirical research on evaluation.
  • How to conduct research on evaluation.
  • How to prepare a manuscript for peer reviewed publication.

Dr. Robert Schwartz is Editor-in-Chief of the Canadian Journal of Program Evaluation and has published several peer reviewed publications in the areas of evaluation, performance measurement, accountability and public health policy. He delivered a well-received presentation on getting published at CES Halifax 2012 and has delivered professional development workshops in several countries with consistently excellent participant feedback.

Level: Intermediate
Prerequisites: None
Scheduled: Sunday June 9th from 9:00am – Noon


16. Survey Research for Evaluation: Advanced Issues in Design and Implementation

Survey research is ubiquitous in evaluation. While this method is extremely flexible and highly useful, it raises a number of issues that need to be faced to ensure useful results. The purpose of this session will be to review and discuss four advanced topics: (1) Moving from the evaluation framework to planning the survey (i.e. operationalization of concepts, sampling, scales, budgeting); (2) Ensuring the meaning of questions is shared (i.e. pretesting and translation); (3) Approaches to enhancing response rates; and, (4) Assessing survey research conducted by others. Facilitators will structure a brief presentation of each topic, frame the issues encountered and provide solutions they have identified in the literature and within their own practice. Participants will be expected to contribute with their own experience and examples, and to apply critical thinking throughout the seminar.

You will learn:

  • How to move from evaluation planning to questionnaire design and budgeting.
  • How to build questionnaires for maximum reliability and validity.
  • How to maximize response rates.

Evaluation competencies to be addressed include:

Benoit Gauthier, CE, has contributed to over 100 evaluations since the beginning of this career, and has developed several evaluation frameworks. Mr. Gauthier has taught several courses for l’École nationale d’administration publique. Both have co-authored several articles for the Canadian Journal of Program Evaluation.
Dr. Simon Roy, CE, has conducted more than 60 program evaluations since 1995. Simon developed the first CES Logic Model Course and co-authored the latest CES Survey Course. Simon teaches program evaluation and received the CESEF award for Contribution to Research on Evaluation Practice.

Level: Advanced
Prerequisites: Participants should have at least two years of evaluation experience. It is also expected that participants will have designed or implemented some surveys of their own
Scheduled: Sunday June 9th from 9:00am – Noon


17. Demonstrating Research Impact: Measuring Return on Investment with an Impact Framework

Impact evaluation informs decision making for a multitude of programs and policies. In health research the task for stakeholders (e.g. researchers, research institutions, funders, and knowledge-users) is to demonstrate impacts of supporting health research. The Canadian Academy of Health Sciences (CAHS) developed a framework to assist with demonstrating research impacts, which has been applied successfully across Canada and internationally.

This workshop will introduce participants to the CAHS framework and its components. Participants will learn about the model itself and an extensive list of indicators developed to evaluate health research impacts based on this framework. Practical examples and small group exercises will teach participants how to apply the framework to their own work. While developed for health research, the framework is applicable to other forms of research. Participants will receive a workbook useful for taking their learning back to their workplace to share and use in future work.

You will learn:

  • Basic understanding of the concepts of impact evaluation.
  • Canadian Academy of Health Science’s (CAHS) framework and indicators to “Measure Returns on Investment in Health Research” and research in general.
  • To apply the CAHS Framework through a guided case study exercise.

Evaluation competencies to be addressed include:

Dr. Nancy Carter is Director, Evaluation Services for the Nova Scotia Health Research Foundation where she provides guidance and advice to catalyze evaluation initiatives for Nova Scotia’s health system. Nancy holds a Ph.D. in Organizational Behaviour and Human Resource Management from the University of Toronto’s Rotman School of Management. She has facilitated workshops at previous CES conferences and other professional peer reviewed conferences. Nancy currently teaches an Introduction to Evaluation course she developed for a local regional health authority.
Rob Chatwin is Manager, Performance Accountability and Evaluation at the Nova Scotia Health Research Foundation. Rob has over 30 years’ experience in health and human service organizations at the community, regional, provincial and national levels in Alberta, British Columbia and Ontario. Prior to joining the NSHRF, Rob worked for the Public Health Agency of Canada – Community Acquired Infections where his work focused on evaluation, performance measurement, monitoring and planning. During his time in Alberta as Director of Accountability and Planning

Level: Beginner
Prerequisites: Some experience with writing evaluation reports is helpful but not necessary
Scheduled: Sunday June 9th from 1:00pm – 4:00pm

18. Using Qualitative Data Analysis Within and Across Settings

This workshop is in three parts. Part One covers QDA methods, an overview and grounded theory. Part Two is a paper exercise involving quotes, codes & memos. Part Three is an exercise in grounded theory and network mapping. Both exercises use real data on paper. The presentation is verbal interactive and backed up with PowerPoint slides. You will receive summaries of qualitative methods, software, and internet resources. Manual data analysis methods vs. software options will be discussed.

You will learn:

  • An overview of qualitative data analysis (QDA) methods and grounded theory.
  • How to code and memo data by applying a range of QDA methodologies.
  • How to use a variety of QDA software (i.e. Atlas-ti, QDA Miner and N-Vivo).

     
Evaluation competencies to be addressed include:

Reed Early, CE, has 20 years’ experience in the fields of evaluation and research and has taught university-level courses in quantitative and qualitative evaluation methodology. As representative to CES National Council, past President of the BC CES Chapter, co-Chair of CES 2010 and past Chair of the CES National Professional Development Committee, he maintains an active interest in advancing the field of evaluation. Reed has a particular interest in methodology and software for qualitative data analysis.

Level: Beginner
Prerequisites: None
Scheduled: Sunday June 9th from 1:00pm – 4:00pm

19. An Executive Summary is Not Enough: Effective Reporting for Evaluators

NOTE: This session is no longer offered

Do you need a refresher on conducting needs assessments, or want to learn how? Make your needs assessment more comprehensive and compelling by conducting one that covers all the bases, provides a strong argument for support, and yields the best program design possible.

You will learn:

  • The role of needs assessments in program planning.
  • The 6 D’s required for a comprehensive needs assessment.
  • The most common sources of data for a needs assessment.

20. Smart Data Visualization

Crystal clear charts and graphs are valuable – they save an audience’s mental energies, keep a reader engaged, and make you look smart. You can achieve that level of smart communication using the tools you already own.
In this 3-hour workshop, you will learn the research-based graphic design best practices that inform smart data visualization. We will focus on the fundamentals of good graph design including how to work from the default settings in Microsoft Excel to design visualizations with an impact. You are strongly encouraged to bring printed graphs you are currently using for in-workshop revision and discussion. As a result, you will be able to use the graphing software you have in smarter ways.

You will learn:

  • Visual processing theory and why it is relevant for evaluators.
  • Graphic design best practices based in visual processing theory.
  • To apply graphic design best practices and visual processing theory to enhance data visualizations with simple, immediately implementable steps.

Evaluation competencies to be addressed include:

Dr. Stephanie Evergreen is the founder and past chair of AEA’s Data Visualization and Reporting TIG. She has published on this topic in New Directions in Evaluation and is co-editing an NDE volume on data visualization. Her forthcoming book, Presenting Data Effectively, will be published by Sage in Fall 2014. At Evergreen Evaluation she delivers workshops and webinars on this and related topics in both private workshops and for clients such as Education Development Center, United Way of the Bay Area, and in guest lectures at Western Michigan University. Stephanie holds a contract with the American Evaluation Association to train their presenters in both their webinars and via the Potent Presentations Initiative.

Level: Intermediate
Prerequisites: Participants need to be comfortable using Excel to create simple graphs using the default graph options
Scheduled: Sunday June 9th from 1:00pm – 4:00pm

21. Write to be Read

Evaluation reports that are highly accessible and interesting can generate learning among all stakeholders. Clear, readable writing is especially important for readers whose first language is not English. While writing styles learned in academic settings may actually hinder effectiveness, Business Communications experts have developed simple strategies that can be applied by researchers to create more inviting and compelling documents.

In this highly interactive workshop, you will practice a number of simple strategies to keep your readers engaged. The session is both fun and valuable, and has been popular at previous national and provincial CES events.

You will learn:

  • How to choose wording that is easy for non-evaluators and ESL speakers.
  • How to structure dynamic sentences and compelling, cohesive paragraphs.
  • This workshop addresses several Competencies (below) for Canadian Evaluators.

Evaluation competencies to be addressed include:

Dr. Christine Frank, CE, is an experienced evaluator and veteran adult educator. As principal of Christine Frank & Associates, she regularly conducts program evaluations and training for a range of organizations. Chris also taught business communications for many years and co-authored a textbook on that subject. A member of the CES Credentialing Board, she holds a PhD in Education and was a core instructor in the postgraduate Research Analyst Program at Georgian College in Barrie from 2000 to 2008.

Level: Intermediate
Prerequisites: Some experience with writing evaluation reports is helpful but not necessary
Scheduled: Sunday June 9th from 1:00pm – 4:00pm

22. International Development and Accounting for Change: Measuring Success, Failure and Finding Ways to Impact Future Choices in Aid.

NOTE: This session is no longer offered

Evaluation across boundaries requires pushing through and beyond comfort zones. Methodologies must consider contextual factors and study what works and what has failed in order to find effective ways to apply lessons of the past to better decisions for tomorrow. The field of international development assistance and global poverty reduction is rife with debate on accountability and effectiveness and is compounded by complexities of history, culture and the ever-shifting geo-political and security landscape.

Two leading Aid organizations in Canada have spearheaded bold and exciting initiatives to shed light on major gaps in international aid: Engineers Without Borders Canada now publishes an Annual Failure Report, and Plan Canada has spearheaded a ground-breaking campaign “Because I am a Girl”. This workshop will highlight the genesis, challenges, methodologies and communication strategies for addressing unpopular topics. Participants of all levels are welcome and will be invited to share their experience and ideas in small group work.

You will learn:

  • Key concepts, principles, information resources associated with evaluation and monitoring in the international development assistance / poverty context.
  • Results Based Management.
  • The importance of client-inclusion and strategic communication of results.

23. Processes of Learning in a Developmental Evaluation

Foundational to developmental evaluation (DE) are ‘processes of learning’ so as to support social innovation and program development. Public sector work in the 21st Century means complexity, adaptation and innovation as the norm in providing services to the changing realities of society. New programs are typically grounded in an understanding of some critical social needs or problems, the stakeholders and target population, but have not yet finalized a standardized model of intervention or delivery to address those needs. As plans are designed and actions and activities are implemented, DE becomes a mean of assessing and guiding the actions and activities of the program which are contributing to progress towards meeting those social needs.

This workshop will provide an overview of the principles of developmental evaluation and introduce process learning tools that have been used by seasoned evaluators to guide the development, sustainability and effectiveness of the program intervention.

You will learn:

  • How to apply their interpersonal and situational practice competencies to attend to the unique interests, emergent issues, and complex contextual circumstances.
  • How to decipher and infuse evaluative thinking by applying technical and management practice competencies and embedding them into the work cycle, elements of evaluation design, data collection, analysis, interpretation and reporting.
  • How to become an integral and effective team member by applying Reflective Practice competencies as an intervention to facilitate the process of learning.

Evaluation competencies to be addressed include:

Dr. Wendy Rowe is a professor in the School of Leadership. She teaches courses and conducts workshops in leadership development, change management, action research, evaluative inquiry, performance measurement, team work and facilitation, strategic planning and financial management. She is also an experienced facilitator in program evaluation, working with organizations to enhance their performance and capabilities. She facilitates the Essential Skills Series evaluation course for CES on a regular basis.
Dr. Keiko Kuji-Shikatani facilitates job-embedded professional learning on a daily basis in the Public Sector using logic model as a tool for infusing evaluative thinking and to guide developmental evaluation. She has provided professional learning workshops on Developmental Evaluation to various audiences in Canada and internationally. She also teaches introduction to program evaluation in the International Educators Training Program, Queen’s University

Level: Intermediate / Advanced
Prerequisites: Foundational knowledge of evaluation
Scheduled: Sunday June 9th from 1:00pm – 4:00pm

24. Beyond the logic model: Advanced approaches to developing evaluation frameworks

Quality evaluation frameworks are one of the key determining factors leading to successful evaluations. Apart from good logic models and appropriate evaluation issues, quality frameworks will include an appropriate strategy to address the evaluation issues, including issues of effectiveness, efficiency and economy. The best strategies will reflect an in-depth analysis of the available data, including its quality, and adapted methodologies that will allow the evaluators to gather evidence and assess the incremental impacts of programs, including optimal comparison strategies.

The purpose of this session will be to review advanced techniques in assessing data as well as principles and techniques to identify the best evaluation strategies given the specific parameters of the program, purpose of evaluation and available information sources. Finally, the session will cover techniques in costing evaluation options, including evaluations involving qualitative and quantitative methods.

You will learn:

  • How to identify and assess available data, including data quality, for method selections and development purposes.
  • How to select the best evaluation strategies, including the blend of specific methods and comparison strategies, to best address the evaluation issues.
  • How to estimate costs of various options to evaluate programs.

Evaluation competencies to be addressed include:

Dr. Simon Roy, CE, has conducted more than 60 program evaluations since 1995, including several projects involving surveys of regional and national scope. He co-authored the latest CES course on surveys. Simon teaches program evaluation on a part-time basis at the University of Ottawa.
Benoit Gauthier, CE, has contributed to over 100 evaluations since the beginning of this career, and has developed several evaluation frameworks. Mr. Gauthier has taught several courses for l’École nationale d’administration publique. Both have co-authored several articles for the Canadian Journal of Program Evaluation.

Level: Intermediate / Advanced
Prerequisites: Participants should have at least 2 years of evaluation experience and basic knowledge of evaluation frameworks
Scheduled: Sunday June 9th from 1:00pm – 4:00pm

Latest Tweet

@CESToronto2013