Return to TIG Newsletters

September 2016 Newsletter

Systems mirror


Systems in Evaluation TIG

September 2016 Newsletter


Welcome!

If “systems” makes your pulse race, please read on to learn more about how you can take part in the SETIG elections, SETIG action teams, and AEA 2016 in Atlanta.

Systems in Evaluation TIG Elections:  Elections begin today!  The talented Ginger Fitzhugh is stepping down after three years on the leadership team.  The new Program Co-Chair will work alongside Brandon Coffee-Borden to ensure that AEA 2017 showcases the very best member contributions.  We’re expanding our leadership team to include a Co-Chair.  The new Co-Chair will work with Heather Britt to oversee the planning and development of TIG activities. Please vote!

(Un)Conference Action Team:  In June, the TIG hosted another successful international virtual gathering of those eager to learn and share experience about “systems” in evaluation practice.  To accommodate truly global participation, one session was held at a time convenient for the Americas, Africa, and Europe and the second session was convened at a time convenient for Asia, Australia, and New Zealand. Many thanks to our dedicated and creative Action Team!  Read more about this year’s event  below and how to get involved!

AEA 2016:  Evaluation 2016 will be packed with SETIG-related workshops, panel presentations, think tanks, and posters. We hope to see you in these sessions and at the SETIG business meeting on Thursday, 6:20 PM – 07:10 PM.  Let’s continue the conversation at a social immediately following the business meeting! This year we will be co-hosting the social with the Non-Profits and Foundations TIG. Stay tuned for details that will be shared during mid-October about the venue for the social and how to RSVP!

Manifesto Action Team:  Last year the TIG polled its members and colleagues about the systems/complexity concepts, approaches, and tools that used evaluation work. This is an important step in the identification of the core elements and principles of systems practice in evaluation. Read about the preliminary findings.

The SETIG Leadership Team,

Heather Britt, Chair (email Heather)

Ginger Fitzhugh, Program Co-Chair (email Ginger)

Brandon Coffee-Borden, Program Co-Chair (email Brandon)

 


What Do We Mean by Systems in Evaluation? A Preliminary Analysis of TIG Poll Results

In October 2015, the leadership of the American Evaluation Association’s Systems in Evaluation Topic Interest Group (TIG) polled its members on what systems and complexity concepts, tools, methods, and approaches they were using in their evaluation work.  TIG members were also encouraged to send the polls to colleagues outside of the TIG. Of the 120 people who responded to the online survey, over half (54 percent) were TIG members, almost one in five (19 percent) were AEA members but not part of the TIG, and the rest (27percent) were non-AEA members. Most of the respondents (81 percent) resided in North America, and the rest were from other regions.  Just over half (53 percent) had ten years or less of evaluation experience; a third (32 percent) had 11 to 20 years of experience, and the other 15 percent reported more than 20 years of evaluation experience.

When asked what systems and complexity concepts, tools, methods, and approaches they were using in their evaluation work, their responses fell into three broad categories:

  1. Systems Thinking in Evaluation: Systems/Complexity Concepts
  2. Systems Thinking in Evaluation: Tools, Methods, and Approaches
  3. Evaluation Tools, Methods, Approaches Not Specific to Systems

Some evaluators are incorporating systems and complexity concepts into their evaluation work; respondents gave 82 reports of the use of concepts.  Favorites included the Cynefin framework (10 mentions), and system attributes of interrelationships, perspectives and boundaries (9 mentions).  Such systems concepts can be incorporated into both systems-specific methods as well as non-systems-specific methods.

 

Systems and Complexity Concepts Used by Respondents Number of Reports
Random, simple, complicated, complex dynamics (Cynefin) 10
Boundaries, perspectives, and interrelationships (Williams) 9
Emergence 6
Complex adaptive systems theory 5
Theorists: Prigogine, Ackoff, Per Bak, S. Kaufman (1 each)

Simple rules

HSD CDE theory

Interdependence and synergy

Nested socio-ecological levels, cross-scale interactions

4
Wicked problem theory

Donella Meadows systems leverage, leverage points

Brenda Zimmerman’s matrix

Path dependence, critical events, and ripple effect

Unpredictability, uncertainty

Feedback loops

3
Systems change theory

Nonlinearity

Tipping points

Time lags, delayed responses, spatial/temporal mismatches

Systemic change: not same as changing an organizational system

2
Viable systems model

Open systems

Co-evolution

Endogeneity (system structure affects system behavior)

1

Many evaluators are using tools, methods, and approaches that are drawn directly from the systems field or are using evaluation approaches based on systems concepts, with the systems and complexity concepts infused or “baked into” the design.  Respondents made 78 mentions of such tools, methods and approaches, with Developmental Evaluation receiving the most mentions (13), followed by outcome mapping (8) and systems dynamics modeling (8).

 

Systems Tools, Methods and Approaches Used by Respondents Number of Reports
Developmental evaluation (Patton) 13
Systems dynamics modeling (7), stock and flow diagrams (1)

Outcome mapping (7) systems analysis (1)

8
Adaptation/adaptive action (3), adaptive management (2) 5
Social network analysis 6
Systems thinking, lens, awareness (4), zooming in/out (1) 5
Complexity-informed theories of change

Multiple, interconnected causal pathways

Systems design thinking

Systems diagrams – rich pictures

Visualization of systems concepts (1), iceberg diagram (2)

3
Systems conceptual frameworks

Matching evaluation methods to system dynamics, pace of change

Soft systems methodology

Contribution analysis to systems-level outcomes

Monitoring changes in systemic conditions

2
Systemic drivers/levers of change

Mental Models (Senge)

CHAT

ABLe (Foster-Fishman and Watson)

Systems Evaluation Protocol (SEP) (Cornell)

Systems Change Evaluation Planning (Hargreaves)

Systemic education reform evaluation

USAID Learning Lab system evaluation tools

1

Perhaps most striking, the data revealed 39 reports of the use of evaluation tools, methods and approaches that may be compatible with but are not specific to systems and complexity theory.  When asked about application of systems concepts and methods, evaluators provided 27 examples of such compatible, but non-systems-specific tools, methods and approaches drawn from the evaluation field in general. It should be noted that no one listed less compatible methods, such as randomized-controlled trials or linear regressions.

 

Non-Systems Tools, Methods and Approaches Used by Respondents Number of Reports
Collective Impact 4
Appreciative inquiry

Transformational leadership

3
Collaborative interpretation of sense-making

Participatory evaluation

Practice principles (not best practices)

Art of hosting (1), world cafes (1)

Rapid cycle evaluation (1), what/so what/now what (1)

 

2
Advocacy evaluation

Balanced scorecard

Comparative case studies

Delphi method

Double-loop learning

Environmental scans

In-depth interviews

Integrative propositional analysis

Performance stories

Power inequities

Most significant change

Probes

Process evaluation

Qualitative comparative analysis

Spotting patterns

Strategic Learning

Strategy map of program recommendations

Theory-driven evaluation

Triangulation of data

1

 

 

What does this mean for the systems in evaluation field?  The respondents reported using a broad range of systems and complexity concepts, tools, methods, and approaches into their evaluation work.  If this is true, evidence of such use needs to be documented and shared, building up the field’s body of work, demonstrating its value, and modeling its use.

Another important finding is that some evaluators are using more generic evaluation tools, methods, and approaches and are calling them “systems evaluation” without reference to specific systems and complexity concepts.  We can draw two potential conclusions from this finding.

  • The boundary around systems concepts and approaches relevant to evaluation — what’s in and what’s out — is unclear to poll respondents.
  • Poll respondents may have a limited understanding of how to effectively modify generic techniques to incorporate systems and complexity concepts.

This trend threatens to dilute the utility that systems concepts and approaches bring to evaluation.  Similarly, the contribution of the evaluation field to discourse in among those working in systems would also suffer.  It would be foolish to generalize definitively from this poll to the field, as the survey is not a representative sample of systems evaluators, however, it would be equally foolish to disregard the findings and their implications for the Systems in Evaluation TIG.

The results of this poll should motivate us to: (1) develop a set of systems in evaluation principles, or “manifesto” identifying the defining characteristics of the use systems and complexity theory in evaluation; (2) use these principles to determine what fits in, or needs to be added to, the TIG’s conference program; and (3) provide more training and technical assistance to evaluators on how to integrate systems and complexity concepts more effectively into generic evaluation tools and methods.

Please help the Manifesto Action Team as we work to develop this set of principles by helping draft the principles or serve as a reviewer. At a minimum, please plan to come to the TIG’s business meeting on October 26 to discuss them! For further information, contact Meg Hargreaves (mhargreaves@communityscience.com) or email the TIG (systemsinevaluationtig@gmail.com).

 


Systems in Evaluation TIG Elections

The Systems in Evaluation TIG is holding elections for the positions of Co- Chair and Program Co-chair for a term of three years.

  • The TIG Co-Chair will assist the TIG Chair in convening the TIG Business Meeting at the annual conference. This meeting will be scheduled in consultation with the TIG Program Chair as part of the conference program.  The TIG Chair is also responsible for overseeing the planning and development of any TIG activities outside of the annual meeting.
  • The TIG Program Co-Chair is responsible for working with the Lead Program Co-Chair to manage the submission and review process each spring and for working in consultation with the TIG leadership team to plan the TIG’s business meeting during the annual conferences.

Voting is open now and will close at 11:00 pm Eastern US Time on Wednesday, October 12.

Results will be announced at our business meeting during Evaluation 2016 in Atlanta and on our website. New co-chairs take office on January 1, 2017.

You must be a member of the AEA Systems in Evaluation TIG to vote. If you are a member but did not receive an email from the TIG, check your member profile on the AEA website to make sure that the Systems in Evaluation TIG is still listed in your TIG memberships. If not, you can easily update your TIG member status by checking the box provided.

 


SETIG (Un)Conference Action Team

The SETIG organized its second annual virtual unconference in June, connecting people worldwide in discussions around systems in evaluation. Due to popular demand, this year’s event was held twice to better accommodate people in different time zones: one event was targeted for the Americas/Caribbean/Europe/Africa and another event was organized for Australia/New Zealand/Asia/Pacific. Sixty-nine people attended the Americas event and 29 people attended the first-ever event Down Under. Participants covered the globe from New Zealand to Nigeria, with additional attendees from Canada, Ireland, Australia, and the USA.                                                                                                                                                  uc

An unconference is a loosely structured meeting that emphasizes the informal, emergent exchange of ideas between participants rather than following a conventionally structured program of speakers and presenters. The SETIG (Un)Conference was held virtually. Attendees chose from a large array of sessions on topics related to systems thinking and evaluation approaches. Session topics included:

  • Designing a systems-based evaluation: Challenges and strategies
  • Support for a systems approach in systems evaluation: Strategies for promoting buy-in
  • Distinguishing systems evaluation: Where are the boundaries?
  • The cultural fit of systems thinking in Indigenous evaluation contexts
  • A systemic approach to evaluating Whānau Ora in a Pasifika context
  • Designing useful systems visualizations: Tools & strategies
  • How might we think systemically when tackling a value for money in evaluation
  • Emergence and attractors in complex systems: how these concepts inform our methods in evaluation

Feedback from participants was positive (particularly for participants in the Down Under session) and most would attend another (Un)conference. Participants provided suggestions for improving the unconference format which the planning committee will take forward next year.

We are currently compiling the proceedings from the 2016 (Un)Conference and will post them on the SETIG website. In the meantime, check out the proceedings from our first ever (Un)Conference—held in October 2015.

A big thanks to our 2016 team:

  • Planning Team Leads: Bethany Laursen (Americas) and Judy Oakden (Down Under)
  • Registration & Website: Jan Noga
  • Unhangout Platform: Bethany Laursen
  • Promotion: Jenny Lawlor
  • Facilitator Coordination: Jen Nunez
  • Evaluation: Kylie Hutchinson and Alana Robilliard
  • SETIG Liaison: Ginger Fitzhugh

Planning for (Un)Conference 2017!

Get involved! We’re looking for volunteers to help organize next year’s (Un)Conference. Contact the Systems in Evaluation leadership team if you might be interested in getting involved by emailing systemsinevaluationtig@gmail.com.

Are you interested in running an unconference or using the Unhangout platform? You can read more about what we learned about using Unhangout here.


SETIG Website and Social Media Action Team

The SETIG Website and Social Media Action Team’s role is to maintain a web-based infrastructure to tell the TIG’s story, share the work of the TIG, provide resources and information to members, and support interaction and connections between members before, during, and between annual meetings.

During the next several months, the team hopes to expand the functionality of the TIG’s website to provide greater opportunity for members to share information and resources and connect with each other. The team is also interested in exploring the use of social media to further develop our community of practice.

We are looking for volunteers! Please contact the SETIG Webmaster if you can:

  • Create content about SETIG activities and resources for website and/or social media
  • Work within the WordPress platform to maintain the website
  • Provide technical support for the website
  • Curate social media spaces such as LinkedIn, Facebook, or Twitter

Contact SETIG Webmaster, Jan Noga, at Jan.Noga@pathfinderevaluation.com or go to the SETIG’s website and use the Contact Us page.

Permanent link to this article: https://www.systemsinevaluation.com/tig-newsletters/september-2016-newsletter/