Often, science communication practitioners in the ‘thick’ of designing and implementing plans, writing, videoing (and more) don’t really have time to do research about their field – what works, what doesn’t, and why.

By the nature of the industry, and often a dearth of time and funding, science communicators are keen on having a solid evidence-base for what they do.

That’s why, this month we muse on what kind of Questions science communication practitioners want answered by people researching science communication.

And welcome to 2018; may it be an exciting and productive one for you all. Do let us know what you think about this issue (or tweet us below) or about anything you’d love to read about.

Regards from the @EconnectTeam:
Jenni MetcalfeSarah ColeJane Ilsley, Toss GascoigneClaire Heath & Madeleine Stirrat.

Just give me the damn hammer!

By Jenni Metcalfe

I had the privilege last November to participate in 3 days of discussions focusing on the theory and practice of science communication with 21 others from around the world, at the Rockefeller Centre at Bellagio on beautiful Lake Como.

Many of our thoughts revolved around how to build the links between practitioners and scholars: How can we best learn from each other and, in the process, improve the way we communicate about science?

A scholar bemoaned the fact that practitioners want black and white solutions; they want to be given a hammer and nails and know if they connect these in the right way it will all work out!

And after 28 years of being a science communicator (with some research thrown in), I thought, “yes; make it easy – just give me the damn hammer and tell me it will work”.

What a relief that would be – off the shelf science communication tools with a money-back guarantee they will work.

But, of course, I know that is simplistic and that there will always be shades of grey. I also know that one solution does not fit all needs.

An important outcome of the Bellagio meeting was that we all agreed to find ways for science communicators and scholars to better collaborate for mutual benefits.

Here are some questions that I think would be great to explore in collaboration with scholars and other practitioners:

  1. How do we work with scientists, research managers and policymakers to critically review and reshape the way we currently do science communication?
  2. What is the role of professional science communicators in helping people to access scientific information for individual and policy decision-making?
  3. How do we reconcile the conflicting roles that a science communicator sometimes faces between being a science advocate versus providing an objective, and sometimes critical, review of science?
  4. How can we use social and online media to engage more people in critical review and discussion of new and proposed technologies without polarising debates?
  5. What are the motivations for scientists and non-scientists to participate with each other on an equal basis to jointly solve problems and create new knowledge?

6 attendees silhouetted seated by a window at a workshop
Rockefeller Science Communication Conference attendees; their copyright

Researchers: we want to know our audiences

By Toss Gascoigne

What research do I want to inform my science communication practice?

It’s about two things: what are the most effective ways to achieve the aim of the event, and how do I know if it’s worked?

For science communication practitioners running events, releasing information about new research or trying to encourage behaviour change, information about audience is vital. What does the audience already know?

What is going to motivate them to change (e.g. to a healthier lifestyle), and what will cause them to switch off?

The most useful document to me in running the annual ‘Science meets Parliament’ events was a survey on how Parliamentarians prefer to be approached by interest groups.

The 2006 Survey of Politicians Lobbying Preferences (PDF, 1.9 GB) was published by Client Solutions, a Canberra-based company with the tagline: ‘Research-based Public Affairs’. They surveyed all 227 Australian MPs to discover the best way to raise ideas with them.
‘Science meets Parliament’ brought 150 scientists to Canberra for individual meetings with politicians. If we were to be effective, we needed to know what Parliamentarians thought, what they wanted and how they wanted it delivered.

The report contains all sorts of information; the best time of day to organise a meeting, how many people should be in the group, how long the meeting will last. It lists the five top lobbying mistakes identified by MPs. These include wasting time on insignificant issues, mis-stating the facts, or raising problems without having any solution in mind.

Parliamentarians say interest groups are wasting their time trying to influence events with campaigns based on letter-writing, email and advertising.

Information about audience is the most helpful thing researchers can provide, closely followed by methods of effective evaluation.

As American businessman John Wanamaker, 100 years ago, is attributed to saying: “Half the money I spend on advertising is wasted; the trouble is I don’t know which half”.

I suspect it’s the same with science communication activities…

Science meets Parliament 2012 - audience- showing 8 people- including lady looking at laptop
A 2012 Science meets Parliament event; copyright CSIRO

Evaluation: expensive, undervalued, ethereal?

By Sarah Cole

Evaluation plans can seem to echo our science communication plans – lofty. Yet they are often eventually prioritised even ‘lower’ than the communication plan, despite being an essential part of one.

Funding for implementing, time to collect data, expertise to develop valid measures, sharing results, using the results to improve… These challenges to evaluating are stark when even science communication activities are themselves clutching for resources.

Comparatively ethereal objectives of communication contribute to this issue: activities that aim to:

  • give enjoyment
  • inspire creativity
  • inspire people
  • change attitudes
  • change values
  • give knowledge
  • increase understanding.

… which are perfectly worthwhile (if vague), yet more difficult to develop efficient measures for.

Eric Jensen, in a [still] refreshing and critical editorial (2014), decries common evaluation in science communication: “poor-quality evaluation”, he says, “has been feeding questionable data and conclusions into the science communication system for years”.

He sees problems with:

  • basic errors and poor practice in survey design, sampling and analysis
  • “fragile” evidence for learning
  • ‘routine neglect’ of key evaluation indicators (e.g. non-visitors to events)
  • evaluation’s “failure to live up to research standards”.

Many do not seem to address the criticisms above, despite the availability of numerous resources about how to evaluate, such as Table 2 within this paper, deep triangulation methods or databases such as this.

Evaluation challenges are also set against more contentious questions within our societies such as:

  • Are science policymakers and institutions setting appropriate goals for science communication?
  • What kinds of science communication-related outcomes are valued and why?
  • Whose interests are served by emphasis on outcomes such as pro-science attitudes, versus more open, democratic ideas such as equipping scientific citizens?

So, I believe it would be useful for practitioners to know:

  • Which methods are best/most effective for going deeper than a ‘bean count’/’show of hands’ at an event (if we’re choosing just one or two)?
  • Are the problems with statistical errors in evaluation specifically something about practitioners/science communication or standard statistical errors researchers fall prey to? Which techniques are best to redress those routine errors?
  • Are there examples of non-country-specific evaluation ‘collation’ projects which outlast short funding periods? How are they maintained?
  • What can we learn from art and the performing arts about evaluating our work and qualitative measures being taken seriously as evidence?

The ‘hammer’ conversation

We appreciated the nuance given in this report of the Rockefeller Science Communication Conference’s discussion about the theories and disciplines of science communication, written by Peter Broks. It delves into why not many communicators have the chance to reflect.

hammer with red handle
Public Domain Pictures