There are two important reasons for evaluating the success of digital engagement projects:
- To assess what has been achieved.
- To improve future practice.
Effective evaluation is not something that can be tagged on to the end of an engagement process. It needs to be considered from the start, begin as early as possible and continue throughout the process. There are three metrics that are likely to be useful: –
- Those which relate to being aware – a measure of the number of people who have visited the dialogue;
- Those which relate to being Informed – a measure of the visitors who have clicked to access further information resources, to learn more;
- Those which relate to being engaged – a measure of the number of people who have given feedback using any of the means available.
A number of techniques can be used to collect these. For example: –
- Attitudinal, behavioural and demographic data (managers and users), to see the different types people who were involved.
- Process observation, to see how people participated and interacted or joined and left.
- Content analysis, to see the outputs of people’s participation.
- Site analytics (e.g. Google Analytics, Counters, Referrers), to see how many people participated, where did they come from, and how long did they stay for.
- Pre- and post-activity surveys or interviews, to see peoples’ experiences of participation and the affect it had on them.
- Search Engine Ranking / Search volumes, to see how easily people can find out about the participation opportunities
The following facets can be measured to determine the level of success: –
- Extent and manner of use (effectiveness).
- Range of users (representation).
- User and stakeholder satisfaction (quality, what changed?).
- Input costs relative to outputs.
- Level of stakeholder support (barriers to continuity).
- User and stakeholder perception about design (process).
- Repeat visits and ‘up-stepping’ of citizens in the engagement process.
- Who was/wasn’t involved (public/stakeholder groups) and why/why not.
- Over spill in terms of increased participation on other channels.
and the following democratic criteria: –
- Representation – who did and did not participate?
- Political equality – were any groups excluded from participating?
- Engagement – what was the quality and quantity of participants’ involvement?
- Exposure – to what degree was the process publicised?
- Transparency – how open was the process?
- Conflict and consensus – did the process cause participants opinions to diverge or converge?
- Community Control – did participants have or take ownership of the process?
The following reference can be used to link to metrics with measures:-
- % Change in number of Facebook fans or Twitter followers;
- % Change in website or blog content views / downloads ;
- % Change in Vimeo or YouTube subscriptions ;
- % Change in blog RSS subscribers ;
- % Change in website or blog returning visitors.
- % Change in Facebook post interactions (Facebook comments + likes divided by total number of impressions);
- % Change in number of blog comments written;
- & Change in number of twitter mentions;
- % Change in ration of organisations Facebook posts to user comments/replies.
- % Change in Facebook ‘unlikes’ and Twitter un-followers;
- % Change in the number of positive Facebook posts in the last 100 posts;
- % Change in the number of positive blog comments in the last 100;
- % Change in the number of positive Twitter mentions in the last 100.
- % Change in the number of likes/shares etc. from embedded social media accounts;
- Change in % of web traffic coming from social media sources;
- % Change in Twitter retweets of posts;
- Top retweets;
- % Change in YouTube/Vimeo content views generated by shared or embedded content;
- % Change in blog and web content trackbacks / pingbacks from content that has been linked or referenced.
Alternative: Effective Number of Issues Index
The Effective Number of Issues (ENI) is an adapted technique that ecologists use to measure the biodiversity of a community of plants or animals. If we think of the information flow as giving rise to a community of issues we can use the ENI to calculate the diversity of this community and thus the efficiency of the process.
To calculate the ENI you need a list of all the different issues that are used in, or emerge from, the consultation process, and a count of the number of times each one appears. You also need the ENI formula, which is ENI = exp(H’), where H´= – ∑px ln(px). In this formula ‘p’ is the relative frequency of each issue and ‘ln’ is the natural logarithm.
More information can be found here: https://www.discuto.io/en/info-page/eni-calculation
Online Forum Goals
In order to monitor progress, it is often necessary to create goals and keep track of accomplishments over time. The following table describes typical developments in the first year of an online forum but can be adapted to most online methods.
Time from launch
– Forum is still active.
– Some regular traffic.
– Experiencing some membership growth.
– Policy officials are aware of forum/maybe reading posts.
– Some community organisations have begun to post announcements in forum.
– 25 – 50 percent growth in subscriptions since launch.
– Local media is to paying attention to discussions.
– 10 or more “regular” posters (post at least once per week).
– Participants are starting new discussions.
– Regular participation in steering committee communications and meetings attract a diverse group of community members.
– Officials are participating – most lurk, but some post.
– 50 – 100 percent growth in subscriptions since launch.
– Occasional story in local media that originates from forum.
– Some examples of citizen or government action that have resulted from forum discussions.
– You have hosted at least one in-person gathering or party for participants to meet one another.