Ten actions to take from our latest web marketing campaign

Email

Following the result that most of our visitors come through email or direct publicity we have been looking more closely at how our email “New at IFPRI” is used by the recipients. Looking at opening rates and click through rates we can get an idea as to the interest in the different materials. We have found the increased click through rate from images versus text and more interest in shorter materials.

Action: increased focus on email campaign, analyzing and changing formats.

Social Media

We were surprised to see how quickly social media is building an audience for our materials. Evidently different products are more suited to the audience than others, hence the video was widely retweeted on Twitter, whilst the book itself less so. It was also critical to make best use of tags to attract new audiences, to follow more people with the IFPRI twitter account to build followers, and to attract retweets from larger or key audience accounts.

Action: include lessons learnt in the social media guidance under development

Video/Multimedia materials

The success in the video product in attracting more users and raising awareness shows the importance of considering multimedia products. We are increasingly developing presentations to explain new findings, products or services. Key to this is hosting these materials where the user is looking for them, we therefore make extensive use of YouTube and Slideshare.

Action to develop explanatory materials as presentations or interactive products.

The Website

With everyone emphasizing the importance of web2 and social media tools for web communication it was interesting that our results underscored the value of the website in bringing an audience to IFPRI products. We have learnt from the keywords used to access the site and the focus on the topical interests of the user rather than the organizational structure of the site.

Action: We have developed more topics pages on the website (our work in focus) and developed a series of options for users to subscribe to content by topic (RSS).

Facebook and Linked In

Analysis of visitors to Millions Fed showed the importance of Facebook and Linked in for attracting a targeted audience.

Action: Continued development of LinkedIn to attract alumni of IFPRI and development of Facebook to capture a younger audience.

Quoting reach rather than just numbers of visitors

We discovered in the course of the analysis the value of quoting our visits as a proportion of the overall internet population of a country. We would like to develop this idea and compare with others.

Action: Compare statistical analysis of IFPRI reach with other development organizations working in agriculture and food policy research.

Dialup and low bandwidth

We found that dialup connection is still used to access our site but only from Germany, India, the US and Australia. We will continue to ensure fast loadtimes, and caching of our materials.

Action: We are looking to provide more guidance to low bandwidth users, and promoting more email delivery rather than a very low bandwidth version of the site.

Access by mobile phone

We found that very few people view the site with a mobile phone.  But are investigating further whether this is because we don’t offer a mobile interface.

Action: In a similar approach to above, we would prefer to promote the use of feeds and email for accessing our content on mobile phones.

Measuring success

By our own standards we were very successful in raising awareness of the product and the strategy of using more social media and web2 tools to get the message out clearly worked. However in terms of readership of the final product, other web-based publications produced during the year were more widely read.

Advertisements

Measuring impact on the web

Like many development research organizations, IFPRI staff often debate the question of how to best measure the impact of our work. As opposed to some of our partner CG Centers, IFPRI’s work focuses on policy-level interventions, which are rather different from producing new crop varieties for the purpose of increasing agricultural yields, household incomes, etc. Thus, it could be claimed (and in fact has been claimed by some) that IFPRI primarily is in the business of producing “intellectual varieties.” Defining what we produce, however, is quite different from knowing how to measure impact, which effectively underscores the importance of choosing which indicators it’s worth paying attention to. Below I list a few potential indicators for measuring impact on the web while trying to provide my own impressions on their overall value and utility for measuring impact.

  • Page views/visitors: It seems like many webmasters and other mangers of web statistics reports have been questioning the value of these for years, but they continue to be reported year after year. Criticisms include the lack of proper contextualization (from what constitutes a ‘hit,’ ‘visitor session,’ etc. to how do we compare our statistics with others) as well as the lack of any kind of clear correlation between these easily-measurable indicators and others designed to measure impact.
  • Downloads: It seems to follow that the more a particular publication is downloaded, the more people are reading it, thereby expanding its sphere of influence. In practice, however, the same person might be downloading a particular publication multiple times and even more important than the how many question is the by whom question. As with page views/visitor sessions, Google Analytics‘ mapping features can be highly useful in obtaining this type of data to make sure that research outputs are reaching audiences in the countries they’re intended to benefit.
  • Citations: Once again, context is everything. Many research organizations only report on their peer-review publications, and it follows that they are only interested in citations in PR journals, books, etc. Several online databases offer citation tracking information, including Google Scholar, ISI Web of Science, CrossRef’s Forward Linking Program, and CiteSeer, among others. One might be tempted to add these citation figures up in order to obtain a clearer picture of overall impact, but this would not yield such a picture since several of these services count the same citations. Thus, benchmarking your organization’s citations vis-a-vis your counterparts would probably prove more effective in monitoring the impact of your research publications.
  • Mentions in the media/blogs: It also seems like many research organizations have been monitoring this for years, but the addition of services like RSS feeds and automated alerts from Google News has made it much easier than in the past. The same also applies to monitoring the blogs via advanced search operators for Google and Google Blog searches as well as Technorati.
  • RSS feeds: Once again, many are questioning the value of measuring ‘hits’ as an indicator for impact. Instead, they argue, we’d be much better served in paying attention to the number of feed subscribers out there, though this can also fluctuate heavily over time. For our purposes, Feedburner fits the bill rather nicely, even while sticking only to their free, basic service.
  • Search engine rankings: Obviously, there’s one search engine that need not be named (at least, not for the umpteenth time in this post!) that most organizations pay attention to above all others, but few fail to take into account differences across different country platforms. My own advice is to pay attention to your ranking among search results for several keywords your organization may be targeting and look into purchasing ad words for your institution via the Google Grants program (free for non-profits).