Electronic materials usage statistics –Deakin University workflow and issues Sept 2012

Current workflow

·  Usage statistics are collected for all electronic resources, wherever possible.

·  Usage data spreadsheets are saved in the library fileshare and published to the library intranet as completed.

·  It is aimed to update all resources twice a year(middle and end of year) and update the cost per download spreadsheet once a year. Additional updates are provided if a resource is being considered for renewal/cancellation.

·  Where possible, regular alerts from vendors are received notifying when usage data is available.

·  Links to usage statistics and user names and passwords are maintained in the ERM resource records.

·  Twice a year staff harvest the usage statistics manually from vendor websites.

·  COUNTER reports are obtained wherever possible – JR1 for journals, BR1 for Books, DR3 for databases. If no COUNTER reports are available, e.g. Hein online, Ibis world – then we use whatever is available that counts the number of times the resource has been accessed.

·  Statistics are downloaded for the time period required.

·  Statistics are then added manually to a spreadsheet that records the stats for the product over several years and provides a graph showing comparisons over recent years. The same spreadsheet is used for all resources to allow easy comparisons (see example on next page).

Current issues with usage statistics

·  It is time consuming process whether the process is done in-house or part is done by an external provider. Responsibility for this process is part of the following staff members duties twice a year:

o  A HEW 4 in the serials access team obtains data from vendors that is not obtained automatically

o  A HEW 4 frontline staff member collates some of the data

o  A HEW 4 frontline officer publishes the data on the intranet.

o  A HEW 7 client services staff member coordinates the process, follows up on gaps, analyses the data and provides regular reports.

·  Deakin trialled Scholarly Stats for 12 months but found the data to be not timely, not available for many providers and still requiring a fair bit of manual sorting, filtering and extracting so it didn’t save much staff time. Deakin is considering trialling 360 Counter in 2013.

·  Many ebooks are still not COUNTER compliant or even consistent.

·  Our discovery layer created some distorted statistics.

·  Some reports from vendors show quite different figures at mid year for some months compared with the end of year report. There are still plenty of inconsistencies and anomalies.

Journal package example
Jan / Feb / Mar / Apr / May / Jun / Jul / Aug / Sep / Oct / Nov / Dec / YTD Total / % change
2003 / 5 / 2 / 8 / 8 / 13 / 22 / 24 / 15 / 29 / 40 / 8 / 8 / 182
2004 / 10 / 9 / 48 / 35 / 44 / 24 / 14 / 55 / 80 / 100 / 45 / 15 / 479 / 163%
2005 / 96 / 86 / 305 / 572 / 586 / 212 / 232 / 103 / 202 / 62 / 159 / 113 / 2,728 / 470%
2006 / 323 / 182 / 620 / 579 / 501 / 232 / 298 / 556 / 558 / 441 / 257 / 182 / 4,729 / 73%
2007 / 238 / 257 / 816 / 922 / 1,104 / 452 / 420 / 1,070 / 813 / 709 / 404 / 236 / 7,441 / 57%
2008 / 323 / 404 / 884 / 1,541 / 1,439 / 661 / 525 / 1,072 / 1,036 / 958 / 454 / 286 / 9,583 / 29%
2009 / 409 / 398 / 1,334 / 1,318 / 1,122 / 525 / 718 / 1,211 / 942 / 727 / 654 / 465 / 9,823 / 3%
2010 / 523 / 617 / 854 / 1,602 / 1,231 / 692 / 750 / 1,270 / 862 / 745 / 656 / 599 / 10,401 / 6%
2011 / 600 / 627 / 1,258 / 1,741 / 1,770 / 828 / 952 / 1,740 / 1,350 / 777 / 755 / 538 / 12,936 / 24%
2012 / 881 / 923 / 1,861 / 2,697 / 2,441 / 1,446 / 10,249
Total downloads / Sub cost / Cost per download / Pro-rata
2007 / 7,441 / $00000 / $2.06
2008 / 9,583 / $00000 / $1.50
2009 / 9,823 / $00000 / $1.38
2010 / 10,401 / $00000 / $1.30
2011 / 12,936 / $00000 / $0.96
2012 / 10,249 / $00000 / $0.83 / 6


RMIT University Library Electronic Resources Usage Statistics

To document the variety of statistics that are currently kept within the University Library relating to Library electronic database usage,

2. Electronic resources usage

1.  Automated Data Collection

We use Ex Libris Ustats product which harvests counter/sushi compliant data from vendors. It is limited to those vendors that are sushi compliant and doesn’t cover all vendors we subscribe to.

We only collect Journals requests, not database searches or database sessions.

We have not manually uploaded any other statistical data into Ustats.

2.  Manual data collection

Statistics are downloaded/received either manually or automatically from publishers. (files are received from publishers based on email alerts) This data (usually in .csv files) is uploaded to a Library staff intranet.

3.  ERM

Ex Libris Verde – usage data is linked from within Verde back to activated targets in Ustats.

No other usage data (or links to data) have been manually uploaded into Verde.

4.  Other systems used

Statistics direct from usage of Ezproxy server

Data is taken from hits on Ezproxy server every time a customer gets the NDS login page. The statistics are updated daily and collected monthly. The data is being collected by AWstats but this is slowly being phased out.

SFX statistics

High level data is collected only, number of sessions, searches, downloads collected every month then cumulated YTD. Provides instant snapshot.

MetaLib statistics

High level data collected only, records the number of searches and downloads from targets within the ExLibris MetaLib software. Data is collected monthly and then by YTD


Collection of Electronic Usage Statistics at the University of Waikato

·  Statistics are collected from all database / journal vendors who make them available and that we know about

·  Preference is given to Counter compliant statistics, but others are collected if that is all that are available

·  Only full-text downloads, logins and searches are collected

·  Statistics are collected by Library staff and loaded on to spreadsheets

·  Where possible alerts are set up to either email the appropriate statistics or to send an email alert that the statistics for the latest month are available

·  Statistics from major databases are collected monthly, other less frequently, maybe just once a year

·  No software applications such as Sushi etc are used as it is difficult to justify the expense and none of them are comprehensive in coverage

·  At the end of each year statistics are cumulated and entered on to a single spreadsheet which includes cost per search and per full-text download

·  Statistics collected are a useful guide at renewal time, but a number of other things need to be taken into account


University of Melbourne

UoM and COUNTER Usage Data:

UoM has an unsatisfactory and incomplete approach to gathering, consolidating, analysing and making available COUNTER usage data.

<!--[if !supportLists]-->1. <!--[endif]-->Record-keeping: We record availability of COUNTER statistics from the publisher as part of the acquisition process. We then record relevant details for access and delivery methods.

<!--[if !supportLists]-->2. Automated Data Collection: We use SwetsWise Selection Support (ScholarlyStats enhanced with cost-per-use data from Swets-subscribed cost information) to manage the gathering, consolidation and user-driven reporting of data from the 48 largest publishers/platforms.

<!--[if !supportLists]-->a. <!--[endif]-->SSS covers a minority of our COUNTER-compliant publishers/platforms (48 of ~150) but a preponderance of total COUNTER downloads

<!--[if !supportLists]-->b. <!--[endif]-->Because availability of Cost-per-Use data depends on the title or package being subscribed via Swets this is not, in fact, a useful feature.

<!--[if !supportLists]-->c. <!--[endif]-->SSS is an unattractive choice for casual use so it’s essentially a tool for providing data on demand.

<!--[if !supportLists]-->3. <!--[endif]-->Manual Data Collection:

<!--[if !supportLists]-->a. <!--[endif]-->For data not covered by SSS (around 100 other publishers or platforms), we gather data from publisher sites based on customer-initiated e-mail alerts from the publisher. This data is gathered in a very piecemeal fashion and often only in response to a specific need, like gathering evidence to help make a decision about renewal or cancellation.

<!--[if !supportLists]-->4. <!--[endif]-->III ERM development project:

<!--[if !supportLists]-->a. <!--[endif]-->The current situation is fragmented and unsatisfactory. We plan to reach as near complete coverage as practicable; to exploit automated processes as much as we can, and to make use of the JR1 stats management capacity of our III ERM system. To that end we have begun a project to move as many reports as possible from SSS to ERM and to include as many of the rest into ERM as staff availability will allow.

<!--[if !supportLists]--> i. <!--[endif]-->We intend to use SUSHI as much as possible to ingest data directly from publisher sites into ERM.

<!--[if !supportLists]--> ii. <!--[endif]-->We intend to use capacity freed up in our SSS account to accommodate reports for non-SUSHI compliant publishers. We can then SUSHI these from SSS into ERM.

<!--[if !supportLists]-->b. <!--[endif]-->Because of the present partial coverage our broader library usage reporting makes no reference to COUNTER or other publisher-provided reports. It is our intention that ERM will allow us to provide both complete and timely JR1 reports for both internal and external reporting purposes.

<!--[if !supportLists]-->5. <!--[endif]-->Other systems used:

<!--[if !supportLists]-->a. <!--[endif]-->SFX click-thru statistics can provide a snapshot of usage with little staff effort. However, SFX tends to undercount usage and is incomplete in its coverage.

<!--[if !supportLists]-->b. <!--[endif]-->EZProxy logfile analysis using AWStats. Is used where COUNTER stats are not available or to supplement non-COUNTER –compliant publisher reports. Has been occasionally useful in cancellation/retention decisions. Current AWStats server has reached end of useful life. We are looking at RAPTOR as a replacement. RAPTOR promises to be much more useful in that, unlike AWStats, it gives accurate figures on logins per user-defined resource – thus providing a way to meaningfully compare COUNTER and non-COUNTER resources. It can also ntegrate LDAP data to show usage per e-resource by the aggregated demographic characteristics of the user population. Promises to be an important and unlaborious complement to COUNTER reports. More information on RAPTOR athttp://iam.cf.ac.uk/trac/RAPTOR


University of Western Sydney

JAR1 reports are run as needed for our main databases, eBooks and streaming video subscriptions.

Cost per download is then calculated and this information is used when deciding to renew subscriptions.

We are currently looking at the feasibility of running the JAR1 reports every six months.

Google Analytics is used to provide insights into how visitors are using our library website, discovery layer, research repository, special collections, library catalogue, subject guides etc.

1