You are currently browsing the monthly archive for February 2013.

It’s always good to find out about new project management tools and tips, so it was good to spend a few hours the other week at a training session introducing the One Page Project Management (OPPM) approach.    OPPM was something that I was vaguely aware of but not something that I knew too much about, and I probably started from the view of being slightly sceptical that it was possible to encapsulate everything you needed for reporting on your project onto a single page.  Well, not at A4 size anyway.

The training, run by David Sommer, was based around the One Page Project Management idea from Clark Campbell (you can see more information about OPPM and download a free version at their website https://www.oppmi.com/).  I understand David runs the UKSG Practical Project Management courses.

One Page Project Management website screenshotThe training covered the ideas behind the concept and then concentrated on working through the template establishing your project and showing how you use it on a day to day basis. It was good to then be able to run through a real project and try to fit it into the template.

The template includes a header with things like the project goal and completion date, then a set of 5 or so objectives, a number of tasks and a list of people.  The template forms a matrix a bit like a gantt chart that shows progress with your project.  So you start with open circles in each time period on the timeline for an activity and then fill in the circles when that activity for that time period has been completed.  The template also lets you assign people and denote their roles and we had a bit of a debate about how we used the notation in that area.

Also in the template was space for some more subjective measures that essentially capture ‘confidence’ using a Red/Amber/Green traffic light system to denote how confident you are that the project will deliver on time for example.  There’s also a section for a couple of measures that might be cost or staff resource or % of content ingested for example.  And finally there’s a small box for a commentary.

A few things struck me during the session.  It’s interesting that it makes you think really carefully about your objectives for your project, and there was a tip about thinking about your objectives in the past tense, e.g. website usability testing completed.  It looks quite good at being able to get people to focus on the key information and where there are key decisions to be made.  Often there’s a tendancy in projects to focus on what’s been done rather than focusing on what is still to do.  I’m always in favour of ‘exception’ reporting where you get attention to focus on the things that are going off track or have changed or particularly where they need some action to get things back on track.  It looks like OPPM might be helpful for that.

Using OPPM as a focus of project meetings is also an interesting idea.  It was said that what tends to happen is that people become more focused on making sure they have done their tasks by the time of the meeting.  That’s a perpetual bane of a project manager’s existence so I liked that idea. I also liked the suggestion that you should print out your OPPM plan out in A3 size and stick it on the wall.  People passing could then see how your project was going at a glance.  And maybe steer clear if there were too many uncompleted tasks.

Overall it was a good day’s training session, quite practical and concentrating on working through the template and seeing how it works in practice.  We’re planning on adopting it for our project reporting proceses so it will be good to see how we get on with it.  Across our teams we’ve several projects running so it will be interesting to see how it copes with the different types of projects that we have.  But first impressions are that it does pretty much what it says in giving you a method of encompassing the key messages about the progress of your project in a single page.

Encouraged by some thinking about what sort of prototype resource usage tools we want to build to test with users in a forthcoming ‘New tools’ section I’ve been starting to think about what sort of features you could offer to library users to let them take advantage of library data.

Early steps
For a few months we’ve been offering users of our mobile search interface (which just does a search of our EBSCO discovery system) a list of their recently viewed items and their recent searches. The idea behind testing it on a mobile device Mobile search results screenwas that giving people a link to their recent searches or items viewed would make it easier for people to get back to things that they had accessed on their mobile device by just clicking single links rather than having to bookmark them or type in fiddly links. At the moment the tool just lists the resources and searches you’ve done through the mobile interface.

But our next step is to make a similar tool available through our main library website as a prototype of the ‘articles I’ve viewed’. And that’s where we start to wonder about whether the mobile version of the searches/results should be kept separate from the rest of your activities, or whether user expectations would be that, like a Kindle ebook that you can sync across multiple devices, your searches and activity should be consistent across all platforms?

At the moment our desktop version has all your viewed articles, regardless of the platform you used. But users might want to know in future which device they used to access the material maybe? Perhaps because some material isn’t easily accessible through a mobile device. But that opens up another question, in that the mobile version and the desktop version may be different URLs so you might want them to be pulled together as one resource with automatic detection of your device when you go to access the resource. Articles I've read screenshot

Next steps
With the data about what resources are being accessed and what library web pages are being accessed it starts to open up the possibility of some more user-centred use of library activity and analytics data.

So you could conceive of being able to match that there is a spike of users accessing the Athens problems FAQ page and be able to tie that to users trying to access Athens-authenticated resources. Being able to match activity with students being on a particular module could allow you to push automatically some more targeted help material, maybe into the VLE website for relevant modules, as well as flag up an indication of a potential issue to the technical and helpdesk teams.

You could also contemplate mining reading lists and course schedules to predict when there are particular activities that are scheduled and automatically schedule pushing relevant help and support or online tutorials to students. Some of the most interesting areas seem to me to be around building skills and using activity (or lack of activity) to trigger promotion of targeted skills building activities. So knowing that students on module X should be doing an activity that involves looking at this set of resources, and being able to detect the students that haven’t accessed those resources, offering them some specific help material, or even contact from a librarian. Realistically those sorts of interventions simply couldn’t be managed manually and would have to rely on some form of learning analytics-type trigger system.

One of the areas that would be useful to look at would be some form of student dashboard for library engagement. So this could give students some data about what engagement they have had with the library, e.g. resources accessed, library skills completed, library badges gained, library visits, books/ebooks borrowed etc. Maybe set against averages for their course, and perhaps with some metrics about what high-achieving students on their course last time did. Add to that a bookmarking feature, lists of recent searches and resources used, with lists of loans/holds. Finished off with useful library contacts and some suggested activities that might help them with their course based on what is know about the level of library skills needed in the course.

Before you can do some of the more sophisticated learning analytics-type activities I suspect it would be necessary is to have a better understanding of the impact that library activities/skills/resources have on student retention and achievement. And that seems to me to argue for some really detailed work to understand library impact at a ‘pedagogic’ level.

I’d been thinking early this morning about writing up a blog post around some thoughts about ‘Library Analytics’ and thinking that it was interesting how ‘Library Analytics’ had been used by Harvard for their ‘Library analytics toolkit’ and by others as a way of talking about web analytics, but that neither really seemed to me to quite be analagous to the way that the Learning Analytics community, such as Solar,  view analytics.  There are several definitions about Learning Analytics.  This one from Educause’s 7 things you should know about first-generation learning analytics:

Learning analytics (LA) applies the model of analytics to the specific goal of improving learning outcomes. LA collects and analyzes the “digital breadcrumbs” that students leave as they interact with various computer systems to look for correlations between those activities and learning outcomes. The type of data gathered varies by institution and by application, but in general it includes information about the frequency with which students access online materials or the results of assessments from student exercises and activities conducted online. Learning analytics tools can track far more data than an instructor can alone, and at their best, LA applications can identify factors that are unexpectedly associated with student learning and course completion.

Much of the library interest in analytics seems to me to have mainly been about using activity data to understand user behaviour and make service improvements, but I’m increasingly of the view that whilst that is important, it is only half the story.  One of the areas that interests me about both learning analytics and activity data, is the empowering potential of that data as a tool for the user, rather than the lecturer or librarian, to find out interesting things about their behaviour, or get suggested actions or activities, and essentially to be able to make better choices.  And that seems to be the key – just as reviews and ratings are helping people being informed consumers, with sites like Trip Advisor then we should be building library systems that help our users to be informed library consumers.

So it was great to see the announcement of the JiscLAMP project this morning http://infteam.jiscinvolve.org/wp/2013/02/01/jisc-lamp-shedding-light-on-library-data-and-metrics/ announcing the Library Analytics and Metrics project and talking about delivering a prototype shared library analytics service for UK academic libraries.  I was particularly interested to see that the plan is to develop some use-cases for the data and great that Ben Showers shared some of the vision behind the idea.   It’s a great first step to put data on a solid, consistent and sustainable basis, and should build a good platform to be able to exploit that vast reservoir of library data.

Twitter posts

Categories

Calendar

February 2013
M T W T F S S
« Jan   Mar »
 123
45678910
11121314151617
18192021222324
25262728  

Creative Commons License