I’m keen on using mediawiki as a platform for learning analytics. So projects which can use analytics on Wikipedia (and related mediawiki projects) are interesting because a) they’re hopefully useful to the individual project and b) they provide proof of concept for use outside of mediawiki and within school-hosted environments, etc.
One thing I’ve been talking to people about recently is the possibility of analytics (or/and badging) on user contributions, as a way to ‘accredit’ training (either within dummy wikis or from previously existing data), and perhaps provide pointers to more experienced editors about areas they might like to explore for new training. So, supporting incoming users and checking they can actually deploy the lessons learnt in real wiki-contexts, and also facilitating moderately experienced editors in learning new skills and engaging in the training materials (which might also encourage them to take part in outreach/education projects – so much the better!).
Expect an update on this when I’ve had time to actually play…but a really simple (dumb) way to do this is:
- Scrape tables from user contributions (e.g. using the gdocs scraper – various options, importHtml doesn’t seem to play well so I’ll have to look through, Martin Hawksey’s great guide)
- For each edit referred to, extract the content of the edit to see what was altered (this might involve having to do a comparison – and it won’t work for all edits). We can also look for things like “N” (indicates new page created)
- Check the next edit to see if it was reverted
- Use a bag of words approach to see what was attempted (see 3 to check whether successfully or not)
Obviously we’re also interested in other things like whether or not users contribute on talk and article pages, whether they edit a range of pages or just the one, can upload to commons, etc. But as a basic level, I think this is a pretty straightforward project.