Honeycomb Bulge

Michael Coghlan https://www.flickr.com/photos/mikecogh/9179980498

Last week I went to CIKM (Conference on Information and Knowledge Management) in Melbourne. I was primarily there for Friday’s ‘[Evaluation of Collaborative Information Seeking and Retrieval]1‘ workshop, at which I presented two short papers from my PhD work. One was based around the kinds of learning indicators we might extract from collaborative information seeking (CIS) tasks, and the other on a temporal analysis of such data in a learning context, and how that might be effectively conducted.

Simon tells us about learning indicators during the IS&R process @ECol2015 conference pic.twitter.com/EXklLiFqJo

— Leif Azzopardi (@leifos) October 23, 2015

On the Monday I also attended a couple of tutorials, one on ‘Data Analytics on Social Media & Social Networks’ – which gave a nice overview of social network analyses – and the other on veracity mining in big data. The latter workshop had some [interesting work]2 around how one can check the veracity of claims from across multiple sources, and the traversal of information (or misinformation, such as rumours) through networks. I was particularly excited to see Jamie Teevan keynoting on the first day, on the topic of ‘slow search’ – the value of increasing information seeking duration (contra the usual search engine aim to reduce the time it takes you to find an answer).

conference is about to start. Follow along with my SlowSearch keynote online: https://t.co/46WdBbQcqB — Jaime Teevan (@jteevan) October 19, 2015

Liveblogging conference https://t.co/vVOVxXeFLX currently Jamie Teevan keynote on Slow Search.

— Jeff Dalton (@JeffD) October 19, 2015

as we search we reframe needs, see content in new ways, change way we interact with content – vocab & results diversify @jteevan conference

— Simon Knight (@sjgknight) October 19, 2015

That included a couple of nice social-sciency manipulations of search engine interactions, or information seeking contexts

Bing Distill – SQ&A via queries & community created answers to those queries https://t.co/yzOAgMtNcR – interesting conference — Simon Knight (@sjgknight) October 19, 2015

 

Will crowdsourcers behave maliciously for you? Cool psychology manipulation from @jteevan et al https://t.co/viBibNJFC1 conference — Simon Knight (@sjgknight) October 19, 2015

A couple of talks through the conference made reference to the identification of factive or assertive statements – statements that either make a factual claim, or assume a fact – and how such information might be used.

http://t.co/AKR24chOQb …ClaimBuster ex­am­ines any tran­script to find sen­tences that should be fact checked conference

— Simon Knight (@sjgknight) October 18, 2015

BiasWatch: […] Discovering and Tracking Topic-Sensitive Opinion Bias in Social Media http://t.co/wL0hZbBqos cc @regis_alenda conference — Dr. Bisounours (@BisounoursJp) October 19, 2015

news suggestions for wikipedia entities https://t.co/TxaNvKJr5n cikm15 Like @edsaperia openaccess reader https://t.co/OgmsvS3xUW @fetahubesnik

— Simon Knight (@sjgknight) October 20, 2015

Linguistic Models for Analyzing and Detecting Biased Language https://t.co/oDWxX3eGfR Using wikipedia edit history – v cool

— Simon Knight (@sjgknight) October 20, 2015

For example, one talk discussed ‘[claimbuster]3‘ which draws in US presidential debates, and flags those sentences which are most ‘fact checkable’, with an aim to highlight the claims that fact-checking organisations should focus on (and reduce their filtering load). I can see some potential application of this kind of approach within education, and will publish a blog on that soon. CIKM also wins the prize for most surreal conference dinner location (aquarium) – I was sat next to the Crocodile…

Dinner companion. conference pic.twitter.com/3QVqJ330P4 — I am K (@vanessa_murdock) October 21, 2015

Footnotes

  1. http://www.irit.fr/ECol2015/

  2. http://daqcri.github.io/dafna/#/dafna/home_sections/home.html

  3. http://ranger.uta.edu/~cli/pubs/2015/claimbuster-cikm15-hassan.pdf