Workshop on Implicit Measures of User Interests and Preferences
Friday, August 1, 2003
Workshop Overview
The goal of the workshop on Implicit Measures is to explore how various implicit measures of user interests can be used in information retrieval and filtering applications where it is difficult to obtain explicit user feedback. Since this is the first workshop on this topic at SIGIR, we encourage participation from people with different backgrounds and perspectives including predictive modeling, experimental analysis, and applications development. Applications from information retrieval, collaborative filtering, e-commerce, user modeling, and human-computer interaction are encouraged. As an outcome of the workshop we hope to identify key theoretical modeling issues, systematize engineering principles and best practices, and spark new research directions.
Important Dates
June 2, 2003 |
Position statements and abstract submissions due |
June 16, 2003 |
Notification of acceptance |
July 7, 2003 |
Final notebook statements and abstracts due |
August 1, 2003 |
Workshop |
NOTE: All people wishing to attend the workshop must submit a position statement before June 2, 2003. Submission details below.
Organizers
Susan Dumais, Microsoft Research |
sdumais@microsoft.com |
Krishna Bharat, Google |
krishna@google.com |
Thorsten Joachims, Cornell University |
tj@cs.cornell.edu |
Andreas Weigend, Amazon |
aweigend@amazon.com |
Workshop Schedule (tentative)
9:00-9:15 - Introductions [S. Dumais, moderator]
9:15-10:00 - Invited Talk, Steve Lawrence (Google Inc.), Implicit feedback: Good may be better than best. [Extended Abstract] [Talk] [T. Joachims, moderator]
10:00-10:30 - D. V. Sreenath (Wayne State University), W. I. Grosky (University of Michigan-Dearborn), F. Fotouhi (Wayne State University), Deriving emergent web page semantics. [Extended Abstract] [Talk] [K. Bharat, discussant]
10:30-11:00 - break
11:00-11:30 - David Brown and Mark Claypool (Worcester Polytechnic Institute), Curious Browsers: Automated gathering of implicit interest indicators by an instrumented browser. [Extended Abstract] [Talk] [T. Joachims, discussant]
11:30-12:00 - Steve Fox (Microsoft), Evaluating implicit measures to improve the search experience. [Extended Abstract] [Talk] [T. Joachims, discussant]
12:00-12:30 - Douglas W. Oard (University of Maryland), Anton Leuski (University of Southern California), Stuart Stubblebine (Stubblebine Research Labs), Protecting privacy of observable behavior in distributed recommender systems. [Extended Abstract] [Talk] [A.Weigend, discussant]
12:30-2:00 - Lunch and Posters
Poster - Diane Kelly (Rutgers University), Understanding implicit feedback: A naturalistic user study. [Extended Abstract]
Poster - Thorsten Joachims (Cornell University), Evaluating retrieval performance using clickthrough data. [Extended Abstract] [Slides]
2:00-2:30 - Travis Bauer (Sandia National Laboratories), WordSieve: Learning task differentiating keywords automatically. [Extended Abstract] [Talk] [K. Bharat, discussant]
2:30-3:00 - J. Díez, J. J. del Coz, O. Luaces and A. Bahamonde (Universidad de Oviedo at Gijón), A clustering algorithm to find groups with homogeneous preferences. [Extended Abstract] [Talk] [S. Dumais, discussant]
3:00-3:30 - Jie Wu and Karl Aberer (Swiss Federal Institute of Technology, EPF), Semantic web graph implied by user preferred activities. [Extended Abstract] [Talk] [A. Weigend, discussant]
3:30-4:00 - break
4:00-4:30 - Andreas Lorenz and Andreas Zimmermann (Fraunhofer Institute for Applied Information Technology), Inferring interests from user movements: The LISTEN approach. [Extended Abstract] [Talk - cancelled] [S. Dumais, discussant]
4:30-5:00 - Final Discussion and Wrap Up
Workshop Details
Background:
In most information retrieval or filtering applications, it is difficult to get explicit feedback from users about the relevance of the results, the appropriateness of the presentation, and more generally about the quality of their experience. Yet explicit judgments are assumed by researchers for many activities like the tuning and selection of ranking algorithms, information combination, user modeling, information presentation, etc. This workshop will explore how implicit measures of user interest (such as dwell time, click through, and user activities like annotation, printing, and purchasing) can be used to develop predictive models for a variety of purposes. Example uses in the context of information retrieval include: improved ranking and relevance assessment (e.g., the extent to which implicit measures can be used to evaluate the quality of systems, ranking algorithms and recommendations, or as input to relevance feedback algorithms); personalization in search, filtering or presentation; personalization in the large considering both individual and aggregate data; browsing or searching agents; automatic hyperlink generation; and adaptive web site design. An examination of theoretical issues such as modeling approaches (Bayesian techniques and other predictive models), gold standards for user behavior (e.g., relevance judgments, purchases), combining implicit and explicit preferences, and biases introduced by reliance on implicit measures are also encouraged.
We encourage presentation and participation from both researchers who are interested in theoretical issues, and practitioners who have deployed systems that make use of implicit indicators of user interest.
Topics of interest include, but are not limited to:
Novel techniques for collecting and using user activity data
Information retrieval or collaborative filtering applications that use implicit indicators of interest (e.g., buying, usage or reading patterns) rather than ratings from users
Integrating information about individual and aggregate group behaviors for search or collaborative filtering
Adaptive web presentation and content selection based on inferred preferences (e.g., surfing assistants, intelligent agents, personalize web pages and sites)
Practical issues in deployment of personalized systems (e.g., how do users react; are users confused by non-repeatability; how can users manage state)
Personal profiles and feedback influencing search results
Merging ranking from multiple sources using implicit usage information
Search or relevance evaluation with implicit measures
Modeling the evolution of interests and preference over time or task
Effective predictive modeling techniques
Limitations and biases encountered when relying on implicit measures
Privacy issues
Planned Activities:
Planned activities include invited talks or introductory overviews in key areas, short talks grouped by topic along with substantial discussion periods. The workshop will also include a closing discussion on lessons learned, open issues, and potential follow-ups.
Submission Process for Participants
Two kinds of participation are solicited:
Attendees: Attendance by researchers and practitioners with interests in implicit measures.
Presenters: Presentations about predictive modeling techniques or experiences with systems that have used implicit measures of user interest, preferences, etc.
All applicants must submit a short position statement (maximum 250 words) describing their background and interest in implicit measures of user interest and preferences. Participants who would like to present should submit an abstract (maximum 750 words) in addition detailing the major points and/or results they would present during a talk. Submissions of both kinds will be reviewed by the program committee and invitations issued. Workshop attendance will be limited to presenters and attendees selected by the committee based on their submissions.
Submissions are due on June 2, 2003 (5pm PST). Email your submissions (ascii or pdf preferred) to Susan Dumais, sdumais@microsoft.com.
A notebook of participants’ position statements and presentation descriptions will be distributed to all attendees. A summary of workshop activities and emerging issues and trends will be prepared for the SIGIR Forum by the organizers.
Last modified Apr 28, 2003. Please send comments or corrections to sdumais@microsoft.com