With the deployment of interactive media at the interface between users and machines, and the massive amounts of available data that result from such interfaces, the dream of personalization of a user’s experience is now a reality: searching the web is personalized based on the user’s past searches; buying goods is personalized based on what other users with a similar profile have searched for and bough in the past; attending a class in an educational setting is personalized based on the strengths and weaknesses of the student; taking a virtual tour of a museum is personalized based on the interests and constraints of the visitor. At the heart of personalization lies the simple premise that a user’s model of behavior can be learned through interaction with the user, and that once that model has been learned one can actively intervene to improve the user’s experience based on the user’s personal model.
Missing from these premises is the realization that any intervention has the side-effect of affecting the users’ predisposition to the choices they are faced with. Suggestions on how one should proceed may elicit either compliance or defiance, in a manner that deviates from what a user would have done if a suggestion was absent. The IF project seeks to formally analyze the manner in which such side-effects propagate in decision-making, and to design elicitation and intervention processes that are impervious to such effects. Such introspective processes are essential in order to prevent the dangers of a vicious cycle where machines and users reinforce a false model of how the users are thought to behave.
Computational Cognition Lab, Open University of Cyprus
Project Website: http://cognition.ouc.ac.cy/loizos?get=Michael_2015_DisembodiedPredictors