Home  /  Ricerca  /  Human-computer interaction tools with gameful design for critical thinking the media ecosystem: a classification framework

Human-computer interaction tools with gameful design for critical thinking the media ecosystem: a classification framework

In response to the ever-increasing spread of online disinformation and misinformation, several human–computer interaction tools to enhance data literacy have been developed. Among them, many employ elements of gamification to increase user engagement and reach out to a broader audience. However, there are no systematic criteria to analyze their relevance and impact for building fake news resilience, partly due to the lack of a common understanding of data literacy. In this paper we put forward an operationalizable definition of data literacy as a form of multidimensional critical thinking. We then survey 22 existing tools and classify them according to a framework of 10 criteria pointing to their gameful design and educational features. Through a comparative/contrastive analysis informed by a focus group, we provide a principled set of guidelines to develop more efficient human–computer interaction tools to teach how to critically think in the current media ecosystem.

Introduction

The infodemic has shown us that public ability to recognize false and misleading information during a disease outbreak is crucial to diminishing (a) risk-taking behaviors by the misinformed and (b) mistrust in institutions and media which hampers public health responses and recovery. Due to the proliferation of information across digital media, fact-checking is struggling to keep up with the spread of misinformation. As a result, the number of infodemically vulnerable people is increasing at a rapid pace (https://reutersinstitute.politics.ox.ac.uk/UK-COVID-19-news-and-information-project).

The phenomenon of misinformation has been exacerbated by the advent of Networked Society and the rise of AI systems designed to accomplish tasks mimicking how the brain works rather than helping humans evaluate their reasoning patterns. As a result, while advances in natural language generation leveraging GPT-3 produce systems able to produce news articles not distinguishable from those written by journalists (https://www.theguardian.com/commentisfree/2020/sep/08/robot-wrote-this-article-gpt-3, automatic fact-checking systems still struggle to identify disinformation bundles.

To counter this scenario, the European Commission advocated already 15 years ago (2007) for a media literacy campaign targeting mediatic systems in the digital infosphere, including their economic and cultural dimensions. Such urgency was reiterated in the Digital Competence Framework for Citizens proposed by “The Digital Education Action Plan” (2018) with a stress on data literacy. Skills designated under this umbrella term are received differently by different demographics and constitute a challenge for standard curricula due to fast-pace changes.

Gamification environments based on human–computer interaction have proved to be an efficient learning tool since, while boosting users’ digital skills, they provide rapid feedback, they guarantee freedom to fail, a sense of progression and a storytelling environment that prompts focus and concentration (Stott and Neustaedter 2013). A variety of media literacy games hosted on digital platforms are currently publicly accessible. As underlined by Miles and Lyons’ cross-indexing (2019), these games differ in terms of learning outcomes, type of learning experience (e.g., knowledge anticipation vs. reflection) as well as type of action (e.g., simulation or puzzle). However, there is currently no systematic framework to assess which data literacy skills are addressed and to evaluate systems’ design in view of their desired outcome. This is partially due to the lack of an agreed notion of data literacy, which remains a blurred concept to be updated hand in hand with changes in the digital media ecosystem. More specifically, as pointed out by Carmi et al. (2020), there is confusion among scholars and policy makers about what critical thinking means when applied to the digital ecosystem. As a result, existing human–computer interaction tools are addressing only some aspects of critical thinking skills for media literacy in the online (mis)information ecosystem. To inform the future development of such educational tools, we propose a classification framework to analyze existing tools in a functional perspective. The paper is organized as follows: Sect. 2discusses the underpinnings of critical thinking for media literacy in relation to the misinformation ecosystem within the digitized society, identifying five components. Drawing from the operationalizable definition of critical thinking for media literacy and from the literature review, Sect. 3.1 puts forward a suite of 10 criteria to classify human–computer interaction tools to teach media literacy. Section 3.2 reports a survey of existing tools analyzed against the criteria with the aid of a focus group. On the backdrop of the survey results, Sect. 4is devoted to the discussion of limitations of currently available tools. Section 5 summarizes the theoretical and empirical contributions of the study, offering recommendations to inform the design of a new generation of human–computer interaction tools to teach critical thinking for media literacy.Footnote 1

HERE THE FULL PAPER

By Elena Musi, Lorenzo Federico and Gianni Riotta