The #1 largest and the #2 most cited Psychology journal
Human-Media Interaction is devoted to the investigation, design and evaluation of novel forms of human-media interaction. Accordingly, we welcome submissions that examine and explore new interaction paradigms, report new sensor and actuator technology, and that further knowledge about human-computer interaction, human-human interaction, and human behavior more generally. In particular, we encourage research that pursues insight into how and why people use interactive media, and that recognizes such insight as the basis of designing interactive systems that are more socially capable, safe, effective and fun. We want to foster studies into the sensor technologies that allow behavior sensing, from position and proximity sensing to behavior sensing using vision, speech, touch and situation recognition technology for observing the human participant’s body movements, gestures, facial expressions, eye movements, and speech and language use. In addition, we will be a space for research that focuses on smart environments and how they sense and interpret multimodal information to proactively change their own properties and facilitate interactive participation.
Moreover, we are receptive to articles that approach interaction modelling not solely from perspectives of computation and artificial intelligence, but also from those of social psychology. As such, we accept research that involves empathy, humor and cognitive modelling, and that recognizes the importance of knowledge about the particular, context-specific applications of interactive technologies. We therefore welcome investigations into how interactive sensors will be embedded in our physical and virtually augmented physical environments.
This section also welcomes papers that focus on the modelling of human interaction behavior with smart environments. Ideal contributions will report, depending on the application, relevant, entertaining or efficient support for such interaction; will exploit the use of novel interaction technologies and display facilities (including virtual humans, social robots) in user interfaces; and will bring together research communities that aim to address interface design in mobile and ubiquitous applications, user modelling, sensor and actuator technologies, tangible interfaces, smart materials, and interfaces that recognize and interpret social interaction cues and human behavior patterns and that can generate appropriate feedback and support, in particular judging when this should be generated by artificial (embodied) agents or by social robots. Papers that address related issues such as corpus collection and annotation, corpus data analysis, reasoning, knowledge representation, machine learning techniques for user modelling, usability, (user-centered) design, engagement, experience, and evaluation are also welcome.
Indexed in: Google Scholar, CrossRef
Human-Media Interaction welcomes submissions of the following article types: Book Review, Code, Correction, Data Report, Editorial, General Commentary, Hypothesis & Theory, Methods, Mini Review, Opinion, Original Research, Perspective, Protocols, Review and Technology Report.
All manuscripts must be submitted directly to the section Human-Media Interaction, where they are peer-reviewed by the Associate and Review Editors of the specialty section.
Articles published in the section Human-Media Interaction will benefit from the Frontiers impact and tiering system after online publication. Authors of published original research with the highest impact, as judged democratically by the readers, will be invited by the Chief Editor to write a Frontiers Focused Review - a tier-climbing article. This is referred to as "democratic tiering". The author selection is based on article impact analytics of original research published in all Frontiers specialty journals and sections. Focused Reviews are centered on the original discovery, place it into a broader context, and aim to address the wider community across all of Psychology, ICT and Digital Humanities.