Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
skip to main content
research-article

Exploring the Information Cues of Danmaku Comments to Stimulate Users' Affective Generation in Reaction Videos

Published: 22 October 2023 Publication History

Abstract

Reaction video, a new form of online video that records users' instant reactions to a particular thing, has emerged on social media in recent years. Its unique content composition and hedonic and emotional characteristics make the information cues that influence the affective generation in danmaku comments quite different from those in traditional videos. To explore the information cues of danmaku comments to stimulate users’ affective generation in reaction videos, we conduct thematic coding using the content analysis method by selecting the danmaku resources, video content, and reactors’ responses from 11 popular videos in different categories as samples to identify information cues that influence user affects in danmaku comments. The preliminary findings show that there are three main types of information cues in the reaction videos: the content of the original video, reactors’ reaction and danmaku comments, which could trigger danmaku users’ affect in reaction videos from the perspective of orientation type, parasocial interaction, and peer influence, respectively.

References

[1]
Bagozzi, R. P., Gopinath, M., & Nyer, P. U. (1999). The role of emotions in marketing. Journal of the academy of marketing science, 27(2), 184–206.
[2]
Bapna, R., & Umyarov, A. (2015). Do your online friends make you pay? A randomized field experiment on peer influence in online social networks. Management Science, 61(8), 1902–1920.
[3]
Brave, S., & Nass, C. (2007). Emotion in human‐computer interaction. In The human‐computer interaction handbook (pp. 103‐118).
[4]
Brechwald, W. A., & Prinstein, M. J. (2011). Beyond homophily: A decade of advances in understanding peer influence processes. Journal of research on adolescence, 21(1), 166–179.
[5]
Chen, Y., Gao, Q., & Rau, P. L. P. (2017). Watching a movie alone yet together: understanding reasons for watching Danmaku videos. International Journal of Human–Computer Interaction, 33(9), 731–743.
[6]
Ekman, P., & Friesen, W. V. (1971). Constants across cultures in the face and emotion. Journal of personality and social psychology, 17(2), 124.
[7]
Hartmann, T., & Goldhoorn, C. (2011). Horton and Wohl revisited: Exploring viewers' experience of para‐social interaction. Journal of communication, 61(6), 1104–1121.
[8]
He, M., Ge, Y., Chen, E., Liu, Q., & Wang, X. (2017). Exploring the emerging type of comment for online videos: Danmu. ACM Transactions on the Web (TWEB), 12(1), 1–33.
[9]
Holmqvist, K., Nyström, M., Andersson, R., Dewhurst, R., Jarodzka, H., & Van de Weijer, J. (2011). Eye tracking: A comprehensive guide to methods and measures. New York: Oxford University Press.
[10]
Holsti, O. R. (1969). Content analysis for the social sciences and humanities. Reading. MA: Addison‐Wesley.
[11]
Horton, D., & Wohl, R. R. (1956). Mass communication and para‐social interaction: Observations on intimacy at a distance. psychiatry, 19(3), 215–229.
[12]
Hudson, L. (2014). What's Behind Our Obsession with Game of Thrones Reaction Videos. Wired, May, 6.
[13]
Li, Y., & Guo, Y. (2021). Virtual gifting and danmaku: what motivates people to interact in game live streaming? Telematics and Informatics, 62(3), 101624.
[14]
Liljander, V., & Mattsson, J. (2002). Impact of customer preconsumption mood on the evaluation of employee behavior in service encounters. Psychology & Marketing, 19(10), 837–860.
[15]
Oh, D. C. (2017). Black K‐pop fan videos and polyculturalism. Popular Communication, 15(4), 269–282.
[16]
Rhee, H. T., & Yang, S. B. (2014). Reaction Videos on Music Videos: A New Phenomenon in the Music Industry Research‐in‐Progress. In Proceedings of the Korea Contents Association Conference (pp. 135‐136).
[17]
Rowe, R. (2018). ‘The More Accuracy the Better’?: Analysing Adaptation Reception in Reaction Videos. Adaptation, 11(3), 193–208.
[18]
Rubin, A. M., Perse, E. M., & Powell, R. A. (1985). Loneliness, para‐social interaction, and local television news viewing. Human communication research, 12(2), 155–180.
[19]
Russell, J. A. (1980). A circumplex model of affect. Journal of personality and social psychology, 39(6), 1161.
[20]
Russell, J. A. (2003). Core affect and the psychological construction of emotion. Psychological review, 110(1), 145.
[21]
Smeeton, N. C. (1985). Early history of the kappa statistic. Biometrics, 41(3), 795.
[22]
Stemler, S. (2000). An overview of content analysis. Practical assessment, research, and evaluation, 7(1), 17.
[23]
Wang, Z., & Huang, G. (2021). Sentiment Classification Algorithm of Danmaku Comment Based on Modified Bayes Model. In 2021 4th International Conference on Artificial Intelligence and Big Data (ICAIBD) (pp. 342‐346).
[24]
Warren‐Crow, H. (2016). Screaming like a girl: Viral video and the work of reaction. Feminist Media Studies, 16(6), 1113–1117.
[25]
Wu, Q., Sang, Y., Zhang, S., & Huang, Y. (2018). Danmaku vs. forum comments: understanding user participation and knowledge sharing in online videos. In Proceedings of the 2018 ACM International Conference on Supporting Group Work (pp. 209‐218).
[26]
Yang, X., Binglu, W., Junjie, H., & Shuwen, L. (2017). Natural language processing in “bullet screen” application. In 2017 International Conference on Service Systems and Service Management (pp. 1‐6).
[27]
Yin, F., She, Y., Xiong, R., & Wang, Y. (2020). A sentiment analysis algorithm of danmaku based on building a mixed fine‐grained sentiment lexicon. In Proceedings of the 2020 9th International Conference on Computing and Pattern Recognition (pp. 424‐430).
[28]
Zhao, Y., Fan, Z., & Zhu, Q. (2012). Conceptualization and Research Progress on User‐Generated Content. Journal of Library Science in China, 38(05), 68–81.
[29]
Zhao, Y., & Tang, J. (2016). Exploring the motivational affordances of Danmaku video sharing websites: Evidence from gamification design. In Human‐Computer Interaction. Novel User Experiences: 18th International Conference, HCI International 2016, Toronto, ON, Canada, July 17‐22, 2016. Proceedings, Part III 18 (pp. 467‐479).
[30]
Zhang, L. T., & Cassany, D. (2020). Making sense of danmu: Coherence in massive anonymous chats on Bilibili. com. Discourse Studies, 22(4), 483–502.
[31]
Zhang, P. (2013). The affective response model: a theoretical framework of affective concepts and their relationships in the ICT context. MIS quarterly, 37(1), 247–274.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image Proceedings of the Association for Information Science and Technology
Proceedings of the Association for Information Science and Technology  Volume 60, Issue 1
October 2023
1234 pages
EISSN:2373-9231
DOI:10.1002/pra2.v60.1
Issue’s Table of Contents

Publisher

John Wiley & Sons, Inc.

United States

Publication History

Published: 22 October 2023

Author Tags

  1. Reaction video
  2. Danmaku
  3. Information cues
  4. Affective generation
  5. Content analysis

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 0
    Total Downloads
  • Downloads (Last 12 months)0
  • Downloads (Last 6 weeks)0
Reflects downloads up to 06 Feb 2025

Other Metrics

Citations

View Options

View options

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media