Prevailing ‘Negative Communities’ Online: Algorithmic Influence and User Self-awareness

: In recent years, the proliferation of short videos imbued with negative sentiment on TikTok has elicited widespread societal attention. This paper investigates the dissemination of negative emotional content on the social media platform TikTok. Specifically, this paper analyzes user data from several short video platforms, including TikTok, and finds that users aged 18-35 constitute the primary audience for short video content. Amid the propagation of negative emotions, this paper proposes the research hypothesis of a “negative community.” It conducts an in-depth exploration of the following queries: 1) Is the algorithm of TikTok complicit in the spread of negative sentiment? 2) Has this dissemination cultivated unique cultural communication attributes, coalescing into a community-based audience? Moreover, 3) In forming and proliferating negative sentiment, which plays a more significant role— the algorithm or the user.


Introduction
2017 was the first year of short videos all around the world.Short videos can use people's fragmented time and are more engaging, interactive, low cost, and high traffic [1].Through short videos, users are exposed to more of the world that they would not usually have access to [2].When artificial intelligence algorithms empower short videos, videos on specific topics are more precisely pushed to users' accounts.However, with the popularity of short videos, videos with negative energy on the Jieyin social media platform has been popping up in the public's view, with content with labels such as "emo" and "sunny depression" having a collection of over a million likes.This negativity has spread to the social This negative sentiment has spread throughout society.Generally speaking, users aged 18-35 are the main force of short video content viewing.The youth population has become the primary audience of short videos.Some scholars have shown that negative online emotional experiences partially mediate the influence of boredom tendencies on college students' online deviant behavior.According to formal studies, Jalonen, H. also argues that social media has become an arena for venting negative emotions, and it has also been suggested that the large amount of negative emotions in social media needs to be managed properly [3,4].The previous studies provide a good picture of the fact that negative emotions are spread online and must be properly managed.However, how online negativity is created, whether 'negative communities' exist, and where they should be managed remain unresolved.Thus, the following motivations for this paper: 1) to examine whether the algorithm of TikTok is an accomplice to the spread of negative sentiment; 2) to explore whether this spread has developed its unique cultural communication properties, which in turn has coalesced into a community-based audience to explore the existence of 'negative communities'; 3) to investigate whether 'negative communities' exist when negative sentiment Who is more responsible for the process of forming and spreading cultural communication properties, the algorithm or the user?
In this paper, the above issues are studied through the literature, questionnaire, and interview method, and conclusions are drawn.This paper finds that users with conscious behavior spread emotions on Jitterbug through comments and other means.Therefore, the solution to the problem is based on improving users' self-awareness, correcting their passive motivation, and educating them about such issues.

"Negative Community" and "Negative Emotions"
The first concept that requires elucidation is the "negative community"; this is a hypothetical academic term explicitly used within this context to refer to a particular interest group on TikTok, which is often exposed to negative content such as videos, copywriting, or other negative emotional expressions, either actively or passively.Members of this community frequently express or receive negative emotions (including sadness, despair, resignation, melancholy, anxiety, panic, and anger) through their posts and comments or interact in other ways that directly or indirectly contribute to the propagation of negativity, such as by liking or sharing such content.The concept of a "negative community" is not widely accepted or recognized in academic theory, and no specific scholar or institution has proposed this concept.It is more of an abstract descriptive term used to characterize a possibly existing community or online space characterized by negative behavior and atmosphere.
In current academic circles, concentrated discussions on concepts similar to this "negative emotional community" can be distinguished by four characteristics.1) Focus on psychologically vulnerable groups, such as the spread and reception of negative emotions among teenagers on social media.For example, the concept of "digital self-harm" was investigated by Patchin and Hinduja, where teenagers post self-harming content on social media [5].2) Investigation of malicious behaviors, hate speech, cyberbullying, and the spread of false information in social network communities, which involve criminal-level behavior patterns [6,7].3) Research on social media user behavior connected to the onset and amplification of depression and other mental health issues [8][9][10].4) Studies on system algorithms and improvements in the underlying system related to emotion, such as the accuracy of different data mining and machine learning algorithms classifiers in detecting emotional data, as illustrated in the latest related study [11].The research hypothesis adopts a perspective that diverges from conventional research directions, offering a comprehensive and quotidian viewpoint.This paper proposes to extend the phenomenon of emotional submersion in the "negative community" hypothesis to various negative emotions under depression, such as anger triggered by quarrels or low spirits.These negative emotions may not be apparent, and they may not have a direct connection with the fields related to depression.However, they indeed substantively influence every aspect of our daily lives.Moreover, the concept of a "negative community" is linked with the attributes of cultural dissemination, hypothesizing that the content rendered and precipitated in the process of negative emotional transmission can create elements with specific cultural identities.Such cultural markers cause reciprocal influences between user behavior within the chain and algorithm mechanisms under the effect of causality, leading to a sustained overflow of emotional impact.To the best of the latest knowledge, this research framework that combines emotion with the cultural transmission of physical communities is novel.This paper hopes this innovative research method can provide a fresh perspective and new thought pathways for further studies.

Survey Questionnaire
To achieve research objectives, this paper crafted a survey questionnaire, the items of which were divided into three sections.Firstly, the demographic section includes a series of dependent variables such as age and frequency of TikTok usage.Secondly, the negative emotional impact section adopted a hierarchical item design strategy.Preliminary items directly asked users about their perceptions and views on the influence of TikTok, followed by some hypothetical scenarios such as breakups or disputes with family members, prompting respondents to consider and provide feedback on its impact in real-life situations.This design was based on two rationales: on the one hand, this paper aimed to gain an objective understanding of the genuine impacts as much as possible; on the other hand, this paper hopes that such a hierarchical design could indirectly parse users' self-awareness of the impact.The third part of the survey focused on the issue of responsibility between algorithms and users, for which this paper designs a series of topics involving complex intersecting causal relationships.For instance, this paper explores whether users would share content with negative emotions in their online social circles or attempt to modify the content recommendation mechanism of TikTok, and so forth.The design purpose of this part was to gain a deeper understanding of the dynamic interaction and influence between algorithms and user behavior, thereby elucidating their roles and responsibilities in forming the "negative community."

Tables Description and Results Analysis
The data from Table 1 highlights two key points: 1) Users may need to fully recognize the potential impact of TikTok on their emotions and lifestyles.2) TikTok may only partially serve its intended purpose as a tool for entertainment and relaxation; it may have significant negative emotional impacts.Users' views on TikTok are complex and contradictory.On the one hand, they view TikTok as a tool for entertainment and relaxation and do not believe it influences their daily emotions or mental state.On the other hand, when placed into specific life scenarios, they perceive the impact of TikTok on their emotions.Therefore, from a deeper user perspective, most users may prioritize the immediate entertainment and relaxation brought by TikTok, overlooking its role in shaping their leisure habits, life rhythm, and emotional regulation.This may indicate low selfmonitoring among users or the covert nature of emotional transmission on the TikTok platform.Hence, such platforms and algorithms' design must enhance user notifications, emotional monitoring, and education.TikTok may significantly impact users' emotions, which, from the user's perspective, is negative and difficult to perceive.The following key point is investigating who is responsible for these negative effects.
Table 1: The impact of negative emotions (from user perspective).

Categories
Respondent Rate Respondents who do not often see negative content on TikTok 47.08% Respondents who deem TikTok solely as an entertaining platform (no negative impacts on the mood) 55% Respondents who think that stop using TikTok will bring good impacts on their mood and living state.

69.17%
Respondents who think that using TikTok during the breakup period will bring bad impacts on their mood.

82.92% Table 1: (continued).
Respondents who think that using TikTok during a arguing with family members will bring bad impacts on their mood.52.08% Respondents who actively disseminate content with negative emotions to their social circles 56.25% In Table 2, cross-sectional analysis indicates that most users who would not actively seek to change the algorithmic recommendations constitute the group that encounters the most controversial topics (73.56%).This further underscores the insensitivity of the algorithms, suggesting that the algorithms persist in promoting antagonistic content and are likely to change the logic of their recommendations with user intervention.Moreover, these users are most likely to actively disseminate content with negative emotions to their social circles (48 users, accounting for 55.17%, the highest proportion in the "I will share" option), potentially leading to an overflow of emotional influence.Can our algorithms disrupt the existing recommendation cycle at certain frequencies to avoid forming trending emotional propagation?On the other hand, this group generally perceives TikTok as a mere tool for leisure, unlikely to affect their mood (28 users, accounting for 32.18%, the highest proportion in the "TikTok is only for leisure" option).However, they are inclined to express emotions under emotionally upsetting videos (37 users, accounting for 42.53%, the highest proportion in the "I will express emotions" option).Driven by algorithms, users interact with negative emotional content and share it in their social circles.
Nonetheless, they may be unable to evaluate the emotions involved objectively.This may further highlight users' lower self-monitoring and potentially lower self-awareness-motivations are relatively passive; users may consider emotional venting to relax without clearly recognizing the potential negative emotional impacts, and the probability of self-awareness is relatively low.Therefore, this paper believes that both the algorithm and users bear a certain proportion of responsibility for propagating negative emotions.User behavior and algorithmic mechanisms complement each other, enhancing the efficiency of negative emotional transmission.If you encounter a video on Tiktok that makes you feel and has negative emotions, will you post your own comments in the comments section to express your opinion?Or like a comment?
3. When you post a negative sentiment in the comments section or share a negative comment in a video with your friends, do you realize that Tiktok users or friends are also affected by this negative sentiment?Or do you think it's a subconscious act and you don't care too much?
4. When you do the above actions, do you want people to deeply agree with you or do you want people to feel your emotions at this moment and care about you?Or do you want to make Tiktok users and their friends feel this emotion?
5. Do you think the frequent negative videos on Tiktok are caused by the algorithm, or are there more subjective reasons?
6. How do you view the increasing number of negative emotion videos on Douyin?Do you prefer to vent and resonate your inner emotions or deepen your negative emotions?

Analysis
This paper gets responses from three users, as shown in Table 3, 4 and Table 5. 1. Constantly scrolling through negative videos on Tiktok 2. You say things you feel 3.You know that it may affect other people's emotions, but you don't think about it so much at the moment 4.You want to vent, and you may want to attract some attention 5.You often click "like" and brush it, thinking it may be your problem 6.It deepens your negative feelings Table 5: Interviewee 3.
1. Occasionally scrolling through negative videos on Tiktok 2. You like a comment 3.You just share your feelings without thinking too much or trying to influence others 4. Want other users to feel their mood at this moment, think that this video is their "mouth" 5. Think that there are problems, and they may be the main reason 6.You helped yourself say what you wanted to say.It's easy to empathize with This paper interviews three TikTok users in turn through the interview questionnaire and found that all three users had watched videos of negative emotions on the TikTok platform, and they would also resonate with a certain video and like or leave their comments in the comment section.When this paper asks whether the algorithm is more responsible for frequently surfing videos with negative emotions on TikTok or their subjective reasons, two users thought that their problems were a large part of the reason.However, what is interesting is that when these three users made comments on negative emotions in the comment section or shared a negative comment in the video with friends, all showed that they do not worry too much, is subconscious behavior is just cathartic emotions, but also hope that their behavior can get other users to care, can get attention, can resonate.From these answers, this paper concludes that most TikTok users are passive in their lack of self-awareness and do not consider the impact on others now.This lack is a form of neglect, not an informed willful action.Therefore, they will spread emotions through comments and other means under conscious behavior.Thus, while consumers and researchers embrace a broadening conception of gender, the online marketplace seems to be working to reinscribe stereotypical notions of gender.This paradox raises many important issues for research on gender and marketing [12].In this way, the solution will improve the user's self-knowledge, correct the user's passive motivation, and start education on such issues.

Conclusion
The popularity of short videos has changed the reading habits of many readers in society, and now short videos have become an indispensable way of information communication.Short videos with negative emotions are spreading and spreading at an unimaginable speed.With the gradual increase of users on the TikTok platform, every user has watched negative emotion videos, and the hidden negative emotions it brings to us have begun to spread around us.Through interview and questionnaire research, this paper finds that "negative emotional communities" with negative emotions as the theme have been formed on the platform TikTok, and the number cannot be underestimated.However, users' lack of self-knowledge further leads to the spread of negative emotional videos.The root cause is that forming a "negative emotional community" is not only the result of the intelligent calculation of the TikTok algorithm but also the component of users' lack of self-knowledge.Through this research, this paper believes that TikTok users should consciously improve their self-knowledge, eliminate the "information cocoon" the algorithm brings, and not passively become the disseminator of negative emotions.In daily life, users should improve their self-knowledge not to be controlled by algorithms and become a member of passive propagation.Platforms bear some responsibility for this.Education or alerting mechanisms should be introduced through detailed research.For example, the internal data is used to extract more user samples for analysis, and the internal sample is tested in a range to debug to the best influence state.The active intervention is reasonable under the legal framework when the user's motivation is passive.The education alert mechanism needs to prevent the spread of negative emotions on the premise of soft design while considering the enthusiasm of users' comments.Flexible and flexible interventions are carried out without affecting the level of activity to take care of the psychological state of users in line with the national net action.

1 .
Proceedings of the International Conference on Global Politics and Socio-Humanities DOI: 10.54254/2753-7048/21/20230105 Do you often scroll through videos of negative emotions on Tiktok? 2.

Table 2 :
Algorithm and users' responsible for the spread of negative emotion.