SPSM chats about trends related a person’s ability or “right” to talk about suicide on social media platforms, 10/29/17, 9pCT. While many people perceive social media as highly “free,” “open,” and “unregulated” there are multiple technology trends that will increasingly impact social media “speech” about suicide. Join our chat to find out more, or drop in your two cents on the matter:
- Suicide can be treated similarly to topics such as violent hate speech, or pornography when it comes to social media platform moderation. In fact many platform policies and guidelines about moderation lump these types of content together, or treat them similarly.
- Algorithms that impact moderation, or display of social media content often reflect stigma and prejudice in the “real world.” This likely impacts content display and sharing about suicide.
- Much like other types of “moderated” content, conversations about suicide or suicidality could end up being increasingly driven “underground” to the “dark web.” This may mean migration away from common social media platforms that are more familiar to mental health professionals, crisis centers, or health care policy makers. This may slow or reduce the reach and effectiveness of social media centric strategies in suicide prevention.
- And what about suicide content that has poor, or outright violent messaging that most of us would be alarmed by? (this link displays a non-guideline compliant image and displays facts about suicide paired with violent messaging towards the LGBTQ community)
- And don’t even get us started about Net Neutrality. Based on community feedback, we’ll have a whole SPSM chat about this right before a congressional vote on the matter to help you be an informed suicide prevention stakeholder and citizen.
Watch us LIVE here: