If you come across a user who appears to be at risk of self-harm or expressing suicidal thoughts, it’s completely natural to feel concerned—especially if you’ve interacted with them before. Reaching out or flagging this kind of content is a really important and responsible step.


The most helpful thing you can do is use the report function in the app. This ensures the situation is brought to the attention of our moderation team, who are trained to respond with care and urgency. When posts are flagged under suicide or self-harm concerns, our team follows a dedicated safety protocol. This may include reaching out to the user, directing them toward appropriate local crisis resources, and offering supportive guidance. In some cases, content may also be removed to reduce potential harm while the user is being supported.


Please know that all reports are taken seriously, and action is taken where appropriate. However, to protect user privacy, we aren’t able to share specific updates about another member’s situation.

While it can be tempting to step in directly, it’s important to remember that you’re not expected to handle this on your own. Reporting allows the right support systems to take over.


It’s also important to take care of yourself. Supporting or repeatedly seeing distressing content can feel heavy and overwhelming. If you find that it’s affecting your wellbeing, it’s okay to step back, mute, or block the user to create some space.


Looking out for others is incredibly valuable—but making sure you feel safe and supported matters just as much.