Like others, I made my own assumptions about this feature and, as it turns out, it’s something other than I’d first presumed. Moreover, it reflects the real-life-friends dynamic.
Snapchat’s approach is rooted in human-centered design. In real life, conversations between and among friends aren’t saved, transcribed or recorded in perpetuity. Most of us are more at ease and can be our most authentic selves when we know we won’t be judged for every word we say or every piece of content we create.
One misperception I’ve heard is that Snapchat’s delete-by-default approach makes it impossible to access evidence of illegal behavior for criminal investigations. This is incorrect. Snap has the ability to, and does, preserve content existing in an account when law enforcement sends us a lawful preservation request. For more information about how Snaps and Chats are deleted, see this article.
A natural concern for any parent when it comes to online interactions is how strangers might find their teens. Again, Snapchat is designed for communications between and among real friends; it doesn’t facilitate connections with unfamiliar people like some social media platforms. Because the app was built for communicating with people we already know, by design, it’s difficult for strangers to find and contact specific individuals. In addition, Snap has added protections to make it even more difficult for strangers to find minors, like banning public profiles for those under 18. Snapchat only allows minors to surface in friend-suggestion lists (Quick Add) or Search results if they have friends in common.
Along those same lines, I’ve heard concerns about the Snap Map – a personalized map that allows Snapchatters to share their location with friends, and to find locally relevant places and events, like restaurants and shows
A newer tool we want parents and caregivers to be aware of is Friend Check-Up, which prompts Snapchatters to review their friend lists to confirm those included are still people they want to be in contact with. Those you no longer want to communicate with can easily be removed.
By default, location-settings on Snap Map are set to private (Ghost Mode) for all Snapchatters. Snapchatters have the option of sharing their location, but they can do so only with others whom they’ve already accepted as friends – and they can make location-sharing decisions specific to each friend. It’s not an “all-or-nothing” approach to sharing one’s location with friends. Another Snap Map plus for safety and privacy: If people haven’t used Snapchat for several hours, they’re no longer visible to their friends on the map.
Most importantly from a safety perspective, there’s no ability for a Snapchatter to share their location on the Map with someone they’re not friends with, and Snapchatters have full control over http://www.hookupdate.net/de/localsgowild-review/ the friends they choose to share their location with or if they want to share their location at all.
Generally, people who are communicating on Snapchat have already accepted each other as friends
Early on, the company made a deliberate decision to treat private communications between friends, and public content available to wider audiences, differently. In the more public parts of Snapchat, where material is likely to be seen by a larger audience, content is curated or pre-moderated to prevent potentially harmful material from “going viral.” Two parts of Snapchat fall into this category: Discover, which includes content from vetted media publishers and content creators, and Spotlight, where Snapchatters share their own entertaining content with the larger community.
On Spotlight, all content is reviewed with automated tools, but then undergoes an extra layer of human moderation before it is eligible to be seen, currently, by more than a couple dozen people. This helps to ensure the content complies with Snapchat’s policies and guidelines, and helps to mitigate risks that may have been missed by automoderation. By seeking to control virality, Snap lessens the appeal to publicly post illegal or potentially harmful content, which, in turn, leads to significantly lower levels of exposure to hate speech, self-harm and violent extremist material, to name a few examples – as compared with other social media platforms.