Has Snapchat Taken Things Too Far with the Snap Map?

With internet-facilitated crimes on the rise, the Snap Map increases the vulnerability of social media users, especially young people.

At times it’s hard to remember life without the constant beckoning of the internet, whether by computer or cellphone. Today we’re more “connected” than ever, cultivating personal relationships with the help of Facebook, professional networks with LinkedIn and our social networks with Instagram, Twitter, Snapchat, Facebook (again) and more. This couldn’t be truer for today’s youth, who are growing up in an increasingly tech-saturated world. It’s more normalized to share Instagram stories or Snapchat “snaps” than to pass handwritten notes in class. The world is literally at their fingertips, both the good and bad parts, and with unlimited access, it’s scary to think what they can access, or who is trying to access them.

The technological landscape is advancing, and internet-facilitated crimes, like sextortion, are on the rise. Sextortion is a type of blackmail where the perpetrator coerces the victim to exchange sexually explicit material. Who are the people doing the exploiting? According to the National Center for Missing and Exploited Children, the perpetrators can be anyone: “Some blackmailers are known to the victim, while others are people the sender has never met in real life, often someone he or she has befriended on a social network, gaming platform or other online space.”

A Look Behind the Filters

Snap Map

Getty Images

Today it’s nothing to take a selfie with the latest breakdancing hot dog filter on Snapchat and send it to your entire contact list, or make it public. Instant self-promotion seeks the same immediate gratification in return, in the form of likes, retweets, shares and direct messages. Albeit technology is not synonymous with tangibility, but the upside of these applications includes the growing access to information, cultures and personal or professional connections that weren’t as accessible before. But what happens when these lines blur and others manipulate or exploit what’s accessed and shared online? Studies already show that social media use is linked to depression, a devastating mental health issue, but what about the potential physical risks of social media?

Social media platforms like Snapchat and Kik have come under scrutiny in the past for facilitating the sexual exploitation of minors, with Snapchat even making multiple appearances on the National Center on Sexual Exploitation’s (NCOSE) Dirty Dozen List. The list, released annually since 2013, calls out 12 organizations that are considered “the 12 leading contributors to the sexual exploitation of women.” As to why Snapchat landed on their list for 2017, the NCOSE explains, “Snapchat is frequently used for sexting and the sharing of self-produced child sexual abuse images. Additionally, the built-in feature Snapcash has enabled Snapchat and its users to monetize and profit from the exchange of pornography. Snapchat’s business model facilitates sexual exploitation, yielding hefty profits for the company without any regard for the associated harms.” Snapchat has raised suspicion since its inception for its automatic-delete feature, which can minimize risk for those who wish to use it for sending explicit content. Evidently Snapchat is no stranger to scandals and scrutiny, with seemingly no plan to stop there.

Put a Pin in It

Snap Map

Getty Images

Despite receiving heat for its Snapcash payment mechanism, Snapchat launched another questionable feature on its application: the Snap Map.

“Want to know what your friend is doing at the restaurant by your house or how their work event is going? Tap their Bitmoji on the map and you can chat with them directly.” This tip, found in a recent Refinery29 article, provides a tutorial on how to use Snapchat’s seemingly innocuous new feature. This location feature, if enabled by the user, allows for real-time sharing of locations with friends and/or the public on Snapchat. Users have to opt in to this feature, but according to Snapchat’s support team, “Once you’ve started using the Snap Map, you won’t be able to disable it altogether — but you can always hide your location on it by going into Ghost Mode!” Although Ghost Mode sounds like the safest bet, there’s a catch: snaps uploaded to the Our Story feed will still show up in the Snap Map, even if the user is in Ghost Mode, a warning Snapchat itself issued.

A Perfect Storm of Tech Abuse and Vulnerable Youth

Snap Map

Getty Images

Snapchat’s disappearing messages, direct payment capabilities and now real-time location-sharing features create an opportunistic environment for predators. Police, parents and even some teens agree that the Snap Map can spell trouble for young users, arguably the most vulnerable of the 71 million daily active users (DAU) on Snapchat in the United States alone. In addition to privacy and stalking concerns, increasing evidence links technology to sexual exploitation. Young people seeking connection and expecting instant gratification can be easy targets, especially those who come from unstable environments to begin with. The large majority of young people reported missing to the National Center for Missing and Exploited Children have been involved in the Child Welfare System, and roughly one in six are believed to be victims of child sex trafficking, a form of sexual exploitation. Indeed, Thorn, another agency fighting the commercial sexual exploitation of children, is finding increasing numbers of youth making initial contact with their trafficker online, as opposed to in person.

Searching for Solutions

How can social media users, especially minors, navigate the virtual world safely, especially when bad guys aren’t easily distinguishable? What responsibility do social media outlets hold in protecting their users against harm? As the target audience age decreases and technological advances increase, these answers will only become more important. Until then, it seems the safest approach is Caveat “Snapper,” or “Let the Snapper beware.” end



Next Article