top of page

Tech Tuesday's - Blac Chyna & Rob Kardashian beyond The Tabloid Scandal: The picture of who


Blac Chyna and Robert Kardashian Scandal may be just a PR circus for some and no matter what opinions you share about these individuals a more important topic is being discussed. Revenge Porn has been a fear of many woman and many laws are to be adapted to protect individuals from this tech crimes.

"It’s Like Having an Incurable Disease"

For years, Kara Jefts lived with a terrible secret. When she met a guy, she wouldn’t reveal her last name until they had been on four or five dates. When she began a new job, she would immediately befriend the IT expert who could help her block hostile emails. When she spoke with a new boss, she would force an awkward conversation about her romantic history. Her secret was so terrible because it wasn’t a secret at all: for the past five years, nude photos of Jefts have been only one email, Facebook post, or Google search away.

Jefts is a thoughtful academic in her mid-30s, an archivist and art historian at a Chicago university who never intended for images of her naked body to circulate on the internet. But in 2011, soon after Jefts ended her long-distance relationship with a boyfriend who lived in Italy, explicit screenshots from their Skype conversations began to appear online. They were emailed to her family and friends, posted on Facebook with violent threats against her, and even appeared on websites devoted to exposing people’s sexually transmitted diseases, with false allegations about her sexual history.

There’s a name for what Jefts has experienced, a digital sex crime that has upended thousands of lives but still mostly eludes law enforcement: nonconsensual porn, more commonly known as revenge porn. The distinction is one of motive, not effect: revenge porn is often intended to harass the victim, while any image that is circulated without the agreement of the subject is nonconsensual porn. Both can result in public degradation, social isolation, and professional humiliation for the victims.

Enabled by the technological and cultural upheaval that put a camera in every pocket and created a global audience for every social media post, nonconsensual porn has become increasingly common. Practically every day brings reports of a new case: A 19-year-old woman in Texas blackmailed into having sex with three other teens after a former partner threatened to release an explicit video of her. A 20-something in Pennsylvania had strange men coming to her door after an ex-boyfriend posted her pictures and address with an invitation to “come hook up.” An Illinois school superintendent in her 50s was fired after her ex-husband allegedly sent an explicit video of her to the school board.

Some of these private photos and videos find their way to porn sites, where “revenge” is its own genre. More often, however, they’re also posted on social media, where all the victim’s friends can see them. Facebook received more than 51,000 reports of revenge porn in January 2017 alone, according to documents obtained by The Guardian, which led the site to disable more than 14,000 accounts. A 2016 survey of 3,000 internet users by the journal Data and Society found that roughly 1 in 25 Americans have either had someone post an image without permission or threaten to do so-- for women under 30, that figure rose to 1 in 10. And a June Facebook survey by the anti-revenge porn advocacy group Cyber Civil Rights Initiative found that 1 in 20 social media users have posted a sexually graphic image without consent.

The problem exploded into public view earlier this year, when hundreds of active duty and veteran Marines were found to be circulating explicit images of current and former women service members. The images were posted in a secret Facebook group, passed around the way that their grandfathers might have traded copies of Playboy. Roughly two dozen service members have been investigated since the scandal broke in January, leading the Marines to formally ban nonconsensual porn in April. In May, the House unanimously voted to make nonconsensual porn a military crime subject to court marshal.

In some cases, the perpetrators are hackers who target famous women, searching for compromising photos to leak. Last year, Saturday Night Live star Leslie Jones was hacked and her nude pictures were spread online. In 2014, nude photos of Jennifer Lawrence and other female celebrities were hacked and leaked in one of the biggest nonconsensual porn cases to date. It's a problem nearly everywhere in the world: in May, nude photos purportedly of Rwandan presidential candidate Diane Shima Rwigara appeared online days after she announced her intention to challenge the nation’s longtime leader, Paul Kagame.

This type of harassment shows how sexual violation can now be digital as well as physical. And its rapid spread has left law enforcement, tech companies and officials scrambling to catch up. When evidence lives in the cloud and many laws are stuck in the pre-smartphone era, nonconsensual porn presents a legal nightmare: it’s easy to disseminate and nearly impossible to punish.

Advocates are trying to change that, in part by pushing a Congressional bill that would make nonconsensual porn a federal crime. But there are obstacles at every corner, from the technological challenges of fully removing anything from the internet, to the attitude of law enforcement, to the very real concerns over legislation that could restrict free speech. In the meantime, victims live in fear of becoming a 21st century version of Hester Prynne. “I have to accept at this point that it’s going to continue to follow me,” Jefts says. ”It’s kind of like having an incurable disease.”

Why Would Anyone Share a Nude Photo?

Jefts never thought of herself as the kind of person who would send nude photos. She is circumspect and professional–-and acutely aware of the power of images. But then she met a man who lived an ocean away, and quickly fell in love. Skype was critical to keeping the relationship alive, and the pair often sent each other photos and videochatted in ways that sometimes became sexual. “If it’s World War II and your husband leaves, you send letters and pictures, you have this correspondence that helps maintain that emotional connection,” she explains. “It’s more instantaneous [today] because of the technology, but the origin of it is the same.”

While some nonconsensual porn comes from pictures that are hacked or taken surreptitiously, in many cases the images were flirtatiously traded between partners as sexts. According to a 2016 study of nearly 6,000 adults by researchers at Indiana University, 16% of had sent a sexual photo, and more than one in five had received one. Of those who received nude photos, 23% reported sharing them with others, and men were twice as likely as women to do so.

Boomers might be baffled by this practice, but for many under 30 sexting isn’t seen as particularly transgressive. “It’s embedded in modern relationships in a way that makes us feel safe,” says Sherry Turkle, a professor of the social studies of science and technology at MIT. “This is a question that doesn’t need an answer if you grew up with a phone in your hand.”

According to Turkle, many digital natives are so comfortable on the internet that they imagine that there are rules about what can and can’t happen to the content they share. “If you feel the internet is safe, you want to share everything, because it’ll make you feel closer and it’s a new tool,” she says. “People made up a contract in their minds about the online spaces they’re in.”

Women sometimes circulate male nudes, but studies show the vast majority of nonconsensual images are photos of women spread by men. When accused, some men say they were hacked and the photos must be coming from another source. Others admit that they posted the photos out of anger, lashing out over a perceived slight. One Louisiana tattoo artist told police he posted a sex tape of his ex on a porn site as retribution after she damaged his car. A Minnesota manreportedly admitted he posted explicit images of his ex-wife on Facebook because he was jealous of her new boyfriend.

The dissemination of images can be as much about impressing other men as it is about humiliating the victim. Boys once presented stolen underwear as trophies from conquests — now, a nude selfie can signal the same thing. As a result, schools around the nation have dealt with what are often referred to as sexting rings. In 2014, more than 100 teens in a rural Virginia county were investigated for circulating more than 1,000 nude photos of mostly underage girls on Instagram. A Colorado District Attorney chose not to bring charges against teens who were circulating photos of high school and middle schoolers in 2015. Similar incidents have popped up recently in schools in Ohio, New York and Connecticut. The practice has become common enough that the American Academy of Pediatrics developed a guide for parents on talking to children about sexting.

“Lots of this isn’t intentional,” says Erica Johnstone, a San Francisco attorney with a practice dedicated to sexual privacy. “It’s just part of the hypermasculine culture: sex pictures become like currency.”

Why It's So Hard to Stop the Spread

On an otherwise ordinary day in 2011, Holly Jacobs decided to Google herself. When a porn site came up in her search results, Jacobs went into what she now describes as “a complete state of shock.”

“I could feel the blood rushing out of my head,” she says. “I was turning white as the page was buffering.” She would soon learn that her photos were posted on nearly 200 porn sites. A collage of nude images had been sent to her boss and co-workers. Explicit pictures of her were shared with her father on Facebook. She says she almost lost her job at a Florida college after someone online accused her of masturbating with students there, and she eventually stopped working as a statistical consultant because “every time I met with a client I wondered if they had seen me naked.”

“I never thought this kind of violation was happening to everyday people,” says Jacobs, who originally sent the photos to someone she knew and trusted. “I didn’t realize there was a market for naked photos of people nobody knows.”

Jacobs says she was diagnosed with depression and PTSD, and became afraid to meet new people for fear that they would find the photos. “It was a living nightmare,” she says. “I kept being rejected by police, the attorneys, the FBI because they kept saying there was nothing they could do.”

Now in her 30s, Jacobs ended up legally changing her name to escape her online footprint. But she also decided to fight back. She started the Cyber Civil Rights Initiative (CCRI,) a nonprofit devoted to helping victims of nonconsensual porn reclaim their identities. Since they launched the helpline in 2014, more than 5,000 victims have called CCRI, Jacobs says, adding that the group now gets between 150 and 200 calls a month.

“I’m a good person and I didn’t do anything wrong,” she says. “There’s nothing wrong with sharing nude images with someone I trust, so something needs to be done about this.”

Many victims think the moment they see their nude photos online is the worst part of their ordeal. Then they start having awkward conversations with bosses, fielding relatives’ questions about obscene social media posts, and getting strange looks from co-workers. It becomes impossible to know who has seen your photos, and what they think of you if they have. And when these victims start trying to get the pictures taken down, they realize something even worse: this type of cyber crime can leave a lasting digital stain, one that is nearly impossible to fully erase.

“Once the images and videos have been exposed or published, the internet is permanent,” says Reg Harnish, the CEO of cyber-risk assessment firm GreyCastle Security, who worked with Kara Jefts to successfully remove most of her photos. But even if you get an image scrubbed from one site, there’s no way to guarantee it hasn’t been copied, screenshotted, or stored on a cache somewhere. “There are literally hundreds of things working against an individual working to remove a specific piece of content from the internet,” he says. “It’s almost impossible.”

When victims seek help from law enforcement, they rarely get an effective response. “This is a case they put at the bottom of the stack,” says Johnstone, who represents victims of revenge porn. “They think that the victim was asking for it because they created the content that got them into the situation. They think they’re not as deserving of police hours as someone who was the victim of a physical assault.”

Jefts says she filed six police reports in three different New York counties (where she was living at the time) and got several restraining orders against her ex, but legal remedies were futile. Police officers often didn’t know how to handle digital crimes, and even if they sympathized with her predicament, they said there was nothing they could do because her ex no longer lived in the same state or even the same country. The restraining orders had “zero impact,” she says, and the harassment continued until she sought help from a tech experts like Harnish who helped her get the photos taken down.

As a result of growing awareness and increased pressure from victims and advocates, the number of states with a law addressing revenge porn has jumped from 3 to 38 since 2013. But the statutes are inconsistent and riddled with blind spots, which make them particularly difficult to enforce.

“There are no state laws across the U.S. that fit perfectly together,” says Elisa D’Amico, a Miami lawyer and co-founder of the Cyber Civil Rights Legal Project. “It depends on where your victim is, where your perpetrator is, where someone was when they viewed pictures.”

One of biggest inconsistencies among state laws is the way they treat motive. Some states criminalize nonconsensual porn only if there is “intent to harass,” a targeted campaign to debase and humiliate the victim, as with Jefts. But in many cases, like the Marine photo sharing scandal, the distribution of images isn’t intended to harass, because the victims were never supposed to know that their pictures had been shared. According to the CCRI’s June survey of 3,000 Facebook users, 79% of those who said they had spread a sexually explicit image of someone else said they did not intend to cause any harm.

To those who have had their most intimate moments exposed on social media, such thinking misses the point. “These were images that I took under the assumption that it was a consensual, private relationship,” says Jefts, who has devoted her career to studying the power and dissemination of images. “The context in which they were shared changed their meaning. That trumps their original intention.”

To address the legal patchwork, U.S. Rep. Jackie Speier is planning to reintroduce a bill this June to make nonconsensual pornography a federal crime — regardless of whether the suspect intended to harass the victim. “The intent of the perpetrator is irrelevant really,” says Speier, a Democrat whose district includes San Francisco. “Whether he’s doing it for jollies or money, it’s destroying another person’s life.” Facebook and Twitter backed her bill, called the Intimate Privacy Protection Act, or IPPA, as has billionaire Trump supporter and internet privacy advocate Peter Thiel. It also has bipartisan support from seven Republican co-sponsors.

But Speier’s bill, which stalled in committee last year, has vocal critics who oppose enacting new criminal laws that target speech. The American Civil Liberties Union (ACLU) objects to the very portion of the bill embraced by victim advocates: the part that criminalizes nonconsensual porn regardless of intent. “The Supreme Court has correctly said again and again that when the government criminalizes speech, intent is a crucial component,” says Lee Rowland, a senior staff attorney for the ACLU’s speech, privacy and technology project. “We do not put somebody in jail in this country simply because their speech offends someone else.”

With the law enforcement response in flux, tech companies have begun to respond to growing pressure to help address the problem. Under the 1996 Communications Decency Act, platforms like Google and Facebook aren’t liable for the content they host, which means they can’t be held legally responsible for the nonconsensual porn on their networks. But in response to an outpouring of user requests, several major websites have developed new policies to help fight revenge porn. In 2015, streaming porn site Pornhub announced it would remove revenge porn from its site, and Google announced it would remove the images from its search results. Twitter and Reddit have also updated their rules to prohibit nonconsensual porn. In April, Facebook unveiled a tool that enables users to flag content they think is being shared without consent; company technicians then check if it’s appeared anywhere else on the network to prevent it from spreading further. But this kind of response from tech companies requires significant manpower, since nonconsensual porn is difficult to identify. Unlike child pornography, which can often be spotted on sight, an image posted without consent doesn't necessarily look different than one posted willingly.

No matter what steps Congress and tech companies take, nonconsensual porn remains a problem without easy solutions. And as lawyers sue and lawmakers debate, millions of pictures are still out there circulating, multiplying, waiting to ruin a life.

- time.com

Music Management - Tour - Short Film Production - Content Creation - Brand Development - Marketing

Atlanta GA - United States

bottom of page