Take Up Space Unapologetically: Tackling Online Abuse

Learning about the tools and ways we can manage our privacy online is incredibly important in the digital age. We should all be equipped with the knowledge to make informed decisions about our own digital footprint. There are a myriad of reasons why people choose to be more private than public on social media, and vice versa.

However, I’m growing wary when general advice is given by online safety institutions encouraging people to manage, control or lock down their privacy settings on social media in order to ‘protect’ themselves from forms of online abuse, particularly image-based abuse, which this piece will focus on.

I argue that such advice may be necessary in specific circumstances, but is problematic as a general course of action because it:

  1. cannot guarantee individuals protection from online abuse;
  2. may mitigate the risk of abuse but often fails to manage victims’ expectations;
  3. shifts responsibility away from perpetrators;
  4. disproportionately disenfranchises certain groups and individuals;
  5. is a short-term fix with long-term consequences;
  6. screams victim blaming under the guise of protection;
  7. is not conducive to creating an online world in which we are all safe and free to express ourselves, let alone exist, without being abused; and
  8. fails to actually address the underlying problem at hand.

At the fundamental level there is no guarantee that one can completely protect themselves in the digital age from certain forms of online abuse, including image-based abuse.

Image-based abuse takes many forms from distributing, surreptitiously recording, or threatening to distribute or record intimate images/videos without consent. It includes non-consensually sharing altered intimate images/videos. In the digital age of ‘upskirting’ and ‘downblowsing’ people can be victimised without knowing it. Peoples images can be manipulated from a LinkedIn profile picture, altered into pornography and shared online. The reality is – some forms of online abuse occur beyond our control, even if we follow the advice of controlling or locking our privacy settings on social media.

The most compelling reason why it may be important or in fact, necessary to advise people to control or lock down privacy settings on social media in order to protect themselves from image-based sexual abuse, is that it may mitigate the risk of abuse occurring or continuing to occur, especially when victims may be in danger. Two points to make here:

First, when some online safety institutions encourage people to control their social media settings, it is not accompanied with the explanation that doing so just mitigates the risk of online abuse, as doing so will not guarantee protection from online abuse.

Failing to qualify statements and calls to lock down your social media, fails to adequately manage the expectations of victims and the public, and what’s more concerning is that it gives victims and the public a false sense of security that they are protecting themselves if they follow such advice.

Second, there are horrific cases in which a victim is in danger or is living in fear of the perpetrator/s. Cases where the abuse is relentless, merciless and unforgiving. Cases where the victim’s safety is of paramount importance and that means doing everything possible to try to keep the victim safe. As a survivor of image-based abuse there were times in my journey where I deactivated social media because the emotional distress was overwhelming. In such cases it may be necessary to encourage victims to manage their social media settings, as sad and unfair as it is. However, I believe such advice should be reserved for specific circumstances rather than a general course of action for the public.

Why? Because as a general course of action, even if it may mitigate the risk of online abuse it places the onus, burden and responsibility squarely on everyone except the perpetrator, it places it on us to protect ourselves from online abuse, when the only people who should be changing their behaviour are the perpetrators who are committing the abuse.

Now, you may be thinking, obviously its the perpetrators who should be the ones changing their behaviour, but there are ‘bad’ people in this world who are going to commit these abuses anyway. Common sense would dictate that an appropriate course of action would be to control or lock down our social media settings. 

While I hear you and understand what you are saying, I would still argue that the defensive approach to managing, controlling or locking down your social media settings is not going to work long-term and is not conducive to creating an online world in which we are all safe and free to express ourselves, let alone exist, without being misappropriated or abused. I’ll explain why shortly.

For now, let’s examine who would be the most affected by such general advice. We know that image-based abuse disproportionately affects certain groups in our society: young women, the LGBTQI community, people with disabilities, etc. So, when you make calls to people to control their social media settings, its these groups who would be the most receptive to such advice, and therefore be disproportionately affected by such advice.

We know that social media is used as an economic opportunity for people to build personal brands or grow businesses, its used as a platform to engage and contribute to social and political discourse, its used to connect with friends and family. Sometimes, using social media is necessary for work and career progression.

There are so many benefits to social media that you are disproportionately locking certain people out of by encouraging people to control or lock down social media settings, further disenfranchising certain groups and vulnerable individuals. It’s these groups who lose out the most from the cultural life of our times, leaving other demographics to dominate the social media landscape.

In the short-term, while generally encouraging or advising people to control or lock down their social media settings may mitigate the risk of abuse occurring, noting there is still no guarantee; in the long term, the consequences of such advice can adversely impact the very people you are trying to protect by impacting the configuration of online discourse that excludes the voices of certain groups and individuals, by socially isolating certain groups in our society, by disempowering and depriving people of economic opportunities, among other things.

I’d even go so far as to argue that encouraging people with general advice to manage, control or lock down their social media settings to protect themselves from online abuse is akin to telling people to lock themselves in their houses because the real world is full of dangers.

It’s well-meaning but it screams victim blaming under the guise of protection.

We see victim blaming all the time. It’s the kind of attitude that attacks and criticises the conduct of the victim, instead of the perpetrators of a crime. It’s the kind of attitude that shifts accountability and responsibility away from perpetrators and places it on the victim. It’s the sentiment that somehow the victim is at fault for the wrongdoings committed against them, or worse that the victim deserves the harm.

Victim blaming attitudes are rife in discussions of rape, image-based sexual abuse and family and domestic violence:

If she wasn’t wearing such revealing clothes she wouldn’t have been raped. If she didn’t send nude photos, he wouldn’t have uploaded them online. If she didn’t post “revealing” photos to social media, they wouldn’t be photo shopped into porn. If she was being abused at home she should’ve just left him.

Attitudes that shift responsibility away from perpetrators of crime are dangerous for so many reasons, but I believe the most concerning is that it is not conducive to creating an online world, let alone a world, in which we are safe to express ourselves, let alone exist, without being abused. To illustrate this, I’ll go back to a point made earlier, that essentially there are always going to be ‘bad’ people in this world who commit atrocities, so common sense would dictate that a good course of action is to control or lock down our social media settings. To which I would concede that you’re right, there are always going to be people who perpetrate harm onto others, but I fail to see how anything will stop if you keep advising people to control or lock down their social media settings in order to protect themselves from online abuse.

  • To what end are you advising people to do just that?
  • Are we just going to keep retreating while perpetrators may or may not be held accountable for their actions?
  • And even if we retreat by controlling our social media settings and perpetrators are also held accountable for their behaviour, we’re still the ones who lose out all round. 

If this path continues, I see no end. We’ll be stuck in a cycle where we are forever on the defensive, thereby fostering an online world of fear which makes space for perpetrators to our detriment. We can’t just stop living because there’s bad people out there. We can’t just be stuck in the house because there’s dangers in the real world, and we shouldn’t be missing out on fully participating in the online world because there are people who perpetrate online abuse. I say:

Take Up Space Unapologetically

Lastly, general advice encouraging people to manage, control or lock down their social media settings does not address the underlying problem at hand. It does not address the reality that perpetrators are treating the people they prey upon, commonly women, with no regard for that person’s humanity or dignity. It does not address the motivations behind why perpetrators commit online abuse. Frankly, efforts should focus on holding perpetrators accountable rather than encouraging people to do this, that or the other to maybe safeguard themselves.

While equipping people with the knowledge to make informed decisions about their digital footprint is important; general advice encouraging people to manage, control or lock down their social media settings in order to protect themselves from forms of online abuse is problematic. And I would urge leaders in the online safety space to reconsider doing that.

 

Featured Image: Photo by William Iven on Unsplash

 

 

 

 

 

Senate Passes Civil Penalty Regime to Combat Image-Based Abuse

Today the Senate passed the Enhancing Online Safety (Non-consensual Sharing of Intimate Images) Bill 2017 with some surprising, significant amendments. This Bill is part of the Australian Government’s efforts to combat image-based sexual abuse, and was developed from a public consultation into a proposed civil penalty regime (submissions/public workshops) conducted by the Department of Communications and the Arts between May – July 2017.

The Australian Government’s proposed framework is to establish a Commonwealth civil penalty regime to complement:

  • The world-first image-based abuse complaints portal run by the Office of the eSafety Commissioner which provides: information and advice, options for removing and reporting abusive images and videos, and resources and case studies; and
  • Existing Commonwealth and state and territory criminal offences.

The Bill establishes a civil penalty regime that would, as outlined in the explanatory memorandum: “prohibit the non-consensual posting of, or threatening to post, an intimate image on a ‘social media service’, ‘relevant electronic service’, e.g. email and SMS/MMS, or a ‘designated internet service’, e.g. websites and peer to peer file services”, among other things.

It imposes a civil penalty, rather than a criminal liability, of $105,000 for individuals who contravene the prohibition; and a civil penalty of $525,000 for corporations who fail to comply with a ‘removal notice‘ that may require a social media service, relevant electronic service or designated internet service to remove an intimate image from their service.

The Bill also empowers the eSafety Commissioner to investigate complaints, issue formal warnings and infringement notices, provide removal notices and written directions to ensure future contraventions do not occur.

The general consensus from the Senate this week was that Labor, The Australian Greens, and The Nick Xenophon Team welcomed and supported the Turnbull Government’s Enhancing Online Safety (Non-consensual Sharing of Intimate Images) Bill 2017. Although as Labor Senator Deborah O’Neill pointed out the “Turnbull government has been dragging its feet and has taken far too long to address this issue of image based abuse. The bill comes in the fifth year of the Liberal government and over two years after Labor’s first proposes, stronger measures”.

While Labor supported the Bill as a step in the right direction, they did not think it went far enough. Labor called on the government to criminalise the non-consensual sharing of intimate images citing:

  • The COAG Advisory Panel on Reducing Violence against Women and their Children who recommended in April 2016 that strong penalties for the distribution of intimate material without consent be developed to “clarify the serious and criminal nature of the distribution of intimate material without consent”;
  • Concerns by the Commonwealth Director of Public Prosecutions in a Senate Inquiry submission by the Senate Legal and Constitutional Affairs References Committee that “there are limitations on existing Commonwealth laws to adequately deal with ‘revenge porn’ conduct”;
  • Research from RMIT and Monash University that 80% of Australians agree “it should be a crime for someone to share a nude or sexual image of another person without that person’s permission”.

The Australian Government responded to the push to criminalise image-based abuse at the Commonwealth level by pointing out that there is already an existing Commonwealth criminal provision in place under s 474.17 – the misuse of a carriage service in the Commonwealth Criminal Code Act 1995. However, this non-specific, existing provision has been highly and widely criticized for its limited applicability to image-based abuse.

As a result, a significant amendment to the civil penalty regime was successful in the Senate today, namely to amend the Criminal Code Act 1995 to include specific criminal offences that would criminalise sharing and threatening to share, intimate images without consent. While this amendment to introduce criminal offences in conjunction with the proposed civil penalty regime may return to the Senate after transmission through the House, this amendment could mean an incredible move toward justice for victims of image-based abuse. 

In the Senate debate the Australian Greens stated that they were disappointed that the Bill was brought on for debate in such ‘haste‘ without allowing for proper scrutiny (e.g. inquiry). Australian Greens Senator Jordon Steele-John pointed out that “many of those consulted are under the impression that they will subsequently be given the opportunity to give their thoughts, opinions and expertise in regard to the outcome.”

In light of the lack of proper scrutiny of this Bill, another amendment to the Bill (sheet 8364 revised) was agreed to being that of the establishment of an independent review (and written report of the review) of the operation of the civil penalty regime within three years after the commencement of the proposed legislation.

In addition, the Australian Greens expressed concern that the Turnbull Government has forgotten to allocate any funding to the cost of running the scheme. While the explanatory memorandum of the Bill provides a ‘Financial Impact Statement‘ which states that the civil penalty regime “might have a minor impact on Commonwealth expenditure or revenue”, and “any additional funding will be considered in the 2018-19 Budget”, there is a level of uncertainty as to extent of funding needed to carry out this scheme. Labor Senator Louise Pratt also highlighted the “minimal resources that the eSafety Commissioner currently has for undertaking this kind of work”. I anticipate that the question of funding will be discussed in the House.

Also, One Nation Senator Hanson talked about her own experiences where she was subjected to the ‘degrading’ and ’embarrassing’ publication of images of a woman who was partially nude and false claims that they were pictures of her. However, Hanson went on to express some dangerous rhetoric about image-based abuse:

Hanson said:

“As the old saying goes, sometimes it takes two to tango. I say to anyone out there who thinks that intimate images of themselves are okay to send via text message or email: ‘Stop it. Keep it for the bedroom.’ People, regardless of your age, it’s in what is told to you by your parents and how you feel about yourself: people have to take responsibility for their own actions. Young people who get requests for intimate images of themselves early in relationships should not do it. Relationships don’t always last, and the person they are with may very well turn nasty on them. I’m very pleased to say that One Nation are a part of putting a dent in this abhorrent trend of shaming people using online methods and intimate images, but I reiterate: I want every man, woman and young adult to know that they too must play a role in ensuring their private photos are kept private.”

This rhetoric by Hanson perpetuates an insidious culture of victim blaming. It sends a harmful message that victims are partly responsible for the horrific and criminal actions of perpetrators. And may discourage victims from speaking out or seeking help because they feel they are to blame.  Perpetrators who share, threaten to share or record intimate images without consent are the ONLY people responsible for image-based abuse – not the victims. Many people – young people and adults – are capable and do engage in the consensual practice of sharing intimate images in a respectful, healthy, safe, loving or intimate way. But image-based abuse is the clear absence of consent and respect. Image-based abuse is perpetrated for various reasons: to humiliate, shame, intimidate, coerce, control, harass and violate victims, it’s also perpetrated for sexual gratification, social notoriety, and financial gain. Our standards and expectations of behaviour shouldn’t be so low that we hold victims partly responsible for the heinous actions of perpetrators.

When it comes to young people, there is a growing problem of young girls feeling pressure to send intimate images of themselves, and this is something that desperately needs to be addressed with respectful education initiatives and programs. We must teach young people about the safe use of technology and associated risks, consent, respect and we must empower young girls to take control of their online usage and agency – but we mustn’t, in any way, send the message that young people who send intimate images of themselves are somehow responsible for the actions of perpetrators who betray their trust or personal privacy.

To echo the sentiments in the Senate: this Bill is a significant step in the right direction, and when taken in conjunction with the amendment to introduce Commonwealth criminal offences, today marks a significant move toward long-awaited justice for victims.

I am extremely grateful to the Australian Government, Senator the Hon. Mitch Fifield, the Department of Communications and the Arts and all the stakeholders involved in the public consultation of this Bill, as well as everyone who has worked hard for years fighting for justice and accountability. Here’s to hoping for a smooth passage in the House. This is fantastic news!

Image-Based Abuse: The Phenomenon of Digitally Manipulated Images

Image-based abuse, colloquially referred to as ‘revenge porn’ (‘revenge porn’ is a misnomer) is an umbrella term. It refers to the non-consensual sharing of intimate images. Contrary to popular belief, there is much more to image-based abuse than the textbook ‘revenge porn’ scenario of the ‘jilted ex-lover sharing nude photos of their ex without consent’. Image-based abuse can be perpetrated in a number of ways, for a number of reasons including (among other things) to control, harass, humiliate, shame, coerce or sexually objectify a victim.

Image-based abuse is the recording, sharing or threatening to record or share, intimate images without consent‘Image’ means photo or video. ‘Intimate image’ means an image of a person engaged in a private act, or of a person’s private parts, or of a person in circumstances one would expect to be afforded privacy. ‘Intimate image’ can also mean an image that has been ‘altered’ without consent (digitally manipulated, doctored, photo shopped, etc.) to show a person in any of the above (i.e. engaged in a private act, etc.)

noelle
Photo: Noelle Martin (Me). Source: ABC NEWS (Dave Martin)

To date there is little to no research, data or information on the phenomenon of digitally manipulated images, but this issue is known to academics, researchers, cyber safety experts and women’s groups, and this issue is being incorporated into some recent law reform initiatives in Australia.

As a survivor-turned-advocate of this particular type of image-based abuse (link to my story here). I hope to provide some much needed insight into this form of image-based abuse and the many ways it can occur in the digital age. I will also provide a few tips on what to do if this happens to you.

The insight I provide below cannot tell you the exact extent nor how frequent this phenomenon is occurring, but what I can tell you is that there are horrific online cultures (websites/threads) that exist which host and facilitate the creation and distribution of digitally manipulated images. I can tell you some of its forms and I can tell you that I’m not the only one. Recent comprehensive research conducted in Australia shows that 1 in 5 Australians experience image-based abuse, while this takes into account other forms of this issue too, the prevalence of image-based abuse in general is telling.

Forms of Digitally Manipulated Image-Based Abuse

  1. ‘Face Swapping’ 

This form is where person A’s face is photo shopped onto pornographic material in such a way to suggest that person A is truly depicted in the pornographic material. For me, this form manifested itself when my face was:

  • photo shopped onto images of naked adult actresses engaged in sexual intercourse;
  • photo shopped on images where I was in highly explicit sexual positions in solo pornographic shots;
  • photo shopped on images where I was being ejaculated on by naked male adult actors;
  • photo shopped on images where I had ejaculation on my face; and
  • photo shopped on the cover of a pornographic DVD.

I must also point out that these altered images of me quite literally identified me by name in the image. My name was edited onto the bottom of these images in fancy font to suggest that I was some adult actress.

2. ‘Transparent Edits’

This form of image-based abuse is where a person’s clothes are digitally manipulated to give the effect of it being see-through. For example, a woman’s blouse can be edited so that the appearance of nipples can be seen through their clothes (this happened to me).

3. ‘Cumonprintedpics’

This form of image-based abuse is where a perpetrator has ejaculated onto an image of person A, and has taken an image of their semen (with/without penis) on person A’s image. The perpetrator can take this second image (containing person A’s image and perpetrators penis/semen) and post it online. There are many forums and websites that feature galleries of this kind of image-based abuse (this happened to me).

4. ‘Bodily Alterations’ 

This form of image-based abuse is where a perpetrator digitally manipulates an image of person A by enlarging or enhancing person A’s private parts, particularly the breasts or behind. The alterations are usually very extreme.

5. ‘Juxtapositions’ 

This form of image-based abuse is where a perpetrator doesn’t necessarily alter an image of person A, but instead juxtaposes (places side-by-side) an image of person A next to say, a pornographic image of person B, where person B has a similar-looking appearance/body to person A. The perpetrator can explicitly or implicitly indicate that the pornographic image of person B, is person A.

6. ‘Unidentifiable Alterations’

This form of image-based abuse is where a perpetrator digitally manipulates an image of person A (into highly sexual material) but person A cannot be (objectively) identified at all. In this grey area, I believe that it really doesn’t matter whether person A can be identifiable by third parties, what matters to me is whether person A can identify themselves, because it is EXTREMELY violating and degrading to be the subject of digital manipulation in itself. Plain and simple.

These are some of the many ways the phenomenon of digital manipulation can occur.

What can you do if this happens to you?

Unfortunately, the laws in Australia are limited. The NSW Parliament has recently passed an image-based abuse bill that will criminalise distributing, recording or threatening to distribute or record intimate images (including ‘altered’ images) without consent. South Australia and Victoria have ‘revenge porn’ laws but neither explicitly mention ‘altered’ images or digitally manipulated images. The Federal Government is in the process of potentially creating a civil penalty regime to complement existing criminal penalties, that could potentially cover digitally manipulated images. And the Office of the eSafety Commissioner is working on an online complaints mechanism for images shared without consent.

In the meantime, there are options. The eSafety Women website provides a list of what you can do. You can:

  • Collect all the evidence
  • Report it to the police
  • If you are over 18, you can report it to ACORN(Australian Cybercrime Online Reporting Network)
  • If you are under 18, you can report it to the Office of the eSafety Commissioner.
  • You can contact the webmasters/content hosts and request the removal of the material. (Proceed with caution)
  • Google has a reporting function to remove intimate images that have been shared without consent. Google can remove such images from its search results.
  • Facebook also has the tools to remove intimate images that have been shared without consent from Facebook, Messenger and Instagram.
  • Contact a lawyer and seek advice.
  • Contact local women’s groups/ domestic violence groups.
  • Sign petitions urging Australia to change the law ASAP.

Just remember, you are NOT alone. Wherever you are in the world. ❤ 

If you or someone you know may be suffering from mental illness, contact SANE, the National Mental Health Charity Helpline on 1800 187 263 or Lifeline, a 24 hour crisis support and suicide prevention service on 13 11 14.

 

 

 

 

 

 

 

A Cautionary Tale of Sexual Cybercrime: The Fight to Reclaim my Name

This is a cautionary tale of my experiences as a victim of sexual cybercrime. I’m filled with fear, hesitancy and an overwhelming sense of vulnerability at the prospect of writing this piece. I’ve written a little about my experiences before but never as candid as what is to follow. This time around, I’m fighting to reclaim my name and image, a name and image that has been stolen from me and has depicted me as something I’m not.

So here goes…

It all started a couple of years ago when I discovered through a simple Google Image Reverse search that dozens of photos from my social media were plastered all over pornographic sites: xhamster.com, sex.com, cumonprintedpics.com, motherless.com, titsintops.com you name it.

But let me make one thing clear, none of my photos are or were sexually explicit, they were just ordinary images of myself, that like everyone else my age, and everyone else in today’s internet culture, would post on social media.

pic-7
Photo of me taken at age 17

It’s my understanding after years of dealing with this issue that the picture to the right is the one that started it all, or caught the attention of some pervert out there.

Somehow the perverts responsible had also managed to find out all of my details, which were also posted on these porn sites. My name, where I lived, what I studied- Some people on the thread were even trying to find out the name of my childhood best friend, so they could hack into my Facebook.

What’s more, is that on these pornographic sites were extremely explicit and highly offensive comments about myself that are to this day branded in my mind: ‘Cover her face, and I’d fuck her body,’ and ‘the amount of cum that has been spilt over her could fill a swimming pool.’ I was also called a ‘whale.’

The discovery was traumatising. I was frightened that a perpetrator would try and contact me in person. It was brutal. I immediately went to the police station, but this was before all this exposure to ‘revenge porn’ was dominating discussion in society. The police had told me that essentially there was nothing they could do, as there was nothing illegal going on, because once you upload a photo to Facebook anyone can take it and do anything they want with it, and that I had to contact the websites myself to take them down and just ensure that my social media settings were set to private.

I know now that what was happening to me is called ‘parasite porn’- the term used when ordinary images are taken from a person’s social media site and posted on threads in pornographic sites, usually alongside highly offensive, explicit and objectifying comments.

I also know that there are so many more young women who are victims of ‘parasite porn’ but haven’t a clue and all the while being preyed on by perverted men. The screenshot below is taken from just one website:

pic-3
As you can see, some young women from Instagram are being preyed upon.

For these perverted men, they might argue that what they’re doing may be questionable but technically they aren’t breaking any laws or rules. Unfortunately, they would be right. Under Facebook’s Statement of Rights and Responsibilities, ‘When you publish content or information using the Public setting, it means that you are allowing everyone, including people off of Facebook, to access and use that information, and to associate it with you.’

Perpetrators of ‘parasite porn’ might not be breaking any rules or laws right now. But it’s not far-fetched to imagine that at some point in the future, society does witness the rise in the incidence of ‘parasite porn,’ and we ask ourselves: how are we allowing this? Is it really okay for others to do anything they want with an image they find online even if it means objectifying, sexualising and preying on the victim? Is this the risk young women have to take to have an online presence? How will we deal with this issue?

So while ‘parasite porn’ might not break any rules or laws, what it does do-is open up the floodgates to an even crazier world. The world of ‘morphed porn’- where ordinary images are manipulated and superimposed on naked bodies or edited to create a more sexualised effect, and posted on porn sites.

This is where my story takes a turn for the worst…

I soon learnt that my face was being photoshopped onto naked women and I was being depicted as an adult actress. Some solo, some with other porn stars and in one image I’m being ejaculated on by two men. Today, Photoshop is so advanced that it’s really not that difficult to morph an image and make it look real- and some of mine do, which has been the cause of so many sleepless nights worrying about my future employability.

pic-888The newest morphed image is me photoshopped me onto the cover of porn film, ‘Buttman’s Big Tit Adventure Starring Noelle Martin and 38G monsters’ it says.

From the initial discovery and throughout this process, I contacted all the relevant government agencies and even the Australian Federal Police. I explained my story numerous times but I was always transferred or directed to the next agency or simply not responded to.

So I just had to take matters into my own hands. I frantically went about getting the websites removed with varying degrees of success. Luckily most sites obliged my request for deletion. Until one particular site, the site containing the ‘morphed images.’ I had sternly requested this site be deleted, but the Webmaster refused to do so unless I sent him intimate images of me. When I of course refused and demanded the page be removed, he threatened to send the photos to my university and my father. I knew better than to give into blackmail, so I held strong, but the site wasn’t deleted until much later.

Yet again, I know there are so many girls who literally don’t know about this- it’s a terrifying prospect. The screenshot to the right is from just one site.new

Now, some of you may be thinking that I should’ve just had my photo settings on private, or that I shouldn’t upload ‘risqué’ photos, or that I should just quit social media forever.

I thought the same for a long time, I was filled with shame, embarrassment and disappointment. But I’ve come to terms with the fact that I shouldn’t be ashamed at all. I haven’t done anything wrong. Like many others, I’m just another victim of sexual cybercrime.

In fact, now I would say that firstly, no matter how careful you are with your privacy settings on social media. There are always ways around it. These perverts can and do look through photos in the club taken by the club photographer, events pages and even your friends’ accounts

Secondly, blaming the victim is the easy option, especially in this culture of victim-blaming. Where victims of ‘revenge porn’ are asked why they sent nude photos in the first place, instead of why the boys posted them online. We should be asking why these perverted men aren’t being held to account for their actions and for the harm they have not only caused me, but all the other victims subjected to sexual cybercrime.

Lastly, while it may be common knowledge that the internet is a dangerous place and we should all be careful about what we put on the internet, NOBODY expects that when they upload a photo onto Instagram or Facebook, that they’ll end up being depicted as adult actress, with their name and image smeared and misrepresented in a sexually explicit and highly offensive way.

Today, the media is dominated by news of ‘revenge porn.’ We know about the harms of revenge porn to victims that they are more vulnerable to suicide, depression, emotional distress, humiliation and the list goes on.What we don’t hear are the issues of ‘parasite porn’ and ‘morphed porn,’ maybe because most of the victims don’t know they’re victims, which is terrifying enough. But an even more terrifying prospect is that you don’t need to have taken or sent a sexually explicit photo to be at risk.

If you discover that you’re also a victim of ‘parasite porn’ or ‘morphed porn,’ there’s hope still. Now, Google allows you to request the removal of certain photos and videos posted without consent from Google Search Results.

Befitting it seems, how relevant the words of Brené Brown are, the world’s most renowned researcher in shame and vulnerability:

When we deny the story, it defines us. When we own the story, we can write a brave new ending.

So here I am, reclaiming my name.

 

 

Featured Image: Zac Quitzau Facebook: Zac’s Doodles