Deepfakes and Cheap Fakes – Issues with Terminology and the Incorrect Labelling of Deepfakes

In 2017, Motherboard journalist, Samantha Cole, was first to report on deepfakes in her article ‘AI-Assisted Fake Porn Is Here and We’re All Fucked’. Cole reported on a Reddit user called ‘deepfakes’ who had, in his words, ‘found a clever way to do face-swap’ using a ‘a machine learning algorithm, using easily accessible materials and open-source code that anyone with a working knowledge of deep learning algorithms could put together.’ Cole reported that Reddit user ‘deepfakes’ had created and posted on Reddit, fake pornographic videos of Gal Gadot, Scarlett Johansson, and Taylor Swift, among other celebrity women. Thus, marking the beginnings of what has turned into a serious, growing, and global problem – with women being disproportionately affected by this technology.

Data & Society Affiliates, Britt Paris and Joan Donovan, describe deepfakes as ‘a video that has been altered through some form of machine learning to “hybridize or generate human bodies and faces”‘. While Reddit has since banned deepfakes on its platform, prohibiting the ‘dissemination of images or video depicting any person in a state of nudity or engaged in any act of sexual conduct apparently created or posted without their permission, including depictions that have been faked’, Cole’s reporting sparked broader discussions about the implications of this technology, and what it means for distinguishing what is real or fake, and what it means for individuals who might be targeted by fake pornographic depictions of them (technology-facilitated abuse/image-based sexual abuse).

Since 2017, viral deepfakes have been created of Barack Obama, Tom Cruise, and the Queen, depicting them doing and saying things they did not say or do.

Law professor and deepfake scholar, Danielle Citron, and Robert Chesney from the University of Texas School of Law, wrote about the potential harmful implications of deepfakes for individuals and society. Citron and Chesney wrote that for individuals:

[t]here will be no shortage of harmful exploitations. Some will be in the nature of theft, such as stealing people’s identities to extract financial or some other benefit. Others will be in the nature of abuse, commandeering a person’s identity to harm them or individuals who care about them. And some will involve both dimensions, whether the person creating the fake so intended or not.’

Citron and Chesney also discussed the implications of deepfakes on distrorting democractic discourse, eroding trust, manipulating elections, and national security implications, among other things. Citron said in her TED talk entitled ‘How deepfakes undermine truth and threaten democracy’ that ‘technologists expect that with advancements in AI, soon it may be difficult if not impossible to tell the difference between a real video and a fake one.’

While deepfakes pose a threat to society and democracy, in a 2019 report entitled ‘The State of Deepfakes: Landscape, Threats, and Impact’ by Deeptrace (now Sensity), it was found that 96% of deepfakes are pornographic, and 100% of those pornographic deepfakes are of women. Perhaps unsurprisingly, given the origin story of ‘deepfakes’ and the way deepfakes were first used to create fake, non-consensual pornographic material, the human and gendered implications of deepfakes remain the most significant threat of this technology.

The incorrect labelling of deepfakes

There is a lot of discussion by deepfake experts and technologists about what actually constitutes a ‘deepfake’ and whether other forms of less advanced media manipulation would be considered deepfakes, such as ‘cheap fakes’, which require ‘less expertise and fewer technical resources‘, whereas deepfakes require ‘more expertise and technical resources‘. Cheap fakes is a term coined by Paris and Donovan as:

‘an AV manipulation created with cheaper, more accessible software (or, none at all). Cheap fakes can be rendered through Photoshop, lookalikes, re-contextualizing footage, speeding, or slowing.’

To experts, the term ‘deepfake’ is being used incorrectly and loosely to refer to any kind of manipulated/synthetic media even though the material in question is not technically a deepfake. Moreover, according to deepfake experts, media outlets have incorrectly labelled content as a deepfake when it is not.

The incorrect labelling of content as deepfakes, or indeed the incorrect labelling of less advanced manipulated material as deepfakes, has raised concerns with experts that it muddies the waters, and compromises our ability to accurately assess the threat of deepfakes. For example, Mikael Thalen, writer at the Daily Dot, reported on a case in the US about a ‘mother of a high school cheerleader‘ who was ‘accused of manipulating images and video in an effort to make it appear as if her daughter’s rivals were drinking, smoking, and posing nude.’ In Thalen’s article, it was reported that:

The Bucks County District Attorney’s Office, which charged [the mother] with three counts of cyber harassment of a child and three counts of harassment, referenced the term deepfake when discussing the images as well as a video that depicted one of the alleged victim’s vaping.’

According to Thalen’s reporting, ‘[e]xperts are raising doubts that artificial intelligence was used to create a video that police are calling a “deepfake” and is at the center of an ongoing legal battle playing out in Pennsylvania’. World-leading experts on deepfakes, Henry Adjer and Nina Schick, in relation to this situation, pointed out in a tweet by Schick that ‘the footage that was released by ABC wasn’t a #deepfake, and it was irresponsible of the media to report it as such. But really that’s the whole point. #Deepfake blur lines between what‘s real & fake, until we can’t recognise the former.’

My experience with deepfakes and cheap fakes

At the age of 18, I discovered anonymous sexual predators had been creating and sharing fake, doctored pornographic images of me and had been targeting me well before my discovery (since I was 17 years old). Over time, the perpetrators continued to create and share this fake content, which continued to proliferate on the internet, and became more and more graphic over time. The abuse I was experiencing, coupled with a number of other factors, including that there were no specific laws to deal with this issue at the time, eventually led me to speak out publicly about my experiences of altered intimate imagery in 2015/16 and help fight for law reform in Australia. After a few years, distributing altered intimate images and videos became criminalised across Australia (thanks to the collective efforts of academics, survivors, policy makers and law makers). However, during and after I had spoken out, and during and after the law reform changes across Australia, the perpetrators kept escalating their abuse. In 2018, I received an email that there was a ‘deepfake video of [me] on some porn sites’, I was later sent a link to an 11-second video depicting me engaged in sexual intercourse, and shortly after I discovered another fake video depicting me performing oral sex (the 11-second video has been verified by a deepfake expert to be a deepfake, and the fake video depicting me performing oral sex appears to be, according to a deepfake expert, a cheap fake or manual media manipulation. The altered and doctored images of me are not deepfakes).

Issues with terminology and the incorrect labelling of deepfakes

While I agree with deepfake experts and technologists that incorrectly labelling content as a deepfake is problematic, there are a number of issues I have with this terminology discussion and the incorrect labelling of deepfakes, which I set out below.

Before I discuss them, I should point out that deepfakes and cheap fakes have, and can be, created for a number of different purposes, for a number of different reasons, not simply as a way to carry out technology-facilitated abuse. However, given what we know about deepfakes, and how it is primarily used to create non-consensual pornographic content of women, these terms are inextricably linked to its impact on women, and therefore ought to be considered from that lens.

I should also point out that I am neither an academic, deepfake expert, nor a technologist, but I do have a vested interest in this issue given my own personal experiences, and I do believe that I have just as much of a right to include my perspectives on these issues, as these conversations should be as diverse as possible in the marketplace of ideas, and should not be limited in who can talk about them when it directly and disproportionately affects many women in society.

First, the term ‘deepfake’ itself is problematic, due to its highly problematic origin being used as a way to commit non-consensual technology-facilitated abuse. Why should we give credence to this particular term that has been proven to be weaponised primarily against women, and in so doing, immortalise the ‘legacy’ of a Reddit user whose actions have brought to the fore another way for other perpetrators to cause enormous amounts of harm to women across the world. I take issue with labelling AI-facilitated manipulated media as ‘deepfakes’, even though it’s the term the world knows of it by. There ought to be a broader discussion of changing our language and terminology of this technology because it does not adequately capture the primary way it has, and continues to be, used by perpetrators to carry about abuse against women.

Second, the term ‘cheap fake’ is also problematic to the extent that it might refer to material that has been manipulated – in less advanced ways than a deepfake – to create fake, intimate content of a person. Using the term ‘cheap fake’ to describe fake manipulated pornographic content of a person is potentially harmful as it undermines the harm that a potential victim could be experiencing. The word ‘cheap’ connotes something that is less valuable and less important, and that is particularly harmful in the context of people who might be a target of technology-facilitated abuse.

Third, I would argue that from a potential victim’s perspective, it is irrelevant that there even is a distinction between deepfakes and cheap fakes because a potential victim might still experience the same harm, regardless of the technology used to make the content.

Fourth, as AI is advancing, the capacity for laypeople (and even experts) to distinguish what is real or fake is likely going to be affected, including laypeople’s capacity to determine what is a deepfake or a cheap fake. While there is significant validity to experts casting doubt on the incorrect labelling of deepfakes, I worry that as deepfake technology advances, so too will a kind of ‘asymmetry of knowledge’ emerge between technologists and experts, who will be best placed to determine what is real or fake, and laypeople who may not be able to do so. This kind of ‘asymmetry of knowledge’ ought to be a primary consideration when casting doubt about deepfake claims, especially for those who might innocently claim something is a deepfake, whether that be a victim of technology-facilitated abuse or indeed, media organisations who report on it. Noting that innocently claiming something to be a deepfake can be distinguished from those who might intentionally claim something to be a deepfake when it isn’t, or vice versa – in which case, it would be important for experts to cast doubt on such claims.

In conclusion, there ought to be a broader discussion of our use of the terms ‘deepfakes’ and ‘cheap fakes’, given its origin and primary use case. There is also validity to experts casting doubt on claims of deepfakes, however, there are also important factors that ought to be considered when doing so, including its relevance, and whether claims were made innocently or not.

Whoopi victim blaming Bella Thorne on intimate images is dangerous and harmful

Whoopi Goldberg is making headlines for victim blaming actress Bella Thorne for taking intimate images of herself.

Thorne recently took to Twitter after being threatened by a hacker with her intimate images. Thorne posted her intimate images on Twitter to take her power back, writing “…U can’t control my life u never will.”

In a discussion on The View on Thorne’s response to the hacker, Whoopi said: “if you’re famous – I don’t care how old you are – you don’t take nude pictures of yourself” because then those images become available to hackers.

Whoopi continued “and if you don’t know that in 2019 that this is an issue, I’m sorry, your age does not – you don’t get to do that.”

Thorne responded to Whoopi’s victim blaming comments on Instagram saying that the interview “made me feel really bad about myself.” “I can only imagine all the kids who have their shit released and then they commit suicide” said Thorne.

In a discussion on The View in 2017, Whoopi made similar comments about Blac Chyna, after Robert Kardashian posted intimate images of Chyna online without her consent. Whoopi said, “stop sending pictures of your body, stop!” Whoopie continued “learn from Anthony Weiner – stop sending this stuff!”

Whoopi’s victim blaming of intimate image abuse victims on more than one occasion reflects a pervasive and insidious attitude that victims of all forms of sexual abuse know all too well.

The kind of attidude that attacks and criticises the conduct of the victim, instead of the perpetrators of a crime. The sentiment that somehow the victim is at fault for the wrongdoings committed against them, or worse that the victim deserves the harm.

Words to the effect of:

  • If they didn’t wan’t their nude photos leaked, they shouldn’t have taken the photos in the first place.
  • She shouldn’t have worn that short skirt, what did she expect.
  • Don’t walk home alone at night, if you don’t want to get raped.
  • But the school boy probably liked it.
  • But she’s slept with so many people.
  • But she shouldn’t have gone home with him.

What happened to Thorne was sexual abuse. Period. It was a gross violation of her privacy, dignity, agency, self-determination and humanity. Celebrity or not.

Everyone is entitled to respect, dignity and to be free to exercise agency over their bodies and sexuality – on their own terms, free from exploitation and abuse.

Katelyn Bowden, Founder and CEO of BADASS, a nonprofit organization dedicated to providing support to victims of revenge porn/image abuse said:

“It’s her body, and she has every right to own pictures of it without them being used against her. Celebrities deserve privacy like everyone else.”

Mia Landsem, activist fighting against intimate image abuse in Norway said that we should all be allowed to express our sexuality using technology, but shouldn’t be victim blamed for doing so.

Victim blaming celebrities like Thorne and Chyna is also dangerous. It reinforces a culture of shame where victims of intimate image abuse feel that they cannot seek help or talk about their abuse to others. It gives licence to perpetrators of this abuse by shifting accountability away from them.

Research out of Australia from Dr Nicola Henry, Dr Anastasia Powell and Dr Asher Flynn, RMIT and Monash University researchers report that “victims of image-based abuse [intimate image abuse] experience high levels of psychological distress”.

It was also reported that threats to distribute nude images are:

“…particularly harmful for victims, not only because of the consequences that can flow if the image is made public, but also owing to the acts that emerge from such threats, including unwanted sexual acts, restrictions of movement, exclusion from social life and monetary deprivation.”

This idea that in order to avoid or mitigate the risk of being threatened with your intimate images is to not take them in the first place, misses the point. Why is it our responsibility to ‘avoid’ being sexually exploited and abused? Why isn’t it entirely the perpetrator’s responsibility to not sexually exploit and abuse others?

You don’t hear people say things like, if you don’t want your car stolen then don’t own a car. So why do we hear victim blaming attitudes so often in relation to intimate image abuse victims?

And look, while Whoopi’s comments are disappointing, I really hope that this becomes a teachable moment. The ability to deconstruct and challenge our own views, biases and prejudices is integral if we are ever to grow and change. Nobody is perfect – I’m sure as hell not perfect. And while I can’t speak for everyone, a sincere apology and changed behaviour does go a long way…

We can be better. We have to better.  

Bella – I am so sorry for what has happened to you but I am enormously proud of your strength, courage and resilience through all of this. Just know that there is an army of support behind you.

Featured image: Left (Source – Youtube), Right (Source – Facebook)

Sexual predators edited my photos into porn – how I fought back

TW: Image-based sexual abuse/sextortion

 

I am LITERALLY SHAKING with emotion as I share my TEDxPerth talk about my experiences of image-based abuse.

It’s also bittersweet because to this very day I am still experiencing this horrific crime. Not that long ago an anonymous sexual predator doctored me onto the body of a woman wearing a semi-transparent, nipple-exposing t-shirt with the words ‘I AM A DUMB COW’ written on it, which was shared online. The same sexual predator also doctored another image of me on the cover of another adult movie next to the words ‘TREAT ME LIKE A WHORE’. These were the LEAST sexually explicit of the most recent wave of doctored images of me.

There was a time when I would see these doctored images of me on pornographic sites and uncontrollably cry myself to sleep. But now I am so determined to do what I can to combat image-based abuse so that no other person has to be the subject of this dehumanising and potentially life-ruining criminal behaviour, because this issue is SO much bigger than me or any one person.

It is a global issue. It can and does happen to anyone – particularly women, people with disabilities, the LGBTQI community and other vulnerable groups.

While Australia and many countries around the world have criminalised or are in the process of criminalising image-based abuse (revenge porn), there is only so much one country or state can do to combat an issue that transcends jurisdictions.

The international community (including social media and tech companies) MUST work together to help combat this issue because right now too many victims are left without justice. Technology is advancing faster than our laws, and predators are continuing to come up with new ways to abuse others. We need a global plan of action. And we need it now.

There is so much more I would’ve liked to say in this talk, especially to those who are experiencing image-based abuse. If that is you – I really want you to know that you are not alone. You are loved. You are supported. And the fight for justice is as strong as ever. Yes, things might get really tough. People might victim blame and slut shame you. There might not be any justice or recourse. People might invalidate your experiences because they don’t understand that what happens online has real world consequences.

But PLEASE don’t lose hope or give up. Please know you are not to blame –
It’s YOUR BODY, YOUR CHOICE – ALWAYS. Please stay strong. I know it is easier said than done, but take each day as it comes. Surround yourself with those who love and support you, because things CAN get better. I know it.    (Below I have included the details of the world-first image-based abuse portal created by the Office of the eSafety Commissioner under the amazing leadership of Julie Inman Grant – from personal experience I can tell you that this service is so incredible for those who are looking for support if you are dealing with this, the staff are professional, kind and so caring.)

I also want to take the time to reiterate something I said in the talk – I do NOT in any way want to take credit for ‘changing the law’ AT ALL – In this journey I have had the privilege of meeting fellow survivors and activists who have fought with all their hearts and might for change – this is on their backs. It’s on the backs of ALL the victims and survivors who have dared to speak out.

One warrior in particular is Brieanna Rose who inspires me to my very core. She has been instrumental in this change and she deserves to be recognised. She is an incredible warrior. And it is an honour to know her. I love you Brieanna. I am so grateful for all the work you have done and continue to do for justice.

I have also met some of the incredible academics who have been pivotal in enacting change – Dr. Nicola Henry, Dr. Anastasia Powell and Dr Asher Flynn who have contributed so much in this area – their work and passion is invaluable, and we owe them a great deal of thanks. This is on their backs. They are incredible.

This is also on the backs of women’s rights advocates, tech safety experts, policy advisers, lawyers, politicians and especially the amazing people who helped create such a life changing piece of legislation at the NSW Attorney General’s Department including the NSW Attorney General Mark Speakman who actually included me in this process. Thank you for giving me a voice, thank you for giving me a chance to reclaim my name. I can’t tell how much it has meant to me.

This is also on the backs of so many other stakeholders in Australia and around the world who have worked for years fighting for change and justice in this area. The process of changing the law is not easy, it is a long and convoluted process and I am so grateful to every single person who has played a part in fighting against image-based abuse in Australia and beyond. I am so proud of all your work.

I also want to make it clear, that I could not have gone through this journey without the support of my immediate family (Dad, mum and my 4 sisters – I love you and thank you for putting up with my non-stop crying during the worst of times), my best friends, Liam Downey, Mads Duffield and Tanaya Kar who have supported me from day 1 – I love you and I am forever indebted to you, you were there for me at my worst and I can never repay you. Thank you to ALL my other close friends who have lifted my spirits and given me strength in my darkest days – you know who you are, I love you dearly!

To everyone who has followed this journey and taken the time to reach out – I’ve said this before and I’ll say it again – it does not go unnoticed and I appreciate it more than you know. I want to thank two particular law professors at my uni – Zara J Bending and Shireen Daft who have listened, encouraged, empowered and shown so much love, care and support to me – you have been such pillars of strength for me. Karin Bentley who not only has done so much work fighting for tech safety for women, but has supported and gone out of her way to allow me to share my experiences, bring me to the table, and give me a voice, something that people don’t do often, I am so incredibly grateful to you.

I also want to give A HUGE THANKS to TEDxPerth for allowing me to share my experiences in the first place. Thank you for seeing value in what I had to say and having faith in me. Thank you to all the organisers and curators for VOLUNTARILY doing SO MUCH work putting the event together. TEDxPerth 2017 was a success and it’s thanks to you. To Andrea Gibbs and Emma who were nothing short of phenomenal. They helped me so much with this speech. And were brutally honest with me when it sucked BAD. I am so grateful for your help and support – I really can’t express in words how much your help and support meant to me.  

If you are currently experiencing a form of image-based abuse, please contact police or there is support available through the WORLD-FIRST image-based abuse portal here: https://www.esafety.gov.au/image-based-abuse/

Senate Passes Civil Penalty Regime to Combat Image-Based Abuse

Today the Senate passed the Enhancing Online Safety (Non-consensual Sharing of Intimate Images) Bill 2017 with some surprising, significant amendments. This Bill is part of the Australian Government’s efforts to combat image-based sexual abuse, and was developed from a public consultation into a proposed civil penalty regime (submissions/public workshops) conducted by the Department of Communications and the Arts between May – July 2017.

The Australian Government’s proposed framework is to establish a Commonwealth civil penalty regime to complement:

  • The world-first image-based abuse complaints portal run by the Office of the eSafety Commissioner which provides: information and advice, options for removing and reporting abusive images and videos, and resources and case studies; and
  • Existing Commonwealth and state and territory criminal offences.

The Bill establishes a civil penalty regime that would, as outlined in the explanatory memorandum: “prohibit the non-consensual posting of, or threatening to post, an intimate image on a ‘social media service’, ‘relevant electronic service’, e.g. email and SMS/MMS, or a ‘designated internet service’, e.g. websites and peer to peer file services”, among other things.

It imposes a civil penalty, rather than a criminal liability, of $105,000 for individuals who contravene the prohibition; and a civil penalty of $525,000 for corporations who fail to comply with a ‘removal notice‘ that may require a social media service, relevant electronic service or designated internet service to remove an intimate image from their service.

The Bill also empowers the eSafety Commissioner to investigate complaints, issue formal warnings and infringement notices, provide removal notices and written directions to ensure future contraventions do not occur.

The general consensus from the Senate this week was that Labor, The Australian Greens, and The Nick Xenophon Team welcomed and supported the Turnbull Government’s Enhancing Online Safety (Non-consensual Sharing of Intimate Images) Bill 2017. Although as Labor Senator Deborah O’Neill pointed out the “Turnbull government has been dragging its feet and has taken far too long to address this issue of image based abuse. The bill comes in the fifth year of the Liberal government and over two years after Labor’s first proposes, stronger measures”.

While Labor supported the Bill as a step in the right direction, they did not think it went far enough. Labor called on the government to criminalise the non-consensual sharing of intimate images citing:

  • The COAG Advisory Panel on Reducing Violence against Women and their Children who recommended in April 2016 that strong penalties for the distribution of intimate material without consent be developed to “clarify the serious and criminal nature of the distribution of intimate material without consent”;
  • Concerns by the Commonwealth Director of Public Prosecutions in a Senate Inquiry submission by the Senate Legal and Constitutional Affairs References Committee that “there are limitations on existing Commonwealth laws to adequately deal with ‘revenge porn’ conduct”;
  • Research from RMIT and Monash University that 80% of Australians agree “it should be a crime for someone to share a nude or sexual image of another person without that person’s permission”.

The Australian Government responded to the push to criminalise image-based abuse at the Commonwealth level by pointing out that there is already an existing Commonwealth criminal provision in place under s 474.17 – the misuse of a carriage service in the Commonwealth Criminal Code Act 1995. However, this non-specific, existing provision has been highly and widely criticized for its limited applicability to image-based abuse.

As a result, a significant amendment to the civil penalty regime was successful in the Senate today, namely to amend the Criminal Code Act 1995 to include specific criminal offences that would criminalise sharing and threatening to share, intimate images without consent. While this amendment to introduce criminal offences in conjunction with the proposed civil penalty regime may return to the Senate after transmission through the House, this amendment could mean an incredible move toward justice for victims of image-based abuse. 

In the Senate debate the Australian Greens stated that they were disappointed that the Bill was brought on for debate in such ‘haste‘ without allowing for proper scrutiny (e.g. inquiry). Australian Greens Senator Jordon Steele-John pointed out that “many of those consulted are under the impression that they will subsequently be given the opportunity to give their thoughts, opinions and expertise in regard to the outcome.”

In light of the lack of proper scrutiny of this Bill, another amendment to the Bill (sheet 8364 revised) was agreed to being that of the establishment of an independent review (and written report of the review) of the operation of the civil penalty regime within three years after the commencement of the proposed legislation.

In addition, the Australian Greens expressed concern that the Turnbull Government has forgotten to allocate any funding to the cost of running the scheme. While the explanatory memorandum of the Bill provides a ‘Financial Impact Statement‘ which states that the civil penalty regime “might have a minor impact on Commonwealth expenditure or revenue”, and “any additional funding will be considered in the 2018-19 Budget”, there is a level of uncertainty as to extent of funding needed to carry out this scheme. Labor Senator Louise Pratt also highlighted the “minimal resources that the eSafety Commissioner currently has for undertaking this kind of work”. I anticipate that the question of funding will be discussed in the House.

Also, One Nation Senator Hanson talked about her own experiences where she was subjected to the ‘degrading’ and ’embarrassing’ publication of images of a woman who was partially nude and false claims that they were pictures of her. However, Hanson went on to express some dangerous rhetoric about image-based abuse:

Hanson said:

“As the old saying goes, sometimes it takes two to tango. I say to anyone out there who thinks that intimate images of themselves are okay to send via text message or email: ‘Stop it. Keep it for the bedroom.’ People, regardless of your age, it’s in what is told to you by your parents and how you feel about yourself: people have to take responsibility for their own actions. Young people who get requests for intimate images of themselves early in relationships should not do it. Relationships don’t always last, and the person they are with may very well turn nasty on them. I’m very pleased to say that One Nation are a part of putting a dent in this abhorrent trend of shaming people using online methods and intimate images, but I reiterate: I want every man, woman and young adult to know that they too must play a role in ensuring their private photos are kept private.”

This rhetoric by Hanson perpetuates an insidious culture of victim blaming. It sends a harmful message that victims are partly responsible for the horrific and criminal actions of perpetrators. And may discourage victims from speaking out or seeking help because they feel they are to blame.  Perpetrators who share, threaten to share or record intimate images without consent are the ONLY people responsible for image-based abuse – not the victims. Many people – young people and adults – are capable and do engage in the consensual practice of sharing intimate images in a respectful, healthy, safe, loving or intimate way. But image-based abuse is the clear absence of consent and respect. Image-based abuse is perpetrated for various reasons: to humiliate, shame, intimidate, coerce, control, harass and violate victims, it’s also perpetrated for sexual gratification, social notoriety, and financial gain. Our standards and expectations of behaviour shouldn’t be so low that we hold victims partly responsible for the heinous actions of perpetrators.

When it comes to young people, there is a growing problem of young girls feeling pressure to send intimate images of themselves, and this is something that desperately needs to be addressed with respectful education initiatives and programs. We must teach young people about the safe use of technology and associated risks, consent, respect and we must empower young girls to take control of their online usage and agency – but we mustn’t, in any way, send the message that young people who send intimate images of themselves are somehow responsible for the actions of perpetrators who betray their trust or personal privacy.

To echo the sentiments in the Senate: this Bill is a significant step in the right direction, and when taken in conjunction with the amendment to introduce Commonwealth criminal offences, today marks a significant move toward long-awaited justice for victims.

I am extremely grateful to the Australian Government, Senator the Hon. Mitch Fifield, the Department of Communications and the Arts and all the stakeholders involved in the public consultation of this Bill, as well as everyone who has worked hard for years fighting for justice and accountability. Here’s to hoping for a smooth passage in the House. This is fantastic news!

Proposed Commonwealth Civil Penalties for Image-Based Abuse

The Commonwealth Government has introduced a bill, the Enhancing Online Safety (Non-consensual Sharing of Intimate Images) Bill 2017, aimed at prohibiting and deterring persons and content hosts from sharing intimate images without consent.

Under the proposed law, individuals who post, or threaten to post, an intimate image may be liable to a civil penalty of $105,000.  

Social media providers such as Facebook and Twitter; and other content hosts/electronic services/internet services, may be subject to a ‘removal notice’ to remove an intimate image shared without consent from their service. If they fail to comply with a requirement under a ‘removal notice’ they may be liable to a civil penalty of $525,000.

And as an extra layer of deterrent, persons may be given a direction to cease sharing an intimate image without consent in the future. If persons fail to comply with such a direction, they may also be liable to a civil penalty.

My initial thoughts and concerns of the proposed law:

Overall, I am extremely happy to see such strong and hefty fines for perpetrators and content hosts. I am very pleased that the Commonwealth Government has introduced specific, nation-wide proposed laws for combatting image-based abuse. And I am very grateful to the Department of Communications, Senator Mitch Fifield and everyone involved in drafting this Bill, as well as all the stakeholders who participated in the public consultation process.

My main concern is that there is no express provision creating a statutory right for victims to either claim compensation or damages for the harm this can cause them.

Image-based abuse can cause significant harm to victims including emotional distress, violation, shame, humiliation, damage to their reputation and employability and disruption to their employment or education. Victims can fear for their safety and have suicidal thoughts and/or attempt suicide. I know of victims who have had to take time off work because the emotional distress is so significant. I know victims whose studies have been affected by the actions of perpetrators.

In the area of competition and consumer law, companies who breach such laws can be liable to fines, and for affected consumers the law creates a statutory cause of action for damages for loss or damage. Yet, under this proposed civil penalty regime, while perpetrators and content hosts may be liable to very hefty fines, victims won’t have the express statutory right to access damages (or compensation).

The non-consensual sharing of intimate images is a GLOBAL problem, and it is pervasive in our society, with 1 in 5 Australians having experienced image-based abuse according to RMIT and Monash researchers. And according to research by the eSafety Commission –  ‘women are twice as likely to have their nude/sexual images shared without consent than men’, and ‘women are considerably more likely to report negative personal impacts as a result of image-based abuse’. In August this year image-based abuse became a crime in NSW and since then there have already been 20 charges. So, it is highly likely that should this civil penalty regime be enacted, there will be a lot of fines being handed out. Thus, victims should get the justice they deserve for the harm they suffer.

Also, I am happy to see that the definition of ‘intimate image’ in this Bill has extended beyond just material (photos or videos) depicting or appearing to depict a person’s private parts or a person engaged in a private act – but also specifically includes material of a person without their attire of religious or cultural significance, where that person consistently wears particular attire whenever they are in public. This inclusion of religious or cultural factors in the definition of ‘intimate image’ represents an intersectional and culturally sensitive approach to image-based abuse, which is fantastic!

Having said this, I am proud that Australia is taking such strong action. 

 

 

 

 

 

 

 

 

 

Image-Based Abuse: The Phenomenon of Digitally Manipulated Images

Image-based abuse, colloquially referred to as ‘revenge porn’ (‘revenge porn’ is a misnomer) is an umbrella term. It refers to the non-consensual sharing of intimate images. Contrary to popular belief, there is much more to image-based abuse than the textbook ‘revenge porn’ scenario of the ‘jilted ex-lover sharing nude photos of their ex without consent’. Image-based abuse can be perpetrated in a number of ways, for a number of reasons including (among other things) to control, harass, humiliate, shame, coerce or sexually objectify a victim.

Image-based abuse is the recording, sharing or threatening to record or share, intimate images without consent‘Image’ means photo or video. ‘Intimate image’ means an image of a person engaged in a private act, or of a person’s private parts, or of a person in circumstances one would expect to be afforded privacy. ‘Intimate image’ can also mean an image that has been ‘altered’ without consent (digitally manipulated, doctored, photo shopped, etc.) to show a person in any of the above (i.e. engaged in a private act, etc.)

noelle
Photo: Noelle Martin (Me). Source: ABC NEWS (Dave Martin)

To date there is little to no research, data or information on the phenomenon of digitally manipulated images, but this issue is known to academics, researchers, cyber safety experts and women’s groups, and this issue is being incorporated into some recent law reform initiatives in Australia.

As a survivor-turned-advocate of this particular type of image-based abuse (link to my story here). I hope to provide some much needed insight into this form of image-based abuse and the many ways it can occur in the digital age. I will also provide a few tips on what to do if this happens to you.

The insight I provide below cannot tell you the exact extent nor how frequent this phenomenon is occurring, but what I can tell you is that there are horrific online cultures (websites/threads) that exist which host and facilitate the creation and distribution of digitally manipulated images. I can tell you some of its forms and I can tell you that I’m not the only one. Recent comprehensive research conducted in Australia shows that 1 in 5 Australians experience image-based abuse, while this takes into account other forms of this issue too, the prevalence of image-based abuse in general is telling.

Forms of Digitally Manipulated Image-Based Abuse

  1. ‘Face Swapping’ 

This form is where person A’s face is photo shopped onto pornographic material in such a way to suggest that person A is truly depicted in the pornographic material. For me, this form manifested itself when my face was:

  • photo shopped onto images of naked adult actresses engaged in sexual intercourse;
  • photo shopped on images where I was in highly explicit sexual positions in solo pornographic shots;
  • photo shopped on images where I was being ejaculated on by naked male adult actors;
  • photo shopped on images where I had ejaculation on my face; and
  • photo shopped on the cover of a pornographic DVD.

I must also point out that these altered images of me quite literally identified me by name in the image. My name was edited onto the bottom of these images in fancy font to suggest that I was some adult actress.

2. ‘Transparent Edits’

This form of image-based abuse is where a person’s clothes are digitally manipulated to give the effect of it being see-through. For example, a woman’s blouse can be edited so that the appearance of nipples can be seen through their clothes (this happened to me).

3. ‘Cumonprintedpics’

This form of image-based abuse is where a perpetrator has ejaculated onto an image of person A, and has taken an image of their semen (with/without penis) on person A’s image. The perpetrator can take this second image (containing person A’s image and perpetrators penis/semen) and post it online. There are many forums and websites that feature galleries of this kind of image-based abuse (this happened to me).

4. ‘Bodily Alterations’ 

This form of image-based abuse is where a perpetrator digitally manipulates an image of person A by enlarging or enhancing person A’s private parts, particularly the breasts or behind. The alterations are usually very extreme.

5. ‘Juxtapositions’ 

This form of image-based abuse is where a perpetrator doesn’t necessarily alter an image of person A, but instead juxtaposes (places side-by-side) an image of person A next to say, a pornographic image of person B, where person B has a similar-looking appearance/body to person A. The perpetrator can explicitly or implicitly indicate that the pornographic image of person B, is person A.

6. ‘Unidentifiable Alterations’

This form of image-based abuse is where a perpetrator digitally manipulates an image of person A (into highly sexual material) but person A cannot be (objectively) identified at all. In this grey area, I believe that it really doesn’t matter whether person A can be identifiable by third parties, what matters to me is whether person A can identify themselves, because it is EXTREMELY violating and degrading to be the subject of digital manipulation in itself. Plain and simple.

These are some of the many ways the phenomenon of digital manipulation can occur.

What can you do if this happens to you?

Unfortunately, the laws in Australia are limited. The NSW Parliament has recently passed an image-based abuse bill that will criminalise distributing, recording or threatening to distribute or record intimate images (including ‘altered’ images) without consent. South Australia and Victoria have ‘revenge porn’ laws but neither explicitly mention ‘altered’ images or digitally manipulated images. The Federal Government is in the process of potentially creating a civil penalty regime to complement existing criminal penalties, that could potentially cover digitally manipulated images. And the Office of the eSafety Commissioner is working on an online complaints mechanism for images shared without consent.

In the meantime, there are options. The eSafety Women website provides a list of what you can do. You can:

  • Collect all the evidence
  • Report it to the police
  • If you are over 18, you can report it to ACORN(Australian Cybercrime Online Reporting Network)
  • If you are under 18, you can report it to the Office of the eSafety Commissioner.
  • You can contact the webmasters/content hosts and request the removal of the material. (Proceed with caution)
  • Google has a reporting function to remove intimate images that have been shared without consent. Google can remove such images from its search results.
  • Facebook also has the tools to remove intimate images that have been shared without consent from Facebook, Messenger and Instagram.
  • Contact a lawyer and seek advice.
  • Contact local women’s groups/ domestic violence groups.
  • Sign petitions urging Australia to change the law ASAP.

Just remember, you are NOT alone. Wherever you are in the world. ❤ 

If you or someone you know may be suffering from mental illness, contact SANE, the National Mental Health Charity Helpline on 1800 187 263 or Lifeline, a 24 hour crisis support and suicide prevention service on 13 11 14.

 

 

 

 

 

 

 

Trading Pleasure for Consent

Let’s get one thing straight: stealthing is sexual assault.

You could be forgiven for not knowing what stealthing is, except that is part of the problem. Recently the HuffPost claimed stealthing was a ‘new sex practice’, but since then people all over the world have been coming forward and telling their stories, implying there is nothing new going on here. We are just finally talking about it.

The term itself is fairly new and the internet has been quick to inject the phrase into the online lexicon. But in case you’re still not familiar with it allow me to summarise:

Stealthing is the act whereby one party removes the condom during sex without the other party’s knowledge or consent. Gross, right?

The recent surge of online debate over stealthing began when Alexandra Brodsky of Yale Law School posted a study suggesting that the trend was on the rise in the US and calling for new laws to concretely safeguard victims.

Source: Instagram/@honestly_quotes

In recent years, courts from all over the world have found stealthing to be a clear breach of bodily integrity and a non-consensual sexual act. Bills have been introduced in the US to criminalise it in California and Wisconsin, and a similar piece of legislation is under consideration in the UK.

Now that you know what stealthing means you’re probably thinking ‘Oh, I’ve heard stories about that. Hasn’t that been going on for ages?’ And the sad truth is yes, it probably has. The development of sexual assault and other crimes of a sexual nature, as they are defined under the law, has been painstakingly slow. Some parts of Australia had no laws against marital rape until 1987, and we only managed to introduce legislation criminalising image-based abuse, commonly referred to as ‘revenge porn’ this year. We’ve been well behind the game.

This slow progress can also be seen in stealthing. There have been no cases of stealthing brought before the courts in Australia, and no legislation specifically mentions the ramifications if protection is removed during intercourse without both parties consenting. I can understand the law being slow if it is catching up with technology, but condoms aren’t exactly the latest and greatest in contraception. So what’s the deal?

If I were a betting woman – and I’m not, but if I were – I would guess that the reason there has been no action in this area of law is because nobody is reporting it. Like most issues with sexual assault, it all comes down to whether the victims step forward. And as usual this comes with a whole other mix of problems, from not understanding that what happened was ‘assault’, to not wanting to get a friend or loved one in trouble. One account online of a victim of stealthing also noted that the police did not take her matter seriously when she gave her statement. Sound familiar?

Time and time again victims of sexual assault are having to fight against this overriding theme that consent is not as important as pleasure. Allegations of rape always contain questions over whether the victim was ‘asking for it’ or whether the victim simply regretted it the next day. Sex is fun, sex is pleasurable, people love to have sex! So victims are asked if they are sure they didn’t consent, and if they are sure it was rape. Because to some people any sex is still sex.

Stealthing is the ultimate example of this. Offenders remove the condom, most typically because they can experience more pleasure without it, be it from the physical experience or the feeling of degrading the other party. And in exchange for this pleasure is the consent of the victim, who has no idea that the terms upon which they agreed to have intercourse have been rewritten.

Imagine sex like a contract. Both parties put forward their terms. Lights off. Reciprocal orgasms. But most importantly: a condom. Then during the execution of the contract the terms are changed. And not just any term, but one of the big ones. One of the terms that protects a party’s physical autonomy – the term that protects them from falling pregnant or potentially contracting an STI. That shield is literally taken away.

If you agreed to enter a boxing match on the condition you wear protective gear, wouldn’t you be angry if half way through the match they took your helmet away and continued to punch you?

So while Australian law remains silent on stealthing, it is important that victims don’t. Men, women and non-binary victims who have had their bodily integrity compromised by the selfishness of another. People who have been violated and assaulted by offenders who have consistently gone unpunished.

Stealthing is not a prank. It is not a joke. There is nothing funny about sexual assault.

And as far as I’m concerned that’s all stealthing is: sexual assault. And the sooner we stop trying to divert the conversation about sex-based crimes with discussions centered around pleasure, the better.

Featured Image: Encouraging Life Organisation which provides services on ‘reproductive, sexual health and comprehensive sex education’

‘Not Your Honey’ – When Sexual Empowerment Disempowers

Words: Jessica Sheridan

One of the difficult daily conundrums for women is the pressure to be sexy, but not too sexy. We are encouraged to wear high heels, but not too high, to wear low cut tops, but not too low cut. Honestly it’s a minefield of social faux pas trying to balance the two camps, and it often results in the stifling of our sexuality for fear of being too sexually open.

But women should be able to talk about sex. More than just that, women should be able to talk about pleasure, sexual desires and dislikes, the sensuality of their bodies – everything. I believe women should stand their ground and own their sexuality, recognising that their pleasure is just as important as their partners and their bodies really are a wonderland. Women should not have to feel ashamed of being sexy.

Honey Birdette is one brand that claims to stand for this idea. On their website, they introduce themselves as ‘Pleasure parlours’ created to ‘inject a sense of sensuality into the Australian bedroom.’ Many people are likely familiar with the brand: their decadent shop fronts of gold and black can hardly be missed, and they are known for selling luxury lingerie and sex toys unashamedly. And rightly so – there should be no shame in consensual sexual pleasure.

But not everything is always as it seems.

Recently ex-employees of Honey Birdette have come forward to speak out about the brand, claiming poor work conditions, sexism, and being subjected to sexual harassment. At a protest in Victoria on December 9th a group of ex-employees gathered in Melbourne to bring attention to the backwards working conditions they were subjected to. The ex-employees were seen burning bras and sporting signs that read ‘Not Your Honey’ in protest of the mistreatment and sexual harassment they faced during their employment.

15423523_10211581320953333_1594139243_n
Former Honey Birdette employees fight back against poor working conditions. Source: Twitter

And it’s not just the protest. A petition has started online calling for change to Honey Birdette’s dress code, policies, and attitude toward sexual harassment. The campaign creator Chanelle Rogers wrote in her preamble to the petition:

‘I saw workers humiliated and threatened by management because they weren’t wearing perfectly applied lipstick all day, their heels weren’t high enough, and because they didn’t “talk the way a Honey should talk”. I saw workers sexually harassed and intimidated by customers – and when these women spoke up, management told them to suck it up.’

One story by ex-employee Dominic Jericho Drury has also been shared hundreds of times on Facebook, detailing their own experience working at Honey Birdette. They likened their employment with the company to an ‘abusive relationship – obviously insane from the outside but alluring enough to still suck people in.’ They recalled repeated harassment by customers, claiming ‘we had to put up with this, as there was no way we would be supported if looking after ourselves came before making a sale.’ Their story highlights the extremes expected of employees to be considered a true Honey.

15424510_10211581320433320_478021964_n
Call to action as women stand up against Honey Birdette. Source: Twitter

Over the past twenty four hours, the Honey Birdette Facebook page has been inundated with posts from customers who claim they will be boycotting the store. Many of the posts – mostly from women – demand that Honey Birdette change their policies, or share stories from other ex-employees supporting the protest’s allegations. While it is amazing to see women standing together to protect the rights of their fellows, Honey Birdette are yet to acknowledge and respond to the protests. There have been no posts by the page or on their website following the accusations.

These stories paint a picture nothing like the one Honey Birdette speaks of when it claims to ‘empower women.’ In order to empower women, you have to respect them, treat them fairly, and allow them to stand up for themselves. From small issues like requiring girls to wear perfect red lipstick and high heels for their long shifts, to bigger issues like shutting down complaints of sexual harassment, the protest and petition are shedding a very ugly light upon the company that was created with feminist ideas in mind.

It is not empowerment when women are forced to show their bras and wear stilettos just to keep their job. It is not empowerment when women are paid to have people talk to them in unwanted sexually explicit ways. It is not empowerment when women are scared to speak up about feeling uncomfortable in the workplace for fear of losing their job. This is not sexual empowerment. This is not even women empowerment. Silencing sexual harassment allegations and enforcing dress codes that play on sexualising women for the public (read as: male) gaze is disempowering.

It’s one of those problems that seem to stem from trying to apply a quick fix to a deeply ingrained societal issue. Sexual empowerment is not as simply as wearing a lacy bra or holding a riding crop. It is not red lipstick during the day or wearing stilettos as high as possible. Sexual empowerment is about choice, and feeling good about those choices. If you take away the ability to choose, then you make it impossible to empower women.

Dress codes and workplace policies are a fact of life. But sexism and sexual harassment shouldn’t be.

Featured Image: Source: Facebook

 

Pornographic Cybercrimes: Does the Law Protect Your Personal Privacy?

It seems like every other day we are hearing stories in the news of young girls taking their lives because their nude photos were plastered online without consent. We hear about celebrity nude photo hacks. We hear about government ministers in the Northern Territory embroiled in revenge porn scandals. More recently, the target has hit closer to home, with school students from over 70 Australian schools caught in a pornography ring, featuring thousands of non-consensual sexually explicit images of young girls.

The news is dominated by instances of ‘revenge porn,’ that is, the distribution of sexually explicit or intimate images of another person without consent, usually by ex-lovers. But we rarely hear about ‘parasite porn’ which is when ordinary images are taken from a person’s social media site and posted on threads in a pornographic site, usually alongside offensive and objectifying comments. In other words, you might not have taken a single sexually explicit photo of yourself- but you’re still not immune from being the target of sexual cybercrime.

We also rarely hear about ‘morphed porn’ where ordinary images are manipulated and superimposed on naked bodies and posted on porn sites. The bottom line is that in today’s age of technology, while revenge porn may be on the rise, it is not the only issue compromising our personal privacy.

There has been some talk that law-makers should re-name and categorise revenge porn, parasite porn and morphed porn into what is known as image-based sexual assault. I’d argue that the categorisation of ‘image-based sexual assault’ is preferable as it would encompass a broader range of sexual cybercrimes.

It is very important to know, that while young women are the primary targets of such invasions of privacy, anyone can fall victim to sexual cybercrime – yes, even males. I know of one case, where a guy had dressed up as an animal for a costume party,  later to find it on one of those furry fetish porn sites.

So what laws, if any, are in place to protect our personal privacy in this digital age?

Well, the law has not caught up with advancements in technology and unfortunately Australia is yet to criminalise revenge porn. There are only two state jurisdictions, South Australia and Victoria that have implemented revenge porn legislation. For example, in Victoria it is an offence, punishable by up to two years imprisonment, to maliciously distribute, or threaten to distribute, intimate images without consent. However, these criminal provisions have been criticised for being too ‘weak’ a punishment for perpetrators and too ‘broad’ in scope to capture the harm caused by revenge porn.

Since the majority of Australian states have not criminalised revenge porn, victims have to predominantly rely on civil actions to seek redress for invasions of personal privacy, possibly copyright or defamation proceedings. However, contrary to popular opinion, a general tort protecting personal privacy does not exist in Australia. As such, courts have tried to fit cases involving circumstances of ‘revenge porn’ into existing causes of action. As a result, what we have ended up with is a quasi-privacy tort, namely an equitable action for breach of confidence that was set out in the notable personal privacy case of Giller v Procopets.

The recent case of Wilson v Ferguson applied the principles set out in Giller v Procopets and relied on an action for breach of confidence in circumstances of ‘revenge porn.’ In this case, Ferguson and Wilson were involved in sexual relations and shared sexually explicit photos and videos of each other during their relationship. When the relationship ended Ferguson posted the intimate photos of Wilson to Facebook for public viewing without consent. Wilson was left severely emotionally distressed.

But is this quasi-privacy tort effective in dealing with the rise of revenge porn?

Firstly, this quasi-privacy protection fails to effectively punish perpetrators, and deter future incidence of sexual cybercrimes. Given that the harms felt by victims of sexual cybercrime are significant: as victims are more vulnerable to suicide; others experience stalking, depression, emotional distress and humiliation; for some it has affected their employability and others have lost their jobs. Is it really enough to simply award an injunction and provide monetary compensation to victims under this quasi-privacy protection?

Such harms warrant the criminalisation of revenge porn and the imprisonment of perpetrators. Criminalising revenge porn would serve to provide stronger punishments to perpetrators and would deter future incidence of sexual cybercrimes.

Additionally, this quasi-privacy protection in Australia fails to provide adequate justice for victims. It is somewhat paradoxical that civil actions intended to protect our personal privacy, doesn’t necessarily achieve this outcome- because an action for breach of confidence means that victims may not remain anonymous, unlike the protection that criminal prosecution affords. In fact, victims may be reluctant to seek civil redress because it is extremely timely, costly and emotionally taxing for already vulnerable victims and may increase publicity of the photos.

But even if Australia’s laws were to change – there are inherent problems for lawmakers in addressing these issues due to the nature of the digital landscape:

  1. There are difficulties in enforcement and punishing perpetrators, especially where sites are run outside of Australia.
  2. Once an image is online it can be very hard to remove because images can be shared instantaneously all over the internet and before the law can step in much of the damage is already felt by the victim.
  3. There are difficulties in detecting intimate photos as quite often victims are not aware that their intimate photos have been posted online and by the time the victims become aware that their intimate photos have been posted, the images have gone viral making its removal near impossible.

In America, the situation is quite different. Already around 34 states have revenge porn legislation. Most revenge porn legislation in America is based on the New Jersey or the Californian models, both differ significantly. For example, in New Jersey, it is a crime, punishable by up to 5 years’ imprisonment, to disclose any photograph, film, videotape… of another person whose intimate parts are exposed or who is engaged in a sexual act without consent. Unlike New Jersey, California’s revenge porn law requires there be an intent to cause serious emotional distress and that the depicted person suffers serious emotional distress.

For Australia, all hope is not lost. In late 2015, Tim Watts MP introduced a Private Members’ Bill in the House of Representatives that would criminalise revenge porn, although it wasn’t passed into law. In March 2016, the NSW Legislative Council Standing Committee on Law and Justice released a report on serious invasions of privacy and on September 5 2016, NSW Attorney-General Gabrielle Upton announced that the NSW Government will seek to criminalise revenge porn.

However, deciding to criminalise revenge porn is just one step in dealing with this issue. For NSW and the rest of Australia, questions arise as to what this new law would prescribe: Would the penalties be stronger than two years’ imprisonment as set out Victoria and South Australia or closer to 5 years like the American models? How will it try to reconcile the inherent problems of enforcement and the removal and detection of photos? Will this new law also capture instances of ‘parasite porn’ or ‘morphed porn?

So, how do you find out if you’re the victim of sexual cybercrime? A simple Google Image Reverse Search is a start to see if any of your photos are anywhere on the internet. If, however, you find that there are images of yourself on pornographic sites without your consent- Google now allows you to request the removal of photos or videos on Google search results. We’ve waited a long time for revenge porn legislation but at least now the future is looking promising for Australia.

 

 

 

Written by Noelle Martin. A version of this article has previously been published in The Brief- The Macquarie University Law Society Publication.

Yeah the Boys Meet Up, Third Wave Feminism Turns Up

Words: Jessica Sheridan

How many more times this year will I have to hear the same old excuse of locker room talk? For anyone who still does not understand what third-wave feminists are trying to do when we shut down sexist jokes for encouraging misogyny and rape culture, we now have another very clear example of exactly what we are talking about.

On the 4th of November an event was created on Facebook entitled ‘Yeah The Boys Meet Up’ by a page with the same name. The event seemed to have been originally intended as a meet up for fans of the popular Facebook page Yeah the Boys which has over 470,000 likes. The page posts ironic memes and status updates jesting at lad culture, and with such a following it is not unusual that the fans would band together to organise a meet up at Sydney’s own Coogee Beach.

However what probably started as a well-intended social meet-up very quickly turned toxic.

Attendees, the majority of which were young and male, began posting sexist jokes that became more and more heinous as time went by. What started as ‘just a bit of fun’ deteriorated into threatening and hateful language within a matter of days. Many posts called for physical violence against any ‘two holes’ that turned up to the gathering, while others asked if boys could bring along their ‘two hole’ to share with other boys in attendance.

In case you were wondering, ‘two hole’ was their word for women.

ddd
Screenshot from ‘Yeah The Boys Meet Up’ Facebook event

The page drew a lot of attention fast, and for all the wrong reasons. Many people – men and women alike – posted in the page asking that the event be taken down, or imploring the attendees to stop disrespecting women. This was largely met by jokes about feminists being ‘triggered’ and vulgar, sexually explicit replies whenever women tried to explain why the event was offensive. Men, and sadly some women, banded together to defend the posts on the page with aggressive and violent threats upon those that criticised the blatant misogyny within. The event, which racked up almost 10,000 attendees, had spiralled out of control, and even the Yeah the Boys Facebook page posted to disassociate themselves from the meet up. The event was eventually cancelled four days after it was originally posted.

yeah
Screenshot from ‘Yeah The Boys Meet Up’ Facebook event

You might be wondering – what the hell happened? A lot of people were left confused by the event, for all kinds of reasons. Some people didn’t understand how something which had started as joking and banter had derailed into such a sexist and hateful mess. Some didn’t think it was fair that women were angry about the event, because it was just an all-boys meet up after all – so what if girls weren’t invited? Others claimed anyone complaining about the posts on the event page just couldn’t take a joke.

But there is a very real reason why I am not laughing. There is a very real reason why it’s not funny even though I understand the joke completely.

This is rape culture.

I don’t care about the event name. Frankly, the phrase ‘Yeah the Boys Meet Up’ means nothing to me. Women are not angry that the event appeared to be for boys only; that is not the issue.

The issue is that the type of jokes and sexist banter used on the page inevitably turned into hateful, violent sexism. I say inevitable because this is a pattern we have seen time and time again. We saw it only days ago when a Melbourne law student was added to a Facebook group chat with four of her peers who were sexually and explicitly describing what they wanted to do to her. We saw it when Donald Trump tried to defend against his descriptions of sexual assault with the archaic notion of locker room talk. Time and time again we see examples of people who do not seem to understand that their jokes have gone over the line into violent sexism and misogyny.

yeah-the-boys
Screenshot from ‘Yeah The Boys Meet Up’ Facebook Event

The issue is that jokes too easily turn into threats. Banter too easily turns into sexually aggressive descriptions of what men would like to do to women. Locker room talk too easily turns into the reduction of women to their ability to sexually please men (‘two holes’) and nothing more. This is an exact example of how rape culture is perpetuated in our society. This is why third wave feminism is all about shutting down the sexist joking and the misogynistic banter. Because too easily and too often it leads to a lack of respect for women, which in turn can lead to violence against women.

It is literally that simple, and yet that terrifying.

Until women and men are equal – truly equal – in society, then sexist jokes, banter and locker-room talk will inevitably lead to a culture in which women are seen as lesser beings and objects for sexual pleasure, nothing more. This is rape culture, and this is why third wave feminism is so important.