Joanna Hardy, Criminal Barrister at Red Lion Chambers, highlights the urgent need for new legislation to tackle ‘cyber flashing’ sexual offending, including the sending of unsolicited sexual images and posting revenge porn.
Imagine you are a woman on a packed train. It is standing, jostling-for-elbow-room only. One hand clings to the nearest rail. The other clutches your phone. Your attention remains transfixed on the screen as the carriage thunders along. You crane your neck to see which station is next.
Suddenly, your phone vibrates with an urgency you do not recognise. You are between stations. There is no signal. Your screen is taken over with a request. It is not one you initiated and not one you were expecting. “iPhone (3) would like to share a photo”. Your thumb lifts from the screen and hesitates. Without pressing or accepting anything, there is a preview of the image. It is a photo of a stranger’s erect penis.
Your mind races and your eyes dart. Who sent this? Are they watching you? Do they get this tube every day? Do they get off at your station?
Sexual harassment is now being delivered digitally and anonymously. And all you know is that the offender is nearby.
For decades, the rights of women to feel safe on public transport have been threatened by so-called real life interactions. Lewd remarks. Wolf whistles. Lonely carriages containing unknown strangers. Chock-a-block tubes with pressing bodies, gratuitously close. More recently, up-skirting, down-topping and those with a proclivity for photographing high-heeled feet.
Credit: Wikimedia Commons
Sexual harassment, however, is now being delivered digitally and anonymously. And all you know is that the offender is nearby.
AirDrop is a handy feature on iPhones. It uses Bluetooth or Wi-Fi to send images from handset to handset. If you are at a party and want to share photos of the event, AirDrop is a convenient way to do it. But if you leave your accessibility settings open – and it is easily done – your phone remains vulnerable to receive any file, from any person, long after the party has ended. The name of your phone is visible. Anything like ‘Joanna’s iPhone’ is likely to reveal your gender. Anyone within 30 feet can then send you an image. Even if you do not accept it, the automatic ‘preview’ box reveals all.
A 2018 YouGov poll revealed 41% of women aged 18 to 36 had received unwanted photos of a penis.
The receipt of unsolicited photographs of this nature, dubbed cyber-flashing, is widespread. A 2018 YouGov poll revealed 41% of women aged 18 to 36 had received unwanted photos of a penis. Receipt is not limited to the train or to AirDrop. “Just turn it off” is not an answer. Reports reveal transmission by text, social media and dating websites. Sophie Gallagher for the Huffington Post has reported widely and impressively on the issue, including interviews with women who have received such photos in lecture theatres and bars. They reported feeling sick, panicked and exposed.
What is to be done? There is not, yet, a dedicated law specifically targeting this tech-enabled conduct. Shoehorning modern behaviour into old laws tends to be knotty. Public decency laws are archaic and have already fallen short when applied to up-skirting. One-off victimisation on public transport or on a dating app will not always contravene harassment laws. The core of the Malicious Communications Act 1988 was drafted before most millennials were born and was never aimed at cyber sexual offending.
The offence of exposure [Section 66 of the Sexual Offences Act 2003] was enacted over 15 years ago and long before the ubiquity of smartphones. It prohibits the intentional exposure of a defendant’s genitals if they intend that someone will see them and be caused alarm or distress. It has long been used to prosecute in-the-flesh ‘flashers’. The wording was flexible enough to successfully prosecute a man who exposed himself via Facetime in 2014. But is it fit for purpose?
Firstly, it may produce piecemeal case law seeking to widen or restrict the notion of ‘digital exposure’ for the modern age. Secondly, what if an offender sends a batch of 120 ‘stock’ images of random penises to a complainant? He would not be guilty of a Section 66 offence which only prohibits the defendant exposing his own genitals. Thirdly, is a defendant’s intention (under Section 66 to cause alarm or distress) still an appropriate test in these circumstances? Is the targeting and sending of unsolicited sexual photographs to someone without their consent a better test? It is a debate that needs to be heard.
The Government should introduce a new law on image-based sexual abuse which criminalises all non-consensual creation and distribution of intimate sexual images.
Women and Equalities Committee
A victim-centric approach is certainly favoured by the Women and Equalities Committee who heard extensive expert evidence about all categories of image-based sexual offending and concluded the following: “The Government should introduce a new law on image-based sexual abuse which criminalises all non-consensual creation and distribution of intimate sexual images, including altered images, and threats to do so. This should be a sexual offence based on the victim’s lack of consent and not on perpetrator motivation, and include an automatic right to life-long anonymity for the complainant…”
Cyber-flashing joins an unhappy family of image-based offences. Professor Clare McGlynn, the leading academic authority on the topic, has cogently set out the inadequate ‘ad hoc’ provisions governing them. Legislation, current and draft, offers inconsistent answers to important questions of anonymity, offender motivation, public/private places and consent. The resulting mess is unpredictable for lawyers, difficult for the police and unsatisfactory for those on the receiving end.
The current patchwork legislation of old, new, shoehorning and making-do will be no match for the technology lurking around the corner.
The Women and Equalities Committee propose that the government streamlines the law. If drafted with care, such legislation could adequately cover cyber-flashing, upskirting, revenge porn, and photoshopped pornography. It might even cover new methods of image-based offending that we have not yet contemplated. Nuances in defences, sentencing and ancillary orders could be accommodated. Instead of micro-legislating for each act, or for each technology, we could recognise the common themes of image-based sexual abuse. Professor McGlynn notes that Australia managed to introduce legislation that “straightforwardly criminalises all non-consensual creation and/or distribution of intimate sexual images” including threats to do so and images that had been altered.
A government response to the issue is due imminently. In the meantime, the British Transport Police continue their excellent campaign for safety, including a discreet text message service. Some women have re-named their telephones with male names or deactivated certain functions entirely. Others point to tech companies themselves and argue that risks associated with AirDrop could be eradicated at source by default privacy settings or removal of the ‘preview’ photo entirely.
A compelling answer, however, is to fight technical complications with legislative simplicity. The current patchwork legislation of old, new, shoehorning and making-do will be no match for the technology lurking around the corner. Lawmakers cannot continue to be reactive to the crime-du-jour. By the time a Bill has crawled through Parliament, the next tech-enabled method will have reached digital maturity. This is whack-a-mole offending and modernisation is now long overdue.