AI is constantly promulgated as either being our new potential best friend or Terminator-style singularity-based doom. To an IT expert with a law degree, the responses of the law seem to be lagging in relation to AI generated creations, particularly in relation to deepfakes.

A deepfake is a product where any image and/or voice is substituted for any other image and/or voice. The images can be either still or moving – the format is not a concern: the concern is the output which may either purport to be fake or which may play off as being real. Either way, the product is not a genuine representation of the individual who is represented.

A further problem is that even an artificially created face can sufficiently resemble an individual such that they might perceive that they are being represented. Presently, artificially generated faces are taken by “averaging” the images of many people, but those artificial faces themselves can nevertheless resemble real people very closely. Having been mistaken for other girls/women multiple times (especially one of my own distant cousins who alarmed a law professor when he found out we were not the same woman), I’m very conscious of how only a slight degree of similarity is required to evoke the image of a person.

The problem of deepfake porn
For most publication inventions, including the printing press, their first uses was the Bible and the second was pornography. Deepfakes, however, was one of the first that genuinely went the other way and headed straight to pornography. Defenders of deepfake pornography argue that it should be permitted, because it is often labelled as a deepfake and that the women (because it is usually women being abused) complaining about their faces being overlaid on deepfake porn are not actually being non-consensually touched. Their argument continues that it is no different to people imagining in their minds (the individual being faked) being in that sexual situation, and that there should never be any laws that impinge upon individual’s imagination, because that amounts to totalitarian thought control.

Whereas I can agree that there should not be laws that prevent people from imagining whatever they wish to imagine within the confines of their own minds, the difficulty that I have with their argument is that the person being faked cannot experience what another individual is imagining and is not subject to images that they may have in their mind. I can see why people are entirely permitted to write fanfic and can reduce their (sometimes disgusting) desires to writing, but I would argue that there’s a more visceral response to witnessing someone abusing what appears, even superficially, to be your body than there is to reading about someone abusing your body. When reading a literary work, a lot of the action and visual description of the action is left to the imagination. In comparison to a literary work, pictures or especially moving images are more immersive. Whereas I accept that individuals can be reasonably distressed by literary work pornography, I also understand that moving image pornography representing the individual (whether deepfake or by a lookalike) is going to have much greater impact, because humans who can hear see and hear (those who are able) are highly visual and aural creatures.

Existing Crimes that would protect against Deepfakes
Currently, in Australia we have laws that protect against stalking, harassment or causing “offence” using a carriage service (ie, telephone/the internet). It would be a novel case, but I believe that the “causing offence” component of a deepfake, whether it was a fake endorsement or deepfake porn, would be sufficient. I think that an “ordinary person” (the test) would be caused offence by seeing themselves or a loved one engaging in sex that they weren’t engaging in or representing products that they would not endorse.

The difficulty that people are discovering in relation to all IT law is that the police are not familiar with the technology and the prosecutors are unsure about jurisdiction, so matters such as this, although definitely crimes, are not presently prosecuted. With education, the deepfakes would be sufficient to be covered by existing laws regarding “causing offence”, but I suspect that the real problem in this regard will be police resources.

Note, I initially had strenuously objected to “causing offence” being included as a crime on the grounds that I did not imagine that “causing offence” was sufficiently criminal to warrant prison time, but in this context as opposed to the initial purpose of that law, I can see that it can be sensibly adopted for the context of deep fakes.

In relation to depictions of minors, any deepfake presenting itself as being a child in a sexual situation will be child pornography. Even an AI likeness of a child being subject to sexual situations will be child pornography, because the appearance of the age of the “person” engaging in the conduct is what matters, not their actual age. The laws in this regard are sadly seeing increasing use as minors depict other minors in sexual situations to cause them deliberate harm through “bullying”.

Existing Civil Remedies
The trouble is that police are already overtaxed by physical crimes and are certainly under-educated, so “self help” might be a quicker and more effective action. If someone were to get a court order, breaching that court order would be criminal, so a financial remedy and take-down order would go some way to vindicating a victim. The problem with this is that  only the sufficiently wealthy or connected are able to obtain a court order to shut down publication of an offensive publication.

Defamation may be one such law that would be able to be used in the context of deepfakes, because aligning someone’s persona with an activity that would be viewed as “seriously” injurious could be defamatory. People are not generally aware that we have both civil AND criminal defamation in Australia and that deepfake porn would, in my view, constitute both, because it would give rise to the inference that the person portrayed would engage in that activity before a camera. Stating that it was a deepfake would still not remove the “sting” of the defamation.

Another law that would address such action is the civil action of a person (including a company) engaging misleading or deceptive conduct/misleading endorsement. The difficulty in this law would be that often deepfake porn states directly that it is not the person that it purports to be, so there would need to be a decision of a higher court to the effect that a small disclaimer is insufficient to undo the misleading nature of someone’s face and/or voice being presented. Courts have, from time-to-time, ruled that weak disclaimers are insufficient to undo the confusion that would exist in the mind of a “consumer” (these are consumer protection laws) so there could be an avenue for a joint defamation/misleading deceptive conduct action.

Intellectual property, the laws that most people reach for when someone takes an intangible that they believe belongs to them, are unsuitable, because intellectual property protects the expressions of ideas that are created, it is not designed to protect things that are already in existence ­– such as the appearance of an individual. Even photographs have very limited protection under copyright (but the right is vested in the photographer, not the person photographed).

We recognise no tort of the right of privacy in Australia, so there is no applicable privacy law.

Conclusion

Despite deviating from my original libertarian views on protecting one’s likeness, the increasing existence of deepfake porn has led to me to conclude that the police should be educated and deepfakes that cause offence should be forbidden by criminal and civil law and properly investigated by the State. My reasoning, as above, is because deepfakes, although currently imperfect to the eyes of some, are sufficiently convincing to cause people considerable genuine loss and damage and significant anguish not only to the person being represented, but also to their family and friends, whether the person is alive or dead.

Thank you for your support. To help us in our battle to protect liberty and freedom please click here


LEAVE A REPLY

Please enter your comment!
Please enter your name here