Written by: Michela Rossettini, Working Group on Digital Policy

Edited by: Celina Ferrari

Abstract

This paper examines the criminal accountability of platforms in addressing technology-facilitated gender-based violence (TFGBV), focusing on Italian and European legal frameworks. It begins by defining TFGBV and situating it within feminist criminological theories to highlight its gendered dimensions. The study explores the application of criminal accountability to digital platforms, analysing the Italian and EU legal systems and their alignment with international conventions, using Section 230 of the U.S. Communications Act as a benchmark. Key concepts such as duty of care, human rights due diligence, and design justice frame the discussion on platform responsibilities. Through an analysis of landmark cases, safe harbour provisions, and self-regulatory practices, the paper illustrates how platforms manage liability in TFGBV cases. Finally, it evaluates current regulatory models in the EU and Italy, proposing pathways for policy reforms to strengthen criminal accountability and effectively combat TFGBV.

  1. Introduction 

Technology is a double-sworded invention: on one side, it has the potential to empower and emancipate women, but on the other side, it is used for the perpetration of technology-facilitated gender-based violence (hereinafter “TFGBV”). The lived experiences of women and men within online spaces drastically differ. This research paper explores what TFGBV means, feminist criminology, and the legal realm that tackles the crimes of TFGBV. Regarding the latter, an overview of the legal frameworks at International, European, and Italian levels will be given. This will be done to understand the current legal frameworks and grasp the future of platform regulation. 

  1. The meaning of technology-facilitated gender-based violence

The term “technology-facilitated gender-based violence” is an umbrella term created to tackle the need to collect rising data, research, and policy for issues that undoubtedly concern gender-based violence but that slightly diverge from what is normally understood as such. Thus, the term TFBGV encompasses the ways digital media technologies and platforms exacerbate existing patterns of gendered violence and introduce new modes of abuse (Dragiewicz et al., 2018). However, there are a plethora of other terms also used, i.e., online gender-based violence, information and communications technology (ICT) facilitated gender-based violence, and cyber violence. Overall, scholars, academia, and policymakers are not encouraged to focus too many resources on trying to define something so volatile and ever-evolving as TFGBV. Indeed, the Special Rapporteur on Violence Against Women, in her 2018 Report, stated that:

“The rapid development of digital technology and spaces, including through artificial intelligence (AI), will inevitably give rise to different and new manifestations of online violence against women. […] As digital spaces morph and develop, so too must the application and implementation of human rights norms to these areas.” (United Nations, 2018)

Thus, it is more productive to spend resources drafting regulations that protect women online rather than using the scarce resources available to define the nitty-gritty of TFGBV academically. Nonetheless, here is a summary table of some offences that fall under the phenomenology of TFGBV. 

Table 1: Terms and definitions of some “forms” of TFGBV (Dunn, 2020)

TermDefinition
Non-consensual distributionof intimate images (also known as Revenge Porn)The phenomenon when a person’s sexual images are shared with a wider than intended audience without the subject’s consent
CyberflashingA person sends a penis image to another without their prior agreement or consent (McGlynn & Johnson, 2021).
SextortionThe perpetrator threatens to release intimate pictures of the victim to extort additional explicit photos, videos, sexual acts, or sex from the victim (United Nations, 2018).
DoxingThe publication of private information, such as contact details, on the Internet with malicious intent, usually with the insinuation that the victim is soliciting sex (United Nations, 2018).
ImpersonationOnline impersonation is when a malicious actor steals someone’s online identity to cause damage (Tripathia, 2022).
  1. Feminist criminal law theories

To correctly comprehend the multi-faceted phenomenon of TFGBV, it is important to grasp the peculiarity of the context in which these crimes are perpetrated. Indeed, these crimes are committed within online spaces where the offline “culture of humiliation” has been shifted online, thus becoming the main lens through which cultural narratives are being told. This means that images are used to humiliate individuals for pure enjoyment. Indeed, this is the exact cultural context where female victimisation and male perpetration of image-based sexual abuse occur (Sandberg & Ugelvik, 2016; Patchin & Hinduja, 2020). To refer to the latter, throughout this paper, the term “revenge porn” will not be used, even if the crime is well-known to civil society with that nomenclature, the latter is filled with misconceptions. The mere adoption of legal provisions that target “revenge porn” rather than non-consensual pornography sparked debates that only led to the discrimination of victims. This point of view implicitly suggests that disseminating a person’s intimate picture without their consent is not a harmful act per se, and it only becomes so when motivated by malice.

Tackling criminology, it is of vital importance to put the topic of gender at the forefront of the discourse. This is done through feminist criminology theories. These examine the lived experiences of women and girls to provide meaningful contexts and insights into criminological issues, which is crucial for a phenomenon that affects women by a great majority (Marganski, 2020). Feminist criminology sheds light on the fact that the social world is systematically shaped by relations of sex and gender, and these impact both the self as well as the interaction one person has with others(Daly & Chesney-Lind, 1988). 

TFGBV is perpetrated more commonly by men, because they are taught to assert their virility in patriarchal cultures, whereas women are merely expected to be nurturers while still being overly sexualized and being treated as commodities (Daly & Chesney-Lind, 1988; Vaughan et al., 2023).  For this reason, violating others through misogynist vitriol, degradation, and humiliation is an act of revenge. This misogynist vitriol is drenched in sexuality in the sense that its practice over-sexualises women, impacting also information and communication technologies that expand the opportunities for men to take part in “getting off” on shared content (Marganski, 2020). Following the same line of thought, young single men are more likely to engage in computer offences to demonstrate their masculinity/manliness via what is known as gender-informed power moves, carried out intending to exert control and domination. This has the externality of maintaining online spaces as masculinised and as possible, becoming sealed silos that will then directly influence the subsequent actions of people within that space (Marganski, 2020).

The male support theory discusses macro-level structural violence around patriarchal structures. Indeed, males socialised into a patriarchal structure that support what has been stressed above (i.e., supporting gender stereotypes, the objectification and subjugation of women) will surround themselves with male friends who hold the same set of beliefs (Holt & Liggett, 2020). Furthermore, the creation and improvement of technology and the skills connected to it are highly masculinised sectors within society. Thus, the maintenance and coding of websites that help to diffuse controversial content is mostly handled by male technology users. Naturally, this feeds the vicious cycle of exclusion of women from online spaces, nurturing the normalization of violence against women by male users (Holt & Liggett, 2020).

  1. Platform Accountability

Social media companies have the burden of having to decide millions of times a day which content to regulate and, most importantly, how to regulate it. Each platform has its set of legal agreements stipulated with the users accessing the platform – i.e., terms of service (hereinafter “ToS”). Because there’s no sources of rules for speech regulation formally adopted as international human rights law, these ToS take different shapes and forms (Benescht, 2020). 

The decisions internet intermediaries make have a direct impact on the lives of many. However, most of the time, notwithstanding the countless recommendations and frameworks put in place, platforms represent themselves as neutral intermediaries. Furthermore, most social media platforms or internet intermediaries enjoy what is called a “safe harbor” because they are based in the United States. Indeed, under Section 230 of the U.S. Communications Act, internet intermediaries are equipped with a shield from liability for third-party content (Suzor et al., 2019; Section 230, 1996). In December 2024, the U.S. Senate passed the Take It Down Act. The latter is enshrined in section 3, subsection four, the same shield for liability – that is also found in the above-mentioned section. This Act solely obliges platforms to take down the content within 48 hours after receiving a valid request for removal (S.4569 – Take It Down Act, 2023).

All online platforms provide their Terms of Service, which users agree to compel when signing up for the website or via posting on the platform. For this essay, the social media platforms most frequently used by Gen Z were investigated to understand if there were procedures in place for reporting the crime of non-consensual distribution of intimate images. 

TikTok allows users to report hate speech, harassment, and nudity but lacks a specific option for “non-consensual sharing of intimate images,” which can be reported under “other.” (Reporting, n.d.). Instagram offers fewer categories but includes an option to report the sharing or threats of sharing intimate images (Instagram, n.d.). X, formerly Twitter, offers the opportunity of reporting for “non-consensual intimate images” and provides additional support through its Help Centre (How to Report Violations of X Rules and Terms of Service, n.d.). Reddit enables reporting for harassment, impersonation, and non-consensual media, while Telegram allows users to flag messages for illegal adult content, including non-consensual sexual imagery (Rules & Reporting, n.d.; User Guidance for the EU Digital Services Act, n.d.). Most platforms provide accessible tools but differ in their specificity and clarity; it is also important to underline that this modus operandi is traumatic for victims to go through to get the content deleted. Furthermore, it is not always guaranteed that after a report, the malicious content will be removed (European Institute for Gender Equality (EIGE), 2022). 

Image 1: Screenshot from TikTok (Reporting, n.d.)           

 Image 2: Screenshot from Instagram (Instagram, n.d.)

A white text on a white background

Description automatically generated

Image 3: Screenshot from X (How to Report Violations of X Rules and Terms of Service, n.d.)                        

A screenshot of a phone

Description automatically generated

Image 4: Screenshot from Reddit (Rules & Reporting, n.d.)                      A screenshot of a phone

Description automatically generated

Image 5: Screenshot from Telegram (User Guidance for the EU Digital Services Act, n.d.)

A screenshot of a cell phone

Description automatically generated
  1. International soft and hard law instruments 

In 2015, civil society, together with experts from all around the world, drafted the “Manila principles.” These are compiled into an instrument designed to be a resource to be used in the development of intermediary liability policies to foster and protect a free and open Internet. The stimulus for its creation came from international legal frameworks and human rights provisions, mainly The International Principles on the Application of Human Rights to Communications Surveillance. The drafted principles solely focus on pure intermediary liability. These principles completely overlap with the approach that is taken by most legal systems, thus reinforcing the fact that intermediaries are not to be held accountable for the content that is published through them, allowing them to shield themselves from the claims that victims of gendered cybercrimes could arise (Electronic Frontier Foundation, 2015).

  1. International legal framework

The International Covenant on Civil and Political Rights (hereinafter “ICCPR”) stands as a point of reference to set out a regulatory framework to ensure the transparency and the correct design of the regulations, meaning that these are aimed at protecting the relevant population while ensuring at the same time a minimal level of restrictiveness on the right to freedom of expression. Thus, speech restrictions under the ICCPR are allowed for the respect of the rights or the reputations of others, as enshrined in Article 19. Article 20 states that any advocacy of national, racial, or religious hatred that constitutes incitement to discrimination, hostility, or violence shall be prohibited by law (ICCPR, 1976). 

Furthermore, the United Nations drafted the “Guiding Principles on Business and Human Rights,” a soft law instrument applicable to all States and business enterprises. It is based on the principle that: 

“States must protect against human rights abuse within their territory

and/or jurisdiction by third parties, including business enterprises” (Guiding Principles on Business and Human Rights, 2011).

Overall, these guiding principles focus more on indicating that companies should develop human-rights policies by conducting due diligence (Benescht, 2020). Moreover, it instructs businesses to focus on other sources to be considered like thresholds, the International Bill of Rights, the International Labor Organization’s Declaration on Fundamental Principles and Rights at Work, the Universal Declaration of Human Rights (UDHR), and the International Covenant on Economic, Social, and Cultural Rights (ICESCR) (Benescht, 2020).

In August 2024, the United Nations Ad Hoc Committee reached an agreement on the “Draft United Nations Convention against cybercrime, strengthening international cooperation for combating certain crimes committed employing information and communications technology systems and for the sharing of evidence in electronic form of serious crimes.”(United Nations Treaty on Cybercrime Agreed by the Ad Hoc Committee, 2023). This whole process was supported by the Council of Europe, which ensured consistency with the Budapest Convention’s human rights threshold and rule of law safeguards. It has been reported that throughout the drafting process, various countries sought accession to the Budapest Convention. 

The Budapest Convention on Cybercrime completely lacks any reference to the gender element of cybercrimes. The same goes for the Lanzarote Convention on the Protection of Children against Sexual Exploitation and Sexual Abuse (Almenar, 2021). The Istanbul Convention targets aggressions that are committed in real-life scenarios against women, although not directly mentioning those offenses that happen in cyberspace. Furthermore, in 2021, the case Volodina v. Russia (No.2), no.40419/19269, allowed the European Court of Human Rights to express its jurisprudence on cyber violence. The Court affirms that online violence is a form of violence against women, and member States of the Council have a positive obligation to protect women from such acts (Volodina v. Russia (no. 2), 2021).

The Committee on the Elimination of Discrimination against Women picked up the pace and decided to address the problem in various general recommendations, the first in 2015, with the general recommendation no.33. The latter had as an object recognizing the role of digital spaces for the empowerment of women (Almenar, 2021). This was then clarified in 2017 with the general recommendation no.35, stating that the Convention on the Elimination of Discrimination against Women is to be applied to technology-mediated environments (Committee on the Elimination of Discrimination against Women (CEDAW), 2017). 

  1. European Union legal framework

In May 2024, the European Union issued a directive, i.e., an act that sets out a threshold that member countries must achieve by a set deadline, in this case, by June 2032. The latter is aimed at combating violence against women and domestic violence. The directive tackles all crimes that fall under the TFGBV term, underlining the necessity for harmonized definitions of offences and their penalties. Furthermore, the directive sets out the necessity for member states to set up a system for reporting for victims, stimulating the Member States to encourage frameworks for self-regulatory cooperation between intermediary service providers (Directive (EU) 2024/1385, 2024).

Before the drafting and approval of this directive, there was a plethora of instruments that have enshrined legal obligations or instruments pertaining to TFGBV. Indeed, three instruments can be taken into consideration when it comes to the liability of platforms: the 2000 e-Commerce Directive, the General Data Protection Regulation (GDPR), and the Audiovisual Media Services Directive. The latter was seen as a cathartic moment for victims of cyber gender harassment, although to this day, the burden is on users to pursue their private images online and act by themselves, contacting the platforms and requesting them to remove said content. However, this process is time-consuming and expensive. 

The Audiovisual Media Services Directive obliges video-sharing platforms to take measures to protect the public from content whose dissemination would constitute an offence under EU law. In January 2022, the European Parliament approved the Digital Services Act – which amended the e-commerce directive – imposing obligations and responsibilities on intermediaries in the E.U.  single market, even across borders, while ensuring a threshold level of protection for users, regardless of where they are in the European Union (Europe Fit for the Digital Age, 2020). Furthermore, the GDPR can be used as a remedy for image-based sexual abuse. Indeed, even though the regulation is not aimed at tackling the crime directly, it has however enshrined the right to be forgotten in Art. 17. As a matter of fact, the realization of the right to be forgotten becomes pivotal for the lives of victims, preventing their career destruction or discriminatory practices against them. It furthermore protects and promotes victims’ privacy and autonomy, creating a shield from perpetrators and abusers (Nguyen, 2022).

  1. Italian legal framework

The red code, i.e., codice rosso in Italian, is a repertoire of laws aimed at equipping victims of gender-based violence with an up-to-date penal code in Italy. The adoption of the latter was a reaction from the judicial world to the dramatic consequences that TFGBV had on victims’ lives. The bill was introduced three years before its adoption, in 2016, immediately after Tiziana Cantone’s suicide. 

Tiziana Cantone was a 30-year-old woman from Naples who gave in to her partner’s request to be filmed while having sex with other men: the couple recorded a total of six videos. In April 2015, these were consensually shared with acquaintances who disseminated the videos to other people – without the couple’s consent (Comunello et al., 2022). The videos quickly began to spread over multiple WhatsApp chats and ended in major pornographic portals – without Cantone’s consent. In a short time, an excerpt of the video – where Tiziana says to the man recording, “You’re making a video,” then followed by “bravo” – became a meme-worthy punchline (Reynolds, 2017). Cantone committed suicide after a long struggle with anxiety and depression. Nevertheless, she left a legacy, sparking a discussion about the non-consensual distribution of intimate images and its impact on the victims’ lives. 

A provision tackling the non-consensual distribution of intimate images was integrated into the Italian penal code in 2019. Art. 612-ter explicitly addresses the crime of technology-facilitated intimate partner violence. Namely, it provides for the reclusion of the offender from one to six years with a fine ranging from 5.000 to 15.000 euros. The same fines are applied to whoever has received or has acquired the pictures and proceeded with sharing them without the consent of the people pictured. Furthermore, Art. 612-ter provides that the sentencing will be aggravated if the crime is committed by the partner – despite the latter being either divorced or separated. Notably, the Italian provision introduces an innovative feature that is not found in any other legal framework, providing a solution to the problem of secondary distributors. This allows for a useful application of criminal law – and not merely symbolic – thus enforcing punishment for serious conduct (Caletti, 2021).

  1. Conclusions

Technology-facilitated gender-based violence is a phenomenon drenched in “human fallacy” that is both formal and informal. Thanks to feminist legal theories and criminology, it’s clear that there’s a structural problem –patriarchy. The latter deeply impacted legal frameworks, legal codes, and academia, and now society is unravelling the knot. That is done through advocacy, education, and drafting of international instruments aimed at tackling national legislation: this is why, in the long run, international treaties are the right means through which the phenomenon needs to be addressed through the lens of feminist-informed legal theories. A reform of Section 230 of the U.S. Communications Act is needed, although, with the current panorama of American politics, this seems highly improbable. However, the Republican Party supported the bipartisan Take It Down Act, doing so to protect “Americans’ privacy.”. As shown multiple times throughout the paper, countries – the European Union’s member states – will eventually adopt the provisions enshrined in the above-mentioned international treaties (Dial, 2024). Undoubtedly, the E.U. Directive 2024/1385 marked a pivotal moment for women in the European Union, and together with the UN draft on cybercrime, these instruments will give women real instruments to combat their perpetrators – although, clearly, much work still needs to be done on the platforms’ side, especially when it comes to the theme of accountability. 

Bibliography

Almenar, R. (2021). Cyberviolence against Women and Girls: Gender-based Violence in the Digital Age and Future Challenges as a Consequence of Covid-19. Trento Student Law Review, 3(1), Article 1. https://doi.org/10.15168/tslr.v3i1.757

Benescht, S. (2020). But Facebook’s Not a Country: How to Interpret Human Rights Law for Social Media Companies. Yale Journal on Regulation Bulletin, 38(86), 86–111.

Caletti, G. M. (2021). Can Affirmative Consent Save ‘Revenge Porn’ Laws? Lessons from the Italian Criminalization of Non-Consensual Pornography. Virginia Journal of Law & Technology, 25(30).

Committee on the Elimination of Discrimination against Women (CEDAW). (2017). General Recommendation No. 35 on Gender-Based Violence against Women, Updating General Recommendation No. 19: Committee on the Elimination of Discrimination against Women (No. CEDAW/C/GC/35; Convention on the Elimination of All Forms of Discrimination against Women, pp. 279–305). https://documents.un.org/doc/undoc/gen/n17/231/54/pdf/n1723154.pdf

Comunello, F., Martire, F., & Sabetta, L. (Eds.). (2022). Shameful Traces and Image-Based Sexual Abuse: The Case of Tiziana Cantone. In What People Leave Behind: Marks, Traces, Footprints and their Relevance to Knowledge Society (Vol. 7, pp. 347–359). Springer International Publishing. https://doi.org/10.1007/978-3-031-11756-5

Council of Europe Convention on Preventing and Combating Violence Against Women and Domestic Violence (2011). https://www.coe.int/en/web/gender-matters/council-of-europe-convention-on-preventing-and-combating-violence-against-women-and-domestic-violence

Daly, K., & Chesney-Lind, M. (1988). Feminism and Criminology. JUSTICE QUARTERLY, 5(4), 497–538.

Dial, S. (2024, December 11). Take It Down Act combatting ‘deepfakes’ revenge porn passes U.S. Senate [Text.Article]. FOX 4; FOX 4 News Dallas-Fort Worth. https://www.fox4news.com/news/take-down-act-combatting-deepfake-revenge-porn-passes-u-s-senate

Directive (EU) 2024/1385 of the European Parliament and of the Council of 14 May 2024 on Combating Violence against Women and Domestic Violence (2024). https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=OJ%3AL_202401385

Dragiewicz, M., Burgess, J., Matamoros-Fernández, A., Salter, M., Suzor, N. P., Woodlock, D., & Harris, B. (2018). Technology facilitated coercive control: Domestic violence and the competing roles of digital media platforms. Feminist Media Studies, 18(4), 609–625. https://doi.org/10.1080/14680777.2018.1447341

Dunn, S. (2020). Supporting a Safer Internet Paper No. 1: Technology-Facilitated Gender-Based Violence: An Overview. Centre for International Governance Innovation. https://www.cigionline.org/publications/technology-facilitated-gender-based-violence-overview/

Electronic Frontier Foundation. (2015). Manila Principles on Intermediary Liability. https://manilaprinciples.org/principles.html

Europe fit for the Digital Age: Digital platforms (Press Release No. IP/20/2347). (2020). European Commission. https://ec.europa.eu/commission/presscorner/detail/en/ip_20_2347

European Institute for Gender Equality (EIGE). (2022). Combating Cyber Violence against Women and Girls. https://eige.europa.eu/publications-resources/publications/combating-cyber-violence-against-women-and-girls?language_content_entity=en

Guiding Principles on Business and Human Rights, HR/PUB/11/04 35 (2011). https://www.ohchr.org/sites/default/files/Documents/Publications/GuidingPrinciplesBusinessHR_EN.pdf

Holt, K., & Liggett, R. (2020). Revenge Pornography. In T. J. Holt & A. M. Bossler (Eds.), The Palgrave Handbook of International Cybercrime and Cyberdeviance (pp. 1131–1149). Springer International Publishing. https://doi.org/10.1007/978-3-319-78440-3_73

How to report violations of X Rules and Terms of Service. (n.d.). X. Retrieved 8 December 2024, from https://help.x.com/en/rules-and-policies/x-report-violation

Instagram. (n.d.). Instagram. Retrieved 16 November 2024, from https://www.instagram.com/

International Covenant on Civil and Political Rights (1976). https://www.ohchr.org/sites/default/files/ccpr.pdf

Reynolds J. (2017, February 13). Italy’s Tiziana: Tragedy of a woman destroyed by viral sex videos. BBC News. https://www.bbc.com/news/world-europe-38848528

Marganski, A. J. (2020). Feminist Theories in Criminology and the Application to Cybercrimes. In T. J. Holt & A. M. Bossler (Eds.), The Palgrave Handbook of International Cybercrime and Cyberdeviance (pp. 623–651). Springer International Publishing. https://doi.org/10.1007/978-3-319-78440-3_28

McGlynn, C., & Johnson, K. (2021). Criminalising Cyberflashing: Options for Law Reform. The Journal of Criminal Law, 85(3), 171–188. https://doi.org/10.1177/0022018320972306

Patchin, J. W., & Hinduja, S. (2020). Sextortion Among Adolescents: Results From a National Survey of U.S. Youth. Sexual Abuse, 32(1), 30–54. https://doi.org/10.1177/1079063218800469

Reporting. (n.d.). TikTok. Retrieved 16 November 2024, from https://www.tiktok.com/safety/en/reporting

Rules & Reporting. (n.d.). Reddit. Retrieved 3 December 2024, from https://support.reddithelp.com/hc/en-us/categories/360003247491-Rules-Reporting

S.4569 – Take It Down Act, No. S.4569 (2023). https://www.congress.gov/bill/118th-congress/senate-bill/4569/text

Sandberg, S., & Ugelvik, T. (2016). The past, present, and future of narrative criminology: A review and an invitation. Crime, Media, Culture, 12(2), 129–136. https://doi.org/10.1177/1741659016663558

Section 230, Communications Decency Act, Section 230 (part of Title V of the Telecommunications Act of 1996), U.S. Congress 104th Congress, Title 47 U.S. Code (1996). https://www.eff.org/issues/cda230

Suzor, N., Dragiewicz, M., Harris, B., Gillett, R., Burgess, J., & Van Geelen, T. (2019). Human Rights by Design: The Responsibilities of Social Media Platforms to Address Gender‐Based Violence Online. Policy & Internet, 11(1), 84–103. https://doi.org/10.1002/poi3.185

Tna Nguyen. (2022). European ‘right to be forgotten’ as a  remedy for image-based sexual abuse: A critical review. KnowEx Social Sciences Journal, 2(1), 59–72. https://doi.org/10.17501/27059901.2021.2105

Tripathia, R. (2022). Social Media Impersonation. Jus Corpus Law Journal, 2(4), 361–373.

United Nations. (2018). Report of the Special Rapporteur on violence against women, its causes and consequences on online violence against women and girls from a human rights perspective (Report of the Special Rapporteur No. A/HRC/38/47). United Nations General Assembly. https://documents.un.org/doc/undoc/gen/g18/184/58/pdf/g1818458.pdf

United Nations treaty on cybercrime agreed by the Ad Hoc Committee. (2023, December). Council Of Europe. https://www.coe.int/en/web/cybercrime/-/united-nations-treaty-on-cybercrime-agreed-by-the-ad-hoc-committee

User guidance for the EU Digital Services Act. (n.d.). Telegram. Retrieved 3 December 2024, from https://telegram.org/tos/eu-dsa

Vaughan, C., Bergman, S., Robinson, A., & Mikkelson, S. (2023). Measuring technology-facilitated gender-based violence (p. 18). University of Melbourne – United  Nations Population Fund. https://www.unfpa.org/sites/default/files/pub-pdf/UNFPA_Measuring%20TF%20GBV_%20A%20Discussion%20Paper_FINAL.pdf

Volodina v. Russia (No. 2), No. 40419/19 (ECtHR 14 September 2021). https://hudoc.echr.coe.int/fre?i=001-211794

Leave a Reply

Your email address will not be published. Required fields are marked *

You may also like