Here’s how blockchain could help crack down on abusive images

[ad_2][ad_1]
  • Blockchain could be an effective and efficient solution to help rid the internet of abusive images.
  • Coping with offensive images can help victims heal.

Last month, India was shaken to its core by an alleged gang rape of a 15-year-old girl, which the perpetrators reportedly filmed and shared online. Unfortunately, this type of crime and its documentation occur in every country. For survivors of sexual violence, knowing that images of their ordeal exist and circulate online can cause even greater emotional damage. The images can also be used to blackmail and silence the victim.

Cloud storage platforms and social media networks, where photos and videos are stored every time they are posted or shared, typically don’t address this issue, citing respecting user privacy. Technological developments, however, can provide a solution that will allow cloud platforms to remove illegal images, limiting user privacy concerns.

The COVID-19 pandemic and recent social and political unrest have created a deep sense of urgency for companies to actively work to address injustice and racial inequality. In response, the forum’s platform for shaping the future of the new economy and society has established a high-level community of diversity and inclusion leaders. The community will develop a vision, strategies and tools to proactively integrate equity in the post-pandemic recovery and shape long-term inclusive change in our economies and societies.

As businesses emerge from the COVID-19 crisis, they have a unique opportunity to ensure that equity, inclusion and justice define the “new normal” and address race-related exclusion, prejudice and discrimination. gender, ability, sexual orientation and all other forms of human diversity. It is increasingly clear that new technologies and practices in the workplace can be harnessed to significantly improve diversity, equity and inclusion outcomes.

The World Economic Forum has developed a diversity, equity and inclusion toolkit to outline the practical opportunities this new technology represents for diversity, equity and inclusion efforts, describing the challenges that come with it.

The toolkit explores how technology can help reduce bias stemming from recruiting processes, diversify talent pools and assess diversity and inclusion across organizations. The toolkit also cites research that suggests that well-managed diversified teams significantly outperform homogeneous teams over time, in terms of profitability, innovation, decision-making and employee engagement.

The Diversity, Equity and Inclusion Toolkit is available here.

Most people who share and view offensive images do so on completely legal and popular social media platforms and through regular messaging services. Research from the National Center for Missing and Exploited Children measured the growth rate of child sexual abuse images on the Internet. The results are horrendous – a frightening increase from 3,000 reports of such images in 1998, to 1.0 million in 2014 and 18.4 million in 2018. Social media provides both viewer participation and sufficient storage, which it means that large files, such as videos, can be easily stored and shared. So the key to cutting off the supply of such content is to remove it from the cloud.

A technological solution
The technology exists, however, to help cloud storage platforms remove all illegal and abusive images from their databases and prevent new ones from being added. This solution combines both blockchain technology and a technology called PhotoDNA, which was developed by Microsoft and Dartmouth College. PhotoDNA creates a unique fingerprint of a digital image or video. This fingerprint mostly remains the same even if the image is cropped, resized, edited with filters, or manipulated in some other way. Importantly, while it is easy to generate a fingerprint from an image, it is impossible to decode the image from a fingerprint. There is therefore no risk that fingerprints will be misused to secretly disseminate images. In the case of images of sexual exploitation, this is of particular importance.

Blockchain is a decentralized and distributed database that allows multiple parties, who don’t necessarily trust each other, to create a trusted source of truth to share, update and work with. Unlike a centralized database, with blockchain no single party is responsible or owns the database. Everyone has it the same way.

Some law enforcement agencies already have databases of child sexual exploitation images and their fingerprints, however they usually don’t share them with anyone. What they could use, however, is a way to coordinate their activities, both locally and globally, in order to remove these images regardless of their jurisdiction. This is where blockchain technology is a game changer. It can enable a coordinated global database, accessible to all, yet owned by no one, in order to scan and remove images, regardless of where they are published in the world. Using both technologies, agencies could share the fingerprints they have already collected as part of a joint global effort. Each agency could then access this information, even if they don’t have fingerprints to share.

Being based on blockchain, the database is highly secure by nature. Specifically in our use case, there is no incentive for hackers to attack it, as the database will not contain any images, only image fingerprints. As I explained above, the fingerprint itself is meaningless as it cannot be used to recreate the image, nor does it connect to the original image. In this context, its only value is to offer a way to label or classify images without looking at or sharing them.

Cloud storage platforms would have read-only access to this blockchain database of illegal image fingerprints. Generating a fingerprint of an image is quick and easy. So is the process of finding a fingerprint within the database. Therefore, cloud platforms could generate a fingerprint of any uploaded image and check if it exists in the database. Likewise, this process would allow them to scan existing archived data and search for illegal content that had already been uploaded. Local law enforcement would be required to provide instructions on the next steps once an image has been identified as illegal. The following table summarizes the screening procedure:

Detection of illegal images from a user's point of view.

Orbs

Image: Detection of illegal images from a user’s point of view.

Importantly, the whole process will be completely transparent, as it will be recorded on the blockchain, allowing the visibility of any new information added to the chain. Law enforcement will be able to see who added which fingerprints. Cloud platforms will know exactly which fingerprints they need to check in which jurisdictions. Law enforcement agencies can easily check if their restrictions are being enforced correctly by cloud platforms. This alone would be one of the main reasons for using the blockchain.

To be sure, no solution is perfect and this solution alone cannot eradicate images of exploitation. Users would most likely be opposed to their content being screened, or even tagged or removed. And once illegal content is discovered from a cloud platform, the onus could be on that platform to take the next step and coordinate actions with law enforcement, opening up a broader debate on free speech and censorship.

Furthermore, technology cannot stop highly determined authors. These people will continue to post offensive pictures of children in illegal corners of the internet, also known as darknets. There will never be a single solution that will remove sexual images of children from the Internet.

However, while we cannot eliminate the problem, we can significantly affect it. Authors who are very technologically savvy will need to be tracked down and stopped in other ways. However, for the majority of online users who lack the technology skills to access the darknet, we can and must – easily and quickly – remove these images. Importantly, as the blockchain is immutable, every transaction on it is documented forever. It means that if someone tried to upload illegal images, got rejected, and then tried to delete those images from their systems, the blockchain would still have a record that they attempted to upload illegal content. This, I imagine, could be a very useful tool for law enforcement.

As a co-founder of Orbs, the largest blockchain infrastructure company in Israel, it was important for me to use our technology for good and promote blockchain for social impact. As a result, I founded the Hexa Foundation, a non-profit organization that promotes the use of blockchain technology for social impact. At the Foundation we focus on educating governments about the potential added value and efficiency of blockchain and running blockchain projects that create social impact. We are currently in talks with governments and law enforcement agencies to explore the opportunity to use blockchain as a solution that will help cloud platforms remove these images.

The change will not be easy. As with many solutions that require coordination, the hardest part here is aligning the interests and policies of law enforcement agencies globally. There is reason for optimism here: as no one claims ownership over this database, and it is shared among all players, it is hoped that the incentive to participate will be greater than in other consortium efforts where politics can sometimes hinder the progress. However, at least one major agency would have to lead the way to start progress. Some of the agencies we are in contact with are showing great interest and we hope they are willing to take the lead. Wide adoption, however, will take time. It will also take a change of mindset and new training to ensure that a broader set of law enforcement agencies understand the role new technologies such as blockchain could play in crime prevention.

It is blockchain that allows coordination between bodies in a way that was not possible before, and thus creates an effective and efficient solution – and which I hope will help the healing process of victims, globally.

Offensive images of children are in themselves a form of abuse. Any person who shares or views such images becomes an accomplice to the original abuse and further harms the victim. There have been many reports of survivors being persecuted by the continued circulation of images recorded years ago. It can make it extremely difficult, if not impossible, for them to perceive their suffering and trauma as if they were in the past. Instead, their ordeal continues. By managing to remove part of the problem, we can actually have a positive impact today.


.[ad_2]Source link