Sharesome Transparency Report
Quarter 1 - 2024
Enforcing our Community & Content Guidelines
1. Content on Sharesome
Sharesome has a robust content moderation department with strict guidelines following our Community & Content Guidelines. All content is scanned against whitelists and reviewed with state-of-the-art digital technologies to check whether the content is allowed on the platform before being posted on Sharesome.
Human moderators and admins manually review all content that passes this initial review within 24 hours. We also permit users to report content, accounts, and topics. We review all user reports.
This Quarter
- Posts reported by users:1,573
- Posts deleted after being reported by users:496
- Posts reported by automated tools:4,482
- Posts deleted after being reported by automated tools:1,832
- Total amount of content posted:2,047,323
- Posts deleted after review by moderators and admins:1,105
- Total amount of content deleted:3,433
2. Creator Accounts
Sharesome reviews all applications for creator accounts on Sharesome using a combination of whitelist scans and human moderation teams to verify the age and identity of creators.
This Quarter
- Verified star requests:252
- Verified stars approved:67
- Verified user requests:1,149
- Verified users approved:473
- Verified brand requests:321
- Verified brands approved:7
3. Accounts on Sharesome
We check both creator and regular user accounts after an account is opened. We proactively re-check accounts using both technology and human intelligence, and we investigate any concerns from our community or other interested third parties like law enforcement, commercial partners, or non-governmental organizations.
This Quarter
- Active user accounts:47,676
- Accounts deleted by admins:1,073
- Accounts suspended by automated tools:1,139
- Accounts reinstated after review by admins:88
- Total amount of deleted accounts:2,124
4. Combatting CSAM
Sharesome is combatting the creation and distribution of child sexual abuse materials (CSAM). CSAM is any image or video of sexually explicit conduct, including nudity, involving a person less than 18 years old. These images amount to child sexual abuse and exploitation.
Our proactive prevention and detection efforts include both automated and human review. Additionally, we respond immediately to reports submitted by our users and third parties such as NGOs, to detect, remove, and report suspected CSAM on our platforms
We deployed proprietary state-of-the-art Age-ID technology to discover never-before-seen CSAM, which our admins then reviewed and confirmed.
We also use the Hash Check Service from EOKM, an instant image identifier tool designed to quickly and easily identify online CSAM for removal.
This Quarter
- Pieces of suspect media removed:901
- Pieces of suspect media reported:51
5. Assisting Law Enforcement
We are ready to cooperate with law enforcement globally but have not received a legal request in Q1-2024.
6. Takedown Requests
Sharesome promptly responds to requests relating to (copyright) takedown notifications. The process includes notifying or banning the uploader following our repeat infringer policy.
Everyone can use our Content Removal Form to send a takedown notification.
In Q1-2024 we processed 128 (copyright) take-down requests and deleted the content accordingly.