Moderation Policy
Document reference: TDOLLS-MOD-v1.0
Issued by: Corestack Digital Ltd
Last updated: March 2026
1. Purpose and Scope
1.1 What This Policy Covers
This Moderation Policy explains how Corestack Digital Ltd reviews, assesses, and takes action in relation to content and conduct on the TDolls platform. It covers:
- How content is detected and flagged for review
- How moderation decisions are made
- What actions we may take and in what circumstances
- How and when users are notified of moderation actions
- How users may appeal moderation decisions
- Our obligations under applicable digital services legislation
1.2 Who This Policy Applies To
This policy applies to all users of the Platform — Service Providers, Service Users (Clients), and unregistered visitors who submit reports — and to all content and conduct on the Platform including Proxima by TDolls.
1.3 Relationship to Other Documents
This policy should be read alongside:
- Content Policy — which defines what content is and is not permitted
- General Platform Terms — which set out general prohibited conduct
- Complaints Policy — which governs complaints about our moderation decisions
- Service Provider Terms of Service and Service User Terms of Service — which set out account-level consequences for breaches
2. Our Approach to Moderation
2.1 Principles
Our moderation approach is guided by the following principles:
- Proportionality — the action we take is proportionate to the severity and nature of the breach
- Consistency — we apply this policy consistently across all users and content types
- Transparency — we tell users what action we have taken and why, where legally permissible
- Fairness — users have the opportunity to appeal decisions they believe are incorrect
- Speed — we prioritise serious and illegal content for urgent review and act promptly on reports involving potential harm
2.2 Intermediary Status
We are an intermediary platform. We do not produce the content that appears on the Platform and are not liable as a publisher for user-generated content provided we act expeditiously upon becoming aware of illegal or policy-violating content. This policy reflects our commitment to fulfilling that responsibility.
2.3 DSA Compliance
Where the Platform is accessed by users in the European Economic Area, we operate in a manner consistent with the EU Digital Services Act (DSA), including the requirements of Article 17 (statement of reasons), Article 20 (internal complaint-handling), and Article 22 (out-of-court dispute settlement). Users in the EEA have additional rights described in Section 7.
3. How Content Is Detected
3.1 Three Detection Routes
Content and conduct that may breach our Content Policy or these Terms is identified through three routes:
Route 1 — User and third-party reports
Any user or visitor may report content or conduct via the reporting tools available on each profile and content page, or by emailing moderation@tdolls.net. Reports may also be submitted by third parties — including law enforcement agencies, regulatory bodies, and non-governmental organisations — via the same channels.
Route 2 — Automated detection
We use automated tools to scan uploaded content for known illegal material, including child sexual abuse material (CSAM), using industry-standard hash-matching technology. Automated tools may also flag content based on other signals including metadata, upload patterns, and behavioural indicators associated with policy violations.
Route 3 — Proactive human review
Our moderation team conducts proactive review of content at key points including during Service Provider onboarding and verification, following account escalations, and on a sample basis across the Platform. Proactive review is weighted toward higher-risk content categories and newly registered accounts.
3.2 Reporting Tools
The Platform provides in-context reporting tools on every profile page, content item, and message thread. Reports can be categorised by the reporter as relating to:
- Illegal content (including CSAM, trafficking-related content, or non-consensual imagery)
- Content involving a minor
- Harassment or threatening conduct
- Fake or misleading profile
- Copyright infringement
- Other policy violation
Reports submitted as involving illegal content or minors are treated as urgent and reviewed with priority. See Section 5.2.
3.3 Law Enforcement and Authority Referrals
We cooperate with law enforcement agencies and regulatory authorities. Where we receive a valid legal request — including a court order, warrant, or notice from a competent authority — to review, remove, or preserve specific content or account data, we will comply in accordance with our Law Enforcement Request Policy.
4. Moderation Review Process
4.1 Triage
All flagged content enters a triage process on receipt. During triage, a moderator or automated system assesses:
- The nature and category of the potential violation
- Whether the content requires urgent action (see Section 5.2)
- Whether additional context or information is needed before a decision can be made
4.2 Review
Following triage, content is reviewed by a trained moderator. The moderator assesses the content against the Content Policy and relevant provisions of these Terms, taking into account:
- The content itself and its context on the Platform
- The account history of the user who uploaded the content
- Any information provided by the reporter
- Any response or context provided by the account holder where sought
- Applicable legal requirements
4.3 Decision
Following review, the moderator makes one of the following decisions:
- No action — the content does not breach our policies and no action is required
- Content removal — the specific item is removed or hidden
- Content restriction — the item's visibility is restricted (for example, reclassified from public to private)
- Warning issued — the account holder is warned that their content or conduct is in breach and that further breaches may result in escalated action
- Account restriction — the account holder's ability to upload, message, or perform other actions is temporarily restricted
- Account suspension — the account is suspended pending further investigation or pending the outcome of an appeal
- Account termination — the account is permanently closed
- Referral to authorities — the matter is reported to relevant law enforcement or regulatory authorities
More than one action may be taken simultaneously. For example, content may be removed and the account suspended, or a referral made to authorities while the account is terminated.
4.4 Moderation Timelines
We aim to complete moderation reviews within the following timescales:
| Report Category | Target Review Time |
|---|---|
| Illegal content (CSAM, trafficking, NCII) | Within 1 hour of receipt |
| Content involving suspected minor | Within 1 hour of receipt |
| Harassment or threatening conduct | Within 24 hours |
| Fake or misleading profile | Within 72 hours |
| Copyright infringement | Within 5 business days |
| Other policy violation | Within 5 business days |
These are target timescales. We will always prioritise urgent and illegal content over lower-priority reports. Volume of reports, complexity of the case, and the need for additional information may affect actual timescales.
4.5 Moderation Records
We maintain a record of all moderation decisions, including the content reviewed, the decision reached, the reasons for the decision, and the action taken. These records are retained for three years from the date of the decision. Records relating to illegal content referrals are retained for seven years.
5. Urgent and Illegal Content
5.1 Zero Tolerance Content
The following content categories are treated as zero tolerance. On identification — whether through a report, automated detection, or proactive review — we will act immediately without prior warning to the account holder:
- Child sexual abuse material (CSAM) or any content depicting a minor in a sexual context
- Non-consensual intimate imagery (NCII / revenge pornography)
- Content that directly facilitates human trafficking or sexual exploitation
- Content depicting real violence or torture
Action on zero tolerance content includes immediate removal, immediate account termination, and mandatory referral to relevant authorities including the National Centre for Missing and Exploited Children (NCMEC) where required, the Internet Watch Foundation (IWF), the Gibraltar Royal Police, and any other relevant law enforcement agency.
We do not provide prior notice to account holders before taking action on zero tolerance content. We do not provide an internal appeal route for account terminations resulting from zero tolerance content breaches. These matters are referred directly to the relevant authorities.
5.2 Urgent Review Escalation
Reports categorised as involving illegal content or a minor are escalated immediately to a senior moderator and reviewed within one hour. The account associated with the reported content may be suspended automatically while the urgent review is conducted.
5.3 Mandatory Reporting Obligations
We are subject to mandatory reporting obligations in respect of certain illegal content under Gibraltar law and, where applicable, the laws of other jurisdictions. In particular:
- We are required to report CSAM to relevant authorities including law enforcement and designated bodies
- We cooperate with Europol, the Internet Watch Foundation, and other bodies working to combat online child exploitation
- We maintain records of all mandatory reports in accordance with legal requirements
6. Notice to Users
6.1 When We Notify Users
Where we take action against a user's content or account, we will notify the affected user by email within a reasonable time of the action being taken, subject to the exceptions in Section 6.2.
Our notification will include:
- A description of the content or conduct that was the subject of the action
- The specific policy provision or legal basis for the action
- The action taken
- Information about how to appeal the decision
- The timeframe for submitting an appeal
6.2 Exceptions to Notification
We will not notify a user of a moderation action in the following circumstances:
- Where the action relates to zero tolerance content (Section 5.1) — notification is not given for CSAM or related content
- Where we are legally prohibited from disclosing the action or its basis, including where disclosure would prejudice a law enforcement investigation
- Where notification would tip off a user who is the subject of an active investigation
In these cases the account will be terminated or suspended without prior notice or explanation beyond what is legally required.
6.3 Reporter Notification
Where you have submitted a report, we will acknowledge receipt and notify you of the outcome of our review to the extent possible without disclosing information about other users or ongoing investigations. We aim to notify reporters of outcomes within the timeframes in Section 4.4.
7. Appeals
7.1 Right to Appeal
Any user who has had content removed, had their account restricted, or had their account suspended as a result of a moderation decision has the right to appeal that decision through our internal appeals process, subject to the exceptions in Section 7.2.
7.2 Exceptions
There is no internal appeal right for:
- Account terminations resulting from zero tolerance content breaches (Section 5.1)
- Actions taken pursuant to a valid court order, warrant, or direction from a competent authority
In these cases the appropriate route is through the relevant legal or regulatory process, not through our internal appeals process.
7.3 How to Submit an Appeal
Appeals must be submitted within 28 days of the date of the moderation notification. To submit an appeal:
- Email appeals@tdolls.net with the subject line "Moderation Appeal — [your account email]"
- Include your account details, the date of the notification, and a clear explanation of why you believe the decision was incorrect
- Include any supporting evidence you wish us to consider
7.4 Appeal Review
Appeals are reviewed by a senior moderator who was not involved in the original decision. The reviewer will consider:
- The original moderation decision and the reasons for it
- The grounds of your appeal
- Any new or additional evidence you have provided
We aim to complete appeal reviews within 14 days of receipt of a complete appeal submission. Where an appeal requires additional time due to complexity, we will notify you and provide an updated timeframe.
7.5 Appeal Outcomes
Following an appeal review, we will notify you of one of the following outcomes:
- Appeal upheld — the original decision was incorrect; the action is reversed and content restored or account reinstated
- Appeal partially upheld — the original decision was partially incorrect; a modified action is substituted
- Appeal dismissed — the original decision was correct and is maintained
Our decision on appeal is final for the purposes of our internal process. Where you remain dissatisfied following an appeal, you may pursue the matter through our Complaints Policy or through the external routes described in Section 8.
8. External Routes and DSA Rights
8.1 Complaints Policy
If you are dissatisfied with a moderation decision or the way your appeal was handled, you may submit a formal complaint under our Complaints Policy. Complaints are handled separately from the moderation appeals process.
8.2 EU DSA — Out-of-Court Dispute Settlement
If you are a user based in the European Economic Area and are dissatisfied with the outcome of an internal appeal relating to a moderation decision, you have the right under Article 21 of the EU Digital Services Act to refer the matter to a certified out-of-court dispute settlement body. We will cooperate in good faith with any such process.
Details of certified dispute settlement bodies will be published at tdolls.net/dsa when available.
8.3 Regulatory Complaints
You may also lodge a complaint with a relevant regulatory authority. For data protection matters related to moderation, you may contact the Gibraltar Regulatory Authority (GRA) at privacy@gra.gi. For DSA-related matters involving EEA users, you may contact the relevant Digital Services Coordinator in your country of residence.
9. Transparency
9.1 Transparency Reporting
We are committed to transparency about how we moderate the Platform. Where required by applicable law, including the DSA, we will publish periodic transparency reports setting out:
- The volume of reports received by category
- The volume of content removed or restricted by category
- The volume of accounts actioned by type of action
- The volume of appeals received and their outcomes
- Mandatory referrals made to law enforcement or authorities
Transparency reports will be published at tdolls.net/transparency.
9.2 Statement of Reasons
Where we remove or restrict content, or suspend or terminate an account, we provide a statement of reasons as described in Section 6.1. This statement is recorded internally and made available to you. This reflects the requirement under Article 17 of the EU DSA and equivalent good practice standards.
10. Changes to This Policy
We may update this Moderation Policy from time to time in response to changes in applicable law, regulatory guidance, or platform requirements. When we make material changes we will:
- Update the "Last updated" date and version number
- Notify registered users by email
- Display a notice on the Platform
The current version of this policy is always available at tdolls.net/moderation, tdolls.eu/moderation, and tdolls.uk/moderation.
11. Contact
For content reports:
Email: moderation@tdolls.net
In-Platform: Use the report button on any profile or content item
For moderation appeals:
Email: appeals@tdolls.net
For safeguarding and illegal content:
Email: safeguarding@tdolls.net
Corestack Digital Ltd
[Registered address, Gibraltar]
[Gibraltar company registration number]
This document is version TDOLLS-MOD-v1.0. The current version is always available at tdolls.net/moderation, tdolls.eu/moderation, and tdolls.uk/moderation.