Baaed FREE SEO Suite maintains a small surface of user-submitted content (saved tool reports, account profiles, support correspondence, and AI-tool outputs that you choose to retain). This Policy describes the rules content must follow, how we moderate, and how anyone can report violations. It also satisfies our notice-and-action duty under Article 16 of the EU Digital Services Act (DSA).
1. Scope of User Content
The following are "user content" under this Policy:
- Saved tool reports (e.g. Website Reviewer audits, YouTube Summary records).
- Account profile data (display name, profile picture).
- Tool inputs you submit and that we cache or store at your request.
- AI tool outputs you save to your account.
- Public pages you have asked us to host on your behalf (where any such feature exists).
Tool inputs that are processed in real time and discarded are not "user content" for this Policy — they are governed by our Privacy Policy.
2. Content Standards
The standards in this section apply to all user content. The most-frequently-violated standards are summarised here; the complete list is in our Acceptable Use Policy.
- Illegal content. Content that violates the law of any jurisdiction relevant to its creation, hosting, or access — including but not limited to content that infringes intellectual property, violates privacy or data-protection law, or constitutes child sexual abuse material.
- CSAM. Zero tolerance. Suspected child sexual abuse material is reported to the U.S. National Center for Missing & Exploited Children (NCMEC) under 18 U.S.C. § 2258A and to local law enforcement, and the offending account is permanently terminated.
- Harmful or harassing content. Content that targets a person or group with hatred, threats, harassment, or that incites violence on the basis of a protected characteristic.
- Deceptive or fraudulent content. Phishing, impersonation, fake reviews, fabricated documents, election disinformation, deepfake imagery of identifiable persons.
- Malware and abuse. Code or files designed to compromise systems, distribute malware, or evade safety mechanisms.
- Privacy violations. Doxxing, non-consensual intimate imagery, or processing personal data of others without a lawful basis.
- Regulated content where you lack authorisation. Counterfeit goods, unauthorised pharmaceuticals, weapons, regulated financial products.
3. How We Moderate
- Reactive moderation. We act on reports we receive (Section 4). This is our primary moderation model.
- Automated flags. Some categories of content (e.g. CSAM via PhotoDNA-style hashing in providers we use, blocked-word lists in AI tool prompts) are flagged automatically before they reach a person.
- Spot checks. We perform periodic sampling of saved reports and AI-tool outputs to identify systemic issues.
- Appeals. If your content was removed in error, you can appeal — see Section 7.
We do not pre-moderate content prior to publication. Saved reports and other user content are processed and made available immediately on submission.
4. How to Report Content (DSA Art. 16 Notice and Action)
Anyone — user, non-user, rights-holder, or authority — may submit a notice of allegedly illegal or violating content. Send an email to info@baaed.com with the subject line "Content Report" and include:
- The exact URL of the content concerned (or other information sufficient to identify it precisely).
- An explanation of why you consider the content to be illegal or violating this Policy, including the specific law or rule infringed.
- Your contact details (name, email). Anonymous reports are accepted but we may be unable to follow up for clarification.
- A statement of good faith belief that the information in the notice is accurate and complete.
For copyright complaints, please use our dedicated DMCA Notice & Takedown Policy instead, which has additional requirements under 17 U.S.C. § 512(c)(3).
5. What Happens After You Report
- Acknowledgement within 2 business days.
- Review by a human moderator.
- Decision — we may take any of the following actions:
- Remove or disable access to the content.
- Demote or restrict the content's visibility.
- Terminate or suspend the offending user's account.
- Refer the matter to law enforcement.
- Decline to act, with reasons.
- Notification — we notify you of the outcome and, where the content is removed, we also notify the affected user (with reasons) so they may appeal under Section 7.
For straightforward violations we typically act within 5 business days. For complex or contested matters we act within 30 days.
6. Statement of Reasons (DSA Art. 17)
Whenever we remove, disable access to, demote, or otherwise restrict user content, we provide the affected user with a statement of reasons that includes:
- The specific action taken.
- The geographic scope of the restriction.
- The factual and legal grounds.
- Whether the action was based on a notice, automated detection, our own initiative, or a legal order.
- The available appeal channels.
7. Internal Complaint Handling (DSA Art. 20)
If you believe our moderation decision was incorrect — whether your content was wrongly removed or a notice you submitted was wrongly rejected — you may appeal by replying to our decision email within 6 months. Appeals are reviewed by a person who was not involved in the original decision. We commit to reviewing every appeal in good faith and not solely on the basis of automated tools.
8. Out-of-Court Dispute Settlement (DSA Art. 21)
EU recipients of the Service who are not satisfied with the outcome of an internal complaint may select a certified out-of-court dispute settlement body in their Member State to resolve the dispute. The list of certified bodies is published by each Member State's Digital Services Coordinator. We commit to engaging in good faith with any certified body the user selects.
9. Trusted Flagger Notices (DSA Art. 22)
Notices submitted by entities formally designated as "trusted flaggers" by a Digital Services Coordinator under Art. 22 of the DSA are processed with priority and without undue delay.
10. Measures Against Misuse (DSA Art. 23)
We may temporarily suspend, after issuing a prior warning, the processing of notices and complaints submitted by individuals or entities that frequently submit manifestly unfounded notices or complaints. We may similarly suspend the accounts of users who frequently provide manifestly illegal content, in proportion to the gravity and frequency of the violations.
11. Government and Law-Enforcement Orders
We comply with valid legal orders from competent authorities. Where a court order or governmental request requires content removal, we act within the time required by the order. Where the law permits, we notify the affected user.
12. Transparency Reporting
We publish an annual transparency report (or more frequent updates as required by law) summarising the volume and nature of content moderation actions, notices received, complaints handled, and government requests. The report is linked from this page when published.
13. Contact for Authorities
For Member State authorities, the European Commission, and the European Board for Digital Services under the DSA, the single point of contact is described on our Legal Notice / Imprint page.
14. Updates to This Policy
We may update this Policy as our content surface or applicable law changes. Material changes are announced on this page; the "Last updated" date is the authoritative marker.
15. Contact
All content-moderation enquiries: info@baaed.com