Platform
EU Digital Services Act Notice
ChefSphere's statement under the EU Digital Services Act — single point of contact, notice-and-action, statements of reasons, and user rights.
This Notice describes how ChefSphere complies with Regulation (EU) 2022/2065 — the Digital Services Act ("DSA") — which applies to online platforms that host user-generated content in the European Union. The Notice supplements the Terms of Service, the Acceptable Use Policy, and the Privacy Policy. ChefSphere is not a Very Large Online Platform within the meaning of Art. 33 DSA, so the additional duties for VLOPs do not apply to us.
1. Provider information (Art. 11–12 DSA)
- Legal name: ChefSphere OÜ
- Registered office: c/o E-Residency Hub OÜ, Ahtri tn 12, 10151 Tallinn, Harju maakond, Estonia
- Estonian registry code: 17470129 (Tartu County Court Registration Department, business register card No. 1)
2. Single points of contact
- For Member State authorities, the European Commission, and the Board (Art. 11 DSA): [email protected]
- For recipients of the service (users) (Art. 12 DSA): [email protected] or the in-app "Report" action, in English, or in any of the five official languages of the Service (English, German, French, Spanish, Arabic).
These mailboxes are monitored during business hours and reply within applicable legal deadlines.
3. Legal representative in the EU
ChefSphere OÜ is established in the European Union (Estonia). A legal representative under Art. 13 DSA is therefore not required.
4. Terms and conditions and their enforcement (Art. 14 DSA)
The rules that apply to users are the Terms of Service and the Acceptable Use Policy. Those documents describe prohibited content, how we moderate, the measures we apply, and the appeal channels. We apply the rules diligently, objectively, and proportionately, taking into account the rights and legitimate interests of all parties involved, including fundamental rights under the EU Charter.
5. Notice-and-action mechanism (Art. 16 DSA)
Anyone can submit a notice that content on ChefSphere is illegal. We offer three channels; two of them are public and do not require a ChefSphere account, so that any person or entity — including rights-holders, regulators, and non-users — can notify us:
- Email (public, unauthenticated): [email protected]. Available to anyone.
- In-app "Report" action: available on any post, profile, message, listing, review, or live stream. Requires a ChefSphere account because the report is tied to the reporter for Art. 23 anti-misuse bookkeeping.
- Structured online form (account-holders, recommended for high-volume rights-holders):
POST /api/v1/compliance/takedown-notices. Requires an account because each submission is bound to an authenticated identity for Art. 22 trusted-flagger prioritisation and Art. 23 anti-misuse counting. Returns a tracking ID.
A notice — via any channel — should include, so that we can act diligently and without undue delay:
- A sufficiently precise indication of the content concerned (URL, item ID, user handle, or the reference shown in the reporting flow).
- A substantiated explanation of why the person believes the content is illegal, including the legal basis (for example a specific national law, EU regulation, court order, or intellectual-property right).
- The name and electronic contact of the notifier, unless the notice concerns one of the specific offences listed in Art. 10(2) of Directive 2011/93/EU (child sexual abuse material), where anonymity is accepted.
- A statement of good faith that the information in the notice is accurate and complete.
We send an electronic acknowledgement of receipt and a reasoned outcome to the notifier where contact details are provided.
Once a notice is triaged and decided, the moderator who accepted the notice executes the removal, restriction, or suspension synchronously in the same administrative action via the Moderation Console; a Statement of Reasons (Art. 17 DSA) is written to our DSA SoR store and queued for submission to the Commission's DSA Transparency Database (Art. 24(5) DSA) on the same transaction. A compliance.takedown.accepted durable event is emitted in parallel for metrics, audit, and future automation; today that event is not gated on by downstream content-state changes — the content-state change is performed directly by the moderator.
6. Trusted flaggers (Art. 22 DSA)
Notices submitted by bodies designated as trusted flaggers by a Digital Services Coordinator under Art. 22 DSA are given priority. Trusted flaggers can identify themselves by including their trusted-flagger designation reference when writing to [email protected].
7. Statements of reasons and internal complaint handling (Art. 17 and 20 DSA)
When we act on your content or account for reasons of alleged illegality or incompatibility with our Terms or AUP — for example removing a post, reducing its visibility, suspending a monetisation feature, restricting your account, or rejecting a listing — we give you a statement of reasons that includes: the decision, whether it concerns removal, visibility restriction, monetisation restriction, or account restriction; the territorial scope; the facts and circumstances relied on; where relevant, the automated means used; the legal ground or the Terms / AUP clause relied on; and how to challenge the decision.
You can use our internal complaint-handling system free of charge for six (6) months after the decision by replying to the notice we sent you or by writing to [email protected] with the reference number. Complaints are reviewed by qualified personnel, not solely by automation, and decided without undue delay. If we reverse the decision we restore the content, remove the restriction, or reinstate the account as applicable.
8. Out-of-court dispute settlement (Art. 21 DSA)
If you are not satisfied with the outcome of an internal complaint, you can refer the dispute to an out-of-court dispute settlement body certified by the competent Digital Services Coordinator of the Member State where you are established, or where the content was made available. Such bodies are listed by national Coordinators and by the European Commission. ChefSphere will engage in good faith with a certified body. Using an out-of-court body does not affect your right to bring the dispute before a court.
9. Suspicion of criminal offences (Art. 18 DSA)
If we become aware of any information giving rise to a suspicion that a criminal offence involving a threat to the life or safety of a person or persons has taken place, is taking place, or is likely to take place, we promptly inform the competent law-enforcement or judicial authorities of the Member State concerned — directly, or through Europol where no Member State of contact can be identified — and provide all relevant information available. Where a victim is identifiable, we also give that person the contact details of the authorities notified, if this is not manifestly contrary to the person's interest.
10. Measures against misuse (Art. 23 DSA)
We may temporarily suspend users who frequently submit notices or complaints that are manifestly unfounded, or users who frequently post content that is manifestly illegal. Suspension is preceded by a warning and is proportionate. Patterns considered include the absolute number of manifestly unfounded submissions, their ratio to total submissions, the gravity of the misuses, and the intent (where identifiable).
11. No dark patterns (Art. 25 DSA)
The design and organisation of ChefSphere does not deceive, manipulate, or materially distort or impair the ability of users to make free and informed decisions. In particular we do not: (i) give visual prominence to a choice over another when asking for a decision in a way that biases the choice, (ii) repeatedly request a decision that the user has already made, especially a decision that concerns fundamental rights or the Service's privacy settings, (iii) make the procedure of terminating a service more difficult than subscribing to it, or (iv) obscure material information behind multiple taps or screens. If you believe any Service flow violates Art. 25 DSA, please report it to [email protected] or to the Estonian Digital Services Coordinator.
12. Transparency reporting (Art. 15 DSA)
We publish an annual, machine-readable transparency report on our moderation activity at /legal/transparency-report/{year}. The report is generated directly from our compliance database and covers every sub-paragraph of Art. 15(1) DSA: authority orders (Art. 9), notices received through the notice-and-action mechanism (Art. 16), content moderation engaged at own initiative, internal complaints (Art. 20), use of automated means, the accuracy proxy (share of automated decisions overturned after appeal), suspensions under Art. 23, and the number of Statements of Reasons submitted to the Commission's DSA Transparency Database under Art. 24(5) DSA.
Under Art. 24(2) DSA, providers of online platforms publish, at least every six months, the average monthly active recipients of the service in the Union, calculated as an average over the preceding six months, in the same report. Because ChefSphere is below the DSA Art. 33 VLOP threshold (45 million average monthly active recipients in the Union) and the measurement pipeline for the six-month rolling average is still being instrumented, the field is currently emitted as null in the machine-readable transparency report and is accompanied by a calculation-methodology note. Once the measurement pipeline goes live, the figure will be populated automatically in every subsequent six-month report without any other change to this notice.
Reports are cached for 24 h and updated at the statutory interval.
13. Cooperation with authorities (Art. 9 and 10 DSA)
We cooperate with orders from EU Member State authorities to act against illegal content and with orders to provide information, as set out in Art. 9 and 10 DSA. We require orders to be legally compliant, proportionate, and duly substantiated.
14. Advertising (Art. 26 DSA), ranking transparency (Art. 27 DSA), and profiling
- Advertising. ChefSphere does not currently run third-party behavioural advertising inside the app or website. If we introduce advertising, the Art. 26 obligations (clear identification of ads, the natural or legal person on whose behalf the ad is presented, and the parameters used to determine the recipient) will be met in the UI and documented here.
- Ranking transparency. Under Art. 27 DSA, we disclose the main parameters used in our recommender systems and any option to modify or influence them. Content ranking (swipe discovery, community feed ordering, meal-plan generation, ebook marketplace sort, tools marketplace sort) uses a mix of user preferences and usage signals. The complete list of main parameters, with plain-language explanations and their relative importance, is available per surface at the following endpoints:
/api/v1/legal/tools/rankingand/api/v1/legal/ebooks/ranking, and is also surfaced in-context via the "How this list is ranked" link shown next to every marketplace sort control. - Non-personalised alternative. ChefSphere is not currently a Very Large Online Platform (VLOP) under DSA Art. 33, so the Art. 38 obligation to provide a non-personalised option that is not based on profiling does not apply. If ChefSphere is ever designated as a VLOP, we will introduce a non-personalised recommender option in compliance with Art. 38 within the statutory transition period. Today, users can still influence ranking indirectly through in-app filters, explicit feedback, and content-category controls on each recommender surface.
15. Children and minors
The Service applies specific protections for users who are minors, including restricted feature access, content filters, and no profiling-based advertising. See the Children & Age Policy.
16. Changes
We update this Notice when our Service, our processes, or EU law change. Material changes are announced in-app or by email at least 30 days before they take effect.
17. Contact summary
- Authorities / Board / Commission: [email protected]
- Users / notice-and-action: [email protected]
- Legal notices: [email protected]
- Privacy / DPO: [email protected]