Content Policy

Public code should stay playful, open, and safe for the people who run it.

Because people can actually run what gets posted here, moderation has to protect them without turning public creation into a locked-down, joyless experience.

The platform allows experimentation, remixing, and weird web publishing, but it does not allow code, media, or behavior that turns public playback into harassment, malware, exploitation, or abuse.

Start here

If this is your first time on Vibecodr, these are the quickest ways to understand what this page is trying to help you do.

Public means playable

Public vibes should stay publicly accessible.

We do not treat public playback as an account-locked surface. Moderation, safety, and capability checks must preserve anonymous public viewing for genuinely public work.

Safe by default

Permissive publishing still needs protective runtime boundaries.

User code runs behind sandbox, network, and capability controls so one creator's experiment does not silently become another person's risk.

Social by design

Collaboration is encouraged, not punished.

Remixing, feedback, and public discovery are core product paths. Policy should target abuse and deception, not ordinary curiosity or experimentation.

Durable enforcement

Moderation actions should be understandable and consistent.

When we hide, quarantine, or remove something, the outcome should make sense across the feed, player, profile, conversations, and embeds.

How it works in practice

This is the more concrete side of the story: what changes as your project grows, what stays the same, and where Vibecodr draws the line.

Not allowed

Malware, exploit kits, credential theft, and covert abuse paths are prohibited.

Creators cannot publish code or supporting assets intended to exfiltrate data, exploit viewers, evade platform controls, or meaningfully increase another user's exposure to harm.

Harassment and hate

Abusive social behavior is out of bounds even when wrapped in code.

Runnable projects, comments, metadata, and remixes cannot be used to target people with harassment, hate speech, intimidation, or coordinated abuse.

Deception and spam

Public discovery surfaces are not for bait-and-switch or junk amplification.

We remove repetitive spam, fraudulent claims, impersonation, deceptive monetization funnels, and experiences designed mainly to manipulate engagement.

Escalation model

Enforcement ranges from removal to quarantine to account action.

Depending on severity and intent, we may block publication, quarantine content from public surfaces, remove it entirely, or suspend the account that published it.

Keep exploring

If you want to go deeper, these nearby pages explain the next part of the picture without assuming you already know the vocabulary.

FAQ

Do public vibes require viewers to sign in before they can play them?

No. Public means publicly playable unless a specific safety or policy control says otherwise. Account state should not silently turn public vibes into private ones.

Can a vibe be removed from feeds without deleting the author's work entirely?

Yes. Quarantine is one of the moderation tools available when content should stop circulating publicly but still needs a narrower visibility outcome than permanent deletion.

Does code-only content get different rules than images or posts?

The format changes the enforcement details, but not the policy standard. Runnable code is evaluated as content plus behavior, so both social harm and technical abuse matter.

What should I optimize for when building public vibes?

Build for invitation, clarity, and safe interaction. Public work should be runnable, understandable, and respectful of the people who open it.