Bing delisted 1.5 million Neocities sites. Here’s why that matters for the open web
Neocities is one of those rare internet services that still feels like the old web: hand-built pages, strange little fandom shrines, personal research notebooks, art experiments, and earnest “here’s what I learned” write‑ups. It’s the opposite of the templated, engagement-maximized, AI-summary-driven experience that increasingly defines mainstream online life.
So when Microsoft’s Bing search engine effectively delisted Neocities and its roughly 1.5 million hosted sites, it wasn’t just an SEO hiccup. It was a case study in what happens when the modern web’s discovery infrastructure becomes opaque, automated, and hard to appeal—especially for small, non-commercial communities.
Based on reporting from Ars Technica and Neocities’ own public statements, Bing’s block appears to have done two damaging things at once:
- It made a huge slice of the “small web” nearly invisible to anyone using Bing or a Bing-powered search experience.
- It created space for copycats and potential phishing pages to rank for Neocities-related queries while the real thing was suppressed.
This post breaks down what likely happened, why platforms like Neocities are uniquely vulnerable to bulk demotions, and what the episode tells us about the future of search in a world drowning in low-quality content.
What is Neocities (and why people care)?
Neocities is a static site host founded to keep the spirit of GeoCities alive—an internet where regular people had personal pages and weird aesthetics without needing permission from a platform. It’s explicitly about control and individuality: you can hand-write HTML/CSS, upload files, and build a site that looks like you.
That matters because discovery on today’s web is increasingly centralized:
- You don’t “surf” sites as much as you scroll feeds.
- You don’t navigate link trails as much as you ask a chatbot.
- You don’t bookmark as much as you trust algorithmic recall.
In that landscape, search engines function like public infrastructure. If a major search index silently removes an entire host—or treats it as low-trust by default—it’s not merely hurting traffic. It’s changing what kinds of creativity remain findable.
What exactly did Bing do?
According to Neocities founder Kyle Drake, Bing blocked the entire neocities.org domain, including the main site and user subdomains (for example: example.neocities.org), from appearing in Bing search results.
Neocities’ own blog post is careful to say what the block wasn’t:
- Not a known malware outbreak
- Not a technical misconfiguration that Bing explained to them
- Not a clearly communicated policy violation
- Not a quality collapse caused by generative-AI spam (Neocities claims it has “almost none”)
Ars Technica reported that the situation persisted through multiple attempts to reach Bing support—where Drake repeatedly encountered automated support loops and an AI chatbot rather than a human who could diagnose a bulk indexing decision.
Then there’s the more alarming part: Drake reported seeing Bing rank a copycat page resembling Neocities’ front page, potentially collecting user credentials. Even if that page wasn’t a deliberate phishing operation, the combination of (1) suppressing the authentic site and (2) ranking lookalikes is exactly how users get tricked.
Why could a search engine delist a million sites at once?
From a search engine’s perspective, Neocities is a single host with an enormous number of subdomains. That architecture is common for user-generated website platforms. It also creates a temptation for “bulk trust scoring.”
Search engines fight abuse at internet scale. The simplest enforcement mechanism is to apply site-wide or host-wide classifiers:
- If enough subdomains host spam or malware, the host can get flagged.
- If crawlers see patterns that resemble doorway pages or link farms, whole clusters get downranked.
- If “low-quality” heuristics trigger (thin pages, copied content, aggressive templates), a domain can get treated as junk.
That logic can be defensible in the abstract. But it’s brittle when the host’s mission is to allow creative freedom and weirdness. Many legitimate Neocities pages are:
- Minimalist (a few paragraphs and links)
- Highly stylized (heavy GIFs, unusual layouts)
- Unmaintained (older “set it and forget it” pages)
Those traits don’t necessarily mean spam. They just don’t look like modern commercial sites.
The uncomfortable truth: anti-spam systems are now anti-small-web systems
Search quality teams have been under pressure for years—first from traditional SEO spam, then from content farms, and now from generative AI “slop” that can manufacture endless near-duplicate pages.
When you have that kind of adversary, you end up building systems that reward:
- Clear authorship signals
- Strong reputational backlinks
- Consistent engagement
- Freshness and update cadence
- Structured data and “best practices” compliance
But the small web often optimizes for none of that. It optimizes for personal expression.
So the risk is that anti-abuse systems start acting like a gatekeeping layer—one that unintentionally excludes communities whose content is human, sincere, and valuable, but not “professionalized.”
In other words: we can build a web where the only discoverable human writing is the kind that looks like it was produced by a marketing department.
Why this matters beyond Bing’s ~4–5% market share
It’s easy to shrug and say: “Just use Google.”
But Bing’s influence is bigger than its direct usage because:
- It’s the default search engine in many Windows environments.
- It powers and supplies results to other products and partners.
- Other search services can and do rely on Bing for traditional web links.
DuckDuckGo, for example, states that it largely sources its traditional links and images from Bing while mixing in its own crawler and other sources.
So a host-level suppression in Bing can become a downstream suppression across parts of the search ecosystem.
The support problem: when everything routes through a chatbot
Even if Bing’s block started as an automated classification error, the story becomes damning when the appeal path looks like this:
- Your traffic falls off a cliff.
- You file a ticket.
- You get routed to an automated assistant.
- You receive vague policy language.
- You are told to “work directly with Microsoft,” without a clear human point of contact.
That’s a governance failure. Search engines are not just websites; they are discovery utilities. If they can silently exclude huge communities, there needs to be a credible escalation path—especially when user safety is involved.
Modern search companies love to talk about transparency. Bing’s own webmaster communications emphasize how indexing, crawling, and AI answers are changing, and how site owners can adapt. But transparency isn’t only about guidelines; it’s about debuggability. When a platform can’t tell you what is wrong (which pages? which patterns? which rule?), you can’t fix it.
The phishing angle: suppressing the real thing makes impersonation easier
Drake’s warning about copycats is important because it reveals a predictable dynamic:
- If the real Neocities pages are suppressed, their reputation and link graph can’t help them rank.
- Lookalike domains can target “brand” queries and harvest confused users.
Even if Bing eventually deranks one suspicious result, the underlying incentive remains: once you block the authentic host, you lower the bar for impostors.
Search engines usually justify harsh anti-spam measures by arguing they keep users safe. But a blunt block can do the opposite—especially when it’s applied to a legitimate host that users trust.
What Neocities (and other small hosts) can do next
Neocities can’t fully control how a search engine classifies it. But there are practical moves that reduce the odds of being treated as a bulk-abuse platform:
-
Strengthen abuse reporting and transparency
- Publish clear anti-phishing / anti-malware policies.
- Show how quickly takedowns happen.
-
Improve machine-readable signals
- Encourage users to add basic metadata (titles, descriptions, canonical URLs where appropriate).
- Provide a clean sitemap strategy for subdomains.
-
Lean into “good citizen” protocols
- Tools like IndexNow (supported by Bing and others) exist to make updates and removals easier for crawlers.
- That won’t solve trust scoring by itself, but it can reduce confusion and lag.
-
Educate users about impersonation
- If Bing is suppressing real pages, a simple “never enter your password on lookalike domains” notice becomes urgent.
-
Diversify discovery
- If search becomes unreliable, communities fall back on RSS, webrings, newsletters, and direct links. That’s not nostalgic—it’s resilient.
None of this is a substitute for Bing fixing the underlying block. But it’s realistic: small web communities survive by building redundancy.
What Microsoft should do (if it cares about web quality)
If Bing wants to position itself as a serious search index in an AI era, it needs to handle edge cases like Neocities better—not worse.
A reasonable playbook would include:
- A human escalation channel for bulk delistings affecting large numbers of legitimate sites.
- Specific examples when citing policy violations (even if only via a secure portal).
- Safer handling of brand queries during enforcement actions to reduce impersonation risk.
- A posture that distinguishes “weird” from “malicious.”
If the web is going to be flooded with synthetic content, we should expect search engines to prioritize human creativity. Neocities is exactly the kind of place that should benefit from that shift.
Bottom line
The Neocities/Bing episode isn’t just a fight over indexing. It’s a preview of an internet where discovery is controlled by a few opaque systems, and where appeals can get trapped in automation.
If we want a web that remains diverse—full of personal pages, niche expertise, and genuine oddness—then search engines need mechanisms that don’t accidentally bulldoze the small web while chasing spam.