Internet cloud providers are dumping “problematic” websites


Recently, the “internet infrastructure firm” Cloudfare removed the website 8Chan (also known as Infinity Chan) from its servers, according to Wired.

This is reportedly in response to the website housing “numerous posts and manifestos linked to horrific mass shootings in the United States and around the world.” For example, the alleged killer from the El Paso shootings posted his manifesto on 8Chan.

Cloudfare is a company that provides the platform for 8Chan to work. 8Chan is an online forum/image board that is mostly unregulated, where people can post photos and texts.

While some may believe removing 8Chan seems like a good solution at first, this solution has been tried before with other “problematic” websites, say experts, and with forums that end up resurfacing later anyway. That’s why some observers say removing 8Chan from the servers of one company won’t necessarily keep it from popping up on another. There’s also an issue, described in the Wired article, of Cloudfare unintentionally setting a precedent for other companies to remove websites they personally consider controversial.

Cloudfare CEO Matthew Prince had this to say to Wired regarding the removal of 8Chan:

8chan has been on our radar for a long time as a problematic user… But we have a responsibility, which is much beyond “We terminate sites we don’t like.” I’m nervous about whether we’ve made the right decision, and I’m nervous about how this will set precedent in the future.

Matthew Prince – Cloudfare CEO

Prince also told Wired that removing 8Chan solves the problem for Cloudfare itself, but doesn’t address the issue of how hate brews online. You can read the rest of Wired’s story here: Cloudfare Ditches 8Chan.

Visit The Sharyl Attkisson Store today

Shop Now

Unique gifts for independent thinkers

Proceeds benefit independent journalism


Leave a Comment

Your email address will not be published. Required fields are marked *

2 thoughts on “Internet cloud providers are dumping “problematic” websites”

  1. This is dangerous territory. How can a website or provider claim they have no responsibility and thus immunity from repercussions, of content if they make decisions on what/who stays and what/who doesn’t? They can’t claim they are “just a platform” with no editorial control over content and that could nullify their immunity.

    If they let person A say something (even with the disclaimer that it does not necessarily represent their views) but silence person B (who’s saying essentially the same thing but in a different or more forceful way) and person C (who’s disagreeing with person A), they ARE making editorial decisions and value choices and they are no longer “just a platform”

Scroll to Top