This doesn't surprise me at all. I run a reasonably popular non-porn, submissions-based blog and immediately after the ban was implemented, our numbers tanked. Submissions dropped from 25-35 per day to around 10-20, while the number of notes (likes+reblogs+replies) per post has dropped from 600-800 to 200-400.
Unfortunately, we still see about the same total number of spambots and fake blogs in our notes. So at least from my own anecdotal experience, the ban did nothing except drive away human users.
A lot of our followers asked if/when we would move to another platform, but unfortunately for them (and us), Tumblr is the only major blog platform we know of which supports a curated, moderated submissions-based blog. Reddit is the closest runner-up in that it allows submissions and can be moderated, but isn't curatable for all intents and purposes. I suspect if (when) Tumblr goes under, it's going to take my blog with it, which is disappointing.
The main thing Tumblr offers that I haven't seen anywhere else is curated submissions. Like the sister reply said, on Tumblr, you can allow anyone to submit posts to your blog (including anonymous users and users without Tumblr accounts). You can set restrictions, such as "only text post or photo+caption submissions", and provide tags for submitters to choose from (e.g., on my blog submitters can choose the tag for the TTRPG they're using).
The key differentiator is the curation. Only those submissions which my mod team approves get posted to the blog, unlike Reddit where all submissions get posted first and then up/downvoted accordingly. I think (?) Reddit might have a way to curate, but it's not feasible at the scale of Reddit and a subreddit equivalent of the blog would quickly overwhelm the mod team and have to become uncurated. Considering the number of low-quality submissions we actively curate out, the result would be a significantly degraded experience. Which is not to say a transfer to Reddit is impossible, but it would definitely be An Effort, and one I'm not sure I'm willing to put in for what's been a fun hobby project for the last seven years.
Asks are another Tumblr feature that I haven't seen in other places. An ask is a question submitted to the blog mods, which the mods can reply to either publicly or privately, and which are visually differentiated on users' dashboards. On Reddit, a question to the mods would look exactly like any other post, and the mods' answer could easily be lost in the sea of general comments on the post.
There are also lesser features that Tumblr has that I've never seen on other platforms. Tagging is a big part of Tumblr culture and something I've had a lot of fun with on my blog. Reblogs and replies are very different forms of interaction than retweets/@'s. Fun coincidences like "dash did a thing" (where two unrelated posts show up together on a user's dashboard in a way that's amusing) don't appear to be a thing on other platforms for various reasons. The visual layout and formatting of Tumblr posts on the dashboard tends to be more readable than Reddit or Twitter.
These are all little things, but they're little things that have built up the personality of the blog over the years. I'd rather close the blog with its personality intact than watch it slowly die as yet another subreddit plagued by low-quality posts, insufficient moderation, and sameness.
(This sounds like I'm down on Reddit, which is not the case at all - I like Reddit, just not as a possible host for my blog.)
* Reddit AutoModerator has settings option where every submission and/or comments must be first approved by moderator. AutoModerator supports relatively complex moderator rules. https://www.reddit.com/wiki/automoderator/full-documentation
You can also allow only approved submitters or design rules for allowed submitters.
* Questions to moderators in Reddit should be send as messages to moderators. They show in completely different place where moderators can reply and discuss.
Automod is what I was thinking of when I said I thought Reddit has some tools to do this. The problem, like I mentioned above, is one of scale. Automod is not capable of making judgments about post quality; it can't do anything to enforce a "must be something someone said" rule, for example. Nor is it capable of flagging the many, many variants of the same low-effort submission we frequently get (not without a lot of false positives, at least). That's what my blog's curation team does, and what takes the most time. We can manage the blog's former peak submission rate of 25-35 per day, but given the nature of Reddit I expect that number would skyrocket, overwhelming the mod team.
Modmail isn't public, is it? It's only viewable to the sub's mods. If it isn't public, then it is not equivalent to asks.
Like I said in the original post, a transfer to Reddit is not impossible, but it would be a significant effort, and it would definitely kill the blog's personality. Even if the curation team comes along, even if the submission numbers don't skyrocket beyond what the team is capable of handling, Reddit lacks a lot of other small features that give the blog personality. If/when Tumblr goes under, I'd prefer to allow my blog to end gracefully rather than trying to port it over to a platform that isn't quite a match.
There's a lot of subreddits configured to only allow text posts, one of the subreddits I frequent is configired to delete your post if it is an image post that doesn't have a top level comment by the poster, as the subreddit rules require all image posts to have text descriptions (to guide conversation/mitigate low quality posts).
Automod is an automatic moderator, not a curator unfortunately, which is what we really need.
Tags are in the same general family as flair, but they're distant cousins of each other. Reddit generally expects one flair per post, and for flair to be short. My blog uses a minimum of three tags per post, with around 95% of posts having four to seven tags. A couple of those could go away due to the difference in how flair and tags are used, but that would still be at least two flairs per post on most posts. It also still doesn't solve for very long tags.
For all we Tumblr users love to mock the blue hellsite, it really does do some neat and very unique things. I'm still holding out hope that it manages to stabilize somehow and stick around, because I've tried all the other social media sites and none of them has kept my interest.
I don't want to restrict the subreddit, for one. I have around 295k followers on Tumblr; I can't hand-approve all of them. Plus, restricting the sub would keep new people from finding it.
Plus what the other commenter said about formatting and poor workflow. The mods can just barely handle the current workload of reviewing ~30-40 posts per day, choosing 20 to post, and adding tags as needed. If we had to open an email, then open a link (and hope the link was functional and not malicious), then review the submission, then copy the submission, then open the "submit a post" dialogue, then paste, then format, then post... yeah, that's way too much.
There's an IFTTT workflow for sending Tumblr posts to Reddit, but unfortunately the API it uses doesn't preserve any formatting, so you end up with a giant text blob. I don't think it could handle image posts. If that worked, I'd happily start sending things to a subreddit.
Restricted subreddit means that only approved users can post, but anyone can view. Private means that only approved users can view.
Another option is to set all posts by non-approved users to be automatically flagged (hidden to non-moderators), and moderators can un-flag the posts. Which, I think, is the workflow you're looking for? All you have to do is set the spam filter to "All" in the subreddit settings.
You still run into the problem that a lot of people might not want to use reddit, and you'll just end up fragmenting the people following your blog and lose a lot of readers.
The flag workflow is the closest I've seen on any platform to Tumblr's submissions, for sure.
The concern about people not wanting to use reddit is definitely a big part of this. Reddit has a very different culture than Tumblr, and I agree that it's likely I'd fragment my readers and lose most of them. As much as I love my blog, I don't think the effort of trying to make the switch to reddit is worth the harm and disruption it would cause.
On top of it, they weren't entirely truthful about how the block would be implemented. A couple text-only tumblrs I followed had long been voluntarily marked adult, and the announcement made it sound like they'd be in the clear.
Not only were they not, the implementation was to simply turn on the safe-mode filter and remove the setting to turn it off, while still allowing the adult content on subscribers' dashboards. So they had no idea they were blocked from the public until someone told them, since subscribers could still interact from that one page.
I expect it to continue to drop as more realize this.
I've seen a post showing that you can change the value of the checkbox behind the UI element using "inspect element" and posting the form back. The server-side doesn't validate and you can disable the flag.
This is probably patched now but if not it might show you how much they care about this.
Verizon is just a terrible company all around, and they certainly don't know how to run a big social network. Everyone was predicting this when the acquisition happened, and it's come to pass exactly as the critics said it would.
Verizon may well be OK with killing Tumblr. It's far outside their core competency, and unlike their core competency, it's hard to monetize.
You're claiming that everyone is legally being forced into ceasing distribution of pornography. Clearly this isn't happening. It's only Tumblr that is doing this, suspiciously soon after being acquired. There's no indications this would have otherwise happened.
> You're claiming that everyone is legally being forced into ceasing distribution of pornography. Clearly this isn't happening. It's only Tumblr that is doing this, suspiciously soon after being acquired
Tumblr was acquired over five years ago.
But moreover, it's not only Tumblr that has banned pornography or nudity, drastically altered their content policies, or even shut down altogether in the last twelve months since SESTA passed. They happen to be one of the biggest, but I have literally lost track of how many other sites have responded in a similarly drastic way.
SESTA includes incredibly onerous penalties, including jail time, for even accidental noncompliance. On the other hand, there is no legal penalty for being overly cautious. So unsurprisingly, most websites choose the less risky and scary option.
As soon as SESTA passed, Tumblr's days were numbered. Any potential buyer could see that as easily as Verizon could. (And even if they couldn't, Verizon would have to disclose it during due diligence or else they would be opening themselves up to a massive lawsuit).
> But a former staff engineer, who recently left Tumblr and asked to remain anonymous for professional reasons, tells Vox that the NSFW ban was “in the works for about six months as an official project,” adding that it was given additional resources and named “Project X” in September, shortly before it was announced to the rest of the company at an all-hands meeting. “[The NSFW ban] was going to happen anyway,” the former engineer told me. “Verizon pushed it out the door after the child pornography thing and made the deadline sooner,” but the real problem was always that Verizon couldn’t sell ads next to porn.
It’s not an American thing, most advertisers also refuse to advertise next to content that references extreme violence (think ISIS beheadings) and gambling content. I worked on the ads team at a company, and we constantly had to tweak our filters to ensure ads didn’t get served next to “adult” content.
> But a former staff engineer, who recently left Tumblr and asked to remain anonymous for professional reasons, tells Vox that the NSFW ban was “in the works for about six months as an official project,
That was (according to reports) because of the prevalence of child pornography on their platform, which they apparently didn't have the tools or staff to police. A blanket ban on pornographic content was cheaper and/or easier.
I believe they're using some automated filtering (see e.g. https://www.wired.com/story/tumblr-porn-ai-adult-content/). I assume it's easier to train an AI to recognise pornography generally, rather than a specific kind -- and now that I think about it, the process of training an AI to recognise child porn sounds extremely unpleasant and legally dubious.
> I believe they're using some automated filtering (see e.g. https://www.wired.com/story/tumblr-porn-ai-adult-content/). I assume it's easier to train an AI to recognise pornography generally, rather than a specific kind -- and now that I think about it, the process of training an AI to recognise child porn sounds extremely unpleasant and legally dubious.
Quite the opposite - it's much easier to allow pornography and ban only child pornography than it is to create an automated system to detect pornography.
PhotoDNA makes it easy to find matches, which can then be used to uncover the networks of people posting child pornography, which are then added back to the database. It doesn't require any computer vision at all.
By contrast, banning all pornography does require computer vision of some sort, and that's much more difficult, as evidenced by how terrible the new Tumblr NSFW content detector is.
Maybe I'm misunderstanding, but PhotoDNA seems to be a tool for identifying reposts and edits of already known illegal images -- those that have previously been found and had their hashes entered into the database. So if Tumblr had a problem with original content (which could include underage users posting explicit pictures of themselves, as well as the worse things we think of when we hear the phrase 'child porn'), I don't think that would help.
edit: I found this quote from an article published last November:
"In its updated statement, Tumblr said that while every image uploaded to the platform is “scanned against an industry database of child sexual abuse material” to filter out explicit images, a “routine audit” discovered content that was absent from the database, allowing it to slip through the filter."
So it looks like they might have already been using PhotoDNA.
> So it looks like they might have already been using PhotoDNA.
Of course they were working with NCMEC (and therefore using PhotoDNA, which is owned by NCMEC); it would have been legal suicide for them not to. They were doing this for years, before they were even acquired by Yahoo.
> Maybe I'm misunderstanding, but PhotoDNA seems to be a tool for identifying reposts and edits of already known illegal images -- those that have previously been found and had their hashes entered into the database. So if Tumblr had a problem with original content (which could include underage users posting explicit pictures of themselves, as well as the worse things we think of when we hear the phrase 'child porn'), I don't think that would help.
How many people do you think have Tumblr accounts where they post only new, never-before-seen pornography of underage children, which has never been posted on any other blog before, and never once post a single photo that has been previously identified as child pornography?
Of those, how many do you think are able to ensure that none of their followers ever repost/reblog that photo on any other blog which also happens to contain at least one other photo that's been previously identified as child pornography?
Of those, how many do you think are able to ensure that none of their followers have been previously identified as highly-connected nodes in the underground networks dedicated to sharing child pornography, and therefore pretty much exclusively post child pornography or follow people who they believe are likely to post child pornography?
Of those, how many do you think are able to ensure that nobody who ever sees one of those photos ever decides to download it and upload it as an attachement to an unsent email in their Gmail drafts folder, or put it in a private Dropbox folder (shared with nobody), or sends it on any of the many "cloud" services which also actively monitor for child pornography and do the same sorts of graph analysis to identify people who are using their services to store or share child pornography?
Again, once you understand how these underground networks work, and once you realize that this is mostly a problem of social graph analysis and not image recognition/classification, you realize that it's very easy to solve.
 Remember, not just new to Tumblr, but new to everyone who is working with NCMEC/ICMEC (which means every large and not-so-large company in the entire world that hosts user-provided content).
You are obviously better informed than me, but if the problem is so easy to solve, and they were already using the relevant tool, why was it still a problem? Are you arguing that they failed through incompetence, or that they were lying/exaggerating as an excuse to ban all pornography for other reasons?
> Are you arguing that they failed through incompetence, or that they were lying/exaggerating as an excuse to ban all pornography for other reasons?
Yes, SESTA was the real reason that they banned pornography. SESTA imposed way too much liability on them to be able to support it.
It had nothing to do with child porn. They are still liable for child porn. They still have to work with NCMEC and go through the same process for identifying, reporting, and removing child porn. None of that has changed.
So, if I understand correctly, this means that by banning all pornography in the name of preventing child porn, they'll actually be harming efforts on this uncovering of child pornography networks effort.
> So, if I understand correctly, this means that by banning all pornography in the name of preventing child porn, they'll actually be harming efforts on this uncovering of child pornography networks effort. This is unfortunate.
Unfortunate, and also wholly predictable. This was exactly why advocacy groups of former victims of sex trafficking opposed SESTA in the first place, as did even the DOJ: they both recognized that it would ultimately make it harder to identify and rescue victims of sex trafficking.
Interesting, because it seems like it will pick up drawings for free because it doesn’t distinguish between photos and drawings (Photo on iOS for example will pick up line drawings of people). I actually thought it would a harder problem to exclude the drawings based on some of these facial id programs.
I don't believe it was a convenient lie. I personally reported A LOT of posts and entire blogs for very very very obviously inappropriate underage content, yet they still remained. Tumblr was quickly becoming a cesspool they couldn't clean.
Every photo site since the beginning of user uploads has had this problem. This isn’t an issue if you decide to invest the resources into fixing it. That means hire people to look at photos. It means hooking in to the NCMEC / FBI’s database of known hashes. It’s buying (not even building) a content filter.
These things exist. Tumblr decided not to bother, which forced them to panic and nuke the site from orbit.
> Every photo site since the beginning of user uploads has had this problem. This isn’t an issue if you decide to invest the resources into fixing it. That means hire people to look at photos. It means hooking in to the NCMEC / FBI’s database of known hashes. It’s buying (not even building) a content filter. These things exist. Tumblr decided not to bother, which forced them to panic and nuke the site from orbit.
Tumblr had an entire team that did literally all of those things, for years. They did not "decide not to bother".
Did they really bother though? I mean, I’m sure they have people employed, but something is seriously mismanaged. How did they let it get so bad? This should have never gotten to the point that another company had to say, “WTF is with all the child porn?”
Saying you have team working on something, but not giving them appropriate resources, not taking the problem seriously, is same as not bothering. It’s same as saying, “We take your concerns seriously”, and then throwing the complaint in the trash. Their actions betray they’re true feelings.
I imagine it would take careful and meticulous observation to differentiate between legal porn and almost-legal porn. Dissolving the platform of porn entirely would streamline this process and potentially be automated. Banning regular porn has a greater than zero bearing on moderating child porn, and you're being disingenuous in pretending otherwise.
> I imagine it would take careful and meticulous observation to differentiate between legal porn and almost-legal porn. Dissolving the platform of porn entirely would streamline this process and potentially be automated. Banning regular porn has a greater than zero bearing on moderating child porn, and you're being disingenuous in pretending otherwise.
Why do people keep repeating this?
No, it's much more difficult to ban all pornography than it is to ban only child porn. One requires analyzing the content of photos and the other doesn't.
> No, it's much more difficult to ban all pornography than it is to ban only child porn. One requires analyzing the content of photos and the other doesn't.
Since it's easier to spot pornography in general than differentiate between very disturbing types of pornography, the work load is lessened and overall less stressful. Not to mention the amount of work ai could automate. That's your point, right?
> Since it's easier to spot pornography in general than differentiate between very disturbing types of pornography, the work load is lessened and overall less stressful. Not to mention the amount of work ai could automate. That's your point, right?
No, detecting child pornography by matching photos to known datasets and then extrapolating rings dedicated to sharing child porn from the social graph is already a solved problem. NCMEC already provides the tools to do this. You don't need to do any image analysis.
There are no effective tools for detecting porn more generally. That's a much harder problem.
According to  a Tumblr spokesperson said "we work collaboratively with [...] partners like NCMEC to actively monitor content uploaded to the platform. Every image uploaded to Tumblr is scanned against an industry database of known child sexual abuse material, and images that are detected never reach the platform."
Evidently that wasn't sufficient to stop Apple removing them from the App Store, though.
You're blatantly wrong. Training a classifier to determine if a previously unseen image contains nudity is easier than training a classifier to determine if a previously unseen image contains a nude 18 year old or a nude 17 year old. The later task is virtually impossible, which should make it obvious to you that tech like PhotoDNA is not classifying previously unseen photographs.
> You're blatantly wrong. Training a classifier to determine if a previously unseen image contains nudity is easier than training a classifier to determine if a previously unseen image contains a nude 18 year old or a nude 17 year old.
Why is it so hard to believe that there exists a solution to this that's much easier than training an image classifier?
As I've explained elsewhere, multiple times, this is a solved problem. There's no image analysis required.
> You missed that part. That requires classification, either by a human or by an algorithm. There is no way around it.
I did not miss anything. "Previously unseen" only matters if your primary means of identifying people who share child porn is based on analyzing the content of images.
I'm very aware of how these systems work, not just at Tumblr but in the industry at large at other, larger companies. I'm not just speculating here; this is literally how NCMEC works and what they expect of companies that host user-generated content.
For years tumblr neglected to address the issue of teenagers that haven't been trafficked or otherwise gone missing, posting pictures of themselves that did not exist in any database. That is the issue they addressed by banning all pornography, removing any ambiguity over whether any particular never-seen-before image was permitted or banned. NCMEC is out of scope here. They do not have magical powers that allow them to distinguish between an 18 year old or a 17 year old uploading pictures of themselves.
You may not like it, I can tell you're very upset at tumblr banning pornography, but this is the reality. Throw around as many of your qualifications as you like, it changes nothing.
> NCMEC is out of scope here. They do not have magical powers that allow them to distinguish between an 18 year old or a 17 year old uploading pictures of themselves.
NCMEC doesn't need those "magical powers" because NCMEC does not care about the latter. That's not their mandate. Hacker News loves to talk about age-of-consent and child pornography in the context of 17 year olds, but that's simply not relevant here. That's not the issue that Tumblr was facing pressure from.
> Throw around as many of your qualifications as you like, it changes nothing.
It sounds like you're so convinced in your own beliefs that even direct, firsthand knowledge of the inside situation won't change your mind, in which case you're right - there's no point in continuing this conversation.
You may not like that I know what actually went on here, and I can tell you're very upset that the actual facts about how NCMEC works and why Tumblr internally chose to make their decision don't support your narrative, but that's the reality. Whether you choose to believe it or not, it changes nothing.
And nor do you it seems, but tumblr now does. They've been forced to. You may not like it, but that's the reality of how tumblr is now operating. Tumblr can no longer turn a blind eye to any form of child pornography, and is no longer turning a blind eye to any form of child pornography.
Tumblr didn't ban all pornography for shits and giggles. They did it because given the current legal and social landscape it was their only economically viable option.
But you already know how stupid and lazy this idea is.
Tumblr may be a company. But it was also a community. Real relationships were made on there. I wasn't big on Tumblr but it meant the whole world to thousands of people growing up.
Verizon could make its decisions in the context of running a company, or in the context of running a community. They chose the former, because as a money machine they have no interest in spending capital to be the bearers of a meaningful online community which facilitates the formation of meaningful connections between people.
You missed the point. Verizon only values cheapness over duty to their community precisely because of what I just said. They want to run a company, not a community.
The same thing happened when Yahoo was running the show, the bleed-out was just slower because they were more timid about making sweeping changes to Tumblr's ecosystem.
Instead of Yahoo or Verizon realizing that they weren't fit to run a community and divesting the company to more competent leadership, as money machines they were unable to recognize the human element behind these companies and ripped them to shreds trying to scrounge up pocket change.
It's foolish to miss my point twice and then insinuate I'm the one being foolish. I expect no one to run Tumblr as a charity. I never stated any expectations, only explanations.
Tumblr disintegrated because they lost sight of their community, first with the Yahoo acquisition, and again with the Verizon merger. And now it's useless to everyone, both the users and the people who thought they could squeeze it dry for a few bucks. There's nothing to argue with. That's simply the facts. You're arguing a straw man and you don't even have a point, you're just playing a poor devil's advocate.
The absurd reach is pretending that a company shouldn't take every measure possible in preventing its platform from hosting some of the most abusive and criminal content on the planet. What makes you think they're not more moral than the other platforms? What makes you think this has anything to do with morality at all?
>...shouldn't take every measure possible in preventing its platform from hosting some of the most abusive and criminal content on the planet.
If you want to genuinely prevent the most abusive and criminal content on the planet, then - as the comment you're replying to suggests - the only sure way to prevent it is to not allow it to be hosted/connected to/shared/etc. anwyhere across the internet but since you can't ensure that, then we default tothe internet being the medium that makes that possible/happen, so...
Then they should shut down Tumblr. Every platform that generates UGC has to police their platform to remove child pornography, full stop. If they don't want to moderate their platform, then they should shut the whole thing down. They have to "subject their employees to having to police child pornography" regardless of the fact that they host legal pornography.
Twitter has plenty of legal porn without butting heads with this issue. Facebook doesn't allow porn and have to deal with the child pornography issue.
Taking the stance of "we banned all porn because of child porn" is lazy and deflects responsibility and agency of their actions. They banned porn because the suits higher up said they didn't like it, full stop. Trying to save face with the community and blame child porn is dishonest when every platform, from 4chan to Facebook, has to police and report child porn.
What about violence? What about torture photos? Child porn is just part of objectable content, and content that is illegal in many jurisdictions in the markets they serve. You still need people to review that, even if only as part of the the appeals process.
I suspect the real story here was budget cuts, and moderation was a big cost center. They did a half ass job and got shitty results. The users are not going to return. I suspect they will be completely shut down or kicked back and forth between buyers indefinitely, like Myspace. These sorts of consumer web brands never seem to come back after a screw up and I doubt Tumblr is going to be an exception.
There is a lot to be said about the Berkshire Hathaway way of just managing companies for the long term, not only when there isn't any growth but as they are shrinking.
I think they are acting on a reasonable fear that the reputation of the brand could be tarnished by their placing of ads on the page, which could be seen as a tacit endorsement of the content.
Let’s not forget that this is not about whether to spend money or not - it’s about where to spend money. Advertisers have countless options for where and how to advertise, so it makes sense to shift money to more boring, safe platforms where the ROI is almost identical anyway.
The lack of a convenient way of anonymously paying is probably the root cause of that. There's cash, which is a pretty convenient way of anonymously paying. There's no such convenient way to pay for stuff on the Internet. Emphasis on convenient.
It's not absurd, it's really the opposite - it's common sense.
Ads are an important means for companies to get the word out for stuff you might want and they're in more places than the internet.
The industry suffers from inefficiencies, surely, but it's still net beneficial.
Someday we might have fewer ads, and better ads that we really want to see.
If you ever get into a position where you have a company, and you run ads, and you see how it affects your business, you might have a different view. For example, I'm helping with a small new business making a very cool niche product - they'd be dead without FB ads because only FB offers the kind of targeting necessary (just basic demographics really). None of us are big fans of FB but the business would not exist without them.
Businesses that have more scale and bigger budgets can hit demos a little easier, but there's only two real games in town for many businesses: G and FB.
And of course, we would never, ever advertise near porn, not for a second. In fact - if we felt that 'Tumblr' became well known or synonymous with porn, we wouldn't advertise there. As far as G, well, they have porn, but for whatever reason, it doesn't seem to affect their brand.
> And of course, we would never, ever advertise near porn, not for a second. In fact - if we felt that 'Tumblr' became well known or synonymous with porn, we wouldn't advertise there. As far as G, well, they have porn, but for whatever reason, it doesn't seem to affect their brand.
I don't understand how the first and last sentence are compatible, unless you're saying that you'd never advertise near a brand known for porn, and don't care otherwise? That's not a very principled stance, and it really does seem wastefully harmful to a lot of sites.
Having your ads next to porn, is very different than having your ads inside a technology that may show porn.
For example, advertisers do not want their ads shown before a youtube clip if it contains porn.
Google is not seen as a 'porn brand' because ostensibly they are just technology - they help you find stuff.
Tumblr had a problem in that a considerable portion of their content was porn.
If it happens that Google develops this popular attribution, i.e. they are known as a 'porn brand' - then people would advertise there, but I don't think this will ever be the case.
FYI - Google also has some content controls, they don't have ads for porn, they don't promote it etc.
Again, this is very easy social math. The 'proof' is not in any of my statements, rather, it's the consistent application by basically every ad agency: they don't want their ads with porn. But they're ok with their ads on a site that may happen to show it if there isn't a branding concern and there obviously is not with Google.
I realize that advertising provides actual value. It just seems that it became too much. It went from something that should be an add-on to the main income source for some of the biggest companies in the world. That‘s way too much power in the hands of advertising companies.
That‘s why I said the „I build something neat that people wanna use and when I have enough users I exploit the shit out of them“ aka ad-based business model needs to die.
There's data on fraud and chargeback rates for adult-content-related credit card payments. They're much higher than for other goods and services. Major payment processors have blanket bans on such transactions, and will often refuse to do business with companies that don't have such bans. And you can't buy ads if you can't pay for them.
I've wondered about this as well. Studies are difficult to find. The ostensibly answer is morals/ethics of advertisers which if not already bordering on an oxymoron, also makes no sense given that these same advertisers still happily run all the ads they can in e.g. Saudi Arabia. And they're certainly well aware that if 62% of men acknowledge watching porn that that simply means 38% of men lie about watching porn. So it's not even cultural concerns, as might explain e.g. Saudi Arabia.
But I think there's probably a very simple answer. People in the mood for porn are going to be substantially focused on the content they're after, rather than getting distracted by your ad. Contrast this against something like a cat pic. Ah that's cute -- then 2 seconds later your mind and attention is wondering, ideally right on over to their ad. The porn ends up being a much worse value proposition than the cat pic, or more generally than any rapidly consumed low-effort low-intensity content.
I think the exact same can also explain why advertisers prefer to avoid 'controversial' content. I expect the advertisers ostensibly kowtowing to social media outrage mobs are mostly just using them as useful idiots. They point out controversial and likely highly immersive content. Pulling out not only gets them a few social media white knight points, but also generally results in some degree of controversy which also helps spread their brand name effectively working as free advertising. You might even get things such as people deciding to follow you on social media as a means of virtue signaling. This also points to another ironic point that I think supports the above hypothesis. Advertisers are more than fine with controversy when they themselves are the controversy. See: e.g. Gillette that intentionally planned an advertising campaign exclusively around fanning the flames of a culture war.
I think the most real reason is that it's hard to develop a targeted ad profile of a user who is doing everything in their power to make sure this particular interest of theirs is isolated from every other aspect of their life. It's just not a productive use of ad money.
> As of March 1, 2019, Tumblr hosts over 459 million blogs. As of January 2016, the website had 555 million monthly visitors.
The numbers quoted in the article are barely more charitable. Basically it's about one monthly visitor per blog... which one would assume means the blog owners periodically check what their sites look like.
Was anyone actually using Tumblr for anything but porn nowadays?
> Was anyone actually using Tumblr for anything but porn nowadays?
What I've encountered there most frequently is blogs with "atmospheric" photos and images. I always thought that most frequent use for tumblr is the same as pinterest: for collecting pics of the same theme, "moodboards".
Of course I've seen nudes there, but almost no real porn, i.e. no photos of sexual intercourse. I've rarely seen blogs entirely of nudes, they were mostly posted amongst other "atmospheric" pics.
> The number of monthly visitors per active, non-spam blog was, and remains, probably much greater.
For sure, but it doesn't seem to be stopping Tumblr from boasting that it has 461M blogs . Even if we charitably assume that as many as .1% of those are run by real, actual humans, that figure makes Tumblr look like a small platform with not many users -- particularly if, as you've noted, there are still tons of porn bots on it.
You might be right but consider how 10% would mean their typical sites have 10 users per month on average. I'm struggling to imagine ~50 million running a blog that no one -- not even their friends and family -- is visiting.
Even 1% active non-spam blogs, meaning 100 visitors per site per month on average, seems unlikely to me.
Keep in mind that every single logged-in user has their own blog, whether they use it or not, whether anyone reads it or not. So, theoretically, "an average of one visitor per blog" would also be entirely consistent with "every single Tumblr view represents an active Tumblr user", because the number of people who visit Tumblr would thereby equal the number of blogs.
No shit! Like plenty of users I don't visit this place anymore as my blog is now empty of content (NSFW nude art blog).
But even before the ban, content creators (photographers in my case) were posting less and less, focusing instead on Instagr.am networking and Patreon.
Tumblr has a fantastic community of cartoonists, I have actually asked some cartoonists to put their designs on other platforms just so I can buy a t-shirt with some of their cartoons. Of course it is just a platform to showcase the free stuff and then pull people to their patreon, which makes perfect sense. Free blog hosting with X million teenagers.
> Advertisers want to be excluded from placements next to adult content
I feel that this is maybe Tumblr's ad teams failing and they could have argued around this. When using the Dashboard the UI design of the system felt very much like every post on that screen was there because your chose it to be there so even if you injected ads into it there was never the feeling that ads were next to pornography posts because your feed was only ever full of pornography if you actively followed people who post it.
It was all manually curated by you not an algorithm system like Twitter, Gyphy, Imgur etc.
30% is a significant drop. It's really unheard of. MySpace and Digg didn't have such ridiculous drops.
And keep in mind that this could be the start of an overall downward trend. If 30% leave, then activity on the platform crumbles. Some of the remaining users notice the decline in activity and so they leave. Rinse, repeat. Once it reaches a critical mass, you can't stop the decline and tumblr is finished.
Or it could be a temperory decline and a new plateau is reached where tumblr can survive at.
So now we have a number for the porn traffic. The filter isn't perfect and porn is still there. I wouldn't be surprised if the real number approaches 40%.
But apparently Tumblr is fine with it, because surely they knew how much traffic it was driving? Or else someone did a truly shitty job with their website statistics.
I can only assume they expect the brand value to rise proportionally to the level that better advertising profits will offset the loss of porn. I guess that's the harder number to estimate, and I have a feeling they're S.O.L. because all blogging platforms are struggling in the days of the big services like Instagram and Facebook, and those don't even have to try reverse a brand associated with porn... They can speak of staying out of these big social networks and win privacy and not selling your private life, but the problem is that literally billions are on these services. They're giving you the unparalleled reach.
I'd wager that more than 30% of content consumed , and probably 60+% on insta, is adult content. The difference here is that tumblr didn't have the resources to mitigate the issue, so they chose a blunt cut-off.
From what I've seen by following photographers and artists sometimes with "risque content", Instagram is successfully strict about it. Sometimes people get away with female nipples minimally censored but anytime that's involved you run a very high risk your photograph will be taken down or the whole account banned if you keep being "into" that in your feed. I think a lot depends on how many complaints Instagram is receiving. They're also OK with butts. Blood/gore/violence is pretty tabooo, even in comic book form. @sirjoancornella has for example had trouble with Instagram. That's how adult Instagram let themselves become from what I can tell over the years.
Does Instagram put much effort into mitigating adult content? I can think of a shit ton of risque stuff on Instagram. If you're not logged in, you have to click a button saying you want to see adult content.
That points back to July 2018. It's not quite a smoking gun, as their GDPR banner went up in late May, ahead of the 25th May deadline; even back then people were complaining that their tumblr stats were tumbling.
Tumblr is just lazy nowadays. There are many fandoms being active though. The spark is long gone since Yahoo got their old shaggy greedy fingers on it. Pornography aside, it is a great place to host a blog. Nice template, can support many types of media. The disgusting part is the 3-4 pages of other filthy fingers (ads-trackers) that welcome you every time you go to <insert-name>.tumblr.com
> The disgusting part is the 3-4 pages of other filthy fingers (ads-trackers) that welcome you every time you go to <insert-name>.tumblr.com
I moved 1500+ posts (drawings, own content) from Tumblr to Jekyll just to avoid that page. It seems designed to repel visitors permanently. People cannot reasonably opt out of 300+ ads-trackers each time they check a single blog. Another great missed opportunity as a competitor to WordPress, and other blogging platforms, Tumblr will follow the path of Posterous.
More than a few comments here lay the blame on "puritanical Americanism" and, as an American, I'm wondering where these so-called Puritans are when it comes to gay wedding cakes, homeschooling, residence based cotton industry, Christian expression, and gun rights.
The problem with Tumblr is that they could never figure out how to make profit from it. Yahoo just made things worse by giving full access to so many advertisers and trackers. Think Cambridge Analytica x50, but at least they have do documented it.
30 percent in three months is huge for a site this large. It's well beyond critical mass needed to cause a permanent downward trend. The effect of the loss of one third of the community will have an impact on the rest, causing others to leave, and so on.
To be fair I think it was more a joke about the acronym being overloaded. LGBT on its own is already longer than it needs to be with 2/3 ways of saying 'attracted to people of the same gender or sex' but it then gets extended with two Qs (queer and questioning), 2 As (asexual and allies), a P for pansexual (which then causes arguments over whether bisexual people can be attracted to trans people) and more. To add to that there are more distinctions which should arguably be included (such as being homoromantic vs homosexual or genderfluidity/genderqueerness as distinct from being trans or even being transgender vs transexual(?)). In short, I think GSM is a great suggestion with the only possible problem being at some point GSM people not being a minority so the term becomes a misnomer.
> LGBT on its own is already longer than it needs to be with 2/3 ways of saying 'attracted to people of the same gender or sex'
Sure and that's on purpose. Your perception of it is not the idea this acronym is trying to push. It's not a label to say "homosexual" or "I like people of the same gender as my own".
It's about unifying the lesbian, gay, bisexual and transgender communities together under an umbrella. Individually they have different values, lifestyles and challenges. Together they have a stronger voice and can tackle shared issues. They are very different identities.
As other communities grow bigger, they are welcomed into the acronym. That's the very point of it and that's why it often gain letters.
Moving past the profanity and stupidity, there are so many fandoms on Tumblr, literally thousand (and/or millions) of Harry Potter, Supernatural, Twilight, Divergent, and many many more have found refuge/a place to express themselves.
I know that "the internet is for porn", but, well, not ALL of it and especially not all Tumblr is serving pornography.
Those "free speech defenders" invariably defend right-wing, white supremacist, fringe science or conspiracy content they feel is being purged by some leftist agenda secretly controlling all social media.
Unsurprisingly, they're not going to defend a site which has become synonymous with SJWs, liberals and feminists.
Every one of those bullet points is an issue which predominantly affects women versus men. Implying that an equal amount of men are stalked and harassed online and sent unsolicited nude pictures is simply incorrect.
Male safety in this regard is not of equal importance because male risk is so low in proportion as to be practically nonexistent.
IIRC, Tumblr users tend more often to be women, so it makes sense to advertise features towards women for a site whose intent is to draw in former Tumblr users. But... the site doesn't say those features are only available for women, so given that they're available for men as well, I don't see what the concern is.