YouTube AI deletes war crimes evidence as 'extremist material'

(middleeasteye.net)

674 points | by jacobr 2420 days ago

55 comments

  • Hasknewbie 2420 days ago
    Youtube's response regarding one of these videos documenting abuses (emphasis mine):

    > "we've determined that your video does violate our Community Guidelines and have upheld our original decision. We appreciate your understanding."

    Can someone explain to me why corporations, when interacting with customers regarding complaints/appeals, seem to have "don't forget to add insult to injury" as one of their motto more often than not? Does that kind of patronizing tone sound polite to the ears of a PR drone?

    • viridian 2420 days ago
      If I remember correctly, the exact same message was delivered to Jordan Peterson a couple of weeks ago or so, before he sat down with the google memo guy. He was in the middle of a bible lecture series, and Google banned his account, and sent the exact same "we've determined that your video does violate our Community Guidelines and have upheld our original decision. We appreciate your understanding." message.

      It seems tone deaf especially since in cases such as these there is no understanding to appreciate. Google will not tell you what you did to violate policy, only that they checked to ensure that they found you guilty, and then they snub you further with the HR speak. It's maddening.

      • cvsh 2420 days ago
        The worst part of the information age is arbitration by unreasonable and impenetrable algorithms rather than humans with the capacity to make a judgement call when the rules clearly don't account for the situation at hand.

        A transparent appeals process staffed by humans who can at least deliver a rationale, including what rule you broke, should be required by law. There's irreparable reputational damage associated with an algorithm libelously labeling something "extremist content" that isn't.

        • viridian 2420 days ago
          I think the bigger issue is that certain companies have near monopolies in their spaces to start with. For plebs like myself, youtube is really the only viable option I have to distribute video media if I hope to build an audience. The fact that you effectively can't mount an alternative to facebook, youtube, etc due to network effects is the larger disease, and this is one of many symptoms.
          • the8472 2420 days ago
            I think the only solution is a distributed and decentralized web.

            Distributed hosting of static content is a sorta-solved problem. But curating, linking and discoverability (which require mutating content) is a lot harder due to the trust anchor problem.

            • stephen82 2420 days ago
              Your suggestion exists and has a name: bitchute. I will paste here what they have for "About" at the end of their main page:

                 BitChute is a peer to peer content sharing platform. 
                 Our mission is to put people and free speech first. 
                 It is free to join, create and upload your own content to share with others.
              
              Feel free to read more about it in their FAQ. I really want to stop using YouTube and use this instead.

              I hope they make it.

              https://www.bitchute.com/

              • vidarh 2420 days ago
                The big challenge with this is to solve the problem that almost everyone has a "one step too far" when it comes to what type of content we are willing to tolerate and/or what type of content we may get in trouble for hosting even if it is not intentional.

                That makes it tricky to for solutions that "put people and free speech first" to succeed, because they've basically painted a giant target on themselves, and it easily makes even a lot of people that sympathise in principle worried about the bits and pieces that steps over their personal line.

                Figuring out a reasonable solution to this, I think, will be essential to get more widespread adoptions of platforms like these.

                • the8472 2420 days ago
                  I think the problem is the expectation of people that someone else do the filtering for them. I.e. "I don't want to see this kind of content" leads to "someone else should remove it from all the sites I visit". Which obviously leads to conflicting requirements once you have more than one person and those people disagree on what they want to see and don't want to see.

                  The only reasonable solution is to host everything, modulo requirements by law, and give users the tools to locally filter out content en masse.

                  In a decentralized system you also skip the law requirements since you cannot enforce multiple incompatible jurisdictions at the platform level, individual users will be responsible for enforcing it on their own nodes, similar how all you can do when accidentally encountering child porn is to clear your cache.

                  • vidarh 2420 days ago
                    The problem on these distributed platforms is not filtering what people see, but filtering what people host or allow to transit their network connections.

                    > In a decentralized system you also skip the law requirements since you cannot enforce multiple incompatible jurisdictions at the platform level, individual users will be responsible for enforcing it on their own nodes, similar how all you can do when accidentally encountering child porn is to clear your cache.

                    But that's the thing: You don't skip it. You spread it to every user. They both have to deal with whether or not they are willing to host the material and whether or not it is even legal for them.

                    How many of us sympathise with the idea of running a Tor exit node, for example, but avoid it because we're worried about the consequences?

                    These platforms will always struggle with this unless they provide ways for people to feel secure that the content that is hosted on their machines is content they don't find too offensive, and/or that traffic that transit their networks is not content they find too offensive.

                    Consider e.g. darknet efforts like cjdns which are basically worthless because their solution to this was to require people find "neighbours" they can convince to let them connect. Which basically opens the door to campaigns to have groups you disapprove of disconnected by harassing their neighbours and their neighbours neighbours, just the same as you can go to network providers on the "open" internet.

                    • the8472 2420 days ago
                      First of all, not all p2p networks operate like Tor. For example bittorrent and ipfs only host content you look at. So hosters could largely self-select the content they replicate.

                      Secondly, there are several tiers of content. a) stuff that is illegal to host b) stuff that is not illegal but that you find so objectionable that you don't even want to host it c) stuff that you don't like but doesn't bother you too much d) stuff you actually want to look at. I posit that a) and b) are fairly small fractions and the self-selection mechanism of "things that I looked at" will reduce that fraction even further.

                      And even if you are on a network where you randomly host content you never looked at encryption can provide you some peace of mind (of the obliviousness kind) because you cannot possibly know or expected to know what content you're hosting. Add onion routing and the person who hosts something can't even be identified. If Viewer A requests something (blinded) through Relay B from Hoster C then B cannot know what they're forwarding and C cannot know what they're hosting. If neither you nor others can know what flows through or is stored on your node it would be difficult to mount pressure against anyone to disconnect.

                      For the illegal content, especially in oppressive environments, you could install a Voluntary Compliance(tm) government blocklist on public-facing nodes and still opt to run an internal node in your network that uses encrypted connections to retrieve things hosted in other countries you're not supposed to see.

                      ----

                      Anyway, back to filtering for decentralized content hosting. I think once you have a network it is a matter of managing expectations. You can't make content magically disappear. Platforms like youtube, twitter, facebook etc. have raised the false expectation that you can actually make things go away by appealing to The Authority and it will be forever gone. In reality things continue to exist, they just move into some more remote corners of the net. Once expectations become more aligned with reality again and people know they can only avoid looking at content but not make it non-existent things boil down to being able to filter things out at the local level.

                      • rmc 2418 days ago
                        > And even if you are on a network where you randomly host content you never looked at encryption can provide you some peace of mind ... If neither you nor others can know what flows through or is stored on your node it would be difficult to mount pressure against anyone to disconnect.

                        I think you misunderstand the objection. Yes, encryption can mean you cannot be persecuted for "hosting"/"transmitting" some objectionable stuff, since you can prove that you had no idea (at least that's the theory).

                        However some want to be able to "vote with their wallets" (well "vote with their bandwidth"). They don't want to assist in the transmission of some content, they want that content to be hard to find, and slow and unreliable. They have the right to freedom of association and don't want to associate with those groups. Encryption cannot guaranatee that I won't help transmit $CONTENT.

                      • vidarh 2420 days ago
                        > First of all, not all p2p networks operate like Tor. For example bittorrent and ipfs only host content you look at. So hosters could largely self-select the content they replicate.

                        I'm aware of that, but they you suffer the problem of people wanting deniability.

                        > Secondly, there are several tiers of content. a) stuff that is illegal to host b) stuff that is not illegal but that you find so objectionable that you don't even want to host it c) stuff that you don't like but doesn't bother you too much d) stuff you actually want to look at. I posit that a) and b) are fairly small fractions and the self-selection mechanism of "things that I looked at" will reduce that fraction even further.

                        That's true, but those sets pretty much only need to be non-zero for it to threaten peoples willingness to use such a network.

                        Further, unless there is stuff in a), and stuff that fall into b) for other people that you want to look at, such a network has little value to most of us, even though we might recognise that it is good if such a network exists for the sake of others.

                        This creates very little incentive for most to actively support such systems unless such systems also deals with content that we are likely to worry about hosting/transmitting.

                        > For the illegal content, especially in oppressive environments, you could install a Voluntary Compliance(tm) government blocklist on public-facing nodes and still opt to run an internal node in your network that uses encrypted connections to retrieve things hosted in other countries you're not supposed to see.

                        That's an interesting thought. Turning the tables, and saying "just tell us what to block". That's the type of ideas that I think it is necessary to explore. It needs to be extremely trouble-free to run these types of things, because to most the tangible value of accessing censored content is small, and the intangible value of supporting liberty is too intangible for most.

                        > Anyway, back to filtering for decentralized content hosting. I think once you have a network it is a matter of managing expectations. You can't make content magically disappear. Platforms like youtube, twitter, facebook etc. have raised the false expectation that you can actually make things go away by appealing to The Authority and it will be forever gone. In reality things continue to exist, they just move into some more remote corners of the net. Once expectations become more aligned with reality again and people know they can only avoid looking at content but not make it non-existent things boil down to being able to filter things out at the local level.

                        This, on the other hand, I fear is a generational thing. As in, I think it will take at least a generation or two, probably more. The web has been around for a generation now, and in many respects the expectations have gone the other way - people have increasingly come to be aware of censorship as something possible, and are largely not aware of the extent of the darker corners of the net.

                        Centralisation and monitoring appears to be of little concern to most regular people. People increasingly opt for renting access to content collection where there is no guarantee content will stay around instead of ensuring they own a copy, and so keep making themselves more vulnerable, because to most censorship is something that happens to other people.

                        And this both means that most people see little reason to care about a fix to this problem and have an attitude that give them little reason to be supportive of a decentralised solution that suddenly raises new issues to them.

                        Note that I strongly believe we need to work on decentralised solutions. But I worry that no such solution will gain much traction unless we deal with the above issues in ways that removes the friction for people of worrying about legality and morality, and that provides very tangible benefits that gives them a reason to want it even if they don't perceive a strong need on their own.

                        E.g. Bittorrent gained the traction it has in two ways: through copyright infringement and separately by promising a lower cost way of distributing large legitimate content fast enough. We need that kind of thinking for other types of decentralised content: At least one major feature that is morally inoffensive and legal that attracts people who don't care if Facebook tracks them or Youtube bans a video or ten, to build the userbase where sufficient pools of people can form for various type of content to be maintained in a decentralised but "filtered" manner. Not least because a lot of moral concerns disappear when people feel they have a justification for ignoring them ("it's not that bad, and I need X")

                        I genuinely believe that getting this type of thing to succeed is more about hacking human psychology than about technical solutions.

                        Maybe it needs a two-pronged attack - e.g. part of the problem is that the net is very much hubs and spokes, so capacity very much favours centralisation. Maybe what we need is to work on hardware/software that makes meshes more practical - at least on a local basis. Even if you explicitly throw overboard "blind" connection sharing, perhaps you could sell people on boxes that shares their connections in ways that explicitly allows tracking (so they can reliably pass the blame for abuse) to increase reliability and speed, coupled with content-addressed caching on a neighbourhood basis.

                        Imagine routers that establish VPN to endpoints and bonds your connection with your neighbours, and establishes a local content cache of whitelisted non-offensive sites (to prevent a risk of leaking site preferences in what would likely be tiny little pools).

                        Give people a reason to talk up solutions that flattens the hub/spoke, and use that as a springboard to start to make decentralisation of the actual hosting more attractive.

                    • mirimir 2419 days ago
                      > But that's the thing: You don't skip it. You spread it to every user. They both have to deal with whether or not they are willing to host the material and whether or not it is even legal for them.

                      > How many of us sympathise with the idea of running a Tor exit node, for example, but avoid it because we're worried about the consequences?

                      Tor isn't the best example, because exits don't cache anything. So mainly, exit operators get complaints. And the exit IPs end up on block lists. Operators don't typically get prosecuted. Maybe they get raided, however, so it's prudent to run exit relays on hosted servers.

                      Freenet is the better example. The basic design has nodes relaying stuff for other nodes. In an extremely obscure and randomized way. Also, keys are needed to access stored material.

                      However, nodes see IPs of all their peers. Investigators have used modified clients to identify nodes that handle illegal material. So users get busted. There is "plausible deniability". But it's not so plausible when prosecutors have experts that bullshit juries. So users typically plea bargain. Or, if they use encrypted storage, they get pressed for passwords. Like that guy in Philadelphia.

                      • vidarh 2419 days ago
                        It doesn't matter if operators get prosecuted or not. What matters if is people in general see running exit nodes as somewhat risky. Unless there is a reasonable perceived payoff, even a very minor perceived cost will be enough to stunt the growth of such a network severely.

                        Same goes for freenet and the like.

                        • mirimir 2419 days ago
                          True. I don't run Tor exits from home.
                    • tripzilch 2419 days ago
                      While I don't disagree with your argument per se (not sure if I quite agree either, though), note that avoiding a decentralised platform because of being "worried about the consequences" is not necessarily the same thing as worrying that "the content (..) is content they don't find too offensive".

                      The first includes both legal and moral considerations, the second only moral ones.

                      My consideration of whether to share, part of the time, some slice of my home Internet connection bandwidth as a Tor exit node is almost entirely a legal one (I admit that time/effort may play a role too). I'd consider the moral aspect too, but I wouldn't have to think long to decide that (for me personally) the trade-offs are worth it (I could explain why and how, but I don't want to derail the discussion in that direction).

                      In fact I'd argue this goes for anyone, in some sense. Even if their underlying reasons align with the legal considerations (and thus not run one), it's a moral judgement. (in the worst case, there exist people who equate moral judgement with legality)

                  • djrogers 2420 days ago
                    > I.e. "I don't want to see this kind of content" leads to "someone else should remove it from all the sites I visit".

                    I don’t think that’s always the mindset. Isn’t it reasonable for people to have the mindset of “I want to go to sites that don’t have content that I find objectionable”? This way websites can decide which group of people they want to cater to.

                    • the8472 2420 days ago
                      The context of the discussion is large websites acting as platforms. Their users are bound to have conflicting views about what's "objectionable". So when the moderation mechanism is deletion instead of letting users just filter then the website has to preferentially treat one group instead of being a platform for everyone.
                      • chii 2419 days ago
                        But the deeper issue is that done people don't want certain content to even exist, and won't be satisfied with just a filter (even thought they can't tell the difference between filter and deletion).
                  • averagewall 2420 days ago
                    I don't think Youtube censors extremist content because "I" don't want to see it. It's because "I" don't want anyone else to see it! There's no use me filtering my own videos if my goal is to limit what other people see.
                    • chii 2419 days ago
                      This is at the heart of the issue: should platforms bow down and allow some users to dictate what others are allowed to see? Or should a platform remain neutral against anything (which means potential back lash)?
                  • likelynew 2420 days ago
                    > modulo requirements by law

                    Which law?

                    • bryanrasmussen 2418 days ago
                      German law against Pro-Nazi stuff would be an obvious example. Then child pornography. 'Normal porn' in some cultures...

                      the thing is law requirements do not generally allow you just to clear your cache of the offending content, the company is not allowed to show it.

              • rmc 2418 days ago
                Warning: it has this channel https://www.bitchute.com/channel/whitepower/ which is full of anti-semitic nazi stuff. It's visible on the front page listed above. You may or may not want to visit that link.

                Channels aren't open to everyone, so it looks like they have manually allowed that?

              • the8472 2420 days ago
                Webtorrent does not meet the distributed requirement since webrtc needs signalling servers.

                Plus the discovery component is still hosted on websites subject to the networking effect.

            • chongli 2420 days ago
              I think the only solution is a distributed and decentralized web.

              I love the idea but one problem: who pays for it? It's a special case of the co-operative vs corporation problem. Without an individual's starting capital, how do you get off the ground?

            • flamedoge 2420 days ago
              Webtorrent is another torrent based media sharing
          • ocdtrekkie 2420 days ago
            Agreed. If competition exists, customer service is one of the angles competitors can improve on to try to gain customers. For instance, one of the top reasons I prefer FastMail over Gmail is that real customer service I get.
        • Retra 2420 days ago
          That's not the worst part at all! Anyone could file a complaint about a mistaken algorithm; the companies are run by people, and they'll have to address their customers' concerns. The worst part is that humans use these algorithms to justify and enforce their own shitty decisions; to say it is "out of their hands" due to some inane, manufactured inconvenience.

          >There's irreparable reputational damage associated with an algorithm libelously labeling something "extremist content" that isn't.

          The damage is definitely repairable. You admit the algorithm is flawed and you tolerate exceptions to it.

          • ScottBurson 2420 days ago
            I think they were talking about the content producer's reputation.
        • erikpukinskis 2420 days ago
          This seems like a false dichotomy. There is nothing preventing a human staff from declining to provide an explanation, nor is there any reason a machine learning algorithm couldn't summarize its reasons.

          The issue isn't AI vs human, it's transparent vs opaque.

        • anigbrowl 2420 days ago
          The opacity is by design. It's notable that such behavior by an individual would typically be considered psychologically abusive. It's one reason I talk about power relations frequently; we are in the throes of automating them, and given the impact of technology on other spheres of human activity we should be wary of what sort of social relations were are baking in.

          Perhaps the Graph should be public domain. Perhaps too we are heading towards a world where reputation and legal identity are subject to casual destruction but there's no real barrier to starting over, much like when you die in a videogame.

        • dunkelheit 2419 days ago
          I think the notion that machine learning algorithms are "unreasonable and impenetrable" is seen as a huge PR boon by these companies as it shifts the responsibility away from actual humans. So they try hard to promote it.

          The fact is that there is always a human in the loop. Without human supervision these algorithms deliver a small but significant portion of incredibly stupid results. So an actual human has to sit down, analyze these results one by one and decide what to do (in some cases just hardcoding the "correct" answer). The general public must be educated about this stuff so that responsibility is not muddled.

          • jjoonathan 2419 days ago
            Yep. The usual incarnation of the scapegoat is "policy." It sounds much better to blame a byzantine rulebook (which is the perfect tool for diffusion-of-responsibility) than to reveal that the strategists have decided to throw a subset of customers under the bus. In the case of monopoly, sometimes it's not even a subset.

            Incidentally, this also explains why there is zero interest in making rulebooks available, conscise, searchable, etc. All of these would improve fairness, but rulebooks are actually an instrument of power, not of fairness, so existing power structures will typically oppose any such changes.

        • AmIFirstToThink 2419 days ago
          I mean, why not wait till it is deemed illegal by the authorities and due process? How can you take away someone's freedom of speech?

          It's not like the AI has absolute idea of what is 'extremist' content, it's just enforcing someone idea of what it is. AI is trained on data, and whoever labeled that data is the person/s who are winning here.

          • ucaetano 2419 days ago
            > How can you take away someone's freedom of speech?

            Nobody is taking away your freedom of speech by deleting your video.

            Nobody is mandated to provide you with a vehicle or medium for your speech.

            • AmIFirstToThink 2418 days ago
              Anyone can deny any service to anyone for any reason?

              Are you sure you are sold on right outcome of the baker/LGBT wedding-cake case? How about a pharmacist not telling correct/all options based on their theology?

              How about a publicly traded corporation? Do they have a mandate to treat people equally? If they are picking a political viewpoint and removing customers because of it, what makes you think that their hiring practices are fair?

              Google has a religion now, it has been baptized in the religion of intolerant left. Google is now theocratic, it will not allow blasphemous talk that challenges its religion.

              Google claimed to champion Net-Neutrality, don't open the packet, they said to the ISPs. They want to resist opening of TCP/IP packet but when it comes to content of the videos, they want to play God. TCP/IP packet or Video, let the legal system take its course, let authorities tell you to ban something, don't play God on the platform that is valuable because of the sum number of people on it. YouTube is a social network, its value comes from people participating in it, treat the people equally and be an neutral steward of the platform technology, don't push ideology. Anyway, Google has damaged its image too much now. It will never be seen with same affection again, at least not by me.

        • likelynew 2420 days ago
          I am on communities where I am fine with content being removed or fined without delivering a rationale(HN included). I definitely don't want government intervention everywhere.
        • Buge 2420 days ago
          These takedowns were done by humans. The videos were flagged by AI for human review.
          • gaius 2419 days ago
            The videos were flagged by AI for human review

            The training parameters of that "AI" were set by a human too. Someone said to it "here are a bunch of videos that I PERSONALLY THINK are to be banned, learn from that".

        • tankenmate 2420 days ago
          I think the biggest problem is that unlike a judicial decision the thought process (including any highlighted pros and cons) that accompanies a decision is entirely missing.
        • ingenuous2 2420 days ago
          This has always existed. We call it bureaucracy.
          • cvsh 2420 days ago
            Bureaucracies are staffed with humans who have the capacity to make a judgement call on how to interpret the rules, and bend their letter to serve their spirit, or the interests of pubic relations, or just common sense.
            • detaro 2420 days ago
              Do we know how much YouTube involves humans in the process? I wouldn't be surprised if appeals go to a human which clicks the "yeah, nope" button.
            • ingenuous2 2417 days ago
              I disagree. Have you ever been to the CA DMV with a photo of your license (that you lost/had stolen)? I have. They told me I needed a copy of my lost license to get a new one, or else the man couldn't validate my identity.

              Bureaucratic hell, as defined by Harry Harrison in the Stainless Steel Rat series, is the definition of humans as automata.

        • nnfy 2420 days ago
          This isn't just some AI gone wrong. YouTube has had an agenda for years. At the very least, people with power to undo these bans are complacent, there is simply no way that YouTube staff are unaware of the gradual crackdowns. Even content producers will mention demonitization occasionally.

          Nothing major online happens by accident.

        • ubernostrum 2420 days ago
          The worst part of the information age is arbitration by unreasonable and impenetrable algorithms rather than humans

          The thing is, people only tend to notice it when it affects them personally (either they are the victim of the algorithm, or someone they know/like/support is). The world has long worked on irrational biases, which now are being used as the training data for decision-making systems which are subsequently declared to be "objective" because people believe an algorithm can't be biased. And increasingly, the mark of privilege is having access to a system -- applications, interviews, customer service, even courts -- which will use human judgment instead of an unreviewable algorithm.

          For more on the topic I suggest the book Weapons of Math Destruction by Cathy O'Neil.

        • pmoriarty 2420 days ago
          To be fair, the process needs to be much more than merely transparent. It needs to be independent.

          That means that it needs to be done by an entity outside of Google itself, and not in any way associated with or influenced by Google.

          • hossbeast 2420 days ago
            Google is a company. You want video hosting to be run by the government?
            • cloakandswagger 2420 days ago
              -The services provided by Google and Facebook have an unassailable majority of market share and are relied on by a huge amount of people.

              -They enjoy a de facto monopoly and are protected by the extreme cost, risk and time involved in building competing services.

              -Finally, they have a potential for abuse (say, with selective censorship or politically biased algorithms) that could essentially curb the Constitutional rights of individuals

              If these points sound familiar it's because they're frequently used when arguing for the nationalization of a private company. Since I think that's (currently) out of reach, I believe regulating Facebook, Google, et al as public utilities to be the next best thing.

              • briandear 2420 days ago
                Trusting government to regulate Facebook? No way. I lived in China; I have seen how that story plays out.

                Some people really do have a naïve trust in government. Free markets are the answer. Who has actually made a legitimate attempt to compete with Google or Facebook? What VCs are investing in Facebook alternatives?

                MySpace was unstoppable – until it wasn’t. Yahoo owned search – until it didn’t. Perhaps there ought to be more bold entrepreneurship rather than calls for regulation.

                Sounds to me that people are ok with just giving up and giving Facebook the win.

                Don’t like Facebook’s dominance? Then challenge it. Don’t cop out and just let the government take control.

                History is littered with great companies toppled by better ideas and execution.

                • CamperBob2 2420 days ago
                  No way. I lived in China; I have seen how that story plays out.

                  It's naive to think that any one form of human organization, be it governmental or corporate, is somehow less corruptible than another. You're right to be on your guard against governmental abuses, but don't take your eye off the other balls in play.

                  History is littered with great companies toppled by better ideas and execution.

                  What we've seen lately are instances where one company topples another and proceeds to commit the same abuses, only more effectively and at wider scale. When Facebook replaced MySpace, were its users really that much better off? Which company had fewer rules and enforced fewer content guidelines? When one company dominates the market and locks it up with network effects, what incentive does that leave them to play well with others?

                  • thaumasiotes 2420 days ago
                    > It's naive to think that any one form of human organization, be it governmental or corporate, is somehow less corruptible than another.

                    You don't think any organizations have ever been any more corruptible than any other organizations?

                    • CamperBob2 2420 days ago
                      Not once they reach a certain size, no. It turns into a pointless exercise in moral relativism. Joe brings up the Soviets, Jane counters with the East India Company. Bob rants about Trump, Betsy pulls up the Wikipedia article on Union Carbide. Sally complains about police brutality, Sam lectures her on the history of the Pinkerton Agency. Hank sticks up for the UAW, Mary criticizes the Teamsters.

                      None of these organizations should have been trusted implicitly to do the right thing for society at large. The burden of proof rests decisively with those who want us to believe that Google and Facebook are somehow different.

                      • PeanutCurry 2419 days ago
                        All of these organizations operated at their zeniths during different time periods, governments, cultural norms, and a variety of other factors. The problem with the conversation you're describing isn't moral relativism or that everyone in it has equally valid points, it's that none of them seem to be capable of isolating the elements of those organizations that functioned/dis-functioned without endorsing/criticizing the organization as a whole while still presenting a cogent argument.
                      • thaumasiotes 2420 days ago
                        We could save a lot of money and time by replacing the Supreme Court and Congress with one person each. Do you think there might be any disadvantages?
                • pmoriarty 2420 days ago
                  "Trusting government to regulate Facebook? No way. I lived in China; I have seen how that story plays out."

                  Are you really claiming that very government that tries to regulate corporations is going to wind up like China?

                  You know there are lots of governments around the world that regulate corporations, and most aren't anything like China.

                  "Some people really do have a naïve trust in government. Free markets are the answer."

                  Some people really do have a naive trust in free markets.

              • ucaetano 2420 days ago
                And please explain, how would you regulated them? They have nothing in common with how utilities work, so none of the utility regulatory models would work.

                What are you actually proposing?

            • CamperBob2 2420 days ago
              Google is a company. You want video hosting to be run by the government?

              I'm with you. I don't want that, but at some point I expect to lose the argument. Google will cut their own throats with their smug "We investigated our decision and found it to be correct" pronunciations.

              The fact is, any sufficiently dominant corporation is indistinguishable from a government. The more a company like Google behaves like a bureaucratically-hidebound public utility, the harder it will be to argue that it shouldn't be regulated like one.

            • geocar 2420 days ago
              Yes.

              An alternative is to classify Google as a common carrier, exempting them from the DMCA, but preventing them from censoring or even throttling traffic, however given their business model is around sponsorship, it is unclear how to also protect the advertisers' interests. Trying to get government shackles onto Google simply seems too tricky;

              It seems much easier to simply run Google with tax dollars and no advertising.

              • dom0 2420 days ago
                Google and Facebook are bigger than a single country now.
                • geocar 2419 days ago
                  This is a strange way to define "bigger".

                  Google and Facebook combined have a total market cap (sum of all shares) is around 1.2 trillion dollars, which would represent only a 5% increase in tax revenue for America to simply buy all the shares.

                  However, the Government doesn't need to turn a profit: Google and Facebook combined spend only around $200 million dollars per year on R&D and operating expenses, which would be a rounding error on the tax budget.

            • squarefoot 2420 days ago
              There is very little difference between a company and a government when the first can lobby (read: bribe) the second.

              I for one would welcome my files being hosted by the government if we lived in a world where democracy isn't more utopian than flying pink unicorns; as a citizen I would have a slight chance of being respected and listened to because I'd be a part of that, albeit a very small one, whilst with a company you have zero chances unless you're a stock holder or work there in some high rank. That's something to keep in mind next time they want to brainwash people about how good is privatization of public property.

              We're slowly but steadily going towards a future where governments will first be owned by corporations, then will cease to exist or be relegated to a purely PR role (think about the royal families in nations still having them). That will likely mark the start of the worst period humanity will ever live in.

              • ekianjo 2420 days ago
                Governments are not owned by corporations in any way. Governing bodies have way more budget and powers, such as the use of military and the police when they see it fit. Of course companies try to influence or lobby officials but in the end governments are not owned by anyone, and the more officials you have the more unlikely it is to bribe them all.
            • pyrale 2419 days ago
              AT&T Verizon, Comcast are also companies. But Google and FB campaigned to have them labeled as utilities and regulated accordingly.

              Let us simply label Google (search, yt, news) and FB as utilities and regulate them too.

            • pmoriarty 2420 days ago
              It doesn't have to be the hosting itself that's independent, but it would be an improvement if there was an independent body to which you could appeal.
            • the8472 2420 days ago
              How about by a non-profit?
      • chongli 2420 days ago
        Whenever I see these stories, it reminds me of the phrase "the lights are on but nobody's home." Google gives the impression that nobody actually works there. That the whole thing is just an automated system and that sometimes people fall through the cracks.
      • ryan-allen 2419 days ago
        His channel was re-instated which... was nice, I guess.

        I have watched hours upon hours of his videos (I've been a fan long before his PC controversy - I love personality theory and the twist he adds to them).

        I'm pretty centre left as far as I'm concerned and his videos do not in any way promote anything nasty. He's completely upstanding. I have no idea why they'd ban his channel unless there was a coordinated flagging.

      • dillweed 2419 days ago
        It's a strange corporate culture thats for sure
      • autokad 2420 days ago
        to me, this is further evidence that google needs to be put under regulations.
      • dublinclontarf 2419 days ago
        Hoping with the rollout of GDPR this will come to an end.
      • lanevorockz 2420 days ago
        Google message to Jordan Peterson was not "Stop making sense, we have an agenda to pass" ?
    • monksy 2420 days ago
      There are 2 uses for that language.

      1. It's to add insult to injury while attempting to soften the blow

      2. It's an attempt to deny that they have power to do what they did.

      For the insult to injury: This is a technique that is under the "thinking past the sale". (http://blog.dilbert.com/post/129433801521/thinking-past-the-...) The context prior to the understanding part basically has put you in: "I've done something bad and now I'm being punished." The last phrase "We appreciate your understanding." has later put you in the position of understandnig that: They (plural people) would like for me to understand their decision.

      In short this you're no longer put in a position where you can really fight back directly with the issue at hand. You're reminded that you are fighting against an organization if you disagree. It's predatory and manipulative.

      The second part: It's a manipulative attempt to prevent you from attributing ill will against the offending party. They're attempting to "soften the blow" because they appear to be reaching out.

      On both of these possibilities, the biggest issue with this is it's incredibly manipulative and it's much more insulting when you notice it. The organizations and people who use those statements should be doomed to be constantly rejected in everything they do and want in his passive aggressive stance for the rest of their life. It's an incredibly anti-social behavior.

      Unfortunately, we don't have social punishments for shitty behaviors like this.

      Last to note: This is the equivalent of the "apology" "I'm sorry that you feel that way" or "I'm sorry this didn't turn out the way you had hoped."

    • bobdole1234 2420 days ago
      It's because every reason you give people is something they can use to sue you or harass your employees if they don't agree with the wording.

      If you say nothing, there's nothing to grab on to.

      It's like online dating, everything you say is something that someone will dislike about you.

    • cabalamat 2420 days ago
      > Does that kind of patronizing tone sound polite to the ears of a PR drone?

      I don't know, but you're right it is very common, and infuriating.

      "Your call is important to us. Please hold."

    • rdtsc 2420 days ago
      I think a passive agressive "fuck you" is an important element because it provides enjoyment and some satisfaction to the corporate drones who you are interacting with.

      I think allowing low level minions with saddistic tendencies to express that saddism via a kthxbye-but-actually-fu here and there is used to reward them for an otherwise boring and unfulfilling task. In this case it is a bit indirect because it is the developer who wrote the code not necessarily a clerk at the counter or a customer service representative in a call center.

      Don't remember the incident, I think it was when someone was fired after some public incident at one of the tech conferences (Pycon I believe) where the HR commenting on the firing said something ridiculous like "we reached out to the developer and told them we'd be letting them go". Which I remembered because it sounded like such a massive and rude passive agressive "fuck you"

    • alexryan 2418 days ago
      I don’t think that Larry and Sergey yet fully grasp the consequences of allowing an echo chamber to form within google. One of the consequences of living in an echo chamber is that we rob ourselves of the feedback that we need to make good decisions. For example, many of us on the left who only listen to those who think like us are blissfully unaware of the severity of the storm that is building on the right and the consequences for the business.

      Google’s whole future business model seems to be based on getting deeper into our lives, into our homes, into our vehicles and gathering ever more data about us so they can more effectively help others to market to us. Many of us have been completely okay with that in past because we trust Google. They’ve worked hard to earn that trust. But with the public shaming and firing of James Damore, the blacklisting of “non believers”, the demonetizing and deleting of YouTube videos that violate the “Code of Conduct”, etc. the bonds of trust have been shattered. And once trust has been shattered, it is nearly impossible to re-build.

    • pjscott 2420 days ago
      > Does that kind of patronizing tone sound polite to the ears of a PR drone?

      Yes, it does. And they're not entirely wrong. Consider this video in which Uber CEO talks with someone like a real person instead of "respectfully" brushing him off:

      https://www.youtube.com/watch?v=vW50dnWVfGU

      It was a PR disaster. If he'd just ignored what's-his-name as an insignificant peasant, nothing bad would have come of it.

    • tehwebguy 2420 days ago
      The big secret is that YouTube, one of the most important sites in the history of the internet, has had a bunch of people who just don't care.

      Word is at some point it became a dumping ground for Google employees who transferred in because they wanted an easy job where they could use the amenities of the YouTube offices in San Bruno.

    • OzzyB 2420 days ago
      > We appreciate your understanding.

      It's their way of saying that they know you won't find their decision popular but they hope you won't pursue it any further.

    • lugg 2419 days ago
      It comes down to the second thing I remember learning in highschool level programming. Always use neutral language in your program text, and don't make jokes.

      Google does have a habbit of making jokes in their software, chrome crash page comes to mind.. so I don't really find it very surprising to see this kind of message.

      To be clear it's really only an insult when you don't know why your content was removed. If you know why, and can see their side it makes sense and it wouldn't seem so insulting. But in a case like this, it comes off insulting due to the true situation.

      Removing the text all together and simply stating the fact would have sent the intended message perfectly.

      And finally, likely just a developer, not a pr drone.

    • sharmi 2419 days ago
      All the responses are ascribing a nefarious ulterior motive for that piece of text. It could well be true, as these orgs have enough resources and data to A/B test it to hell.

      Or, it could be that, long long ago, that phrase actually conveyed sincerity that won customers.

      Soon, that became the next big customer retention tactic to apply. Next, it became cliché. Now it is just grating on our senses to hear the fake insincerity echoing from these huge organizations all around.

    • mirimir 2420 days ago
      So I'm wondering whether this HN exposure will be enough for some sensible humans to review these takedowns. I mean, is it embarrassing enough? I recall that Facebook reacted rather quickly after its "AI" took down videos inappropriately.
    • snerbles 2420 days ago
      Internal studies probably correlates such language with reduced pushback from users.
      • mcguire 2420 days ago
        The wall of jello defense. Works well.
    • tripzilch 2419 days ago
      It's somewhat similar to emailing someone a request and signing off with "thanks in advance", implying not just the request but the expectation of honouring it, too.
    • QAPereo 2420 days ago
      In the past it's the kind of tone that would inspire me to greatly escalate the issue, and I doubt that I'm alone.
  • jacobr 2420 days ago
    Also see this Twitter thread: https://twitter.com/EliotHiggins/status/896358097320636416

    > Ironically, by deleting years old opposition channels YouTube is doing more damage to Syrian history than ISIS could ever hope to achieve

    > Also gone are the dozens of playlists of videos from Syria I created, including dozens of chemical attacks in playlists by date

    > Keep in mind in many cases these are the only copies of the videos, and in some the channel owner will have died, so nothing can stop it

    • giancarlostoro 2420 days ago
      That is a little insane... Makes me wish there was some sort of website that archived specific YouTube videos marked as historical or criminal evidence or some sort of qualification, as long as they don't abuse copyright just to make sure if places like YouTube delete them they can remain. Or a service that uploads to multiple video streaming sources at once (though I imagine these might violate the TOS of YouTube for w/e reason).

      Kind of sad where you have video evidence being deleted by YouTube. It would be nice if they allowed some sort of option for political type videos like these to actually be uploaded by users, especially if the original uploader was killed, to be downloaded with metadata (upload date to youtube, youtuber username, etc) so anyone can reupload it elsewhere.

      Another case where I wish TPB had made their own YouTube clone already. I'm sure they would of not taken down these sort of videos.

      Wondering where Wikileaks is in these sort of cases? Do they download these sort of videos? That begs the question: why don't they? It seems right up their own alley. I don't always agree with them, nor do I digest their content but at the very least for a site like theirs it would make sense for them to archive YouTube and other politically sensitive videos no?

      • aiyodev 2420 days ago
        Not to pick on you because this seems to be a popular opinion but I would just like to point out the insanity of your what you just wrote.

        "as long as they don't abuse copyright"

        Would we delete videos of the liberation of concentration camps if there was Nickelback music playing in the background? This just demonstrates how successful media companies have been in distorting the true purpose of copyright laws: to promote science, art, and culture for the public's benefit. It does not exist for fairness or personal gain. Copyright laws should be changed to better reflect this. Nobody should be able to silence any information that has a public benefit.

        • Spivak 2420 days ago
          Like, Nickelback is in the original footage or someone overalyed the track on top of the footage? I would imagine if this ever really happened it would be fine in the former, and muted in the latter.

          You're forgetting that 'promote...' means give the creator control of that content for the purpose of limiting access and making money. Promote in the sense that it becomes possible to actually sell artistic works like commodities. And then once the value has ben extracted the public can do what they please with it.

          • oh_sigh 2420 days ago
            Not quite on the level of concentration camp liberation videos, but wasn't there a video of a 2 year old dancing that had like a 20 second clip of some popular song playing on the radio in very low quality in the background, and it got deleted for copyright infringement?

            I think this was the video, but it is obviously back up now: https://www.youtube.com/watch?v=N1KfJHFWlhQ

          • tripzilch 2419 days ago
            > Like, Nickelback is in the original footage or someone overalyed the track on top of the footage? I would imagine if this ever really happened it would be fine in the former, and muted in the latter.

            While I agree that would be the most sensible course of action, if this in reality happened right now, the whole video would be deleted in both cases, automatically (assuming it's clear enough to trigger detection in the first example).

            • softawre 2418 days ago
              Nope, today they mute the audio.
          • josinalvo 2420 days ago
            And we just need 140 years to extract it! :P
        • giancarlostoro 2419 days ago
          I only mentioned the copyright thing, because whatever site archives YouTube videos shouldn't focus on archiving all of YouTube just specific segments of it, though it might not be a bad idea to archive a good chunk of YouTube to avoid losing internet history / real world history. It'd be highly unlikely to be a factor if these videos were archived at politically targeted / history preserving archive sites though.
      • icebraining 2420 days ago
        An YouTube clone is expensive. TPB only hosts simple HTML pages (they no longer have torrent files, and even those were just a few KBs), not files with hundreds of MBs or more.

        As others said, the Internet Archive may be a good option for these videos. I wouldn't mind writing a system for backing them up to archive.org, but I'm not sure how would I detect them. Marking those videos requires the user to know they should be marked, which just moves the question to how they would know.

        • 4ad 2419 days ago
          > A YouTube clone is expensive.

          A YouTube clone that uses a clone of YouTube's infrastructure is expensive, but what about a distributed p2p YouTube clone?

          Obviously it's hard to quantify, as it doesn't exist yet, but I think it's technologically feasible.

          • afarrell 2419 days ago
            Meaning, you just have data stored on people's hard drives?

            That would be more expensive because

            - You have a much higher failure rate of the storage media as people say "I'm running out of hard drive space. What should I get rid of?"

            - You need to recruit those people to give up a resource that (unlike the spare compute cycles that SETI uses) they are likely currently using.

            - You have to convince people to trust you to put arbitrary video content on their hard drives. Therefore, you need to have some process for deciding what video content is objectionable enough that you won't store it.

            • 4ad 2419 days ago
              > You have a much higher failure rate of the storage media as people say "I'm running out of hard drive space. What should I get rid of?"

              Not really, because data is replicated between many people. People can delete it and other people would still have it.

              > You need to recruit those people to give up a resource that (unlike the spare compute cycles that SETI uses) they are likely currently using.

              It's no different than people seeding torrents or people just using ipfs. Simply accessing the system would transparently increase availability. In fact, ipfs is probably suited as-is.

              > You have to convince people to trust you to put arbitrary video content on their hard drives. Therefore, you need to have some process for deciding what video content is objectionable enough that you won't store it.

              No, data will become more accessible as people would consume it. People only need to understand how the system works, they don't need to trust "me" (whoever you refer as "you" in your post).

              As I said, ipfs probably works as-is.

              • afarrell 2418 days ago
                > because data is replicated between many people

                Yes, so you are designing a distributed system with a higher failure rate of the underlying media. That means you need more replication, so you need more people to donate space.

          • lightedman 2419 days ago
            "what about a distributed p2p YouTube clone"

            Same reason Bitcoin and its ilk can't scale. Decentralization throws in essentially a log x exp growth rate on bandwidth and storage for every additional peer on the network. Technology can't keep pace, period.

      • pmlnr 2419 days ago
        > Makes me wish there was some sort of website that archived specific YouTube videos marked as historical or criminal evidence or some sort of qualification

        Make one. Add it as warrior project to archive team; save them and re-upload them if they are deleted.

        Unfortunately we all need to participate, because https://medium.com/message/never-trust-a-corporation-to-do-a...

      • LoSboccacc 2419 days ago
        > Makes me wish there was some sort of website

        Can't people just use liveleaks for that kind of videos?

        • giancarlostoro 2419 days ago
          I guess we need more education about online video hosting is also a factor, most people from other sides of the world will know mainly the most popular websites and the smaller ones (by comparison) might not be even heard of by these people.
    • sondr3 2420 days ago
      People really need to download videos from YouTube if they are that important, not doing otherwise is in my opinion reckless. YouTube is not a service you can expect to actually archive important videos, if I look through my favorites or other playlists, a ton of videos are deleted.

      Use youtube-dl, download it to your own server and back them up yourself. Yes, this is awful and sucks on so many levels, but please, please, back up data.

    • PeanutCurry 2420 days ago
      In my opinion the important thing to remember is that Youtube could easily do something about this. It seems like there's a trend in a lot tech-centric companies to save money automating away actual content review as well as customer interaction. But humans are entirely capable of reviewing content and giving detailed explanations for why something has been judged a certain way, especially if you have them work alongside an automated system that simplifies the process for them.

      I realize that minimizing human labor is a big part of how these sorts of business models achieve their profitability, but problems like this aren't going to go away as long as that's the norm. And I don't just mean poor explanations for policy decisions either. The core issue is bigger than that imo.

      The information age has privatized a lot of the modern 'public' social/cultural spaces. For nations that value both the freedom of speech and the preservation of historically/culturally significant speech this is problematic. It reduces the public's ability to express itself but also their ability to look back on old expressions and learn about the history or cultural paradigms behind them.

      This isn't really supposed to be a rant at Google in specific. They're just the topic at hand so they're the easy punching bag. In general, customer service aside, I think they do good work and more importantly they exercised the necessary foresight and resources to develop their products into what they are today. I'm by no means implying we should socialize social media... no pun intended. But I do think there needs to be more discourse about how these trends will affect the future of speech and historical censorship. Right now it's just a modern problem in its infancy, but decades from now people who want to see visceral content depicting firsthand experiences from events like those happening in Syria, or the Arab Spring, are going to be getting censored history. What if China started pressuring foreign companies, via benevolent coercion such as financial incentives, to implement systems that made finding information about Tienanmen Square more difficult? The privatized nature of these platforms makes this sort of attack easier as well. And I don't have a good solution, but that's why I think there needs to be more dialogue about the future of online media in general and what direction we want to steer it in.

      • Eridrus 2419 days ago
        Try some back of the envelope math to see if it is possible. Start with 300 hours of video uploaded every minute.

        I calculated the cost to be 6bn USD/yr assuming the fully loaded cost of a full time reviewer is 20k, which dwarfs the revenue YouTube has.

        So please, lay out a plan that actually works with the economics of YouTube.

        • Faark 2419 days ago
          What about payed arbitrators? The bad incentives i can imagine are less bad than the current system. E.g. for a DMCA dispute, each party could give YT maybe $5 to have a look at the case, winner will get his share back. If the case isn't obvious both get it back and a court has to decide. Non-payment means yielding. For YT vs user, YT would only pay in case their algorithm is wrong. Those cases would need some 3rd party arbitrator, thou...
    • zeep 2420 days ago
      It's ridiculous easy to sensor the web for corporations and Governments (but for some reason, they keep saying that once something it's posted online, it's over and it will stay there forever)... which is why I use youtube-dl for the videos that I really want to keep...
    • carvalho 2420 days ago
      Why wouldn't Youtube pass on any extremist material on to the FBI before making it unavailable for the wider public?
  • AdmiralAsshat 2420 days ago
    It was folly to think that YouTube would be a safe place to document war crimes. YouTube is a distribution channel, not a preservation channel. Its ease of use certainly makes it an attractive option to upload things quickly, but anything of historical significance should have the video raws immediately turned over to a human rights organization for preservation.
    • ballenf 2420 days ago
      Youtube used to be a distribution channel but it slowly became an ad delivery tool with content along for the ride. Like 99% of other free (and even some paid) internet sites.

      Either way, totally agree that it's a tragic situation.

      The only sympathy I have for Google is that trying to separate the good vs. "evil" (as in "Don't be evil.") content is a monumental task that machine learning will probably never be capable of performing. So they're left with the choice of spending an inordinate amount on human review and detailed research or just make wildly over-broad removals.

      I'd rather they leave up more rather than less, but they tried this approach and it almost lost them every major advertiser. So continuing down that road would potentially lead to the whole platform losing viability. Maybe some would like that outcome but these historical videos would be just as lost.

      Maybe we'll see the pendulum swing back in an effort to reach a more reasonable middle ground.

      • wbl 2420 days ago
        It's hopeless for people. Is that footage of ISIS burning a prisoner alive evidence of war crimes or propaganda? How about both! Ban child nudity and depictions of cruelty to children: you've just banned a Pulitzer prize winning photograph from the Vietnam war.
      • nxc18 2420 days ago
        Google has an inordinate amount of money to spend on inordinate amount of content review. Deleting history and evidence of war crimes is Evil; good thing Google had abandoned their earlier aspirations.
        • jptman 2420 days ago
          I think you are underestimating the amount of video that gets uploaded to Youtube. Smaller sites may be better at stuff like this but that's solely because they don't have quite the amount of content. Hundreds of hours of videos are uploaded per minute. They may eventually have AI good enough to do a better job, but this is an unprecedented amount of content to review.
          • CaptSpify 2420 days ago
            So?

            If youtube can't handle the load, then they shouldn't claim that they can. At the absolute lest, they need a usable appeals process. If they can't do that, then they need to own up to it, and stop allowing anyone to upload anything.

            • koide 2420 days ago
              Sorry, but why? Google owns nobody anything. If you have war crime evidence or other important content to publish, upload it to YouTube and all other video sites, letting archive.org and other human rights organizations in the loop. You can put the press in the mix as well.
              • CaptSpify 2420 days ago
                Because it's a shitty thing to do, and we shouldn't encourage shitty behavior? Just because they want to take the cheap and lazy route doesn't mean that we can't criticize them for it.
            • fastball 2419 days ago
              Why? Own up to what?

              YouTube is a company that is beholden to advertisers. YouTube wants/pays for videos that they can put ads in front of. If your content is not the type of content YouTube can wrap in ads, and you need longevity for your videos, YouTube is not the platform for you. YouTube never claimed to be an everlasting video storage space for all your video needs, so I'm not sure why you're expecting that of them.

              • Kequc 2419 days ago
                It's in part because that is what Google has done in the past. They give you unlimited space to store your photos, your email, and your data, why wouldn't you expect to be able to store video.

                Video has become politicised because it is a popular medium for political topics and one which can be rapidly produced and consumed. Advertisers are perhaps influencing Google's decisions on this, but they are equally political. The debacle with the diversity memo is one example of biases inside of the company. There have been many more examples over the last years, one such example is censorship across the board of conservative commentators.

                It goes on in Facebook, Twitter and so on. So we have to wait for competitors to turn up, how many years will that take? Is that a responsible route considering the foothold these companies have?

                • fastball 2418 days ago
                  Yes, absolutely, you have to wait for a competitor to turn up. Or you can build a platform yourself, it's not difficult to do, especially in 2017. Hosting is cheap, video codecs are open source, there are a lot of companies that are willing to negotiate advertising partnerships. YouTube is a private businesses, fuck your privilege.

                  ALSO: I don't know about you, but this so-called "censoring" of conservative commentators doesn't seem to have worked. I see more young people skewing right/centre than I did 10 years ago. You can call it censorship if you want (because sure, that's probably the best term), but implying that a company selectively hosting content is the same thing as limiting the free exercise of speech is absurd.

      • ikeboy 2420 days ago
        Leave them up, but remove ads?
      • davidreiss 2419 days ago
        > The only sympathy I have for Google is that trying to separate the good vs. "evil" (as in "Don't be evil.") content is a monumental task that machine learning will probably never be capable of performing

        That's an impossibility because evil depends on perspective.

        This is why we have free speech rights in america. If we determined others to limit our free speech by their perception of evil, then atheist speech and lgbt speech would be banned. Pornography would be banned along with "offensive" music. Hell books like huckleberry finn would be banned.

        This is why we cherish free speech rights in america ( or we used to ) and why we have the saying "You have the right to be offended".

    • osteele 2420 days ago
      One issue with YouTube is archival status. Another is provenance.

      At some point it will be as easy to create fake videos as to create fake text. It is unrealistic to expect people who aren't information literate about text will be literate about video, but I hope that there's a way for journalists to move away from YouTube by then.

      • Too 2419 days ago
        This is already happening.

        I was googling for baking videos the other day, when all of the sudden most of the hits I got was auto generated videos uploaded on youtube. They just had some panning stock photos in the background with text scrolling on top and super generic music, must be really simple to generate.

        There were hundreds, maybe even thousands, of them from various accounts, all conveniently linked to each other making more of them appear as relevant in the rightmost column.

    • eloisius 2420 days ago
      I doubt they're grieving lost footage. It's loss of access to that distribution channel.

      Documentation is pointless if it can't be distributed and used to effect change.

    • ComodoHacker 2420 days ago
      >should have the video raws immediately turned over to a human rights organization for preservation

      You're saying like every human rights organization have some magical means of preservation other than uploading to YouTube.

      • PeterisP 2420 days ago
        Something like a computer with a hard drive or an USB memory stick?
      • fastball 2419 days ago
        If it's so important than maybe, yes, these orgs should have some means other than YouTube.

        If it's so important, these orgs should maybe build their own platform that is purely about video longevity.

        Expecting a private company to host content they don't want to host is silly.

    • Mangalor 2420 days ago
      Where else should they be stored?
      • sologoub 2420 days ago
        How about a public s3 bucket curated by a non-profit?

        Tools exist already to upload to S3 from practically every device, especially Android.

        Curation would be much harder, but there is a lot of money in philanthropy and I could see some deep pockets contributing to that.

        YouTube is fine as a means of promoting and gaining awareness - world should know about these things, but it's not rational to expect a corporation (aka people within it) to act in any other interest other then it's own.

        Seems like this could be an interesting infrastructure non-profit for YC to fund.

      • Sir_Cmpwn 2420 days ago
        archive.org
        • osteele 2420 days ago
          I hope we're in a world where that can happen.

          At the least, we probably need “upload video to archive.org” mobile apps to make this as useful to journalists in the field as YouTube currently is.

          If the Archive grows as a journalistic distribution channel, it might then face YouTube's issues of copyrights, piracy, and other criminal use. However, the Archive could apply goals that are more compatible with journalism than YouTube can. Maybe sufficient philanthropic support could make this possible.

      • moogly 2420 days ago
        liveleak
  • cisanti 2420 days ago
    I have (had) a channel that had videos about missing people, their last sightings on CCTV etc. The parents of a missing person even used an embed video on their site of a CCTV footage. They emailed me if I still have the video because they need it.

    YouTube banned the whole channel for extremist/hateful content. Probably some of the videos/titles told the AI that the footage is extreme or some sort of glorification.

    I appealed on some form but don't even bother anymore.

    I hope YouTube as a video platform (not streaming) gets a serious competitor.

  • Iv 2420 days ago
    During the Arab Springs I suspected many police violence video would be deleted from Youtube. I had downloaded them to my server and posted everywhere the links for people to mirror them. Not a single person did yet.

    I have been amazed at the little importance people put on this kind of video. You have video evidence of crimes with faces appearing clearly. It can take 5 to 10 years for such events to calm down enough to reach a point where crimes can be prosecuted.

    And it is hard to blame youtube for that. They are considered the channel for Lady Gaga and silly cats video. Hell, I know 3 years old toddler who browse youtube unsupervised.

    In many places Youtube is criticized to promote violence and extremism by leaving these videos. I feel bad for them, they are between a hammer and a hard place.

    I just hope that the censored video are not totally deleted from their servers. They should have someone reviewing criminal videos and keeping them at the disposal of judicial authorities but even that opens a whole can of worms: do you obey only to US authorities (who do not care about war crimes in other countries)? Do you obey all world authorities including Saudi and Chinese?

    Anyway, that's youtube's problem, not ours. Simply, helping prosecute war crime is not part of Youtube's mission, so do not trust them for it. To anyone who feels this is important content, use youtube-dl and keep backups. Make torrents of it, share it around, make sure it does not disappear.

    And when some NGO finally realize that this content is precious, pump up your upload bandwidth and fill their servers.

    • flamedoge 2420 days ago
      Youtube isn't the only channel to upload videos. Check out webtorrent or bitchute for p2p based videos.
  • mnm1 2420 days ago
    Such AI coupled with the inflexible policies of companies like Google and Amazon is already starting to be a problem and will only get worse as it's deployed more broadly. Accounts are closed without recourse for invalid reasons and their owners treated like violators. Short of a law requiring explanations and an appeal process, I don't see this situation getting better ever. Yet another reason not to trust these companies or use their services that require creating accounts and agreeing to their bullshit TOS.
    • tenpies 2419 days ago
      > Such AI coupled with the inflexible policies of companies like Google and Amazon

      I would argue that there is a key difference in customer support which makes me much more confident in Amazon than Google.

      Google has non-existent customer support for the public and virtually non-existent customer support for paid customers. If something goes wrong with your Google product your best bet - even as a paid customer - is to contact someone you know at Google. Going through the official channels is a waste of time.

      Amazon, on the other hand, will bend over backwards to make sure you're satisfied - even if it loses money in that transaction. Refund decisions are mostly automated at this point, although human support for both vendors and buyers is there if needed.

      • mnm1 2419 days ago
        I specifically listed Amazon because they close accounts with no appeals and no reason. Just because you've had a good experience, doesn't mean that's universally so.
  • nnq 2419 days ago
    Maybe people should get their shit together and realize that true free speech include allowing videos that seek to recruit people into despicable organizations be available! Yeah, even Hitler had a right to say what he thought and it'a a good thing he had it, despite the consequences that ensued.

    The problem that needs to be solved is how to educate people into not being lured into those organization DESPITE having access to those materials... This kind of censorship is just as STUPID as banning drugs like heroin and cocaine (instead of just making them unavailable to children, or without a "license") or the "war on drugs".

    Imho the problem comes from the fact that corporations try to hard to be "democratic" about things and "please the majority". But this is not a good idea: sometimes the majority of 99% is against freedom, and they are wrong, despite being the 99%. And the majority should be opposed and freedom protected even when the cost is someone's blood. For me personally, there are these words from my native country's national anthem: "life in freedom or death [for all]"... and I will sure as hell fight, die or kill for them.

    • notahacker 2419 days ago
      True free speech also entails allowing private companies not to be compelled to host what they deem to be extremist material at their own expense. (whilst simultaneously being expected to take down copyright infringements, because whilst people are prepared to defend to the death certain groups' right to incite people to genocide, an unauthorised Bieber video is clearly going too far)

      That remains true even if their algorithm or human criteria for determining what is and isn't extremist material sucks.

    • Quanttek 2419 days ago
      Agree to disagree. This isn't really an issue of a lack of absolute free speech but instead has much more to do with AI gone bad. With any reasonable definition of free speech (i.e. limited) it would've been wrong to remove footage documenting war crimes - hell even with a highly restricted free speech.

      Freedom of speech is not inherenlty the highest value there possibly is. You should be able to defend something like "Yeah, even Hitler had a right to say what he thought and it'a a good thing he had it, despite the consequences that ensued." because, in my opinion, I rather restrict the freedom of one genocidal maniac than see the dath of 85 mio. people.

      Using your definition of freedom of speech we could easily justify not outlawing murder: "We should just educate people not to murder each other instead of banning it." Maybe banning can actually have a chilling effect on ie. hate-speech, heroin abuse? (While I'm for banning heroin, I advocate for providing services that ensure safe consumption (e.g. needle dispensaries, consumption rooms) and help prevent (further) abuse instead of jailing them)

      • nnq 2419 days ago
        > any reasonable definition of free speech (i.e. limited)

        First, "limited free speech" is not "free speech" anymore. Second, you're just not going to be able to "define" things anymore as you automate and replace with AIs more and more processes (the definition will more am more start to be "the practical implementation of the machine learning filtering algorithm and the choice of training data", anything else will be "approximations" since you won't be able to prove things about these statistical algorithms much). The choice will be either (a) full freedom + massive investment in mechanisms to manage the negative consequences of this freedom (let's start with education first, not only making it free for all at all levels, but also forcing people be given free paid time to educate themselves, not worked to dumbness 8+hrs/day and then expecting them to differentiate real-news from fake-news...) or (b) give up freedom and live in a "well managed totalitarian system" with "freedom for distractions and sex only" or some other deal like that.

        > we could easily justify not outlawing murder

        No, there'l a clear criteria: reversibility! If I say something incredibly hate enticing, I can be proven wrong, and I can even retract my words and say "I changed my mind" later, that should be ok. If I murder someone, that can't be undone, even if I say I changed my mind about murdering him, he's still dead and I've still proven that I'm capable of murder (imho all people are, but that's a different discussion...).

      • robertlagrant 2418 days ago
        'Using your definition of freedom of speech we could easily justify not outlawing murder: "We should just educate people not to murder each other instead of banning it."'

        Freedom of speech is a very high value because sometimes extremist views (such as "we should rebel against the British") are a good idea, yet if there are anti-extremism laws, this sort of idea would never be surfaced, because extremism can be defined however the enforcer chooses to define it.

        Why you think this also applies to murder?

    • ss248 2419 days ago
      You are right, but i don't think that's actually possible. It's not easy to educate general population. How would you teach critical thinking, if most people just don't want to?

      Current solution is to just babysit general population by essentially censoring information.

      • pas 2419 days ago
        Most people as in they don't want their kids to learn it, or they themselves don't want to learn it?

        Because kids don't want to go to school usually, yet they do. And luckily with a good curricula the school system could teach critical thinking with a significant efficacy. (And it could be added to common core)

    • noiv 2419 days ago
      It seems to me history stays in replay until we eventually understand freedom is more important than whatever risks are attached.
    • davidreiss 2419 days ago
      > Imho the problem comes from the fact that corporations try to hard to be "democratic" about things and "please the majority".

      Corporations aren't trying to please the majority. They don't care about the majority. Besides, the majority wants free speech.

      Most americans don't want jobs being sent to china, but the elite do. Which side do you think corporations chose?

      Youtube and the rest of social media are censoring because the WSJ, NYTimes, etc have been pressuring them to. And the WSJ, NYTimes and the traditional media doesn't represent the "majority", they are the mouthpiece of the elite.

      Think about it for nearly 10 years social media has been highly "pro-free speech". The WSJ, NYTimes, etc do hit pieces against social media and put pressure on them and all of a sudden, it's relentless censorship.

  • alexandercrohde 2420 days ago
    To me, if you want to regulate controversial opinions, you have to err strongly to the side of too-open.

    Remember, before the declaration of independence our founding fathers were terrorists/rebels. I don't mean this as a snappy hollow comparison. I'm saying fundamentally, you can't distinguish between a US soldier recruitment video and an ISIS soldier recruitment video without applying a moral context. How would an AI ever do this? And even if it could, who's moral retelling is the right one?

    Better in my mind to stay out of the censorship game altogether and promote a forum that is inherently structure in a format that incentivizes accuracy over emotions.

    • LoSboccacc 2419 days ago
      >US soldier recruitment video and an ISIS soldier recruitment video

      Somehow I doubt US recruitment videos have englishmen being decapitated as job perk

    • lerpa 2420 days ago
      Exactly, but in a world where people desire facebook and others to have "truth checking authorities", I don't expect things to get much better.
  • 013a 2420 days ago
    YouTube is balking at their own size. They're discovering what should have been obvious to anyone; the sheer amount of content entering their centralized system is impossible to moderate in any fair way. The only way they can manage is (A) prioritize quality moderation toward channels which are more popular, and (B) enforce the most bland, vanilla experience possible.

    They need to moderate because they are centralized, and their revenue demands it. We, as a society, need to create a better option. Not just another YouTube, but a seamless decentralized solution.

    • LoSboccacc 2419 days ago
      They need to metamoderate: let people tag video by content (nudity, violence, crime, death...) and as soon a video missing a tag gets flagged close the channel.

      User that enable viewing of certain tags csn't complain then, and google only needs to put enough legalese when enabling comtent

      They already doing that to an extent with mature content, so there's that

  • AmIFirstToThink 2420 days ago
    Why not create a setting that allows user to see YouTube as sanitized by their AI or all content?

    Allow people to chose content level just like they choose security level in browser settings.

    1. Legal content. May include content that violates YouTube content policy, but is legal in USA, or the country of the viewer. Maximum freedom of speech and maximum ability to see content that you may find offending.

    2. YouTube content policy met. Content that is legal and meets YouTube Content Policy.

    3. Legal, Meets YouTube content policy, Meets a certain org's taste. Like when you can pick a charity that you can donate to when you shop on smile.amazon.com. You can select the org whose bubble you want to live in. ADL, Focus on Family, Skeptics etc. The org bans content and it only is banned for people who opt into that blacklist on youtube.

    4. When user is not logged in they get AI filtered list but can select "all legal" or "all that meets content policy" filters, even when logged out. All others bubbles available to logged in users only.

    Advertisers can opt into certain bubble if they want, or opt out of certain content e.g. content deemed inappropriate by the AI?

    How does that sound YouTube?

    Doesn't the government security agencies want to know who is watching extremist content and who is not interested in it? How would we know who the extremist are if they fall back to person to person, in person, communication?

    • colejohnson66 2420 days ago
      Because YouTube doesn’t care about the videos; it cares about the advertisers. You can’t be a proponent of free speech (extremist propaganda) while trying to please advertisers. Also, with today’s political climate, people seem like they want anything that disagrees with them to be labeled as hate speech.
      • AmIFirstToThink 2419 days ago
        The advertisers would find it suitable to chase a bubble e.g. greens, nationalists, globalist, feminist, religions etc.

        I think a good strong case can be made to advertisers that their ad will only be served to people opting into a certain bubble. Or reverse of that i.e. show my Ads to all people except those who are in this bubble. Inclusion list and exclusion list.

        • fastball 2419 days ago
          That only works if all your "bubbles" are palatable bubbles.

          What advertiser is going to want to advertise in front of ISIS/Neo-Nazi videos?

          • AmIFirstToThink 2419 days ago
            Why, the store that sells Nazi paraphernalia on Amazon, of course. People who sell NAZI books.

            Jokes aside, why does it matter that Advertisers are not choosing certain 'unpalatable' bubbles to not be associated with those channels, that's perfectly fine.

            You seem to be concerned about extremist videos making money, I don't care about that at all. I just want all videos to be available unless legal system demands its removal after due process. Present sanitized content by default, present all content if explicitly requested, where an action from user says they want to see offending content.

            Advertisers should be allowed to chose the channels that they advertise on. Some may choose to advertise on default channel before the content is flagged. Let the buyer make the decision. Why is YouTube giving in to certain vocal Ad buyers to decide for entire Ad market?

            • fastball 2418 days ago
              1. Every video YouTube hosts on its platform costs YouTube money.

              2. YouTube recoups the cost and makes a tidy profit from ad revenue.

              3. Videos you can't put ads in front of can't be monetized.

              4. Videos that can't be monetized still cost YouTube to process, host and serve.

              5. As such, every non-monetizable video hurts YouTube's bottom line.

              6. Why would YouTube want that?

              • AmIFirstToThink 2418 days ago
                YouTube decided that it will demonetize videos. They preferred to do that for some complains by some Advertisers, rather than giving Advertisers tools to avoid/select certain channels or certain topics. Very similar concept to AdWords, BanWords! Put the Advertiser in control of where the Ad gets shown or specifically, where it doesn't get shown. Demonetization was an ideological decision by Google, it was a political decision.

                >As such, every non-monetizable video hurts YouTube's bottom line.

                Even every non-monetized video is still eyes on the screen for Google. NetFlix claims to compete with books & libraries for time of the day from its viewers. Yeah, Google may not show Ad on that video but the next one in autoplay/recommended list is still going to ring the cash register. It is actually better for Google because they may very well show Ads but not pay the De-Monetized content creators, videos that they created actually act as leads to youtube. These political videos is where a reader is sent to youtube from non-youtube sites, which is much more valuable to google and they are getting away by not paying anything for that. Leading video is much more valuable to google than a video in auto-play list. Not pay for leads but get paid for Ads on subsequent videos, very nice business model win-win Google developed for itself.

    • Buge 2420 days ago
      The whole point of the censorship is to stop Youtube from being a tool used for terrorist recruitment. If the terrorists can just check the box "all content" then it's useless.
      • AmIFirstToThink 2419 days ago
        Terrorist induction videos would clearly fall into Not-legal category. I think NSA can easily stand up resources to scan new videos as they arrive and tag as legal or not-legal. Or, we can demand a judge's decision on taking away someone's freedom of speech as the constitution clearly indicates.

        And, I have a hunch that three letter acronym gov organizations dealing with keeping us safe would rather know who is interested in terrorist induction videos, and track them, than just remove the videos and let opinions fester at an isolated individual level (lone wolf).

        • Buge 2419 days ago
          So you're saying that youtube should only ban illegal videos and allow any video that is legal? So youtube should allow porn, gore, etc?

          Freedom of speech only says the government can't stop your speech. It doesn't say that private organizations have to provide you a platform. Youtube also has freedom of speech. They have the freedom to filter and compile videos that they like and only show those. It's also not youtube's responsibility to optimize their site for helping the NSA to track terrorists.

          If I create a website that allows people to upload videos, it's perfectly fine for me to filter those videos and only show the ones I like. It's my website after all, I am allowed to control what is on it.

          • AmIFirstToThink 2418 days ago
            >So youtube should allow porn, gore, etc?

            It had, for ages.

            >Youtube also has freedom of speech. They have the freedom to filter and compile videos that they like and only show those.

            Sure they do. I wish they were honest about their political ideology before they touted the platform for all to come and participate.

            >It's my website after all, I am allowed to control what is on it.

            Sure you are. I just wish you had advertised as such before content creators invested time and money into the platform, creating user base for you. That's a bait and switch, no different than Apache to AGPL license change on 27.8.3 version of your successful GitHub project.

            A private entity and a public corporation and government are three different levels of individual discriminatory behavior. A private person or business can employ any discriminatory practice they see fit, as you yourself say. A government is held at highest standard of equality for all. A publicly traded corporation is somewhere in between the government and a private citizen.

    • davidreiss 2419 days ago
      > Why not create a setting that allows user to see YouTube as sanitized by their AI or all content?

      Because it's about controlling narrative, controlling propaganda and giving the "media/news" space back to traditional media.

      That's what people have been asking from sites like reddit and HN for years now. Give people the option view the raw threads ( uncensored ) and the moderated ( censored ). But neither are interested or have indicated they will. Instead, on reddit at least, there is more and more censorship.

      > Advertisers can opt into certain bubble if they want, or opt out of certain content e.g. content deemed inappropriate by the AI?

      Do you really think advertisers care? Do you really think corporate america cares? They don't have morals. China is brutal dictatorship and yet advertisers and corporations have no problem doing business with china.

      It's simply a matter of control. Who gets to decide what you and I see. Do the masses get to decide for themselves and control what they see or do the small group of elites? Per usual, the elites won and they get to decide.

  • userbinator 2420 days ago
    Yes, I could see how that classifies as "extremist material", but that's no reason to delete them...

    IMHO the gradual increase of (self-)censorship in the popular Internet is worrying --- one of the most compelling things about the Internet as it existed was that, from the safety of your own home, you could see and experience things that would otherwise be impossible to access. Now it seems it's turned into a massively commercialised effort of "curating" content so that it doesn't offend anyone, and only results in more profits for advertisers.

    • cat199 2420 days ago
      Old internet is still there, you just have to not be too lazy to host your own content..
  • Alex3917 2420 days ago
    Since my understanding is that covering up a war crime is itself a war crime under Complicity doctrine, could Google executives get charged for this in The Hague?
    • avip 2420 days ago
      I urge you to report this inhumane case and follow-up with a "show HN" (I filed an ICC case, then THIS happened).

         otp.informationdesk@icc-cpi.int
    • ocdtrekkie 2420 days ago
      I am not an expert in international law, but you would still have to prove intent, I believe. It's hard to prove intent on an algorithm that very simply is incapable of understanding the significance of its actions.
      • 7373737373 2420 days ago
        Deferring decisions to an algorithm doesn't absolve the owner from responsibility for its actions. If the consequences are unknown, why should it be allowed to use it?
        • PeterisP 2419 days ago
          For things that are crimes only if the intent is there, it certainly does absolve the owner from responsibility. There's no duty to preserve everything that might be evidence in some manner (because, really, everything might be) - if evidence gets destroyed as a byproduct of normal operations, that's not prohibited.
        • fastball 2419 days ago
          Because if the consequences are unknown, there is no intent, and intent is necessary for complicity.

          If Google/people at Google made an algorithm to intentionally delete criminal evidence, that would qualify. Having an algorithm that deletes lots of things, and happens to delete evidence does not.

        • CaptSpify 2420 days ago
          Couldn't that also imply that someone else would need to view the algorithm to verify it's intent?
      • justinclift 2420 days ago
        The parent post mentions they've been reviewed by a human (appeals process) and been rejected anyway.

        That should get past any "it woz the algorithm that did it!" arguments about intent.

      • tauntz 2420 days ago
        If I'd write an algorithm for a self-driving car that just drives it in a straight line with high speed.. I'd be still responsible if the car kills somebody and couldn't just say that "whoops it was just the silly algorithm" even though I didn't specifically intend to kill somebody.
    • cema 2420 days ago
      Unlikely, but an interesting argument.
    • dmurray 2420 days ago
      Google executives are overwhelmingly US citizens. The ICC has no jurisdiction over them.
      • meric 2420 days ago
        Do some of them have dual citizenship?
        • walshemj 2420 days ago
          a lot of the h1bs will have
      • mrout 2420 days ago
        The International Criminal Court has universal jurisdiction
    • gambiting 2420 days ago
      I'm willing to bet that in their hundreds and hundreds of pages of terms and conditions there is a paragraph saying that by using their services you give up your right to sue Google for war crimes.
      • leereeves 2420 days ago
        Even if there were, such contracts have been tried before and the courts simply ignore them.

        A contract can't override criminal law.

  • mtgx 2420 days ago
    I remember when I used to like - no, love - almost anything Google did.

    That seems like such a long time ago. Since then my attitude has changed to being mostly hostile towards Google, with every such event.

    Google should have never entered the "content game" and should have remained a neutral search and distribution (YouTube) platform. Once it went down the path of being a content company, it started "compromising" in all sorts of ways that were terrible for its users.

    I wonder if the higher-ups have even noticed this change in attitude towards them, and if they did, then they've probably decided that making money is more important even if they become the Comcast of the internet (most hated company).

  • monocasa 2420 days ago
    Have they checked with YouTube to see if the files are actually deleted?

    Like just because their gateway won't give you access to it doesn't necessarily mean that the bits have been scrubbed on the back end.

    Also: here's a project to archive this information.

    https://media.ccc.de/v/33c3-7909-syrian_archive

    • mustacheemperor 2420 days ago
      Unfortunately many of the original uploaders have since died in the war, and the deleted playlists were the only visible place the videos were accessible.
      • monocasa 2420 days ago
        Sure, but I'm saying that in a 'documentation of war crimes' context YouTube might allow a little peek behind the veil if the videos aren't deleted but hidden.
        • mustacheemperor 2420 days ago
          Agreed. I think the larger problem highlighted here is that in this case "YouTube" is just the faceless algorithm making these decisions, and accessing the judgement of a real human being is nearly impossible for the average end user.
          • PeterisP 2419 days ago
            They quite likely have some humans verify the content - however, the criteria most likely are quite straightforward (depictions of the deaths violate their guidelines).

            They don't provide the judgement because that only invites attempts of explaining and negotiation - they don't want to spend time to do a careful review of every contested video, they want to make an usually accurate final judgement with the minimum time investment possible (e.g. 5 seconds for a video), and they don't want to spend resources reading all kinds of reasoning and appeals, so they don't. And that is their right to do so - they can completely arbitrarily choose which videos to host on their site and which not.

            • mustacheemperor 2418 days ago
              Sure, but returning to the original point - by that token, why should YouTube have any obligation or even judicial ability to retain videos like this on "the back end?"

              It is indeed YouTube's right to vaporize any bits at any time. But when they are the leading video platform on the entire web by a huge margin they need to at least adequately present the reality of their content guidelines to users in countries like Syria who are probably not focused on researching alternative video hosts while trying to document chemical weapons attacks.

  • mschuster91 2420 days ago
    Once again, the only hope for customer service seems to be a (social) media shitstorm.

    Seriously, Google, Twitter and FB massively need to ramp up their customer service and not externalize the costs of a lack of support onto society any more. And there are many "costs": people being actively harrassed and intimidated, sometimes so far they are afraid leaving their house, due to hate speech or doxxing, a loss of historically relevant information as in this case, people locked out of vital emails or their businesses (e.g. when their Gmail account gets closed due to copyright violations on Youtube)...

    • TuringTest 2420 days ago
      > Google, Twitter and FB massively need to ramp up their customer service

      This is not going to happen; the whole point of their businesses based on offering free massive online services is that they are dirt-cheap by being run mostly automatically.

      No, the only way to fix the problem in those juggernauts, and protect the tiny individuals from getting caught and squashed in their wheels, is the mechanism that governments use to protect citizens from the worst effects of bureaucracy: having an ombudsman. A semi-independent service to receive complaints of severe abuse by the main service, and for which the primary goal is protecting users, not reducing costs.

      In some sense, this is how their PR department operates: they'll bring human attention and put all the required effort to fix an unjust situation, to clean the image of the company. The difference is that now the unjust situation needs to become a scandal, as you said, and an ombudsman would be required to examine all applications (either to accept them or reject them) as part of their official definition.

      • mschuster91 2420 days ago
        > No, the only way to fix the problem in those juggernauts, and protect the tiny individuals from getting caught and squashed in their wheels, is the mechanism that governments use to protect citizens from the worst effects of bureaucracy: having an ombudsman. A semi-independent service to receive complaints of severe abuse by the main service, and for which the primary goal is protecting users, not reducing costs.

        Yeah but who would finance the ombudsman? To service a country like Germany, I'd bet it needs around 2.000 FTE minimum (Facebook alone is opening a new, additional 500 FTE centre right now, and that's just for deleting the worst of the worst hate speech and porn). That's around 5M € per month.

        Having it paid for by taxpayers is the true manifestation of cost externalization, having it paid for by the services quickly leads down to "do whatever $company wants", and having it paid for by users leads to service only for those who can afford it while leaving the poor and most vulnerable persons in the rain.

        • TuringTest 2420 days ago
          Maybe a non-profit foundation could assume the task, financed by Google's PR money and wealthy patrons, and having some sort of community representation to take care of the needs of users.

          Modern society is looking a lot like time-compressed feudal eras, with corporations taking the role of noble families; imho the time of powerful independent bourgeois professionals, thriving under the rule of law in nation states, is coming to an end. Maybe we should start looking at the medieval ways of organizing a fair society, at least as the starting points for the new social structures that will be unique to the digital era.

  • tempodox 2406 days ago
    Mis-applying bad so-called “artificial intelligence” is still a prime example of natural stupidity.
  • brndnmtthws 2420 days ago
    If you use YouTube, you are subject to the whims of that private corporation, regardless of whether it's right or wrong.

    They should find a way to host the content somewhere else.

    • emilsedgh 2420 days ago
      Well almost everything we use these days belongs to a private organization. And we're fully subject to their whims.

      Something is wrong about that.

      • crusso 2420 days ago
        And yet, without those private organizations and the free markets that allowed them to thrive - the whole notion of having all that is preserved online by them would be utterly unimaginable. The internet as we know it, capable of passing all this bandwidth to individuals around the world wouldn't exist. At best, it would still be a toy in government labs and academia like it was before it was commercialized.
        • emilsedgh 2420 days ago
          I'm not against privatizing internet. I'm sure it wouldn't have been this useful if it wasn't for private companies.

          I'm just saying the notion that private companies (on the internet or not) have almost zero responsibility and we're subject to their whim is wrong.

          I guess that's where the rule of law is supposed to enter the discussion.

        • leereeves 2420 days ago
          That's hardly certain. The government built the interstate highways, it could have built the Internet as well.

          Privatizing it was a political decision.

        • icebraining 2420 days ago
          At best, it would still be a toy in government labs and academia like it was before it was commercialized.

          This is demonstrably untrue. Minitel was already much more than that, despite being designed and implemented by a division of the French Ministry of Posts and Telecommunications.

        • cat199 2420 days ago
          You're conflating the products of an economic system with its organization.. which is not to say you are wrong necessarily, but this is not a neutral, objective, or provable position to take..
        • tpallarino 2420 days ago
          I don't see how this would be any different than a library. We need new, good public institutions such as this.
    • WalterSear 2420 days ago
      They are non-technical people posting from a war zone.

      Or were, some are dead now, and have taken their videos with them.

  • raverbashing 2420 days ago
    Why are people storing evidence on Youtube again?

    Not blaming the victim, but at this point most of Google services have not shown to be reliable, especially if you require some kind of thinking human behind a decision

    • gambiting 2420 days ago
      If you are a rebel in a contested part of the world, submitting videos to youtube takes literally 2 clicks on any cheapest android phone, and then the entire world can watch it. Everything else requires at least a bit of technical expertise which you might not have.
      • lazyasciiart 2420 days ago
        There is probably someone in that 'entire world watching it' who has the time and technical expertise to help make backups. If nobody else knows about it, then it wasn't going to be used for evidence of anything anyway.
      • raverbashing 2420 days ago
        Correct.

        You could also save it to Google Drive or other "Cloud Backup" solutions like OneDrive/Dropbox

        But I guess hindsight is 20/20, and I would probably have trusted YT more than I should

        • gambiting 2420 days ago
          If you submit a video to google drive and start giving people links then it will be very promptly disabled with a warning that if you want to share it you need to upload it to youtube. Same with OneDrive/Dropbox. It's fine to share with a few people, but go into hundreds and it gets quickly shut down.
          • raverbashing 2420 days ago
            Yes, don't share it from those, but you can upload to both
  • Anagmate 2419 days ago
    I feel like YouTube uses its monopoly to create a walled garden focused on (in their own words) advertiser-friendly content.

    The thing is, it makes perfect sense from their side - they will make people angry, but why would they bother if those people can't go anywhere else?

    I'm starting to feel that a competitor providing the same quality of service while allowing all kinds of videos has a chance to succeed. It's OK to have both child videos, porn and Syrian documentation, as long as you can filter - maybe have some sort of a "curiosity" slider that filters child content on one side, YouTube content in the middle and all content to the other side. Also some category toggles,... If you're unhappy with the current selection, just take a few minutes of your time and change your preferences.

  • anotherbrownguy 2420 days ago
    Given that all of the videos happen to be anti-ISIS... and YouTube happens to be owned by an evil empire in bed with American military industry which created ISIS... the AI must have figured out that the videos could be a threat to its masters.
  • AmIFirstToThink 2420 days ago
    What did they train the AI on to deem something 'extremist'?

    Should we get to see the training data used and labels?

    Or is this the modern day equivalent of credit score algo, something that can have huge impact on lives, but you are not allowed to know what it is.

    This is bad.

    • nthcolumn 2420 days ago
      We will have to get used to this - people hiding behind their AI's skirts saying 'Wasn't me - she did it'.
      • AmIFirstToThink 2419 days ago
        They are still liable. They unleashed it upon the world.

        Can't wait till they are sued out of their bubble.

  • wyager 2420 days ago
    YouTube is a really horrible service for content creators. For this type of content, you're practically probably best off with LiveLeak (which, incidentally, seems to be a much better source of breaking news than YouTube these days). Ideally, we'd all switch to LBRY or some sort of IPFS video distribution or something, but that will take time.
    • tpallarino 2420 days ago
      Yeah wow, an audience of billions having instant access to your content at the click of a button. So terrible for content creators. Most of these content creators wouldn't even exist without Youtube, they'd be working in a cubicle somewhere forwarding memos.
  • dickbasedregex 2420 days ago
    Screw YouTube's automation across the board. It's horrendous and lazy.
    • mtgx 2420 days ago
      Yeah, forget about beating Go and StarCraft 2 top players. How about making the takedown of YouTube videos actually fair for a change?
      • ocdtrekkie 2420 days ago
        Games are still 'easy' in comparison to this sort of topic for AI. Bear in mind, every game they've had an AI play still has a firm set of rules.
  • cyanexttuesday 2420 days ago
    Google is seemingly more and more a regular almost evil corporation.

    I miss the days of "don't be evil".

  • carvalho 2420 days ago
    War crime evidence can also be extremist material. It is often repackaged as propaganda to rile up new troops.

    Give evidence to the courts or police. Don't upload it to a video entertainment site and expect it to stay up, despite skirting their rules.

  • greyman 2419 days ago
    As I understand it, this is the result of Google itself having quite a strong political opinions, at least recently. They profiled themselves as being leftist/progressive... their software just enforces this.
  • bedros 2420 days ago
    very related to this article about facebook [0]

    corporations control what info passed to people, and create their own version of reality, but blocking what they don't agree with.

    I know it's AI, but seems that google appeal agrees with AI decision.

    people should read Noam Chomsky's Manufacturing consent book, here's interview about it in 1992 [1]

    [0] https://news.ycombinator.com/item?id=14998081

    [1] https://www.youtube.com/watch?v=AnrBQEAM3rE

  • snakeanus 2420 days ago
    It seems that we really need to find a new distributed/decentralised censorship-resistant way to distribute videos.
  • ajarmst 2420 days ago
    YouTube does not seem to me to be an appropriate medium for "war crimes evidence". Evidence needs documented provenance, chain-of-custody, storage integrity, affidavits, etc etc. Why does this evidence need a high-bandwidth publicly accessible and searchable interface? For what purpose?

    To be honest, if you have evidence of a war crime, I hope your plan to seek justice doesn't depend on Youtube.

  • ajb 2420 days ago
    Douglas adam's 'Peril sensitive sunglasses' are nearly here.
  • floatingatoll 2420 days ago
    In case it's not already apparent, there's a business opportunity here for someone to automate "set up an S3 bucket and host videos in it" as an app that uses an API key, so that you simply provide the key to the app and it manages your video collection, gives you a UX to it, and charges you a fee per month.
  • tetromino_ 2419 days ago
    Often there is no difference between war crime evidence and war crime glorification that machine learning could discern. Exactly the same content could be interpreted as "look at us do great things in defense of our noble ideals!" and "look at these monsters do horrific things for no justifiable reason!".

    The difference is in the audience's mindset - which is only partially influenced by the uploader's intentions, and partially by how other pages and channels link to the video and present it, and partially by historical context (the same content can acquire a different interpretation five years down the road). Machine learning cannot be expected to emulate that.

  • dandare 2419 days ago
    I am very concerned about Google using AI to filter hoaxes from search results. Government testing syphilis on black population or selling drugs to fund terrorism? That must clearly be a hoax, right?
  • redthrowaway 2419 days ago
    One of the most interesting developments in AI will be watching how we respond to human rationality detached from human morality. Programs that optimize for practical outcomes are going to come up with a whole host of solutions that we consider abhorrent, not least because the mere notion that that solution is a practical one riles our sensibilities.
  • chinathrow 2420 days ago
    The revolution will not be televised.
  • bryanrasmussen 2418 days ago
    I find this interesting in comparison with the google api that detects toxic comments. I suppose we'll be seeing the same sort of situation in comments sections (less irritating though)
  • TheRealPomax 2420 days ago
    To be fair, YouTube is under no obligation to some greater good; it's just a video hosting service. Expecting it to "preserve footage" and any footage at that, is a strange expectation.
    • cisanti 2420 days ago
      Not an obligation, but their mission statement is "To organize the world’s information and make it universally accessible and useful."

      Guess they need to change to "information that we and our advertisers agree with."

      Yes, I know they are different companies under Alphabet but it doesn't matter. Google has become a monster and too big, powerful.

      • camus2 2420 days ago
        > but their mission statement is "To organize the world’s information and make it universally accessible and useful."

        It's just marketing. If you really want to see what the actual "mission" is read the TOS. Google,Facebook,Twitter and co like to boast about their humanitarian and humanist stance, the Apple way, when it comes to their relationship with their users but that's all a lie. The moment the need of their users doesn't match their financial interest all bets are off. The shit-storm triggered by a few outraged online publications and announcers a few month ago is a demonstration of that fact.

        Independence and freedom of speech online has a price and "users" are going to find it out the hardway when Google refuses to host their content for political reasons.

        People already forgot, that the tech to share content online already exists, it's called RSS and Google,Twitter,Facebook and co want it to go away.

      • dickbasedregex 2420 days ago
        Agreed. Google just grosses me out these days.
    • beejiu 2420 days ago
      An organiziation should not be free of criticism simply because it does not have an obligation to do or not do something.
      • tentaTherapist 2419 days ago
        Yes, but lots of people are suggesting regulation, which is more than just criticism.
    • Sir_Substance 2420 days ago
      >To be fair, YouTube is under no obligation to some greater good;

      Eeeeeeeeeeh....I don't know about that.

      Lots of corporations today target "owning" a certain aspect of humanity. Facebook "owns social", Google "owns search", and LinkedIn is having a jolly good swing at "owning recruitment". Youtube wanted to "own video" and by and large it has succeeded. I'm not sure they get to have that position consequence free though.

      I'm increasingly of the opinion that companies that manage to pin an entire market implicitly take on a social responsibility, and lots of them are not shouldering it appropriately.

    • jacobr 2420 days ago
      You are of course right in theory, but it's not good enough to let that justify evidence of war crimes getting lost.

      If you film people getting shot at in a demonstration and want to get the word out, chances are you use a popular social network. You might not have any further knowledge, or not be able to (imprisoned, fleeing, dead, like the majority of Syrians) put the video elsewhere.

    • mtgx 2420 days ago
      Yes, but the more people realize what an awful platform YouTube is for them to keep their videos, the better.
    • tomjen3 2420 days ago
      And we are under no obligation to not hurt their PR over this.
  • StreamBright 2419 days ago
    Torrent based Youtube alternative when? I think the technology is ready to move all of the content to a distributed system where it cannot be censored.
  • mirimir 2419 days ago
    So does YouTube want to look like an ISIS supporter? Or at least, that it doesn't approve of criticizing ISIS?
  • ekianjo 2420 days ago
    Is it possible to host such videos on archive.org ? is that a valid option?
  • devpalmari 2419 days ago
    hope YT did a soft-delete on those files...
  • AmIFirstToThink 2420 days ago
    And, come to think they had me convinced that this was not going to happen for few decades.

    I think YouTube went down pretty fast and without fight. The ideological takeover of Facebook and Twitter raged on for few years. I think YouTube was taken over literally overnight. I remember being appreciative of YouTube just a few days back.

    Guess, time to cancel my $15 Youtube Red Family membership. Ugh, I really hate ads on YouTube. And I was happy to give my $15 month over month. But, I can't fund Youtube anymore given what they are doing. $15 to Youtube, $10 to NetFlix, $10 to Amazon, with $35 a month, I can sponsor ton of content on Patreon that I like. My subscription list on YouTube is not 35 people long, I think it would work out.

    Never ever I thought I would type these words... break up Google and Facebook and Amazon.

  • crusso 2420 days ago
    should be required by law

    If your videos don't pass the algorithm, post them somewhere else rather than reaching for the government hammer.

    Youtube/Google has every right to run their business of posting or denying video content the way they see fit without justifying it to you, free user of their service.

    If you think they're making a bad business decision and that there's a need for a video service that gives great explanations when they deny your videos, start such a service.

    • dang 2420 days ago
      We detached this subthread from https://news.ycombinator.com/item?id=14998738 and marked it off-topic.
      • crusso 2420 days ago
        On second thought, nevermind. I realized that I'm just wasting my Saturday here. Have a nice day.
    • icebraining 2420 days ago
      If you think they're making a bad business decision

      No, that's not what people arguing for a law think, and if you don't understand that, you can't make an effective argument rebutting their position.

      The implied position is that Google is doing something bad for society, even if it's good for business. You may disagree that this is bad for society, or that even if it is, it's still Google's prerogative, but you should at least understand the argument if you want to have a meaningful discussion.

      • jMyles 2420 days ago
        I actually thought crusso did a good job of rebutting that argument given the context of a HN comment that presumably can't stretch on for volumes of political theory.
      • crusso 2420 days ago
        You ignored the first part of my post where I said: post elsewhere rather than appealing to government authority. My last statement was suggesting a better idea in a free society than to try to bit-by-bit destroy that free society with yet more overpowering, overintrusive government.

        The implied position is that Google is doing something bad for society

        That completely doesn't compute. Google isn't in business to make things better for society. If they are, maybe they should give all their profits to charity, stop hosting cat videos and instead become some kind of hoster of national public content? Who said that was their business?

        I mean, why not argue that youporn.com isn't hosting these anti-ISIS videos either? They host videos. They could start hosting anti-ISIS videos to archive them.

        • jMyles 2420 days ago
          I wholeheartedly share your "don't run to government" sentiment.

          However, this sentiment bugs me:

          > That completely doesn't compute. Google isn't in business to make things better for society.

          If we're going to refrain from seeking the violence of state intervention when actors do things we don't like, don't we then have to count on business (and more generally, the market) to do things that we do like?

          Yes, I want Google to be in business to make things better for society. I'll go even further: I want Google to position itself so that its profit depends on it doing good for society. I was proud of Google when it aligned things this way re: VP8 / webm.

          If the violence inherent in the state is undesirable - and I agree that it is - then we need to build a society in which the free market selects for societal good.

          • krapp 2420 days ago
            >If the violence inherent in the state is undesirable - and I agree that it is - then we need to build a society in which the free market selects for societal good.

            That's not a free market though.

            The unfortunate truth is, if you want "societal good" to be guaranteed, (given some definition of that) you have to force it to be so - which is what the violence of the state is intended for. Otherwise markets take the path of least resistance to greatest profit, and that may not be a path which benefits society as a whole.

            • jMyles 2420 days ago
              > That's not a free market though.

              > markets take the path of least resistance to greatest profit

              I assert that a society in which love and compassion are the orders of the day will, in the demands and price signals it sends to the market, not abide by this orthodoxy.

              > if you want "societal good" to be guaranteed, (given some definition of that) you have to force it to be so - which is what the violence of the state is intended for

              The idea that wonderful ends can spring from such unseemly means is not in keeping with what I am able to observe of the universe.

              • krapp 2420 days ago
                >I assert that a society in which love and compassion are the orders of the day will, in the demands and price signals it sends to the market, not abide by this orthodoxy.

                Such a market (and society) would seem contrary to human nature, therein lies the problem. People are willing to tolerate mob violence, rape and slavery to spend a bit less for their smartphones.

                    If men were angels, no government would be necessary. If angels were 
                    to govern men, neither external nor internal controls on government 
                    would be necessary. 
                    --Alexander Hamilton
        • icebraining 2420 days ago
          Eric Schmidt disagrees with you: "In general, I think our mission is to use technology to really change the world for the better."

          But anyway, you keep trying to argue with me, while I'm only making the point that you should understand the argument. I'm not sure if you're unwilling to try, but in any case I'm not interested.

          I actually sympathize with the position that we shouldn't have laws requiring Google to host this content, by the way.

          • crusso 2420 days ago
            xx
            • icebraining 2420 days ago
              It's not ad hominem because I'm not trying to rebut your argument. In fact, I'm explicitly saying I'm not engaging with the argument you keep making. Your post just underlines that you're not actually listening to the people you're talking to.
    • AmIFirstToThink 2420 days ago
      And, Government has right to break up monopolies.

      I now fully support breaking up Google, Amazon, Facebook and Microsoft; may be Apple.

      Executives at publicly traded companies have no right to enforce their individual political views as company policies. They are not privately held companies, they are public companies who are held at higher standards of equal treatment to all.

      What if Bic and Mead said you can't write a opinion that we don't like using our pen and notebooks?

      What if US Postal Service said you can't send a snail mail if it contains references to UPS or FedEx?

      • crusso 2420 days ago
        And, Government has right to break up monopolies

        Youtube is successful and even dominates the space, but is not a monopoly. There are other video hosting services that can be accessed by anyone. You can start your own if you think there's a need. The fact is that people like Youtube.

        Monopolies have a specific lock on their customers or their supply of a limited resource. That's why people are okay with breaking them up. Not just because they're really popular and successful.

        What if Bic and Mead said you can't write a opinion that we don't like using our pen and notebooks?

        Bic and Mead don't host your content. You buy their tools to create your own content and then publish it yourself, give it out yourself on media you paid for, etc.

        What if US Postal Service said you can't send a snail mail if it contains references to UPS or FedEx?

        The USPS is a government-supported entity. Special rules apply for entities that have government laws behind just them.

        If a delivery entity opened your packages to evaluate content, that would present an entirely different sent of problems for them and their customers - because the content is private. Youtube videos are public. That's the point. Youtube has determined for their own reasons that extremist content hurts their business model.

        • AmIFirstToThink 2420 days ago
          >Bic and Mead don't host your content. You buy their tools to create your own content and then publish it yourself, give it out yourself on media you paid for, etc.

          The people who bought into YouTube as a platform, the content creators and their followers, paid for by their time and watching ads into the platform. Remember, when you are not paying for the service, you are the product. The people who adopted into YouTube platform paid for it's success. YouTube was sanctuary when Facebook and Twitter were cracking down on Free Speech. Now, YouTube changing their position is clear bait and switch. You allowed people to use the platform till you are successful, taking their time and energy to drive your ad revenue, now you are cracking down on content you don't like. It is true that it will be relatively easy in this day and age, to secure funding outside of YouTube Ad revenue and host content yourself. Startups will start doing just that. YouTube will look at this moment in time where the executive's individual political preferences were turned into company policy, destroying shareholder value.

          >Monopolies have a specific lock on their customers or their supply of a limited resource.

          YouTube changes are bait and switch on content creators and consumers who made the platform successful. DeMonetizing someone overnight after years of efforts into the platform and acquiring followers is extreme power held by a corporation.

          >If a delivery entity opened your packages to evaluate content, that would present an entirely different sent of problems for them and their customers - because the content is private.

          My argument precisely for Net Neutrality. :-) Don't look at content, I bought data transfer, ISP should just forward it on using internet routing rules, just like Post Office does.

          • crusso 2420 days ago
            paid for by their time and watching ads

            The ads pay for the content you watch at the time. It doesn't pay for your right to dictate the terms of their service in perpetuity. Thus, this whole line of thinking is a non-starter. Further, YT is not required to support whatever business model you build on top of theirs unless they've agreed to a contract of some sort. Do you know of such a contract?

            destroying shareholder value

            Maybe so, but their rationale at the moment is that extremist videos create a hostile environment for their users. They would rather not perpetually expose their preferred users to the toxicity of extremism. It's arguably a move to increase shareholder value. If you think they're wrong, don't be a shareholder.

            • AmIFirstToThink 2420 days ago
              The content policy has changed following political ideology.

              Now content that does not violate any terms of conditions if also getting removed/demonetized, again following political ideology.

              You are focused on Google being able to do what they are doing, I agree, they are quite within their rights to do so.

              But, I see this situation same as a Github project uses Apache license to begin, to get adopters, to get eyes that find and resolve bugs, eyes that ask for much wanted features, after being successful the project changes to AGPL or Propitiatory closed-source license selling Enterprise Edition. This is slap in the face of people who gave time to the project.

              Anyone who spends time in modern software industry, like an user on hacker news, would clearly see the similarities between the YouTube situation and bait-and-switch on license for successful open source software.

      • macintux 2420 days ago
        When you find something Apple has a monopoly on, you might have a point there.

        The only monopoly they've ever had is on taste, and for a short period of time I suppose you could argue MP3 players.

        • jonex 2420 days ago
          Devices capable of running apps designed for IOS. Selling native apps to iPhone users. Charging users within native apps for IOS.
          • macintux 2420 days ago
            I suggest you re-read https://en.wikipedia.org/wiki/Monopolization.

            Defining a market as "the products Apple sells" for the purpose of determining that Apple has a monopoly on that market is irrational.

            The iPhone is one competitor in the smartphone market. Apple clearly lacks a monopoly.

        • AmIFirstToThink 2420 days ago
          iMessage
      • ComodoHacker 2420 days ago
        >I now fully support breaking up Google, Amazon, Facebook and Microsoft; may be Apple.

        OTOH, only their shitloads of money allow them to resist government demands to hack a user's phone or hand over users' data in their cloud. Every cloud has a silver lining.

        • greenyoda 2420 days ago
          Microsoft, Google, Facebook and Apple were all participants in the NSA's PRISM surveillance program that Edward Snowden disclosed:

          https://en.wikipedia.org/wiki/PRISM_(surveillance_program)#M...

        • thaumasiotes 2420 days ago
          > only their shitloads of money allow them to resist government demands to hack a user's phone

          Their extreme centralization makes it much easier for the government to make those demands. I think you're more right than wrong, though.

        • AmIFirstToThink 2420 days ago
          Government changes, at least every eight years.

          The fiefdoms of AFGAM are there for pretty much life.

    • yeukhon 2420 days ago
      Yes, absolutely! We solved the problem!

      No really, seriously, this idea is so flawed.

      * You don't like the government? Move or start your own government!

      * You don't like your house? Rebuild it or buy another house!

      * You don't like the way Earth is being run? Move or RIP!

      * You don't like the way the school is being run? Move to another town or build your own school!

      * You don't like the fact oil is so expensive? Drill your own oil!

      * You don't like this comment? Flag it or deal with it or build your own HN.

      * You don't like the way anything is done or served? DIY all the way.

      * You don't like your local Starbucks? Run one or go to Dunkin' Donut.

      * You don't like the way hospital runs? Run one or don't go.

      Something more extreme?

      * You don't like your parents? Disown your relationship with your parents or make new parents!

      * You don't like your child? Disown your child and make one again.

      Why don't we dedicate our own life building things every time we don't like? Because we have better things to do. We deserve to complain and we have every right to dislike a service and we don't need to discredit our expectations.

      Users want real time PvP and exchange/trade in PokemonGo, but after a year they are still not available. So why don't we build a new PokemonGo? Because we need to rent servers, build the code, maintain the code, make deal with Pokemon rights owners (which is impossible for a small company). So we bend and yield to Google, because we have no better alternative.

      This incident teaches us a few things;

      1. Please do not think AI/algorithm is smart enough to replace an actual human. Even though human carries biases and are sloppy too, but algorithms are just as bias, if not, more bias and more sloppy (dealing with "abnormality" and edge cases) than a human being will ever be. The hype "AI" will take over... we are nowhere near that.

      2. YT is too popular to listen. Losing a few thousand users means nothing to YT. We will soon forget about this in 5 hours and go back to do our own things.

      • crusso 2420 days ago
        Appealing to government authority every time some business does something you don't like should make your list of "so flawed" solutions.

        1. No one said it was, but it's their business. We live in a society that allows you to start up your own business in days if you don't like the offering of another private business. Giving up immediately and suggesting that the wielder of force in the society (government) jump in as a parent is usually a fail.

        2. That still doesn't give you the right to dictate how they run their business?

        There's definitely something going on in our society these days where everyone wants authority to solve all their problems without acknowledging all of the potential problems that strengthening authority leads to. The article below talks about victimhood, but it mainly tries to help illuminate where this appeal to authority strain is coming from.

        http://reason.com/blog/2015/09/11/victimhood-culture-in-amer...

        • sirtaj 2420 days ago
          It's a common response even here on HN when companies behave contrary to the social good - "it's their job to make their investors profit while working within the law. Don't like it? Change the law." Well, don't be surprised when they do. Why complain now?
        • anigbrowl 2420 days ago
          Proposing a legal change isn't appealing to government authority. It's asserting that we should formalize a social attitude into a rule. Complaining about government as paternalistic intervenor ignores the right of people to shape government to work on their behalf, which is the reason they are instituted to begin with.

          So what that society 'allows' you to start up your own business? The economic and logistical barriers to doing so in competition with a monopolistic are significant, and overcoming anticompetitive tactics of incumbents is horrifically wasteful. People have every right to set general standards for how they want business to be conducted and promulgate those standards by law.

          Rather than warding off government as some remote and malign force, we should be seeking to make the process of governance more responsive to the public interest. The stability and consistency that allows markets to operate is government's primary product.

          • nnnnnande 2420 days ago
            The fact that this is being down voted is such a shame. It's a good addition to the discussion and it shouldn't be down voted just because one does not agree with it politically.

            "People have every right to set general standards for how they want business to be conducted and promulgate those standards by law."

            I think that this in particular is a very important insight with regards to the accountability of multinational corporations. In this case government acts as a way of stipulating conditions of common decency.

          • crusso 2420 days ago
            Complaining about government as paternalistic intervenor ignores the right of people to shape government to work on their behalf, which is the reason they are instituted to begin with.

            Anigbrowl, I'm sure you and I are just going to have to disagree on that one.

            The purpose of the federal government in the USA is to execute its duties in the Constitution within its enumerated powers agreed to by the States that ratified it, along with the amendments that have been agreed upon since. Ultimately, the Constitution was constructed in such a way as to protect individual liberty while minimizing the chance that the government would become too powerful. Mechanisms such as the checks and balances and enumerated government powers were put in there explicitly to constrain those in government and prevent them from thinking that they could shape the government however they wanted without fundamentally altering the Constitution through new amendments.

            Treating the government like it's a parent there to fix all problems for the children has led to failed state after failed state as leaders assume more and more power over their people. Venezuela is the latest in a long line of where the paternalistic intervenor view of government leads. It's a shame that societies never seem to learn that lesson. My FB feed is full of my more left-leaning friends' laments of how Trump is in control of the American government. But the left still doesn't seem to grasp the fact that Trump isn't as bad as it can get - assuming he doesn't do something that makes things even worse. Keep increasing the power of government and a future incarnation of Trump will really terrify you.

            • anigbrowl 2417 days ago
              Well, I certainly disagree with your rhetorical bait-and-switch here.

              checks and balances and enumerated government powers were put in there explicitly to constrain those in government and prevent them from thinking that they could shape the government however they wanted without fundamentally altering the Constitution through new amendments.

              I was talking about people, as in the electoral, shaping government to suit the public interest, whether by the passage of laws or by constitutional amendment.

              For some reason you switched that out with 'those in government' - whether you mean professional politicians or bureaucrats, this is a completely different thing to what I was talking about.

              Treating the government like it's a parent there to fix all problems for the children has led to failed state after failed state as leaders assume more and more power over their people.

              You're the one asserting that government is being used that way. I say that the regulatory role of government is entirely valid and constitutional and reject your claim of paternalism. Your argument is tantamount to saying that any kind of legislation is questionable because it undermines the constitutional role of government, which is absurd.

              But the left still doesn't seem to grasp the fact that Trump isn't as bad as it can get - assuming he doesn't do something that makes things even worse. Keep increasing the power of government and a future incarnation of Trump will really terrify you.

              You don't seem to get that the left understands this perfectly well, and indeed better than you. You seem to imagine that dictatorships result from an excessively powerful government whose levers are suddenly seized by a bad actor and switches to becoming evil overnight. In reality institutions (both formal and informal) are corroded over time, from within. Hitler had been in power almost a decade before the nazi government began to implement the 'final solution.' You may wish to reflect that the Nazis never abrogated the Weimar Constitution but maintained it as a legal figleaf for their totalitarian policies. The US Constitution is similarly capable of exploitation, eg via a3s2.

              Your thesis seems to be 'we must keep government weak so that it can never be used to oppress us' but this is absurd on its face. What keeps governments in line is the granting or withholding of the consent of the governed.

        • quanticle 2420 days ago
          The problem is that YouTube is a monopoly in online video. We wouldn't even be having this discussion if there were other viable providers for online video hosting. However, there aren't. So we're down to either complaining about YouTube blocking videos where it shouldn't or defending YouTube on the grounds that they're a private corporation and they have the right to host or not host whatever they please.

          Moreover, I've noticed a certain ideological favoritism on Hacker News towards Google, Facebook, etc. I strongly suspect that if it weren't YouTube, but rather Comcast that was filtering these videos, the community's reaction would be different.

    • chc 2420 days ago
      "Go start your own business if you disagree" seems like a middlebrow dismissal unless you're actually offering to fund the endeavor.
      • crusso 2420 days ago
        It's a dismissal of the idea that we need to appeal to an authority to solve every problem we see without acknowledging that there are very real ways that the civil society can solve problems without the use of government force and the myriad of unintended consequences that go along with it.

        There are countless guides on the Internet for writing software that can create blogs, host pictures, videos, and other content. You could do it yourself or hire a few people for a relatively small amount of money.

        Start off with a hosting company and if you're successful, get your own colocation facility.

        Build from there.

        (yes, I was an engineering founder of an ISP in the Bay Area and built it up to past the point of being capable of the above - and we did it without outside funding, just our own savings from our jobs: engineers, accountant, support person)

        • anigbrowl 2420 days ago
          But people have legitimate desires as consumers without necessarily wanting to be competitors, and without necessarily waiting for the market to evolve an alternative. You seem like a smart person, surely you know that while other platforms besides Youtube exist the network effects the obtain on a large platform have huge commercial value.

          I get that you don't like government force and the unintended consequences that often follow from the creation of laws, but you seem oblivious to the fact that markets have externalities and problems of their own, and are not by themselves sufficient for the operation of a society.

    • tanilama 2420 days ago
      Agree. If Google just wants Youtube to be an entertainment funnel, it has every right to do so.

      If the service you describe exists, it won't survive long. People would begin to swarm such services with rejected videos from Youtube, which you can foresee will be problematic and messy, in the end it might just turned into a 4chan where you can upload video. Hardly an attractive business.

    • cvsh 2420 days ago
      That addresses one issue posed by AI bans, but not the other issue I mention in the second paragraph.
      • crusso 2420 days ago
        That the ban was libelous? I'm not a lawyer, but being extreme according to their own standards doesn't sound like something you could successfully sue for libel.

        I think that libel involves written statements that have objective specificity that is provably wrong. Subjective statements, especially if only implied by the removal of some videos would seem to be entirely within the realm of protected opinion.

  • immanuel_huel 2419 days ago
    This was to be expected. All history books are written this way. History books are government propaganda. History books do not document the truth. History lessons are nothing but propaganda. So history at school is nothing but learning government propaganda.
  • davidreiss 2419 days ago
    Thank you WSJ, NYTimes and the traditional media for pressuring youtube, facebook, reddit and social media to censor.

    People aren't aware that for the past few years, traditional media and social media has been battling behind the scenes over content, narrative and censorship. It was a major war going on that the public was simply unaware of. Suffice it to say, traditional media won.

    It is amazing how a select group of news organizations and their editors and journalists can use their bully pulpit to intimidate certain industries.

  • pottersbasilisk 2420 days ago
    Perhaps its time for google and youtube to be regulated.
    • carapace 2420 days ago
      "Nationalize Google!" Okay, no, that would be turning the knob to 11.

      I upvoted you because I was like, "Yeah, maybe some regulation might be good." Then I thought about who would be doing the regulating and now I'm much less enthusiastic. :-/

      Still, it does seem like there should be a bit more, uh, community input into how these vast silos are administered.

      • mythrwy 2420 days ago
        The community making it's own silo(s) might be another option.

        (Problem is, silo making by committee has it's own set of challenges.)

        • carapace 2420 days ago
          Well, I agree, but then how do you get Joe and Jane User to switch?
    • icebraining 2420 days ago
      Every company is regulated. You have to be more specific than that.
  • GlobalServices 2420 days ago
    Google needs to be broken up. They have too much power.
  • ricky_kutch 2419 days ago
    They don't care, better to lose legit content than advertising dollars.
  • haterswillhate 2420 days ago
    YouTube start to be dead. People move to use different platforms!

    Google, you really start to suck more than my socks!

  • superioritycplx 2419 days ago
    This is a side effect of Google employing ADL who are only interested in doing political cleansing. "AI" let's them have plausible deniability.
  • mozumder 2420 days ago
    Can't any prosecutor gain access to those videos via subpoena anyways?
    • megous 2420 days ago
      What are you talking about? They are deleting entire accounts of people who were filming their cities/districts being bombed by Russians, Assad or US/Coalition. (where there may not be any direct violence to any particular persons visible)

      Who will prosecute whom? Primary source historical material is being removed wholesale by some shitty "AI". Account of recent history is being erased. Researchers who want to put some account of history together will have harder job. They will not go to subpoena google to release some material they don't even know google has. People whose lives were destroyed by a dictator will see YouTube erasing evidence of what happened, often times leaving propaganda channels for the regime untouched. It's a disgrace.

      I actually came here today to try again to post about this issue, after deleting my Google/YouTube account, because I wholeheartedly disagree with this whole fiasco, that's going on for the last month or so. So I'm glad it's finally discussed.

  • solarkraft 2420 days ago
    YouTube is not a reliable video host, but that's okay. It's a company. Fortunately these videos don't really rely on people finding them by having them recommended by an algorithm as they are merely evidence. I don't see a problem and completely understand why YouTube (especially as it's getting as non-offensive as it can) doesn't want to show war crimes.