• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Companies are pulling ads from YouTube to protect their brands

Status
Not open for further replies.

gcubed

Member
Between this and Facebook with the child sexualization it's pretty embarrassing for these companies to not even respond to in place processes of reporting. It's bullshit and I wouldn't mind it collapsing under the weight of its own bullshit
 
As of late 2014, 300 hours of content were being uploaded every minute to YouTube. It's just not possible to deal with that volume without the use of ML.

https://www.google.com/amp/expandedramblings.com/index.php/youtube-statistics/amp/

Most of that content has close-enough-to-zero viewers to approximate to being completely unseen, for all intents and purposes. I'm sure the advertisers don't really care if some shit-nothing video no one's actually watching prerolls their ad before it, they just care that Google actually pays a spot of attention to what's getting views and whether or not it's completely abhorrent.

It's significantly less difficult to police high-traffic uploads.
 

f0lken

Member
How long before Youtubers start bitching because companies don't want to associate for neo-Nazis?



Guy who a lot of people assumed was a good guy based off his video turned out to be a huge racist and white supremacist.

Youtubers will start the "it was all a joke" narrative that pewdiepie made a few weeks ago. Why are the bigots are the ones who's voice is louder :/
 
Good. If YouTube wants to do business the way they are then they need to do a better job of curating content that does not rely on hate speech to build audiences. Let them go elsewhere.
 

FZZ

Banned
Fucking finally

It hurts my brain to think that the reason this "alt-right" and hate fucking bullshit is mainly happening because the people at the forefront of this shit are making money through avenues like youtube

Twitter should step up next and start banning these fuckers off their platform

when you remove their voice they won't have power

how hard is it for CEOs to understand
 

cDNA

Member
youtube have an advertiser friendly algorithm already, few months ago there was huge uproar because many YouTubers were having their videos demonetized.
 

Fat4all

Banned
I thought it was going to be about their comments system. That's just as terrible.

They should probably re-think how those work, too.

Or don't work, really.

youtube have an advertiser friendly algorithm already, few months ago there was huge uproar because many YouTubers were having their videos demonetized.

That was directed more at tags and video titles iirc.

this seems to be aimed at the videos content
 

Haly

One day I realized that sadness is just another word for not enough coffee.
Who wants to bet Google implementation of this will be horrible

Look at what's happening to lgbt videos being on restricted.

Well if they can algorithmatize gay culture suppression I'm sure they could turn that to hatespeech in its various form.

I do not believe for a second that there is "no solution" to this problem. I can be convinced that there is no "perfect" solution but I never expected one in the first place. I do, however, expect a modicum of responsibility and demonstrative action, which has been severely lacking from all the tech giants.
 
If things continue to go this way, lots of good-natured content creators are going to be getting less revenue unless Youtube is willing AND able to go after all of the deplorables. That sounds like a bad time for people barely making it. Fucking Trump. Fucking US. Gah.
 

Fat4all

Banned
Now if only Patreon stepped in to stop haters getting support.

Unfortunately, if people want to give money directly to awful people they are free to. In a similar way, we couldn't ask Paypal to refuse donations to them.

The best we could hope for would be to ask Patreon to be a bit more stringent for who they allow on their service, but that would be up to them in the end, they don't answer to advertisers.
 

enzo_gt

tagged by Blackace
This is the most difficult thing. Creating, enforcing, and detecting boundaries for this type of content on such a massive scale is super difficult.

There's this weird perception that Twitter has been twiddling their thumbs about the whole harassment stuff, and while that may be true to some degree, implementing a solution for this kind of stuff is incredibly complex. I don't even know how you'd start. It'd have to be some sort of machine learning application, and even still you run the risk of over-censorship due to error (and the margin of error will still be a significant raw number of users/content) and subsequent backlash.

Part of me believes that a mediating solution by the platform isn't the answer.
 
Google says it's committed to working on a resolution

Here's a simple solution, how about you just ban these fucking white supremacists? YouTube isn't obligated to have some moron on there talking about white genocide. Same with Twitter.

Wonder what google is going to do about it.

Probably some extremely convoluted "see no evil, hear no evil" solution that separates hate speech into it's own bubble.
 

Somnid

Member
I wonder what's going to happen to people once we come up with better decentralized substitutes? You won't be able to ask/coerce anyone to take down content, and people who normally look to authority figures to police things are probably going to get steamrolled. Youtube has its business to maintain so I assume it will do something but I think people in general need to get a little more clever about how they approach these problems because it's not going to work like this forever.
 
This is the most difficult thing. Creating, enforcing, and detecting boundaries for this type of content on such a massive scale is super difficult.

There's this weird perception that Twitter has been twiddling their thumbs about the whole harassment stuff, and while that may be true to some degree, implementing a solution for this kind of stuff is incredibly complex. I don't even know how you'd start. It'd have to be some sort of machine learning application, and even still you run the risk of over-censorship due to error (and the margin of error will still be a significant raw number of users/content) and subsequent backlash.

Part of me believes that a mediating solution by the platform isn't the answer.

Soon enough it'll be AI policing the whole thing.
 

Goo

Member
Maybe Google can setup an ad marketplace where creators petition for sponsorship based on some type of application and content sample submission.

Keep the current ad system in place for companies who don't care what they sponsor and have the marketplace for the companies that want to know what type of content will show their ads.
 
Considering how many websites have been flooded with an influx of hate speech, harassment, and generally shitty people websites are going to need to be much more proactive than reactive to nip issues like these. Like thank god for the strict moderation of neogaf, a lot of places can't be bothered.
 

Nanashrew

Banned
Here's a simple solution, how about you just ban these fucking white supremacists? YouTube isn't obligated to have some moron on there talking about white genocide. Same with Twitter.



Probably some extremely convoluted "see no evil, hear no evil" solution that separates hate speech into it's own bubble.

Agreed, it's the most simplest solution. Their hate speech violates their ToS and they do nothing even when people report it.
 
Shit is about to get real now. Heavens move and planets crack when the money starts talking behind the scenes. I think 2017 is going to be a rough year for youtubers

This isn't just money talking, this is 'The" money talking. Once again people fail to realize that the Wild West days of the internet are over. These companies will allow a little bit of shenanigans to look like their hands off but after a few the hammer comes down.
 

RootCause

Member
Here's a simple solution, how about you just ban these fucking white supremacists? YouTube isn't obligated to have some moron on there talking about white genocide. Same with Twitter.



Probably some extremely convoluted "see no evil, hear no evil" solution that separates hate speech into it's own bubble.
Like the first part of the post. The second is scary.
 

enzo_gt

tagged by Blackace
Soon enough it'll be AI policing the whole thing.
Yeah, this is what makes me fear it. The answer people want is basically the most sophisticated and powerful censorship and social engineering tool that the world will have ever seen. Not only do I feel very uncomfortable with the idea of that existing, but being able to be controlled by whoever can wield it.

NSA builds backend into Twitter -> detects and censors (or at least hides) dissenting opinions -> total control of the nature of discourse online

Pretty much the quickest and slipperiest path to a totalitarian regime.
 
Considering how many websites have been flooded with an influx of hate speech, harassment, and generally shitty people websites are going to need to be much more proactive than reactive to nip issues like these. Like thank god for the strict moderation of neogaf, a lot of places can't be bothered.

Gaf has one of the largest barriers of entry to post on for any forum I've seen. It's really hard to ban evade here since the approval process takes so long. Strict mods help but it's really hard to keep people off your platform of they can spend 10 minutes and have another account to post with
 
Good; now fix your news.google section to prevent Breitbart and Fox News from showing up so we can go back to having factual news.
 

antonz

Member
Google has been on the Moderation Train with its content creators etc. for sometime telling them to clean up or lose advertising.

Has far more to do with general perception than hate speech or anything. Have seen content creators who have made vlogs talking about past issues with drug dependency etc. get their videos knocked around on the grounds of advertisers do not want to have their ads played on controversial topics
 
Gaf has one of the largest barriers of entry to post on for any forum I've seen. It's really hard to ban evade here since the approval process takes so long. Strict mods help but it's really hard to keep people off your platform of they can spend 10 minutes and have another account to post with

Hm, this is definitely true. Not sure if I want other websites to implement similar sign up restrictions, so that would complicate it...
 

The Technomancer

card-carrying scientician
IE we absolutely insist on handling everything via automated algorithms rather than actual human oversight.

The scope of the internet, even specific sites like YouTube, has long surpassed anything that would be feasible to maintain through human oversight alone.

To be fair, it'd be incredibly hard to monitor every youtube video by person.

You couldn't view every single video in a million years, but if you turned off ads by default and made opting into displaying ads on your video require a manual approval that might work

Of course Google makes so much money off of all of those marginal videos that get a few hundred views without needing to pay the creators, so I'm sure they're in no rush to do that
 

UrokeJoe

Member
I hope this turns out well and it's okay that I hate peas.

Honestly it's a move in the right direction, just hope it turns out well.

....
 

TSM

Member
This actually seems like a double edged sword. Advertisers will probably only want to be associated with "safe" content if they have a say in the process. I'm sure the big guys have no interest in being associated with political speech of any sort.
 

Bgamer90

Banned
This is what happens when you focus too much on services and partnerships to make money instead of greatly supporting the (diverse) users that made your website popular. Reminds me of Twitter's struggles.
 
Status
Not open for further replies.
Top Bottom