• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Companies are pulling ads from YouTube to protect their brands

Status
Not open for further replies.

RalchAC

Member
Yep. Same people who believe Twitter should be moderated by people. Absolutely insane. GAF is a small forum in terms of active posters and we have dozens of mods, pay/work email requirement, and a waiting period.

That is not scalable for these big internet giants.

Also, cheering this will only affect bigoted stuff are gonna be in for a rude awakening.

They could have some kind of "trusted reviewers" thing. Something like some online stores. People could engage with a channel, review it by answering a simple questionary (8 questions, you mark 1-5 from better to worse, then a box if you feel like writing something). If a channel gets enough red flags, said channel (and the person behind it) are reviewed manually and it's determined if it's actually true or not.

Then, people that use the system often enough and as intended could get the "Trusted Reviewer" tag. And they could gift 1 month of Youtube RED to people that uses the system once it gets X points for using the system as it should be used.

Have this feature available in those channels that have ads and see how it goes.
 

Trokil

Banned
People applauding this forget, that if youtube moves, it also allows companies to select more who they want to support. So this will hurt minorities just as much, because advertisers will argue, they are a family business and do not want to be connected to those groups.

And what will happen is, more money for the few successful names, less money for the newcomers and the lesser known youtubers.
 
People applauding this forget, that if youtube moves, it also allows companies to select more who they want to support. So this will hurt minorities just as much, because advertisers will argue, they are a family business and do not want to be connected to those groups.

And what will happen is, more money for the few successful names, less money for the newcomers and the lesser known youtubers.

Most advertisers already target based on viewer demographics and that drives the money through the system. What advertisers evidently do NOT want is to find their ads on a channel saying there is giant Jewish banking conspiracy or similar Alex Jones type content. It's mainly about that. Ring fencing off extremism so no ceo flips out when someone asks them do you know your product is appearing under *this* video?
 

Trokil

Banned
Most advertisers already target based on viewer demographics and that drives the money through the system. What advertisers evidently do NOT want is to find their ads on a channel saying there is giant Jewish banking conspiracy or similar Alex Jones type content. It's mainly about that. Ring fencing off extremism so no ceo flips out when someone asks them do you know your product is appearing under *this* video?

Reality will hit you hard, as soon as youtube moves there will be a lot a changes and this will include age restricted content, because think about the children, warnings for difficult content and all of that.

Remember the copyright system and what a clusterfuck that is now. I already imagine the warnings before every episode of Jimquisition because of the dildo bat and the language. It will backfire as it always did. Also because youtube will use an automated system and a reporting system. So if Jim is ever going to give another Zelda game a 7 out 10, guess who is going to get reported for hate speech. And also if somebody will start a political channel, guess what will happen.
 
Reality will hit you hard, as soon as youtube moves there will be a lot a changes and this will include age restricted content, because think about the children, warnings for difficult content and all of that.

Remember the copyright system and what a clusterfuck that is now. I already see the warning before every episode of Jimquisition because of the dildo bat and the language. It will backfire as it always had. Also because youtube will use an automated system and a reporting system. So if Jim is ever going to give another Zelda game a 7 out 10, guess who is going to get reported for hate speech.
Considering we have racists targeting kids with their channels right now, maybe some restrictions won't be too bad on that.
 

Sulik2

Member
Hey YouTube instead of focusing all your efforts in on bogus DMCA takedowns maybe you should actually start moderating some of the awful content on your site?
 

Trokil

Banned
Considering we have racists targeting kids with their channels right now, maybe some restrictions won't be too bad on that.

Yes, and parent groups or religious groups will attack lgbt content, people will report anybody talking about religion, political groups will attack the other side. There is a reason why opening this box can backfire Pandora.
 

Spladam

Member
Good. Fuck the bigots. They can starve.
This doesn't only affect the vastly small faction of Youtube that broadcast hate speech or racism, this affects all of youtube.

I boycott every product/company that plays unskippable ads longer than 10 seconds.
Why would we have expectations of having this content free without advertising. Google spends a lot of money running youtube, it's not exactly a profitable enterprise, that's why there is only one youtube, and not a vast array of bigtime content providers.

This is going to affect a lot more things than a few alt-right channels.

I'm not sure this news is worth cheering for yet.
Right, I'm puzzled by all the "Good" comments, this affects all the things we watch on youtube, not to mention the sustainability of even having youtube. Google has plenty of capital to try to figure out things in the meantime, but there is a cost/profit ratio dictating these things in the end, and developing advanced algorithms to monitor content, much less make software that can translate video content into something that can be moderated by algorithms, sounds like an expensive endeavor. Hope they can figure this all out. In the end, they might just have to hire more human content moderators, and we might have to pay for our youtube, are at the very least expect more commercials.
 

Zocano

Member
We're not disagreeing. I'm just saying that people that think humans can run this are absolutely out of touch.

The rest of the internet isn't GAF. I see this stuff come up many times. Get more mods!

That not happening in these big sites. They will build an algorithm and many decent channels will also get fucked in the process.

Yah a lot of the replies here strike me as incredibly naive and the people behind them just do not comprehend how vast and difficult a problem this is. Yes, there has been progress in AI and ML but it is a slow process with a lot of trial and error. But there are comments here implying it is somehow such an easy problem to determine what hate speech is but guess what, it's not easy. It's not easy at all. Trying to create a machine that can learn the meaning and context of speech enough to know what hate speech really is is such an incredibly dense and difficult task.

A hastily made algorithm will not be clean and it will not work well and there would be a non insignificant amount of errors (misinterpretation).
 

CTLance

Member
Youtube has shirked its responsibilities for long enough, so even if it's a hot mess to fix now, they won't get any pity from me. They sat on this problem for ages, and now it's finally beginning to bite them in the ass. Couldn't be happening to a better company.

And yes, the incoming amount of videos is daunting, but if you add user reporting and the absolutely disgusting amounts of user tracking and data mining they are capable of, they can pre-filter the complaints until the problem becomes manageable by a bunch of humans.
 
Yes, and parent groups or religious groups will attack lgbt content, people will report anybody talking about religion, political groups will attack the other side. There is a reason why opening this box can backfire Pandora.
You act like there aren't fake reports already now. Those are the things they have to deal with.

Just because it can be used for abuse, doesn't mean the answer now is to do nothing. You then find a solution where that abuse can be brought to a minimum. No system will be perfect, but the current situation isn't also.
 

Mindwipe

Member
Youtube has shirked its responsibilities for long enough, so even if it's a hot mess to fix now, they won't get any pity from me. They sat on this problem for ages, and now it's finally beginning to bite them in the ass. Couldn't be happening to a better company.

And yes, the incoming amount of videos is daunting, but if you add user reporting and the absolutely disgusting amounts of user tracking and data mining they are capable of, they can pre-filter the complaints until the problem becomes manageable by a bunch of humans.

Can they? Nobody, but nobody does this successfully.

I am genuinely and have always been baffled at why Gaf has so many people who think it's ever going to work. It's not. Enhance human moderation only scales if you outsource it to dirt cheap individuals in third world countries, and then you act surprised when those people have very conservative upbringings and shit all over vulnerable minorities.

Heck, we're seeing Twitter's new "anti-abuse" crackdown, which involves people, every time, and it's not touching the racists and nazis on the platform - but meanwhile transgender people who call Trump an asshole, or British people this week who say Fox News' coverage of the London attacks are bollocks - get put on timeout.

These things don't work. They don't work. They're too complicated for human systems.
 
Easy to say, but companies exist to make money. If they need to have thousands of people to moderate content then that will never be a viable solution due to costs.

How much do you think a forum moderator pulls in annually compared to say, a high end programmer for a company like Google?

Have you ever been to one of their campuses?

Hint: they can afford it
 

tokkun

Member
Hey YouTube instead of focusing all your efforts in on bogus DMCA takedowns maybe you should actually start moderating some of the awful content on your site?

Be prepared for this to have an effect similar to the DMCA takedowns.

People are mad about how Nintendo can make a DMCA claim on anyone who does a Let's Play video of one of their games or features some clip of it in a review. You already have a chilling effect where people don't want to cover Nintendo games because they cannot monetize those videos. The same thing will happen with any video that covers issues of racism, sexism, or gender identity, from a perspective you agree with. They will run the risk of getting falsely labeled as hate speech, either due to misdetection by some automated algorithm or because malicious users who disagree with the message report them. Pretty soon we will have topics here about how every Feminist Frequency video is getting de-monetized because GamerGaters are coordinating to report them as hate speech. Some people will simply choose to stop talking about any controversial topic to avoid the headaches.

The companies will be happy. Bad users will suffer, but some good users will suffer as well. Maybe you still think this is a worthwhile tradeoff to reduce real hate speech (much as DMCA takedowns reduce real copyright infringement) - I won't judge that. But I hope people realize what it is they are asking for here. If you think this is as simple as only hurting the actual hate speech videos, you are being incredibly naive.
 

dan2026

Member
With doesn't Youtube just ban the accounts of these bigots and racists?

That would show the ad companies they aren't playing around.
 
The companies will be happy. Bad users will suffer, but some good users will suffer as well. Maybe you still think this is a worthwhile tradeoff to reduce real hate speech (much as DMCA takedowns reduce real copyright infringement) - I won't judge that. But I hope people realize what it is they are asking for here. If you think this is as simple as only hurting the actual hate speech videos, you are being incredibly naive.
You can easily set up filters to throw false reports in the garbage bin, or exclude certain channels that have proven themselves already.
 

patapuf

Member
Be prepared for this to have an effect similar to the DMCA takedowns.

People are mad about how Nintendo can make a DMCA claim on anyone who does a Let's Play video of one of their games or features some clip of it in a review. You already have a chilling effect where people don't want to cover Nintendo games because they cannot monetize those videos. The same thing will happen with any video that covers issues of racism, sexism, or gender identity, from a perspective you agree with. They will run the risk of getting falsely labeled as hate speech, either due to misdetection by some automated algorithm or because malicious users who disagree with the message report them. Pretty soon we will have topics here about how every Feminist Frequency video is getting de-monetized because GamerGaters are coordinating to report them as hate speech. Some people will simply choose to stop talking about any controversial topic to avoid the headaches.

The companies will be happy. Bad users will suffer, but some good users will suffer as well. Maybe you still think this is a worthwhile tradeoff to reduce real hate speech (much as DMCA takedowns reduce real copyright infringement) - I won't judge that. But I hope people realize what it is they are asking for here. If you think this is as simple as only hurting the actual hate speech videos, you are being incredibly naive.

Jep.

The "I don't talk about politics/ controversial topics" attitude so many on this board hate? Expect a lot more of that. There will be a few big channels/news organisations that'll get to do it and the rest will have to constantly deal with bogus claims just like they do with the DMCA stuff now.
 

CTLance

Member
Can they? Nobody, but nobody does this successfully.
[snip]
These things don't work. They don't work. They're too complicated for human systems.
This problem has been brewing ever since YouTube was a much smaller company. They didn't fix it then despite public pressure and common sense, and now it's a nearly insurmountable task. It is entirely their own fault for sitting on this problem for so long. If it is not achievable right now, then tough fucking luck for them. They better die trying, then.

I get the feeling this is kinda like a gun control discussion. Somehow those for more control are expected to deliver a lock-and-key prebuilt solution, while the opposition gets to veto the entire thing for every single possible or perceived fault, no matter how remote or minor.

Sure, fixing this mess is a terrifying task, and it will devour countless man hours and result in countless people getting unfairly accused. However, shit needed to be fixed ages ago, so we have to make do with the tools available to them at the present, then tweak and enhance this grotesque mess until the result is acceptable.

Because allowing the situation to continue as it is right now is foolish to the extreme. We can't really continue to allow them to stand at the sides and throw their hands into the air helplessly while they happily refine their content ID and ad frameworks to siphon off even more money on the back of those who get wounded and oppressed. There's a limit to everything.

Not so long ago a moderately accurate content ID mechanism for videos on a huge scale like that was a dream within a dream. Yet, here we are now. Where there's a will, there's a way. Sure, the solution right now might be terrible or not work at all, but without looking for a better one we will never get anywhere. Time for YouTube to get cracking, spend some money, do some research, write some bounties, hire some talent. That's how things get done. Not right now. Maybe not in a decade. But soon(TM).
 

Trokil

Banned
I get the feeling this is kinda like a gun control discussion. Somehow those for more control are expected to deliver a lock-and-key prebuilt solution, while the opposition gets to veto the entire thing for every single possible or perceived fault, no matter how remote or minor.

No, it is not.

They are not taking guns away, they are also taking cars away, because people will tell them, those are guns, they will take away ducks, because some people will tell them, those are guns, they will take away puppies, because those are guns as well.

It will be like firing a shotgun from a distance into a crowd and hoping only the bad people will get hit.
 

Saganator

Member
Right, I'm puzzled by all the "Good" comments, this affects all the things we watch on youtube, not to mention the sustainability of even having youtube. Google has plenty of capital to try to figure out things in the meantime, but there is a cost/profit ratio dictating these things in the end, and developing advanced algorithms to monitor content, much less make software that can translate video content into something that can be moderated by algorithms, sounds like an expensive endeavor. Hope they can figure this all out. In the end, they might just have to hire more human content moderators, and we might have to pay for our youtube, are at the very least expect more commercials.

I'm fine if the overall quality of YouTube takes a hit. I remember YouTube in its infancy and enjoyed it just as much if not more than I do now with all the "high quality" YouTubers out there. If all of YT needs to take a hit so bigots spreading hate to kids don't have a massive platform with a direct link to young minds, then so be it. I don't know if you have kids, but when you do, you might change your mind.

YouTube has definitely been a factor in the propagation of hate recently, it needs to be reined in for the good of society. If that means less highly produced videos from large companies, and more from passionate amateurs, that's just a small price to pay if you ask me.
 

xrnzaaas

Gold Member
And at the same time the public doesn't care that these companies don't give a fuck about their customers, just their money. Johnson & Johnson's is probably the best example out of the ones mentioned. They were found guilty in the past for selling products unsafe to health and knowing about it.
 

squall23

Member
IE we absolutely insist on handling everything via automated algorithms rather than actual human oversight.
This kind of job would be extremely time consuming and near impossible.

There was a AMA from a guy that works for a porn tube site who actually has a video moderation job and according to him they're already overworked as it is.
 
That does already not work with copyright takedowns and that is a way smaller problem.
So they need to improve it for this. I don't get the argument that they should just do nothing, because there might be a chance that there will be some abuse. So we are fine with people making money of racist content, hate speech and more terrible things?
 

FLAguy954

Junior Member
Maybe Google can setup an ad marketplace where creators petition for sponsorship based on some type of application and content sample submission.

Keep the current ad system in place for companies who don't care what they sponsor and have the marketplace for the companies that want to know what type of content will show their ads.

This would be a good solution/start since a auto-moderation approach could easily lead to over-censorship.
 

CTLance

Member
It will be like firing a shotgun from a distance into a crowd and hoping only the bad people will get hit.
Yes, right now they have a shotgun, and after a few test firings they will see that there are some drawbacks to that particular idea.

Next up is the research of
Smart bullets
Better targeting solutions
Alternative weaponry
Pre-shoot measures
Decoys
Drones with tear gas
Non-lethal alternatives like water throwers
Aftercare improvements so that those who get shot survive

They either fire the shotgun now and work from there until they get to a fleet of crowd control satellites that can shave the beard off a guy in the crowd, or all they will ever have will be the shotgun. Honestly, unless they have a shotgun pointed at them themselves, they will never use their own. Because that's how businesses operate. Bullets are expensive, and parts of those about to be shot earn them good money.
 

Mindwipe

Member
So they need to improve it for this. I don't get the argument that they should just do nothing, because there might be a chance that there will be some abuse. So we are fine with people making money of racist content, hate speech and more terrible things?

It's not a chance there'll be some abuse. It's a guarantee it will face significant abuse, and more importantly will inadvertently attack minorities to a significant extent, as it has in all instances they've been tried before.

Facebook operate the model you're asking for here. It hasn't made a dent in hate speech on Facebook. It has lead to significant censorship issues, and significant discrimination issues.

There is a reason legal frameworks err on the side of free speech to the extent people can say terrible films, which is because innocent minorities always, always get hurt first in the face of tighter ones. It would be the same for online platforms, and we are rapidly reaching the point where removal from private platforms is, in practice, a bigger infringement of speech than a government. Google and Apple have far more power over global speech than an individual government.
 
This doesn't only affect the vastly small faction of Youtube that broadcast hate speech or racism, this affects all of youtube.

Why would we have expectations of having this content free without advertising. Google spends a lot of money running youtube, it's not exactly a profitable enterprise, that's why there is only one youtube, and not a vast array of bigtime content providers.


Right, I'm puzzled by all the "Good" comments, this affects all the things we watch on youtube, not to mention the sustainability of even having youtube. Google has plenty of capital to try to figure out things in the meantime, but there is a cost/profit ratio dictating these things in the end, and developing advanced algorithms to monitor content, much less make software that can translate video content into something that can be moderated by algorithms, sounds like an expensive endeavor. Hope they can figure this all out. In the end, they might just have to hire more human content moderators, and we might have to pay for our youtube, are at the very least expect more commercials.

This will nudge them to go in the right direction. YouTube is not going away, Alphabet is a very profitable company and they know the value of YouTube.
 
People applauding this forget, that if youtube moves, it also allows companies to select more who they want to support. So this will hurt minorities just as much, because advertisers will argue, they are a family business and do not want to be connected to those groups.

And what will happen is, more money for the few successful names, less money for the newcomers and the lesser known youtubers.

Advertisers have always had this power in every other medium they fund. Always. They now want it from a growing number of social media platforms or else they won't fund it. PDP spooked Disney, one of the easiest companies to spook. The rise of white supremacy and hate speech has spooked Johnson and Johnson. Now that YouTube personalities have attained the same mindshare as youth-targeted pop idols and sports personalities they're going to face a growing number of 'welcome to the NFL' moments that removes some of the democratizing effects of a free social video platform that has provided them with a full-time job.

I suppose there's a risk for small potatoes channels to not get the Johnson and Johnson and Disney accounts and yeah, YouTube is pretty crappy with how it bows so readily to large companies skittish over copyright infringement so I do share concerns that they won't always intelligently apply whatever new rubric covers speech that frightens big advertisers. But I won't want them to fear failure here.
 
Wait, you mean that letting right-wing reactionary bigots spew their bullshit unfettered while slapping an Applebee's ad on top of it ISN'T a sustainable long-term business strategy?

giphy.gif
 

Guileless

Temp Banned for Remedial Purposes
I have here in my hand a list of two hundred and five Youtubers that were known to Google as being offensive who are nevertheless still broadcasting.
 
I thought this was gonna be about how much people hate ads playing before their videos that it actually made people not want whatever was being advertised. Probably better that it's this.
 

tokkun

Member
How much do you think a forum moderator pulls in annually compared to say, a high end programmer for a company like Google?

Have you ever been to one of their campuses?

Hint: they can afford it

How about math instead of hints?
https://fortunelords.com/youtube-statistics/

There are 300 hours of new content uploaded per minute. It would take > 50,000 people doing nothing but watching videos just to keep up with the new material, without putting a dent in the 1 billion hours that already exist. For reference, the number of employees in Youtube is estimated at < 1000.

Five years ago, it was 'only' 60 hours per minute, so it is also growing very fast.

You can easily set up filters to throw false reports in the garbage bin, or exclude certain channels that have proven themselves already.

How's that working out for DMCA claims?
 
Honestly this is a problem that Google has been trying to ignore while it profits hand over fist off YouTubers promoting hatespeech. They've been kicking this can down the road for a while now, but it's about to get a whole lot more serious to them.
 
And the Law is coming to clean up this dumpster fire. I bet this will end up going a bit too far and some Youtubers will have to stop cursing and other stuff if they want an ad to play before their video.

This. Do people really find the YouTube bots fair? Because they fucking are not. This will become something else and it will result in "DONATE TO MY PATREON OR I CANT DO THIS ANYMORE" e-begging.
 
It's not a chance there'll be some abuse. It's a guarantee it will face significant abuse, and more importantly will inadvertently attack minorities to a significant extent, as it has in all instances they've been tried before.

Facebook operate the model you're asking for here. It hasn't made a dent in hate speech on Facebook. It has lead to significant censorship issues, and significant discrimination issues.

There is a reason legal frameworks err on the side of free speech to the extent people can say terrible films, which is because innocent minorities always, always get hurt first in the face of tighter ones. It would be the same for online platforms, and we are rapidly reaching the point where removal from private platforms is, in practice, a bigger infringement of speech than a government. Google and Apple have far more power over global speech than an individual government.
I don't get why this would attack minorities. Will there be some kind of 'minority content creator' checkbox for advertisers suddenly?

Yes, there might be problems with people reporting content that should not be reported. But that is already possible and it hasn't let to removal of those videos and such.

How's that working out for DMCA claims?
Why do we pretend the exact same policies would apply here?
 
And at the same time the public doesn't care that these companies don't give a fuck about their customers, just their money. Johnson & Johnson’s is probably the best example out of the ones mentioned. They were found guilty in the past for selling products unsafe to health and knowing about it.

Yeah, but doing the right thing for the wrong reasons is still probably better than doing the wrong thing for any reason.
 

Pie and Beans

Look for me on the local news, I'll be the guy arrested for trying to burn down a Nintendo exec's house.
Anything to curb an era of entertainment where screaming "raaaaape" down a microphone was worth millions but painstaking creative work wasn't worth jackshit.
 
I don't get why this would attack minorities. Will there be some kind of 'minority content creator' checkbox for advertisers suddenly?

Yes, there might be problems with people reporting content that should not be reported. But that is already possible and it hasn't let to removal of those videos and such.


Why do we pretend the exact same policies would apply here?

If this ends up really hurting the bottomline, YouTube might stop giving a shit and just hunker down on removing videos without checking the content. Or this might just hurt the big name stars that actually make waves since they're the ones with a significant amount of influence.
 
D

Deleted member 13876

Unconfirmed Member
Not sure how related this is, but just this week I got an email stating a random Binding of Isaac daily video I had uploaded was banned from monetization as the content may be deemed offensive by advertisers.
 

WedgeX

Banned
It'd really be a shame if Google, which has a mere 50 thousand employees, was forced to actually hire people to monitor their traffic rather than algorithms.

What a catastrophe that would be! /s
 

daveo42

Banned
Advertisers are well in their right to pull ads and Google should be working to better apply appropriate ads to content, but it's not as cut and dry as just banning all the bad shit on the internet. Racism, alt-right views, white power, and all the stuff in that sphere is terrible and it's disgusting that we are still dealing with these issues...but as it stands you have to weigh that against freedom of speech. As well, the Market Place story touches on the fact that it is also a balancing act for Google (and Twitter and several other social media sites) about continued embrace of open speech on the internet as opposed to a more moderated approach.
 
Advertisers are well in their right to pull ads and Google should be working to better apply appropriate ads to content, but it's not as cut and dry as just banning all the bad shit on the internet. Racism, alt-right views, white power, and all the stuff in that sphere is terrible and it's disgusting that we are still dealing with these issues...but as it stands you have to weigh that against freedom of speech. As well, the Market Place story touches on the fact that it is also a balancing act for Google (and Twitter and several other social media sites) about continued embrace of open speech on the internet as opposed to a more moderated approach.

People may feel they have a right to say toxic shit online, it doesn't mean Google has an obligation to host it when it impacts and spreads to the rest of the platform. Reddit is so steadfast in it's belief of neutrality and free speech that the website is riddled with racism, sexism, misogyny and hate speech that a lot of people don't want to be anywhere near it.
 

Fuchsdh

Member
Shit is about to get real now. Heavens move and planets crack when the money starts talking behind the scenes. I think 2017 is going to be a rough year for youtubers

Some sort of reckoning was always coming—Youtubers are primarily propped up by ad rates that were always going to fall over time. The smart ones have gone into sponsored media/branded content because there's no way ad revenue is going anywhere but down for individual channels long-term, unless you've got enough annual growth to counteract it.

But yeah, the only way Youtube will be making any changes is if it threatens their bottom-line like this, so once again it's up to selfish corporations to make positive changes in Bizarro World :)
 
Status
Not open for further replies.
Top Bottom