I’ve set myself the objective of making one YouTube video per week, for a couple of reasons. One is to add one drop of something positive to the ocean of shitlords and dreck — I’ve complained enough about the toxic nature of YouTube, I figure that if I should be making a nominal effort to correct it. And another is to challenge myself to learn something new, and video skills are difficult for a non-photogenic and at least initially talentless videographer like me. So I’m tinkering. I hope they get better week by week.
Then, as I was exploring various features of the video editor, I learned that some things are not enabled until you switch on monetization. I’m not into making money off this endeavor (although it would be nice), so I turned that on, and then it took 5 or 6 days for the powers-that-be to decide I’m legit, and one video was activated for ads.
Except — and this is what I’m asking about — it was immediately declared “Not suitable for most advertisers”. I was mildly offended! What’s “not suitable” about this video? Is it my lack of style? My laid-back speaking manner? The old-man bags under my eyes? The occasional flash of spiders? Does being boring disqualify one for monetization? That might be it, since it can’t be the content — I see lots of racist/sexist crap on YouTube, which must be acceptable in a way that a geezer talking about genes can’t be.
On the bright side, though, I still get access to all the shiny video editing features, but they aren’t stuffing ads in. I guess that’s good. I’d like to know why — if it’s just a glimpse of my face that repels advertising, I’ll have to make sure to stick a portrait into every one.
While I’m asking, does anyone have a good tutorial to recommend on using various YouTube features?
John-Henry Beck says
Not an expert.
Something I’ve seen other YouTubers talk about is that declaration is an automated thing. There’s a high likelihood if appealed to a human that it would be re-monetized.
Moggie says
It’s not just you: this is happening all over, to completely inoffensive Youtubers. Google “adpocalypse” for for than you ever wanted to know about why the sky is falling.
I now pledge money via Patreon to a few of my favourite channels, so that they can continue to pay the rent. (I know, I’m privileged in that I can afford that).
Moggie says
Yes, you can appeal demonetisation, but that takes a few days. Apparently, most of a typical video’s earnings come immediately after it’s uploaded, so this is a problem.
=8)-DX says
From what I’ve heard the YouTubers griping about is that “Not suitable for most advertisers” is now what it defaults to and you have to “appeal” the decision (should be in the user interface somewhere). I’ve never done anything worth monetisation (although I was automatically made a “partner” back in the day), so I can’t give any advice past that. Oh maybe from what I’ve looked at the YT video-editing features are pretty basic, any other program will be better: edit your video offline and then just upload…
If you’re looking for tutorials on video editing, there are thousands of these on… YouTube.
=8)-DX
PZ Myers says
That’s what I was afraid of, that they’d be on youtube. Don’t youtubers ever read a book? They’re really convenient.
Aaron says
I hate using videos for reference when doing something. No table-of-contents, no quick-scan to track forward or backward to the step I’m on, no way to control the pace to match what I’m doing, just play, pause at a likely point, do stuff, seek to a backtrack to review, seek forward to the wrong place, seek back again to hopefully the right place, play that instruction, pause, etc etc etc
madtom1999 says
#6 I’ve always meant to do a TED talk about how videos are one of the worst ways of disseminating information.
consciousness razor says
You can dispense with all of that nonsense simply by using different editing software, not just what YouTube itself offers.
If you’re aiming for quality (informative, clear, well-designed, etc.), especially as a beginner, then it won’t help to try to churn out one video per week. You could set a goal to work on it a little every day (whatever you can reasonably handle), but each project takes as long as it takes, since you won’t get anything out of it that you haven’t put in. You’ll start to produce faster and more efficiently, as you learn new skills and become better acquainted with your tools (much like writing a blog), but it isn’t something you can just demand of yourself because you’ve set an arbitrary deadline.
consciousness razor says
I’m afraid they don’t. Your smarthone does everything for you these days, or it claims to. No need to know how any of it works, why you think any of the things you do, etc…. It just happens, then you move on to more important stuff in your life (also on your smartphone).
Anyway, I guess most of what you’d want to know is about how to use the software. Manuals aren’t so great sometimes, but there will at least be something for complicated programs like that. As with your students, if you have a question, there are always other people out there with the same issue, but you may have to hunt for the answers.
If you want an arts course on producing videos, more or less how things look and sound, that’s a whole other story. (There’s also the writing aspect of it, although you’re a decent writer, as well as a professor, and you can carry that experience over without much of a problem.)
Walter says
There is another thing that doesn’t fill me with confidence when it comes to Youtube’s algorithms: I noticed on your channel page that Yt suggests similar channels, among them Sargon of Akkad. Just because both of you made videos about atheism I guess..
Sorry PZ, I didn’t mean to sour your day:)
Snidely W says
Do advertisers have a choice of where to put their ads? After all, THEY are the paying customer here. I’d guess that they would want to put their ads on videos that fall in the “entertainment” category. You know, like music, comedy, cats, etc. Videos that may fall under the general heading of “edumacation” may not be what advertisers are looking for. After all, if one is looking for straight information, one may not be so easily be distracted by an ad as someone who is just there to be amused or entertained.
mod prime says
So in response to extremist videos having adverts on them – many major advertisers threatened to or actually did pull out of using Youtube – on top of this was the negative publicity associated with the whole thing so Google decided to try and vet videos.
There are way too many videos posted to make this a manual process.
So they use AI.
The AI was set up deliberately to be overeager – to demonetize way more than necessary. It then relied on the process of people contesting the demonitization and a human agreeing with this to learn what videos were OK.
But this process on its own was still too much for humans to manage – so they put a filter on which ones humans will look at. And that is based on subscriptions and views. Smaller channels simply get ignored.
So why yours? Impossible to say as Google don’t reveal to avoid people trying to beat the system.
However here are some theories;
1) Image recognition. A good amount of the video is text and images but the rest is some bearded guy talking. The algorithm may flag ‘bearded guys with simple backgrounds talking interspersed with imagery’. This might be because of extremist videos which follow a similar format. And legit channels that do this may not have the numbers to successfully contest this, resulting in reinforcing the AI’s opinion that such videos are worthy of flagging.
2) Closed Caption transcription: the CC transcription picked up the words ‘dead’ {1:46}, ‘Paris’ {2:40}, ‘die’ {3:28}, ‘aborted’ {6:41}. A bit of a stretch. But you also talk about sex in other videos which may be flagged as ‘NSFA’ {not safe for advertisers}
3) Metadata: You are PZ Myers. You talk about freethoughtblogs. It might be that a good number of people that talk about PZ Myers and freethoughtblogs are toxic horrors that the AI thinks advertisers want to stay away from…
4) Controversial topic: Maybe ‘genes’ ‘evolution’ and the like are considered by the AI to be controversial topics…
Incidentally at 4:40 it sounds like you said ‘430 billion years ago’!
consciousness razor says
Well, the main issue here may just be the number of views a channel gets. Educational videos (including some very boring ones) are evidently “suitable,” at least when they come with millions of views. Of course, some advertisers will settle for only tens/hundreds of thousands (or whatever number), and they do try to associate ads with somewhat-related video content so as to target an appropriate audience. At the end of the day, though, there’s still a question of how much exposure that ad will get. There are plenty of education- or science-related things to advertise (such as online courses, etc.), which would be a “suitable” match in terms of content, but somebody has to think there are enough views to justify the expense. So, they’d turn to a more popular channel, perhaps with a very similar audience (since there’s no shortage of people producing the relevant kind of content), to get more bang for their buck.
Raucous Indignation says
“Is it my lack of style? My laid-back speaking manner? The old-man bags under my eyes? The occasional flash of spiders? Does being boring disqualify one for monetization?”
Yes. Very much so. All of that.
Lofty says
Maybe you need a few key words in your video titles, like “tactical” and “the truth” to get the views you need to go commercial.
Marcus Ranum says
Moggie@#3:
Yes, you can appeal demonetisation, but that takes a few days. Apparently, most of a typical video’s earnings come immediately after it’s uploaded, so this is a problem.
If you think that’s an accident I have a good offer on an F-35 of your very own. It’s very sniny.
Raucous Indignation says
MR @#16: I love the “sniny” ones.
SC (Salty Current) says
I just read this article about kids’ videos on YT and I’m…very troubled.
emergence says
SC @18
My little brother watches a lot of that stuff on the more benign end of the scale. They always seem a little off, but I’ve never seen anything really extreme or disturbing. I think the writer might be exaggerating a bit about just how severe a threat most of this kind of stuff is, but I can imagine that people, either because they’re malicious trolls or because they’re socially-maladjusted weirdos, might put up videos with content that scares little kids.
SC (Salty Current) says
Honestly, I had no idea about any of it. I’m having a combination of Frankfurt School convulsions and fears for my “goddaughter.” But I don’t know about the exaggeration. I’m actually pretty limited in what I watch, and I still get recommendations for far-Right propaganda, so I imagine it’s a similarly wide net for children. Some of the videos in that article, which I didn’t even watch in full, were…I don’t even know how to describe them. Nightmarish.
The larger point about the systemic non-/irresponsibility of it all is the key issue.
consciousness razor says
emergence:
As they made very clear, much of the content is more or produced (or stolen, automatically-generated, etc.), purely to maximize ad revenue, corresponding to the number of eyeballs (or bots) which land on it. That is far too easy to do, with the right (nonsensical) mix of search terms and something which looks like it’s probably supposed to be children’s entertainment.* But it doesn’t need to be scaring children, deliberately or not, in order to be a problem.
The system does not encourage or promote content which is valuable, original or even meaningful. It’s polluted with clickbait trash, by design. Sure, some may want to waste their time with that bullshit, every now and then — although parents ought to have something better for their children — but the rest of us have to go along for the ride too, because of the way it is built. If it isn’t designed for us, then I’d like them to say so: they should put it on the front of the box that it’s supposed to contain garbage which isn’t intended for human consumption, since by some kind of magic this garbage creates income for somebody. If they were just honest about that, and I still had a reliable way to find some non-garbage, then there would probably be less to complain about.
*But of course it could also look vaguely like a news channel, one devoted to atheism/skepticism, science, or whatever it may be. Some adults are unfortunately about as discriminating as toddlers, when it comes to right-wing horseshit, conspiracy theories, and so forth, so similar concerns arise no matter what it may be.
consciousness razor says
I left out a word:
Ichthyic says
that… was extremely interesting.
online communications have moved into realms I never could have even imagined.
and no human society has the tools to be able to readily cope with the results.
Ichthyic says
I have. and the author himself points out some of it, how it was easily created automatically with apparently no “intent” involved, and that it isn’t even close to the worst he has seen.
and the parallels with source disconnect and newsfeeds he notes are worth thinking about as well.
I know FB is thinking about them.
emergence says
I’m not saying that the creepy, disturbing, nightmare-inducing stuff isn’t there. I was mostly talking about what I’ve seen my little brother watching. I think that my parents are being careful to filter what he watches so none of the creepy shit shows up. I still think he’s better off when he watches stuff that looks like some actual thought went into. Next to crap with people dressing up in superhero and Disney costumes and acting out inane skits, regular cartoons look like David Attenborough documentaries.
I’m not too keen on how the author brings up the teenagers and video game violence thing again though. That feels like a dead horse that I thought we’d put behind us at this point.
methuseus says
@emergence #25:
You’re right about teenagers and video game violence, though it can still desensitize and make people less empathetic (which is still debatable).
However, he is sort of right about needing to police kids on YouTube majorly. We don’t let our kids watch YT videos without us at all because one time we left the room for less than 5 minutes and a violent video featuring I believe Disney characters came up. That doesn’t even include the fact that, after watching two unboxing videos our kids were begging us for multiple toys mentioned in the videos for weeks afterward, but after not letting them watch them for a couple months there has not been a single begging episode.
These videos can be quite harmful to kids, and automated filters cannot do enough to keep the bad stuff away with the way YTs algorithms work.