Algorithms are ruining UX
How the guise of personally curating our digital experience has created a monster.
I only watch a handful of types of videos on Youtube. Stuff with Synthesizers, How to make Ramen Noodles, UI Tutorials, Carl Sagan and Alan Watts interviews, Trevor Noah and other liberal political comedy clips, NYT’s - The Daily, and other investigative journalism podcasts.
For the most part, Youtube’s algorithms stay right in my wheelhouse, minus the occasional Right-Wing Propaganda video that somehow creeps into my feed.
But lately, I started being spammed with recommended content for conservative conversion videos, and Pro-Trump videos in my news feed. Everything was very negative, alarmist, and dare I say, extreme?
At first, I considered it mildly annoying, but I didn’t think much else of it. I put it in the back of my mind as something to keep my eye on, blocked them, and moved on.
Then, the coup de grâce. I started getting assaulted every time I opened Youtube on my iPhone. This time, with Trump ads at the top of my home screen, disguised as primary content, and no option to disable them.
I started getting anxious whenever I went on Youtube. I began to recognize a change in my behavior when I interacted with the platform.
I was closing and reopening the app as soon as I saw these ads, but they never went away. I found myself looking away from the top of my phone screen and immediately scrolling down so I wouldn’t see these ads.
I felt insulted, and they just straight up pissed me off. I tried to block them with the three-dot menu, but I could only select, “Save to Watch later,” “Save to playlist,” “Share,” and “Why this ad?”
“Like anyone is saving these to watch later,” I thought.
I began to recognize my behavior outside of the platform began to change as well. It seems that I’ve caught Emotional Contagion — It looks as if the strategy of this administration was to make sure everyone catches some type of infection.
but the real question is…
How is this tailored to me?
It’s not, well not directly…
Per Youtube/Google’s recommendation, I checked my Ad Settings to see if my “personalized ad” might have mistaken my interests.
My Ad personalization is on, which I knew…Most of the interests listed are pretty generic, but I previously had turned off some misplaced interests that I don’t care to see.
I went through all of the interests that make up my online advertising data.
Nothing stood out that would suggest the reason I was seeing inflammatory Trump ads, so…
I turned to the internet and apparently, I wasn’t alone.
The Google search results of Trump +Ads +Youtube turned up in a massive amount of people asking how to dismiss these ads.
They’re sick of it, and they’ve been given no way to dismiss or flag them. There’s an overwhelming feeling of helplessness that many people are starting to associate with Youtube.
Hey Youtube — User annoyance matters.
The Negativity Bias in a User’s Experience
People remember more bad than good. So, in designing the user experience, we need extra care to avoid any frustrating interactions.
Ad space, the types of ads shown, as well as the ability for users to control what ads we see, are and should be a consideration for Google.
If you want us to eat your cookies, that allows ad personalization so that advertisers can fine tune, target, and reach 80% of all internet users - stop the spam placement — seriously…give some of the power back to the products…sorry, I mean’t users.
Anyway, I dug a bit further and it turns out, this was a deep-seated issue regarding Google’s Election Ad defense.
Political ad vs Election ads
They differentiate between an ad that’s political, and an ad for an election, each have restrictions, and that’s a good thing.
Also, Google updated its policies regarding Election ads last year. They continue to offer restricted targeting for election ads to the following criteria:
- Geographic location (except radius around a location)
- Age, gender
- Contextual targeting options such as ad placements, topics, keywords against sites, apps, pages, and videos
It's important to note they never offered granular micro-targeting of election ads.
Political content is different and includes ads for political organizations, political parties, political issue advocacy or fundraising, and individual candidates and politicians.
Election ads don’t include ads for products or services, including promotional political merchandise like t-shirts, or ads run by news organizations to promote their coverage of federal election campaigns, candidates, or current elected federal officeholders.
This is an important difference.
You can’t mention a candidate by name in the text of the Election ad, or when you apply to run the ad, Google will flag and disapprove it.
There’s a loophole to the targeting of Election ads. All you have to do is avoid using a candidate’s full or last name.
Search terms are still not considered part of an ad, for the purpose of election enforcement.
Notice there’s nothing about fake or misleading content?
- YouTube will remove videos that advance “false claims related to the technical eligibility requirements” of current candidates and officeholders. YouTube offers as an example a claim that a candidate isn’t eligible for office because of false information about citizenship status requirements.
- The company also said it would remove content that has been manipulated in a way that misleads users and may pose a risk of “egregious harm.” That includes the altered video of House Speaker Nancy Pelosi that was slowed to make her appear as if she was drunkenly slurring her speech.
It seems that Google is being proactive when it comes to directly helping curb very specific electoral deception, ie: Birther conspiracies, and specially manipulated content, but what about hate speech and the rampant conspiracy theories that gave voices to QAnon?
YouTube will use its powerful algorithm to prioritize authoritative voices in search results for news and topics prone to misinformation. They will not ban QAnon, but they are banning some white supremacist accounts.
Here are some external sources to learn about how their algorithm works from people much smarter than me:
- How YouTube Drives People to the Internet’s Darkest Corners — WSJ VIDEO — Behind Paywall.
- How an ex-YouTube insider investigated its secret algorithm — The Guardian
- Fake News — Exposure to untrustworthy websites in the 2016 U.S. election — PDF Download
Supporting the 2020 U.S. election
The 2020 election season officially kicks off with the Iowa caucuses today.
Building on our work to support the operations and security of the 2020 U.S. Census, we’re sharing more about what we’re doing to tackle abuse on our platforms, equip campaigns, and help voters.
Tackling threats and abuse
Our Trust and Safety teams span the globe to monitor and disrupt account hijackings, inauthentic activity, disinformation campaigns, coordinated attacks, and other forms of abuse on our platforms on a 24/7 basis.
We take seriously our responsibility to protect our users from harm and abuse, especially during elections.
But why are they up?
Because the only repercussion for breaking these policies, are video removal, and non-compliance with political content policies may result in information about your account and political ads being disclosed publicly or to relevant government agencies and regulators.
And who are you gonna report Trump to? The police? The Mayor?
Typically, ads ran a few days before being removed, suggesting they reached the target audience before removal.
So we know;
- Google differentiates between Electoral and Political ads, the former has restricted targeting for election ads.
- Google has proactive policies in place to remove Electoral and select Political Ads that spread misinformation and fake claims.
- You can’t mention a candidate by name in the text of the Election ad, or when you apply to run the ad, Google will flag and disapprove it.
- While Election Ads have restrictions, Political content can include ads for political organizations, political parties, political issue advocacy or fundraising, and individual candidates and politicians.
- Youtube will use its algorithm to prioritize authoritative voices in search results for news and topics prone to misinformation.
- There are no clear long-term repercussions for violating company policy in an Electoral or Political Ad, besides single offense removal.
This is all good to know, but still, why am I seeing these ads?
The Trump campaign has reserved the most expensive digital ad space in the country on Election Day: YouTube’s homepage.
They bought the masthead. Everyone sees them, and we can’t flag the ads in the masthead.
Now, we know why the Election ads are on my homepage, we also know how the campaign gets around targeting and misleading possible voters, but;
If everyone see’s these ads, how are they choosing what content to run as these high-profile, unavoidable billboards?
Good question, Josh… The message conveyed in the ads is a result of the best performing content and messages shared and viewed by non-subscribers.
They’ve been flooding the platform with small bits of content, like splitting Trump’s speeches into 28 smaller videos and testing which pieces had the most views.
But, with record unemployment and 211,000 Covid deaths what narrative would work to keep supporters engaged and energized?
Well, their strategy seems to be similar to convergent rapid ideation and testing — They make small, sharable, Hype-memes, mostly out of context, that users can easily share without too much effort on their part.
Let’s have a look;
Notice a pattern?
Not many videos over 1 minute.
Classic Trump one-liners as short video clips on his personal Youtube Account.
How YouTube recommends content
The Trump campaign taking advantage of how Youtube’s algorithm is more likely to recommend to viewers channels that are updated regularly with new content, combined with not making the users think too hard about what they’re liking or sharing.
Essentially, they're doing user testing with the qualitative data they get on every video they upload.
Even if it get yanked for a violation, they still get performance data to better inform their big spending, election ads messages that we’re seeing on the masthead.
They’re finding the most divisive messages and content that speaks best to voters outside their core support system — And THAT is why we are all seeing these ads.
Does it work? The campaign spent millions to spam YouTube’s homepage during the Democratic convention. It drew 40 million views to five new ads, and 93 percent of the watch-time came from non-subscribers — you tell me.
So the question becomes, who can we blame?
Google says it’s against company policies for advertisers to make false claims, but when politicians repeatedly violate the policy, continue to have a platform, and those violations go unpunished.
Do we blame the users? Well, that wouldn’t be a good UX strategy, now would it?
Do we blame the Trump Administration for weaponizing data to further their agenda? They have had over 300 videos pulled, yet there’s no accountability but a slap on the wrist, so why would they stop?
Do we blame Google/Youtube for prioritizing ad generating revenue over the User Experience? I mean that’s really what’s going on here…
Personally, I’m gonna go ahead and blame 2020.
I kid….but, selling the masthead to an Election Ad campaign?
That. Is. Greasy.
Google is banning election ads after November 3rd. Due to the mail-in voting due to Covid, it might take weeks to tally the votes and announce a winner.
This window could give either political candidate a chance to runs ads prematurely claiming victory, spreading misinformation about the results.
They are trying. But, as I said before, they’re being ‘Proactive’ — I think they should be ‘Reactive’.
Making an example out of a flagrant violator would stop the misinformation and disinformation on the platform. Well, at least until the next person figures out the next way around it…
ward winning photographer & digital artist turned UX Designer with over 10 years of experience in content creation and content strategy. From hands on projects, to managing & nurturing creative teams, I am a visual storyteller dedicated to solving complex problems & producing creative solutions.