"He sees you when you're sleepin'
He knows when you're awake
He knows if you've been bad or good
So be good for goodness sake."
Recorded by Eddie Cantor, Bing Crosby, the Andrews Sisters, and others, 1934, "Santa Claus is comin' to Town"
Big Brother is watching me.
Big Brother wants to sell me stuff.
OK. I hear those voices. But I still don't get it.
Everything one does on line is trackable and tracked. Tuesday morning I visited the Hoka website to get screenshots of shoes for my post on Phil Knight and Nike. By that afternoon I began getting pop-up ads for Hoka shoes when I visited politically-oriented websites. Foreign companies don't need to have spies skulking around in shadows nor do they need spy balloons. They could buy whatever data they need from Google, Microsoft, X, or Meta, just like Hoka did.
Google has its search engine, G-mail, this web hosting company "Blogger," and it owns YouTube. They know everything. The billionaire ex-wife of a Google founder just joined the RFK, Jr., ticket as the candidate for Vice President. She can legally spend as much money as she wants on the campaign. She opposes vaccinations. She opposes in vitro fertilization. I think she is a conspiratorial nutcase, but she is a billionaire conspiratorial American nutcase. She can legally make all the mischief she wants.
Tech firms know what I buy, where I am, who I talk to, and what I write, including what I delete. I own stock in each of them but I don't control them. I find TikTok interesting and fun. It is essentially the same thing as YouTube, except YouTube has more long-form, educational videos on history, geography, and science. My TikTok feed leans toward movie clips, stand-up comedy routines, political commentary, sleight-of-hand card tricks, and sports. It seems pretty harmless to me. Apparently I watch those short videos to the end, so TikTok feeds me more of them. That doesn't strike me as invasive or manipulative. It strikes me as responsive and attentive -- a bit of Dale Carnegie-style salesmanship: Listen to the customer.
I hear the criticism that TikTok is too good. It should be American-headquartered companies, not a Chinese-headquartered one, doing a good job. TikTok is winning friends and could influence people -- how American of them. The most dangerous and heavy-handed manipulator of information I observe is Rupert Murdoch and Fox. If TikTok is manipulating that feed by sending me hidden messages in the choice of magicians and commentary that I see, it is very subtle and I don't care. Fox isn't subtle at all.
I consider TikTok to be exactly as dangerous as YouTube. One has a corporate parent in China. The other is headquartered in Silicon Valley. Both are potential time-wasters, but so are cooking shows, basketball tournaments, and crime fiction novels. People should be free to amuse themselves. But shouldn't young people be doing their homework? Sure, but that won't change. People will find what is fun for themselves. TikTok is more interesting than algebra problem sets.
The most dangerous manipulators of the American mind are the Russian government and American billionaires. I don't trust government to ban speech and neither did the writers of the Bill of Rights. I prefer more speech even if some of it is dead wrong.
Maybe I am wrong here. But I am independent.
I will be happy to publish guest posts that attempt to persuade readers of this blog that TikTok is worse than Rupert Murdock.
[Note: To get daily delivery of this blog to your email go to: https://petersage.substack.com Subscribe. Don't pay. The blog is free and always will be.]
16 comments:
I , just for fun, looked at new car prices from a google search and now I am getting lots of car suggestions. I too like TicTox and think it is no more dangerous than anything else. Let China know I like Biden and hate Trump by what I watch. Let them know I’m a sports guy and watch all the science stuff, not all Americans are science deniers. The world has become smaller and we are integrated even though some countries are dangerous. I think it’s a mistake to stop the integration.
Ultimately, the most dangerous manipulators of the American mind are those who possess them – or are possessed by them, as the case may be. Whether the influencers are TikTok, Fox “News” or Hollywood, they are simply responding to demand. The cause of all our problems is our own behavior. If the U.S. winds up degenerating into a sleazy autocracy, we’ll have no-one to blame but ourselves. As Pogo said, “We have met the enemy and he is us.”
Just a reminder: There is no requirement to be on any social media. Even in this Disinformation Age, one can survive quite well without it. It could even give one time for something really subversive, like reading a book.
If you want to know about subversive influence, read Heather Cox Richardson's Letters From an American post from last night. With due respect to Peter.
The real danger from TikTok is allowing the Chinese Communist Party to control the content selection algorithm that is in control of the attention focus of so many Americans.
Many young Americans get all of their news from TikTok. I can just imagine the anti-Taiwan slant in that news for the six months prior to a Chinese attempt at a military takeover of Taiwan.
What Michael describes is much like how Russia funneled money into social media ads.to promote TFG in the US.
When bush invaded Iraq I asked my friends "what if Iraq doesn't want democracy?"
It looks like Americans (TFG supporters at least) would be happy for Russia to control the United States.
“What Michael describes is much like how Russia funneled money into social media ads.to promote TFG in the US.”
The difference is, Russia was not in control of the content selection algorithms for Facebook and Twitter.
“What Michael describes is much like how Russia funneled money into social media ads.to promote TFG in the US.“
Except that Russia was not in control of the content selection algorithm for Facebook and Twitter.
Fears of the Chinese using TikTok algorithms to influence the election are probably unfounded. They’re undoubtedly well aware that Republicans don’t need any help spreading Trump’s stupid lies and regardless of any help they had, his deranged BS will only resonate with those who already wallow in it.
The nature and influence specifically of the TikTok platform is over my head, but overall it seems apparent that the Chinese government is a far greater and more influential manipulator of the American mind than the Russian government, especially in the academy, mass media, and in the halls of government. The much-bruited Russian social media purchases associated with the 2016 election were, comparatively speaking, a drop in the bucket. Left-leaning folks have transitioned from decades as minimizing apologists for Soviet Russia to minimizing Chinese government apologists today because the latter are now the international vanguards for socialist or communist conceptions of social justice and centralized collectivism.
Facebook did some experiments in number of years back that showed that subtle changes in the content selection algorithm could produce significant political effects in voters who followed those particular content feeds.
Mike claims that the concern I raised is “probably unfounded“. Perhaps he would care to let us know the evidence he has to back up his assertion.
Studies have shown that changing a platform's algorithm substantially changes what people see and how they behave on the site, but doesn't affect their beliefs.
https://www.npr.org/2023/07/27/1190383104/new-study-shows-just-how-facebooks-algorithm-shapes-conservative-and-liberal-bub
Perhaps Michael would care to let us know what evidence he has "that subtle changes in the content selection algorithm could produce significant political effects in voters who followed those particular content feeds."
The quote below these comments is from the article that Mike linked to. The article clearly does not support Mike’s contention that concern about Chinese communist party influence on the content selection algorithms of social media is “unfounded“.
Liberals indulged themselves in a panic over Russian influence on our politics via social media after the 2016 election. And that was a situation in which Russia was not in control of the content selection algorithm. We have a new election coming up, and however strong Russian influence was, Chinese influence will be stronger thanks to their control of the algorithm.
If Russia was a problem in 2016, China has got to be a worse problem in 2024.
——————————-
The study's short duration and setting — a three-month period ahead of a highly contentious national election — may have been too short to show an impact on beliefs, he added.
The research published on Thursday is the first batch of more than a dozen studies the project took on; further papers are in the works about the impact of political advertising, the spread of disinformation, and other topics.
Ultimately, these studies raise more questions than they answer, said Chris Bail, director of Duke University's Polarization Lab, who was not involved in the research but reviewed the findings.
"We need many, many more studies before we can come up with these types of sweeping statements about Facebook's impact on democracy, polarization, the spread of misinformation, and all of the other very important topics that these studies are beginning to shed light on," he said.
For the record, my actual point is that Chinese manipulation of algorithms isn't likely to change people's beliefs, which is exactly what the article states. Of course more research is needed, but meanwhile Michael provides no research-based evidence contradicting that premise.
Michael provides common-sense logic that anyone with an open mind would be capable of understanding and agreeing with.
Sounds like the stolen election argument: Who needs evidence?
Algorithms encourage people to live in echo chambers of repeated and reinforced media and political content. They don't change people's beliefs, they reinforce them. So, logic would lead us to conclude that no matter who is manipulating the algorithm, it still isn't likely to change people’s beliefs or who they vote for.
Unless, of course, you have some evidence to the contrary besides, "I said so."
Post a Comment