Tulsi Gabbard is right – Google is biased


On the night after the first Democratic debates,
Tulsi Gabbard was the single most Google-searched candidate by far [1]. A perfect opportunity for the new candidate
to raise crucial donations and climb up in polls in order to qualify for the next debates. But that would be too easy. In the heat of Tulsi’s massive surge in
popularity, Google suspended her advertising account for 6 hours and so drastically limited her ability to direct newcomers to her campaign website[2]. Tulsi is asking for $50 million to compensate
for damages. Google’s response?”We have automated systems
that flag unusual activity on all advertiser accounts… …and we do so without bias toward
any party or political ideology.” They have an algorithm and it is unbiased. The classic argument goes that machine-learning
algorithms are mathematical, and by their very nature, neutral and unbiased. But this unchecked theoretical view of engineers
in Silicon Valley crumbles in reality. In our reality, algorithms reinforce biases
they learn about from their training data. The new invisible hand of the modern discourse
are the machine-learning algorithms that are used by tech companies to recommend us shopping items, organize social media feeds, or personalize search results. These algorithms start off with a small set
of very simple instructions and programmers then feed them pools of data to learn from
on their own. Machine-learning algorithms are good at navigating
complexity – much more efficiently than humans. They can quickly skim through large databases
and prioritize certain values over the others. Today, algorithms are increasingly more often
entrusted witdh critical decision-making, including court sentencing, granting loans
and benefits, and even hiring for jobs and academic placements. [7] But there is a catch. Much of the development and implementation
of algorithms happens in secret. Their formulas are proprietary and users rarely get to even know the variables that make up their equations. Often times machine-learning algorithms make
decisions that not even their developers can understand why and how they arrived to them
and yet they just seem to work. [7] But mathematics cannot solve everything. The result of machine-learning algorithms
is solipsistic homogeneity – a process of finding associations, grouping data into categories
and creating a structure of sameness. The training data is always paramount to any
algorithm. If social or political biases exist within
that data, the algorithm is most likely going to incorporate them. Often times, it’s the historical data that
carry negative social footprint into the automation. In 2018, Amazon was looking for a way to automate
its hiring system. To recruit new engineers more quickly, an
artificial intelligence was developed. The system would scan through past resumes
and search for the best candidates on the web. But because the historical data showed predominantly
male resumes, the AI “learned” men are preferred to women. The algorithm automatically downgraded all CVs with the term “women’s” or attending women-only schools. When Amazon learned about this they tried
to repair the algorithm but soon they found out no matter what they did, it would always
find new forms of bias. So they decided to kill the algorithm and
return to the traditional hiring methods. [11] Similar to Amazon’s hiring AI, Google’s advertising algorithm also mirrored cultural
biases of historical data. A study found that the system shows ads for
high-income jobs to men disproportionately more often than it does to women. [8]
In other cases, users can attempt to feed the algorithm with biased information and
manipulate its outcome. Not so long ago Google Search autosuggest
feature used to rely heavily on user-input data. Until users learned how to easily game the
system to manipulate its rankings or just to troll the search engine with a cesspool
of bigotry. So Google made a decision to drastically interfere
with its search algorithm removing entire dictionaries of non-advertiser friendly terms. [10] Artificial intelligence is also used to predict criminal behavior that judges rely on to determine
their sentencing. But not even this realm is immune to algorithmic
biases. One such widely used algorithm flagged African
Americans as higher risk although they didn’t re-offend twice as mush as white Americans. Similarly, white Americans were labeled lower risk but did re-offend twice as much as African Americans. [9] Machine-learning algorithms are still very weak at understanding nuances of human language. Under the pressure from advertisers, YouTube
cracked down on extremist content by automatically flagging and demonetizing videos containing
a whole vocabulary of key words. But the algorithm is not capable of differentiating
between content that is truly extremist and one that is educational or merely reporting
on it. YouTube’s workaround was to give mainstream
media an exclusive pass, automatically alienating independent creators and journalists in the
process. [12] [13] The success of machine-learning algorithm stands and falls on the availability of good
data. The catch is there will always be less information
about minorities which will always lead to higher likelihood of invalid statistical patterns
about minorities. [14] A perfect manifestation of this reality,
is Amazon’s facial recognition tool that misidentified women for men 19% of the time and brown and black women for men up to third of the time. [15] [16] Not always is it the algorithm that should be blamed for all the bias. Sometimes corporate or organizational interest
of its creators can hugely interfere with its delivery. As Google grew to become a dominant search
engine worldwide, it slowly began offering more and more services that directly competed
with the market of providers that relied on Google search to reach their customers. [5 a,b] When the company launched Google Finance,
it began prioritizing it over the organic search results for relevant key words, even
though Yahoo Finance claimed the title of being most popular among users. This practice then expanded to Google Health, Google Reviews, Maps,video, travel and bookings and email. Prioritizing its own products allowed Google
to steal up to 34% of the search traffic. Now that percentage is even higher, as Google
Search offers instant answers and a wider range of Google products that make users stay
on Google longer and thus generate more ad revenue for the company. [17] [18] [19]
This is not a critique of whether Google should be allowed to push its own products as a private
company. Rather, it’s to show yet another vector
for bias to sneak into the algorithm and show that its search engine is not as neutral as
Google would have you believe. Corporate bias is a powerful factor. And corporate bias is especially important
to political insiders. Long time Google Executive Eric Schmidt has
been working hand-in-hand with the Democratic party, both with Obama and Hillary Clinton
campaigns. There was a lot of effort from Google insiders
trying to get Hillary Clinton elected. This included implementing features that would
manipulate Latino vote in key states or investing in startups and groups that would support Clinton campaign with technology, data and advertising. [20] [21] Tulsi Gabbard probably doesn’t enjoy the same level of insider connection with one
of the most influential tech companies in the world. So whether temporary suspension of her account
in a critical moment was just an error of the algorithm or was intentional, is a speculation
at this point. Had Tulsi had people on her side at Google
headquarters, this suspension might have never taken place or would have been much shorter. Google is refusing to give answers to crucial
questions: What variables triggered the automated system
to suspend her account? Was it flagged by the algorithm and then suspended
manually? Or was the decision made by the algorithm
alone? What unusual activity led to the algorithm
flagging Tulsi’s Ads account? Spending significantly more on ads on Google
after she became the most searched candidate could only be expected as the most rational
move a presidential candidate could make. Definitely not an unusual activity. Everybody’s strategy would be to capitalize
on the search traffic. It’s very difficult to understand the reasoning behind suspending her account under these circumstances. This practice of unaccountable moderation
is an industry standard across all major social media platforms [3 a,b]. Routine censorship raids on social media gave the right the argument to accuse Silicon Valley of liberal bias. [4] Whatever the case is, the presence of bias is undeniable. Algorithms are mathematical, but they can
only learn from people. A good step forward would to be admit the
bias exists and open up the source code of the machine-learning algorithms, so that we
can study these biases in real time as they arise. Secret development of artificial intelligence
by unaccountable tech corporations is a recipe for dystopian control of the information flow
and monopolization of Internet markets. Tulsi Gabbard learned this the hard way.

Maurice Vega

100 Responses

  1. You can now follow references appearing at the bottom left corner of the video. All sources are listed in the description. Thank you all for your support!

  2. Democracy is impossible when all information is controlled. A Republic is impossible when a ruling class exist above the law!

  3. Yea, Google and YouTube search absolutely sucks now and essentially completely useless. All that hard work for nothing. I use DuckDuckGo for my search engine now although it's not quite what I'm looking for either. And BitChute is the streaming platform I use alternatively to YouTube and I'm using it about 30% of the time now since YouTube will not host much of the content that I find interesting and valuable.

    For a wake up call in case you are dead asleep, Google "Federal Reserve", "Donald Trump", or "9/11 Conspiracy Theory"…Basically pretty much anything considered interesting or controversial and you'll see how garbage the search results are. Nothing valuable shows up and the results are largely curated and preselected whether you are logged into YouTube or not. Even "Flat Earth" is heavily censored. YouTube also hit most good independent content creators with heavy demonetization, shadow bannings, and outright entire channel deletions while favoring Big Media / TV content. Pathetic however superior alternatives are being developed as I type.

  4. Google: "It wasnt us, it was…alghoritm"

    The most childish excuse ive ever heard since kindergarten "graduation." 🙂

  5. 9:09 The biases are not necessarily in the "source code" but the training data and tuning parameters of the model building phase.

  6. Google is crazy whit there censorship, you noticed that the mainstream media nowadays is the top result also on Youtube when locking for a topic to not start whit the trends? There engine is also heavy censored you can see it whit copyrighted adult content over the last 2 years where you will find hardly any full movie streams whit there search engine while ducktogo works just fine.

  7. wasnt google bought by the obamas while they were in office, google was directly told & or paid for to block her, duh.

  8. google and youtube first censored and after blocked around 80% of conservative pro life channels and christian catholic channels in my country , also they block life channels from church ceremonys after one Bishop strat saying that lgbt movement is doing bad things to our society organizing half naked marches on the streets , we have alot of liberals in charge corrupted by EU so if someone dissagre they call him facist and that he spread "hate speach" realy? half naked woman or man with bra on his head and dildo in his hand, what the heck is going on ?

  9. Know what's really scary? AI making decisions for govt officials. I'm also concerned about self-driving vehicles and their algorithms. If my car is entering a crosswalk and suddenly must decide which person to strike (if there's no other option), will it choose to run over the 80-yr old woman or the 10-yr old child in the crosswalk? Society values youth over old age, so will the algorithm reflect that value? Who's responsible for the accident and subsequent damages? The programmers? The insurance companies? The auto manufacturer?

  10. She will end like Sanders. Drag ppl into "democrats" pit. And after … bye bye.

    Democracy is a joke.
    #yellowjackets

  11. I knew Google was evil when they prevent people from accessing email accounts unless they have their mobile phone number as well.

  12. "the gene learning algorithms are mathematical, and by their very nature, neutral and unbiased"
    Only a cis hetero valid white man can believe this sentence. You can't learn something without introducing some bias, and minorities tend to learn this the hard way

  13. Lmao you're telling me the AI has male preference for job applications and rates basketball americans as being more at risk? All because of statistical fact? Geez, AI truly is based and redpilled. Maybe we should hand the reigns over to it, save us from clown world :^)

  14. It doesnt matter is they did it manually or not. They shit the calf and drowned the cow. Its fucked. Even if they didnt we cant trust them that they didnt because they did it plenty of times before.

  15. everybody should stop using google and go 2 duck,duck go or whatever its call3d i havent switched cuz well honestly im an idiot when it comes to technology.😳

  16. No they have chosen her as President….rather God has and they are following correct suit by statistically focused Ai driven content exposure methods. In other words it's too early… she needs to be an underdog Bravo I say! AI can provide data, but Silicon valley made the right choice. Slow and steady. Aloha Google.

  17. As i countered the bias of elitist media on My Nation after the revocation of Article 370 my account was banned on Facebook and remains so as i write this comment, these people can not tolerate any one countering their narratives. i wish American people well and hope they will come on top and see through the lies of Elitists, Tusli seems an excellent woman with morale integrity,hope Americans will see past her religion. another excellent video keep it up.

  18. Google reportedly cited “problems with billing information or violations of our advertising policies,” then “suspicious behavior in the payment activity in your account,” before reinstating the account.

    https://www.google.com/amp/s/www.theverge.com/platform/amp/2019/7/25/8930373/google-tulsi-gabbard-democratic-candidate-lawsuit-ad-search-ban-political-bias-debates

  19. There's some biases when I scroll up looking for more "Related" and Google/YT thinks Im reviewing something for purchase ,that "Related" pops up every time flawlessly

    The opposite of that when I leave your videos and scroll up , you would think the word "related" doesn't exist cause it's never been anywhere in site without fail.

    "Oh but AI directed people right back too his page , that as related as it's gets".

    🤷🏽‍♂️ sure why not

  20. 1 error. Women are not a minority. They are 50% of the population, so that is not example of minority status leading to poor results in facial recognition. Either the sample data was poorly designed and included too few women, or women are harder to recognize for machines for other reasons.

  21. the thing about facial recognition is that the black or dark colored people have a higher absorption rate of light than fair or white colored people. when you look at a black paper you don’t notice the difference in relief, but if a white sheet of paper has an indentation you notice it instantly

  22. Here is a guest speaker giving a talk (at Google no less) about how their algorithms are biased / racist etc. . The "algorithm talk" really kicks in at 31:00.
    https://youtu.be/EMUEuMV112E
    The google attendees more-or-less concede.

  23. THO…I agree…the algorithms should be open-sourced. But that will probably never happen…Google, et.al will never open their doors. The only way this will happen, is for an open-source AI algorithm to be invented…but since it is going to be "the Truth"…it will limit profitability. No company will willingly take on less profits…for the Truth. Cheers!

  24. 2:40 instead of scrapping it, they should've started it from scratch, using equal amounts of male and female resumes to train the data.

  25. The Hated One, can you possibly do a review of AirDroid application used to transfer files from Samsung to Apple? How secure is it really? Is it abusing the trust of customers much like google and facebook? How easy is it to hack after the latest security patch was implemented? Or maybe you can recommend a link or two that covers it?

  26. Machine algorithm bias… naah I don't buy it. We all know that media outlets have their own bias during elections – why would this media corporation be any different? If we want to regulate Google then they all need to be reined in. Problem is – there's no competition for many of these online media. Anti trust laws & anti monopoly laws need to be enforced & Google & Facebook need breaking up… and soon.

  27. Only the idiots in Washington, DC, still have an inkling something may be afoot with big techs bid to own the public narrative. I suspect one day LONG after Google/Facebook/Amazon have installed their own puppet totalitarian regime, will the ousted Washington insiders have that Eureka moment, saying to themselves, "maybe, we should have tried to do something about this, before it was to late"?

  28. I think it's great that democrats are becoming victim of this, since if any they've decided to not care about this issue for so long as long as the targets were their political opponents.

  29. It's not just the bias in datasets selected, but they allow manual interventions to always override the decisions.
    The algorithms give you what you want, like this video to me .. . ….. but the algorithms become too people friendly for the corporate overlords, so they purge it with their "family friendly" content

  30. the bias seems to be a problem, because they had a public talk about it. but they HAVE taught us stochastic gradient descent, learning rate, batch size, and all the neatty gritty about the parameter tunning. hmm, funny. and in finance, everybody builds their own trading bots in secrets and lie to others that they don't use machine learning lol

  31. Fuck Google ain't worth a shit somebody needs to stop their damn ass before they destroy everything we're going to be at War and it's going to be a sad world things don't change soon and ship up are the white worms looking in major trouble major problems fuck Google earn dumbasses destroy go back to the way it was fucking assholes need to get their stinking ass so that all the wrongdoings are doing. They ain't no damn good

  32. I'm curious where you got your statistics for your race based statements. One I rarely believe any statistics on that and two the more we perpetuate these thing's the more we perpetuate division. The problem is more on economics than race, if we focused on the real issue the latter would cease but people aren't willing to do the hard work of solving real issues because their brains can't easily solve them, that's not an insult it's a widely researched study on brain science. Without addressing the root causes all other problems will persist.

  33. Why would they be biased? I think that the better question actually would be "why wouldn't they be biased". How will people be able to prove it without the algorithm?
    And what makes the matter even worse is that they can't show how everything works because people will 100% exploit and/or hack it.

  34. Somewhat relevant nit pick: You can open the source of your ML algorithm and Google does that in many cases, but you won't be able to reproduce what the neural network does unless you also have the model (made from lots of data), which you're very unlikely to get.
    Even with the correctly trained model it will be hard to understand the decisions that your network makes.

  35. they made their algorithm and they own their action, if we can't see how it operates why should we trust something we can't see?

  36. they are doing this on purpose. be vigilant and don't let your guard down, because it might just get worse

  37. What's wrong with men being the best? Don't companies want to hire the best? Unless they are hiring women just to have vaginas in the building.

  38. not fixing is still a choice anyway and she is an electoral candidate not just yet another number on advertising

  39. Is google biased? OF COURSE THEY ARE! The problem is that google/youtube is private property and thus not regulated as a public utility. For this reason, they are not violating any laws. The first amendment restrains the government from impeding on your free expression, not private businesses.

  40. There need to be an a app cellphone and computers that can make the data appear as if u never watched or used ur cellphone or computer

  41. Don't think twisting the data towards an agenda is unbiased and several email examples already suggest they can twist their results from their algorithms for certain effect

  42. i use duckduckgo, havent used google in well over 5 years. i havent shopped at a walmart in 10 years. and gabbard is a goddamn kook.
    TRUMP 2020! TRUMP FOREVER!

  43. If the machine says men are the best whos to argue?
    Give the top jobs to women and watch the world fall or at least get a whole lot worse.

  44. The top result was still her official website, even though her ad account was suspended. I don't see how that's a great manipulative force. I wouldn't particularly sell it as a good thing, but in effect her campaign saved money by not having to pay for clicks. Further still, if you ask me, people are more likely to click on the first actual result than the ad that shows up.

  45. This is what happens when we pursue the idea of "Kapital Uber Alles!" we get lazy in the procurement of goods and services and seek to martial out labor. The end result: solipsism. A solipsism created by human laziness.

Leave a Reply

Your email address will not be published. Required fields are marked *

Post comment