Facebook can't hide behind algorithms
If Facebook’s algorithms were executives, the public would be demanding their heads on a stick, such was the ugly incompetence on display this week.
First, the company admitted "fail" when its advertising algorithm targeted anti-Semitic users.
Then, on Thursday, Mark Zuckerberg said he was distributing details on over 3,000 group-bought lists with links to the Kremlin, a move made possible by the advertising algorithms that made Mr. Zuckerberg the multimillionaire.

Severe discomfort, you might say - but of course you can not remove the algorithm. And besides, that was only what was said to him.
"The algorithms work exactly as they were designed to work," says Siva Vaidhyanathan, professor of media studies at the University of Virginia.
This is what makes this controversy so difficult to solve - a crisis that directly affects the main activity of the largest social network in the world.
Fundamentally flawed
Facebook did not create a huge advertising service by getting contracts with big companies.
No, its success lies in small people. The florist who wants to spend a few dozen local targets when school bingo arrives, or a plumber who has just moved to a new area and needs work.
The wild profits of Facebook - 3.9 billion pounds (£ 2.9 billion) between April and June of this year - are the responsibility of this automated process. He discovers what users like, he finds advertisers who want to hit those interests, and they marry the two and take the money. No human needed.
But unfortunately, this lack of surveillance left the company open to the types of abuse exposed in ProPublica's investigation of anti-Semitic targeting.
"Facebook's algorithms have created these categories of antisemitic terms," says Professor Vaidhyanathan, author of Anti-Social Network, a book on Facebook to come later this year.
"It is a sign of the absurdity of a system without humanity, and of how a system without human can be dangerous."
This system will be a little less human in the future. In his nine-minute speech, Mark Zuckerberg, visibly uncomfortable, said his company was going to put human beings to prevent political abuse. The day before, its chief operating officer said that more humans would help solve the problem of anti-Semitism.
"But Facebook can not hire people to sell ads to other people on this scale," says Professor Vaidhyanathan.
"It's the very idea of Facebook that's the problem."
Crazy Idea
Mark Zuckerberg is in rough and unexplored waters. And as a "leader" (as I said) of the largest community ever created, I have nowhere to seek advice or precedents.
This was most evident on November 10, the day that Donald Trump was elected President of the United States.
Mr. Zuckerberg, quickly as a moment, rejected the suggestion as a "crazy idea."
This turn of phrase turned out to be Mr. Zuckerberg's biggest mistake to date the Chief Executive.
After Trump's election victory, the influence of Facebook was in question
His naivety over the power of his own company has sparked enormous play - both internally and externally - and an investigation into the impact of false news and other abuses has been launched.
On Thursday, the 33-year-old man turned out to concede that not only was this affecting the elections, but that he had done little to prevent that.
"I would like to tell you that we will be able to stop all interference," he said.
"But it would not be realistic, there will be bad people in the world and we can not prevent all governments from interfering."
A huge turnaround on its position only 10 months ago.
"It seems to me that he basically admits that he has not mastered the system he built," says Vaidhyanathan.
It is not surprising, therefore, that Mr. Zuckerberg "looked like an unlikely young leader speaking to his people in times of crisis," as the New York Times said.
Wolves at the door
This is not the first time that Facebook's confidence on machines has put him in trouble - and it would be quite unfair to characterize this as a problem that just affects Mr. Zuckerberg's business.
Just in the last week, for example, a study by Channel Four revealed that Amazon's algorithm would readily suggest the components you needed to make a homemade bomb based on what other customers also bought.
US officer: Russia has "hacked" 21 US states in election
Can the US electoral pirate be traced to Russia?
US-Russian Sanctions: How did we get here?
Big Tech algorithms are out of control. They have been used for a long time.
At least two US Senators
are supporting a new bill that would require social networks with a user base of more than one million to adhere to new guidelines on transparency regarding campaign ads. The statement by Mr. Zuckerberg on Thursday , a serious promise to do better, is considered a way to keep the wolves from regulating the door. He - and all other CEOs of technology - would much prefer to deal with this in his own way. But Professor Vaidh ...
Comments
Post a Comment