Skip to main content

The Massive Privacy Failings Aren't Even the Scariest Thing About Facebook | Inc.com

The Massive Privacy Failings Aren't Even the Scariest Thing About Facebook | Inc.com: Every product changes�its users' behavior in some way; that's practically what it means to be a product. But while Facebook is very good at engineering behaviors that are good for its business--adding more friends, sharing more information with them, spending more time interacting with their content--it doesn't stop there. The company has also induced users to vote who otherwise wouldn't have. It got people to become organ donors.�Now it's trying to get people to become more active in their local organizations and support their local newspapers.

None of these things sounds terribly sinister. Rather, they're expressions of a vaguely utopian worldview that infects much of Zuckerberg's thinking. Because he thinks "human nature is basically positive," if more people express their ideas or vote or volunteer, the results will ipso facto be basically positive. But Zuckerberg himself has said being too "focused on the positive" for the first 10 years of Facebook's existence blinded the company to much of the abuse it was enabling as well as to emergent effects of social media like hyper-polarization.

And Facebook doesn't even stop at modifying behavior. It tinkers with users'�thoughts and emotions as well. A notorious "emotional manipulation" study showed that the company could make users feel better or worse by altering the contents of their News Feeds. Recently, Facebook announced it will tweak its algorithms to encourage more "meaningful interactions" between friends because those cause its users to experience positive emotions, whereas passive content consumption leaves them feeling worse afterward.

Francois Chollet, a computer scientist who works on deep learning at Google, believes Facebook's ability to both measure and alter its users' behavior is dangerous, raising the specter of "mass population control." That's because machine learning algorithms--an area where both Google and Facebook have invested heavily, and which Facebook uses to fine-tune the content of each user's News Feed--are highly effective at connecting inputs and outputs in a recursive optimization loop. "The human mind is a static, vulnerable system that will come increasingly under attack from ever-smarter A.I. algorithms that will simultaneously have a complete view of everything we do and believe, and complete control of the information we consume," Chollet tweeted.

Popular posts from this blog

(26) Post | LinkedIn

(26) Post | LinkedIn : ► Trump was first compromised by the Russians back in the 80s. In 1984, the Russian Mafia began to use Trump real estate to launder money and it continued for decades. In 1987, the Soviet ambassador to the United Nations, Yuri Dubinin, arranged for Trump and his then-wife, Ivana, to enjoy an all-expense-paid trip to Moscow to consider possible business prospects. Only seven weeks after his trip, Trump ran full-page ads in the Boston Globe, the NYT and WaPO calling for, in effect, the dismantling of the postwar Western foreign policy alliance. The whole Trump/Russian connection started out as laundering money for the Russian mob through Trump's real estate, but evolved into something far bigger. ► In 1984, David Bogatin — a Russian mobster, convicted gasoline bootlegger, and close ally of Semion Mogilevich, a major Russian mob boss — met with Trump in Trump Tower right after it opened. Bogatin bought five condos from Trump at that meeting. Those condos were...