撤掉編輯人員無法解決Facebook的傾向性問題
Facebook最近表示,,它已經(jīng)調(diào)整了自家網(wǎng)站上一個(gè)關(guān)鍵板塊的運(yùn)作方式,。該社交網(wǎng)站稱,,在這個(gè)名為“熱門話題”(Trending Topics)的板塊中,,內(nèi)容的挑選和編輯基本都不再由人來負(fù)責(zé),,相反,,現(xiàn)在幾乎整個(gè)板塊的制作和編排都由計(jì)算機(jī)算法完成,。 今年初,曾負(fù)責(zé)該板塊的幾名編輯說有人鼓勵(lì)他們把某些保守新聞網(wǎng)站的內(nèi)容排除在外,,阻止它們成為熱點(diǎn),。盡管Facebook矢口否認(rèn),但此事引起了軒然大波,,進(jìn)而成為該公司做出此番調(diào)整的部分原因,。 Facebook在自己的博客中宣布了此項(xiàng)新措施。它還表示,,并未發(fā)現(xiàn)熱門話題挑選過程中存在“系統(tǒng)性傾向”的證據(jù),,但依然希望這次調(diào)整能讓該板塊“就各種各樣的話題為人們提供廣泛的意見和看法”。 Facebook想必是希望用算法管理該板塊能讓此類指責(zé)變得更容易應(yīng)付,,這是因?yàn)槿藗冋J(rèn)為程序代碼比人更客觀也/或更理性,,所以不容易存在傾向。 然而,,正如科技行業(yè)分析師反復(fù)指出的那樣,,負(fù)責(zé)Facebook新聞采集和熱門話題的算法并非無所不知,也不是什么冷酷客觀的機(jī)器,。它由人設(shè)計(jì)和編寫,,而且在大多數(shù)情況下都包含了這些編程人員的傾向,。 實(shí)際情況是,F(xiàn)acebook其實(shí)并未將所有人都排除在熱門話題板塊的運(yùn)作之外,。該公司在博客中稱,,仍將由編輯人員來剔除一些并未指向真正新聞事件的話題。它指出:“比如說,,全世界的人每天中午吃飯時(shí)都有午餐話題,,但它并不是熱門話題?!?/p> 本周一也出現(xiàn)了一個(gè)類似事例,,熱門話題板塊出現(xiàn)了關(guān)于福克斯新聞網(wǎng)主持人梅金·凱莉的假新聞,,隨后被眾多新聞?dòng)浾咭约捌渌恍┯脩糁噶顺鰜怼?/p> 但真正的問題在于,,僅僅從使用編輯人員轉(zhuǎn)向使用算法并不會(huì)改變Facebook的新聞采集和熱門話題算法是否存在傾向的事實(shí)。無論由人還是計(jì)算機(jī)軟件來選擇哪些新聞足夠有趣或者有價(jià)值,,其決定都會(huì)自動(dòng)地把其一些東西排除在外,。 算法和編輯人員或許都會(huì)排除午餐話題這樣的內(nèi)容,但也可能會(huì)排除其他東西,,而大多數(shù)用戶對(duì)此一無所知,。對(duì)一個(gè)看來想成為新聞內(nèi)容匯聚地的社交網(wǎng)站來說,這將帶來潛在風(fēng)險(xiǎn),。 2014年,,密蘇里州弗格森鎮(zhèn)發(fā)生黑人遭槍擊事件后,F(xiàn)acebook的熱門話題板塊并沒有向大多數(shù)用戶披露相關(guān)內(nèi)容,,相反,,它展示了一些無傷大雅的“冰桶挑戰(zhàn)賽”帖子。這是因?yàn)榇蠖鄶?shù)用戶都沒有針對(duì)弗格森鎮(zhèn)事件發(fā)帖嗎,?Facebook一定會(huì)說,,是的,但實(shí)際情況如何我們不得而知,。 同時(shí),,這些問題還會(huì)循環(huán)往復(fù)。就算熱門話題板塊真實(shí)地反映出人們分享和交流最多的內(nèi)容,,但如果Facebook的新聞采集算法隱藏甚至排除了某些類型的帖子,,就像這個(gè)網(wǎng)站經(jīng)常做的那樣,這些內(nèi)容就永遠(yuǎn)也不會(huì)流行起來,。 我的結(jié)論是,,F(xiàn)acebook的編程人員其實(shí)也是編輯,他們決定著我們?cè)谑裁磿r(shí)候看到哪些東西,這會(huì)產(chǎn)生實(shí)實(shí)在在的影響,,不光是對(duì)新聞行業(yè),,對(duì)整個(gè)社會(huì)也是如此。(財(cái)富中文網(wǎng)) 譯者:Charlie 審校:詹妮 |
Facebook recently announced that it has changed the way it handles a key section of its website. Instead of being curated and edited mostly by human beings, the social network said its “Trending Topics” feature will now be almost entirely produced and structured by computer algorithms. This change was driven in part by a controversy that flared up earlier this year in which human editors who worked on the feature said that they were encouraged to exclude certain conservative news sites and prevent them from trending, although Facebook denied this. In its blog post about the new approach, the social network says it found no evidence of “systemic bias” in the way trending topics were selected, but nevertheless hopes that the change will make the feature “a way for people to access a breadth of ideas and commentary about a variety of topics.” Presumably, Facebook is hoping that handing the feature over to an algorithm will make it easier to defend against these kinds of accusations because computer code is seen as being more objective and/or rational than human beings, and thus not susceptible to bias. The code that operates Facebook’s news feed and trending algorithms, however, isn’t some kind of omniscient or ruthlessly objective engine, as technology analysts continually point out. It’s designed and programmed by human beings, and in most cases incorporates the biases of those human programmers. As it turns out, Facebook isn’t actually taking all of the human beings out of the Trending Topics process. The company noted in its post that human editors will still be used to weed out certain topics that don’t refer to actual news events. “For example, the topic #lunch is talked about during lunchtime every day around the world, but will not be a trending topic,” it said. Another example presented itself on Monday when a fake news story about Fox News host Megyn Kelly appeared in the Trending Topics section, and was called out by a number of journalists and other users. The real point, however, is that simply moving from using human editors to using algorithms isn’t going to change the reality of whether Facebook’s news feed and trending topics algorithms are biased. If either human beings or computer software are choosing which items qualify as interesting or newsworthy, then that decision automatically excludes certain other things. Maybe the algorithm and the human editors will exclude topics like #lunch, but they may also exclude other things, and most users will never know. That creates a potential risk for a social network that seems to want to become a hub for journalistic content. In the aftermath of the shooting of a black man in Ferguson, Mo. in 2014, the Trending Topics feature showed nothing about the event to most users, but instead showed innocuous posts about the “Ice Bucket” challenge. Was that because most users weren’t sharing posts about Ferguson? Facebook would undoubtedly say yes, but the simple fact is that we don’t know. These problems can become recursive as well. Even if Trending Topics does faithfully represent what people are actually sharing or interacting with the most, if Facebook’s news feed algorithm hides or even excludes certain types of posts—which it routinely does—then they will never trend. The bottom line is that Facebook’s programmers, who in a very real sense are also editors, are choosing what we see and when—and that has very real implicationsnot just for journalism but for society as a whole. |
-
熱讀文章
-
熱門視頻