Headlines Speaking
Debate/Åä·Ð Essay/¿µÀÛ
Àΰ­°úÁ¤ Misc
ÀÚ·á½Ç
WTS ½ÃÇ躸±â
(°í±ÞÅä·Ð) Is Big Data Spreading Inequality?
ÃÖ°í°ü¸®ÀÚ  |  14-08-26 19:13


Is Big Data Spreading Inequality?
Social media companies depend on selling information about their users¡¯ clicks and purchases to data brokers who match ads to the most receptive individuals. But the Federal Trade Commission and the White House have called for legislation that would inform consumers about the data collected and sold to companies, warning of analytics that have ¡°the potential to eclipse longstanding civil rights protections.¡± Does the collection of data by companies threaten consumers¡¯ civil rights?
* match = ÀÏÄ¡ÇÏ´Ù, ¾ÆÁÖ ºñ½ÁÇÏ´Ù/ receptive = ¼ö¿ëÀûÀÎ(¼±¶æ ¹Þ¾Æ µéÀÌ´Â)/ call for ~ = ~À» ÇÊ¿ä·Î ÇÏ´Ù/ eclipse = ºûÀ» ÀÒ°Ô(¹«»öÇÏ°Ô) ¸¸µé´Ù; °¡¸®´Ù/ longstanding = ¿À·§µ¿¾È¿¡ °ÉÄ£/ civil rights = ½Ã¹Î±Ç(Æòµî±Ç)   

 È¸»çµé¿¡ ÀÇÇÑ (°³ÀÎÁ¤º¸) ÀÚ·á ¼öÁýÀº ¼ÒºñÀÚÀÇ ½Ã¹Î±ÇÀ» À§ÇùÇϳª¿ä?

1. The Dangers of High-Tech Profiling
Big data systems have been used to target minorities. But these systems can be used to help them.

2. A Way Toward Greater Equality
Big data can also advance the interests of minorities and actually fight discrimination.

3. Implement ¡®Technological Due Process¡¯
Oversight of scoring algorithms would go a long way to ensure their fairness and accuracy for both government and private systems.

4. It Can Be Used for Good in the Community
Prescriptions for our most pressing social issues emerge from the patterns found in the bonanza of collected data points.

5. Losing Out on Jobs
When companies use patterns in large datasets to hire employees, they may unknowingly rely on previous poor decisions.

6. Extending Credit Through Data
Meaningful data such as on-time rent and bill payments, or even payday loan repayments, do not make it into traditional credit bureau data files.


Sample Essay

Big Data Should Be Regulated by ¡®Technological Due Process¡¯

In our increasingly scored society – where algorithms turn our browsing habits, click patterns, purchases and GPS location data into ratings and predictions of who we are – it is very difficult for those who are mislabeled, or tagged in an undesirable way, to break out of their scoring prisons, in part because they are usually unaware they are being reviewed.

When the government makes important decisions that affect our life, liberty and property, it owes us ¡°due process¡± – understood as notice of, and a chance to object to, those decisions. Unlike the government, private companies have no obligation to tell us about their scoring systems. Nonetheless, all predictive systems should be subject to fairness requirements that reflect their centrality in people¡¯s lives.

Oversight of scoring determinations – a sort of ¡°technological due process¡± – would go a long way to ensure their fairness and accuracy for both government and private systems. In the case of governmental automated systems, their opacity has frustrated due process guarantees. Some systems like the ¡°no-fly¡± list adjudicate in secret, while others lack record-keeping audit trails, making review of the law and facts supporting a system¡¯s decisions impossible.

When it comes to the scoring systems of private companies, consumers have no inkling that they are being scored as ¡°depression inclined,¡± ¡°poor candidates¡± or ¡°potentially pregnant¡± based on their online activity. They do not know their resumes are excluded from talent lists due to their Internet browsing habits.

The best way to ensure the fairness of scoring systems is through routine auditing by an expert agency. Much like the I.R.S. does with taxes, the Federal Trade Commission could randomly select private scoring systems for review on an annual basis. F.T.C. technologists could run expected and unexpected hypothetical scenarios to assess whether algorithmic predictions are statistical proxies for race, gender, religion and disability – thereby cutting down the possibility that the algorithms infringe on civil rights. The ever-present threat of an audit would encourage the adoption of precautions and, perhaps, encourage entities that are building scoring systems to be more mindful of concerns about discrimination and inaccurate predictions based on polluted data.

Precedent exists for such agency review. In 2008, the F.T.C. invoked its ¡°unfairness authority¡± against CompuCredit – which marketed credit cards to people with subprime credit – for reducing users¡¯ credit limits based on an undisclosed behavioral scoring model that penalized cardholders for certain transactions, including visits to pool halls and pawn shops, and personal counseling appointments.

As the demand for big data grows, we need to pierce the secrecy behind the systems that determine how we fit into society. Hard and fast rules are not the answer: Predictive systems are built on correlations and algorithms that change dynamically. But procedural regularity is essential to prevent ¡°arbitrariness by algorithm.¡±