Headlines Speaking
Debate/Åä·Ð Essay/¿µÀÛ
Àΰ­°úÁ¤ Misc
ÀÚ·á½Ç
WTS ½ÃÇ躸±â
Should Companies Tell Us When They Get Hacked?
ÃÖ°í°ü¸®ÀÚ  |  13-02-25 16:51


Should Companies Tell Us When They Get Hacked?
The New York Times Company and others have come forward in recent weeks to say that they were hacked. Although hacking is common, it¡¯s rare for companies to talk about it — even though doing so could warn customers about compromised data, or alert other businesses to a particular threat. Should companies be required to disclose security breaches?
* come forward = ³ª¼­´Ù/ hack (into sth) = ÇØÆÃÇÏ´Ù/ compromise = Å¸ÇùÇÏ´Ù; ±ÁÈ÷´Ù/ alert (sb to sth) = ÀǽÄÇÏ°Ô ÇÏ´Ù/ security breach = º¸¾È»óÀÇ °áÇÔ  

 È¸»çµéÀº º¸¾È»óÀÇ °áÇÔÀ» ¾ð·Ð¿¡ ¹àÇô¾ß Çϳª¿ä?

1. A National Priority and a Business Priority
Like pollution, an insecure infrastructure is bad for everyone. Disclosing security breaches reintroduces the public's collective interest.
 
2. Disclosure Plays Into Hackers¡¯ Hands
By requiring companies to disclose security breaches, we would be requiring them to telegraph their weaknesses to the world.
 
3. Investors Need to Know
It¡¯s not that the events aren¡¯t happening. It¡¯s that you're not being told about them -- despite the S.E.C.'s instructions.
 
4. We Need Better Notification Laws
Existing laws focus mainly on whether a breach exposed information, not why it happened. Yet the reason can be critical.
 
5. More Disclosure Is Not Always Better
People may exaggerate the risk of hacking if they suddenly start hearing about a lot of it. Or they may get used to frequent reports and underestimate the risks.
 
6. Improve Digital Hygiene
As a nation we devote few resources to creating a common social understanding of how to keep people and computer systems safe.


Sample Essay

More Disclosure Is Not Always Better

Psychological research offers no clear prediction about the effects of disclosing more about computer hacking. All it has are various relevant behavioral principles whose net impact depends on the circumstances.

One of those principles is that people judge risks by how easy it is to recall and imagine them happening. However, memory and imagination are imperfect guides. As a result, people may exaggerate the risk of hacking if they suddenly start hearing about a lot of it.

Another principle, leading to contrary predictions, is that people get used to frequently reported events. As a result, they may underestimate the risks of hacking, if they hear about it all the time and nothing horrible happens.

Emotions play a role, too – sometimes helpful, sometimes not. So do the social cues that people take away from others¡¯ behavior. Are my friends worried? Do they know any more than I do it?

Trust in those making the disclosures makes a difference, too. Why are they suddenly telling me about all their hacking problems? Are they trying to work me? What are they still hiding? Are the regulators finally doing their job?

Questions of risk and decision making are complicated, especially with complex, novel, uncertain, technical issues like hacking. Anyone who makes confident predictions is likely to be disappointed – and to disappoint those who trusted them.

Rather than try to predict, why not require disclosures that provide people with the information they need: namely how big the risk is over all and when they are personally vulnerable.

Such disclosures would need to be standardized, both so that people have a chance to learn about hacking and to protect the forthright firms from the ones less inclined to be forthcoming.

Level with people, in a comprehensible way, and trust them to figure out how to reward the firms that provide the most security. That¡¯s the best that we can do. Acting as though people can¡¯t handle the truth will only breed distrust and undermine the purpose of disclosure.