There is no legal obligation to ship secure code, and most companies survive data breaches without real consequences. Companies all too often decide that security best practices aren't worth the extra resources. And corporate responsibility if often thought of as an obligation to shareholders. But as customers, employees andcommunity members, don't we want to see more than that?
This talk explores the obligations that companies have to their user base, and the ways that community expectations can lead to stronger security practices. We'll begin with an exploration of the nature of community and corporate obligation, drawing from traditional philosophical approaches across culture. Some examples we'll explore:
Even young, scrappy crypto companies will not launch until they have a pen test, referred to as an "audit report." There is no legislation requiring this, but it's become part of the culture. What can we learn from this, to potentially encourage adoption of similar practices in the broader startup community? (Is that even desirable?)
It's accepted that social media companies minimize the use of full time moderators, because that would be expensive. But this comes at a real psychological cost to users. Companies like Facebook and Twitter failed to stop the spread of a violent viral video on March 14th and 15th, despite requests from authorities in New Zealand and complaints from sensitive customers worldwide. What were Facebook and Twitter's obligations here? How does cost factor in?
This talk aims to give a thoughtful overview of the security landscape and current events, with the aim of leaving the audience with a better framework for evaluating corporate obligations and advocating for improved security practices.