We were recently interviewed on the topic of the impacts on personal privacy that human centred analytics might have. Of course personal privacy was front of mind at the very inception of SWOOP; and we quickly adopted the privacy tag line that we are “SWOOP not SNOOP”. That said, we are also mindful of the very power of the insights that can be drawn about individuals, from what might look like innocuous data, which presents a very grey area for privacy protection.
In the consumer world we are now aware that when we accept something for free, e.g. the use of a free app, free membership of a rewards program; we are in fact paying with our personal details and acceptance that we will be ‘advertised to’. Ray Wang from social media research and advisory firm Constellation Research, describes it succinctly as “if you haven’t paid for the product, then you are the product”. We have accepted the benefit, and paid with diminished privacy. But there are also situations, it could be argued, that we have not willingly entered into a transaction to trade our privacy. Monitoring systems in shopping malls and increasingly even corporate offices can track personal movements. While these are typically installed for security reasons, the data could still be available for a more privacy invading means of tracking. Being tracked for bathroom, coffee, smoking breaks etc. would certainly breach the ‘pub test’ for privacy invasion, but likely not the letter of the law.
Inside the Enterprise, the idea of tracking ‘digital footprints’ is not new. In fact email tracking has been an academic pursuit for decades; and even received a boost when the US government ordered the release of failed energy company Enron’s email archives for academic use. In the commercial world, Email archives are typically the legal property of the Enterprise and therefore the individual has little if any right to their own company email archive, and by extension, their corporate instant messages, social networking, video conferencing, telephone calls and the like. What we have found however, is that despite their legal rights, most organisations will draw a line if they believe their employees may feel that their privacy is being compromised. In fact, as a vendor, we are regularly faced with quite exhaustive security reviews, sometimes taking many months to complete, before our product is allowed to be installed.
Privacy by Design
Organisations’ respect for personal privacy are increasingly now an ethical, more so than a legal issue. Losing the trust and respect of one’s employees can be far more damaging to a business than even a formal legal breach. We are fortunate at SWOOP to have been associated with the Ethics Centre (formerly St James Ethic Centre), who have been able to help us navigate some the grey areas around ethical use of our analytics solutions. And the general advice we have gained is that privacy needs to be part of the initial product design. Some of the principles that we designed into SWOOP are:
- Personal privacy needs to be protected. Which has resulted in us securing the personal level analytics to the individual;
- Transparency and simplicity. We have shied away from complex and opaque algorithms, so that for the most part, end users can see and understand how measures are being derived
- Personal Empowerment. From the very start we have designed SWOOP to be accessed by all staff, and not just the management, as is often the case. Our desire is to equip Enterprise staff by showing them explicitly how they are working, and therefore able to change the way they work;
- No exploratory data mining. We collect data with a specific purpose; and that is to identify the types of personal relationships that enhance collaborative performance. In fact, our name SWOOP was inspired by the very specific and targeted relationship data we are seeking; to SWOOP in and SWOOP out.
As we get exposed to similar developments in the industry, we are encouraged to see that others are also developing similar privacy principles. For example, IBM’s Marie Wallace has designed a ‘privacy by design’ framework, that has been successfully employed inside IBM.
In the consumer world, providers of personal fitness devices like the popular Fitbit, have access to significant personal data, that is protected by their privacy policies. However, like many health organisations, they do reserve the right to use de-identified data to support research and development of new health solutions. Analytics data, like that provided by Fitbit, are a boon for health researchers. And individual Fitbit users are typically supportive of this use of their data. At SWOOP, all the data that reaches our servers are already de-identified. Partly, this is due to the fact that our core analytical technique Social Network Analysis (SNA), does not require message content or personal information to derive its insights. Where required, the re-attaching of ID information will happen within the firewalls of the client, using browser level APIs. Like Fitbit, we are also using this de-identified data for the ‘global good’, by using this data for collaborative benchmarking research. Through this research we are looking to gain insights into the most productive collaborative online behaviours, for the benefit of the industry as a whole. We have recently signed a research agreement with the University of Sydney’s Digital Disruption Research Group to accelerate this work.
So is it worth trading personal privacy for personal benefit?
Our view is that this is not always necessary. We think and have seen enough to believe that the industry is moving toward a situation where individuals can receive the benefits of human centred analytics, without needing to trade-off their own personal privacy. And it is important for this balance to sustained, if we are to achieve the full benefit of the social analytics that has now become available to us.