Investor Insights is an Ocrolus guest blog series featuring prominent voices from the investment community. The series highlights industry trends, funding culture, and observations about the fintech space.
David is a computer scientist, investor, and philanthropist with decades of experience in investment management. He holds a PhD in Computer Science from Stanford University, where his thesis on Natural Language Parsing as Statistical Pattern Recognition was an early and successful attempt to use large-scale data to produce fully-automated syntactic analysis of text.
As a venture capital investor in early-stage data science-oriented technology start-ups, I see investment opportunities in nearly every industry you could imagine: real estate, human resource management, insurance, supply chain management, shipping, banking, advertising, the list goes on and on. Every one of these industries is abuzz with the same two questions: what data can we collect and how can we use that data to improve our businesses and increase our profitability?
The fundamental assumption in these inquiries is that every industry has seemingly limitless untapped data sources that contain information that can be used to improve businesses by increasing efficiency, reducing costs, identifying profitable opportunities, and generally improving decision-making. Furthermore, they assume current machine learning technologies, enabled by inexpensive and readily available computer technology, can help extract this information and can be used to implement software solutions to realize these benefits. And in every one of those industries, those assumptions are undeniably true.
It seems like we are entering a utopian age for data science applied to capital enterprises. However, as with all perceived utopias, there is a devious snake and a forbidden fruit lurking in our midst. In this case, the forbidden fruit is human behavioral data and the devious snake is our inclination to use that human behavioral data to benefit our businesses at the expense of the human beings we are supposed to be serving with our products.
Consider the example of human medical information. In the mid-1990s, the United States government implemented a federal law, the Health Information Portability and Accountability Act (HIPAA), which encouraged the sharing of medical information across the medical and insurance industries. As this information became digitized and easier to share promiscuously, it became clear that HIPAA had created a problem that endangered the welfare of the people it was intended to help. In response, in 2003, the United States government created the HIPAA Privacy, Security and Enforcement Acts to protect this digitized data from misuse. The theory behind this second set of acts was that personal data could be monetized by certain parties, including insurers, employers, and hospitals, to improve the profitability of those parties at the expense of the patients whose data was being mined.
Fast forward to 2019 and consider the case of human behavioral data (HBD). Devices that collect data about human behavior are everywhere: cell phones track and transmit our location, video cameras record our actions in public and private spaces, credit card companies, banks, and retailers collect and share our spending behavior, social media platforms catalogue nearly every aspect of our lives. Valuable HBD is available to nearly every industry, and it is largely left up to businesses’ sense of corporate social responsibility to limit the use and abuse of HBD.
There are any number of examples of corporations making ethically questionable choices about the use of HBD to benefit their businesses:
- Facebook and its Cambridge Analytica scandal
- Amazon using its Echo device to record private conversations
- IBM aggregating location data from weather apps to build valuable models to sell to the highest bidders
- Apple, Samsung, Facebook and Twitter “brain hacking” you, engineering your phone, apps and social media to get you hooked
The list goes on and on. As these abuses accumulate, one thing is inevitable: eventually, human behavioral data use will have to be regulated.
If this all sounds like the rantings of a data privacy zealot, you wouldn’t be far from the truth. But, ignoring the messenger for a moment, there are significant implications for everyone who invests in businesses whose livelihoods depend in one form or another on the monetization of human behavioral data.
First, from a corporate social responsibility perspective, we ought to consider the ethical and moral implications of the proposed uses of human behavioral data by any company we invest in. This is a tall order, since the long-term implications of those uses are not always clear. And ethics and morals are always complicated when significant amounts of profits (or losses) are involved. Nonetheless, it is incumbent on us to remember that we create businesses to serve their customers as much as their shareholders, and we need to take both parties into account when making investment and business decisions. And let’s not ignore the employees, whose loyalty to management and willingness to implement our business plans should not lead them into inadvertently violating their personal moral compasses.
Second, and more cynically, investing in companies that intrinsically depend on unethical, immoral and abusive uses of human behavioral data will likely lead to financial losses in the fullness of time. Just as in the case of HIPAA and private health information in the United States, eventually governments around the world will figure out how to regulate and restrict the use of human behavioral data, making HBD-abusive business models unprofitable and possibly illegal.
Whenever I come upon a company that is likely to be successful and profitable in the current regulatory environment, but appears to be in jeopardy of running afoul of unknown but likely new regulations, I give them the same guidance. Do everything you can to run your business in a way that is competitive and operates within the perceived ethical, moral and legal standards of the day, which is admittedly a very low bar today. But make sure you have a plan for how to pivot your business model to be compliant with the likely regulations that will limit, restrict or eliminate the promiscuous use of human behavioral data. And, if you can’t figure out a way to pivot away from HBD abuse, then don’t expect savvy investors to consider investing in your business.