We are always on the lookout for potential customers who have a data problem. We listen to their stories and assess if Sharemind is the best fit for their data needs. Sometimes, these great data stories find us. From the plains of the Midwest, Asemio, one of our US partners, is a company with a commitment to social good and data privacy.
Asemio’s Co-founder and Managing partner, Aaron Bean took time to tell us more about his company and how privacy preserving technologies, like Sharemind, are important.
Tell us a little about Asemio's history.
Asemio was created from a desire to reconcile two worlds. In 2009, after a few decades in technology consulting, I answered a call to public service and joined the United States Peace Corps. While stationed in Kazakhstan for almost three years, I navigated a lot of unfamiliar terrain. Instead of daily standup meetings and software development milestone reviews, much of the work I was doing was establishing human connections. In Kazakhstan, I learned that to be effective in my capacity-development role, I needed to become a part of the community. This lesson in cross-cultural understanding would stay with me when I returned to the States a few years later, and I found myself forced to reconcile the radical extremes of my professional experience. I had experienced the sophisticated infrastructure and profit-driven paradigm of a Fortune 500 company, as well as the entrepreneurial and richly human model of the Peace Corps. I wanted to be a part of a technology organization that married humanitarian and financial goals rather than setting them at odds with each other. When I couldn’t find that organization, I built Asemio.
What types of projects do you look for?
The humanitarian component of Asemio’s mission has led us to collaborate with partners who are focused on community development. Our partners include collective impact backbone organizations, university programs, philanthropic entities, and other community-based organizations.
The services we provide for these partners are split into three camps: implementing new technology systems that increase efficiency for direct service providers; creating innovative systems to support the analysis of integrated data; and assisting communities in developing an integrated data and technology vision for the future.
How did you find Sharemind?
One of the first data integration efforts that I participated in after entering the social impact sector had a goal to design a foundational architecture for future community data integration work. We assessed advanced community data integration architectures and sought to connect social determinants of health providers into an existing data warehouse. We spent years in discussions with community stakeholders and invested heavily in legal advice and technology. At the end of the project, we had gained insights that could inform possible future plans, but felt our efforts were disproportionately high when compared to the resulting data and technology infrastructure. The tension between the need to obtain private data for increased efficacy of analysis and the need to protect the identity of vulnerable populations became evident to us. We asked ourselves, “How can we support a system that enables community analysis of integrated data faster, with fewer dollars, and in a manner that enhances privacy protection for the individuals contributing data?”
Specifically, we were interested in linking records of individuals from disparate systems while protecting their privacy. Our engineers had an instinct that modern cryptosystems could offer a solution. In our research, we identified homomorphic cryptography advances that would allow us to create new technologies that could meet our needs by protecting input values and sharing aggregate results. As we considered the available cryptosystems, we found Sharemind to be set apart by its security protections, industry acclaim, and commercial-ready tooling.
How have you integrated using privacy enhancing technologies into your projects?
Our initial step toward integrating privacy enhancing technologies was to analyze the upstream impact of privacy preserving technologies on adherence to HIPAA, FERPA, and 42 CFR Part 2. Our initial analysis found both the technology and legal foundations to be sound.
To explore the potential of secure multi-party computation in community analytics, Asemio initiated a pilot project in collaboration with two social impact organizations serving families in Tulsa, Oklahoma. These organizations were interested in using secure multi-party computation to derive insights from shared data without sharing personally identifiable information. Both organizations rely on data-informed decision making, and obstacles to data sharing decrease their effectiveness in serving the community. Both organizations also work with some of the most vulnerable populations in Tulsa. This situation provided an opportunity to test our assumption that privacy preserving technologies could relieve the tension we identified between privacy and data sharing.
The results of the initial pilot were encouraging. The automated privacy preserving analysis of encrypted data produced the same result as a manual analysis on plaintext, indicating that the end result, a count of unique individuals represented in two data sets, was correct. Ensuring correctness in a controlled study was a necessary first step to ascertaining the viability of the technology for larger and more complex applications. We believe that secure multi-party computation has the potential to mitigate many of the risks of sensitive data sharing while preserving many of its benefits, and we have already begun executing a second project and planning a third.
Continuing from our initial pilot, our second project is funded by Data Across Sectors for Health (DASH), a national program of the Robert Wood Johnson Foundation. For this project, we are partnering with a cross-sector collaborative to apply this innovative analytics technology to the problem of quantifying overlap in the sets of individuals served by providers from multiple sectors in Tulsa. Our goal is to further test our assumptions that using secure multi-party computation will encourage organizations that historically have not been able to share data to do so, simplify governance processes for HIPAA- and FERPA-regulated data, and increase trust in collaborative partnerships.
Where do you see privacy technologies, like Sharemind, having an impact in the US?
Data has become a central component of almost every community-focused strategy, often acting as the linchpin of outcomes-improvement work, systems-level change initiatives, pay-for-success projects, and social impact strategies. Providing a technical answer to increased privacy concerns is paramount.
Several industry-wide indicators make the case that privacy technologies like Sharemind will become more popular in the near future. The World Wide Web Foundation, through its “Contract for the Web,” has called for governments, businesses, and citizens to take a more intentional role in shaping the future of the web. The contract specifically asks companies to “respect consumers’ privacy and personal data.” Facebook is building a “privacy-focused platform” around several of its key products, including WhatsApp, Instagram, and Messenger. The impact of the European Union’s General Data Protection Regulation (GDPR) is starting to reverberate and add to the voices calling for increased clarity with regards to privacy regulations. In addition to federal efforts, California has already passed the California Consumer Privacy Act, and Washington is evaluating the Washington Privacy Act. The proposed legislation in Washington state mirrors several elements of the GDPR, and if passed, would make the privacy protections some of the strongest in the world.
The combined efforts of industry watchdogs, big business, technology-using citizens, and domestic and international regulators all point to the same place. It is not a matter of if, but rather when and how improved privacy protections will be implemented in the United States.
As our society moves to embrace this industry shift, technologies like Sharemind are poised to give us the new tools we need to architect more privacy preserving applications.
What do you believe is the future of data analysis and how important is protecting people's personal data?
Asemio is committed to the idea that the hard work of building trust must precede investments in data infrastructure. A community must first establish a common vision, build mutual trust, and gather the resources needed. Then it can implement appropriate community governance models, ethical and legal protections, and algorithms that advance community efforts. When these cultural and governance requisites are in place, data systems infrastructure can rise to meet the community’s vision. It follows from this that privacy protection is paramount, especially when working with vulnerable populations. The conversation must be changed from whether we invest more in privacy or efficacy of analytical results to how we invest more in privacy guarantees while increasing the efficacy of population-level health analytics.
The Sharemind team is here to help solve your data challenges, so you can write the next chapter in your data story. To find out more about Asemio, visit their website here.