Balancing Transparency and Privacy in This Broken World



— by Christian Schnedler

I have spent the majority of my career innovating in the gap between the ever-more-sophisticated tactics used by criminals and terrorists and the staid methods employed by the government agencies charged with maintaining law and order. This journey has led to stints as an entrepreneur, international technology consultant, government employee, and now senior executive. Throughout I have been guided by a vision -- my “Leaf by Niggle” (Tolkien, 1938), as it were -- that in the life hereafter we will once again walk unclothed and unashamed, with neither the taint of sin nor the self-awareness that proceeded from it interfering with our revelation to one another and to God. Yet this life demands a balance between transparency and privacy; a balance in need of a solution facilitating a more robust, redemptive dialogue between the government and governed. My hope is that the result will be transparent conversations focused on the fundamental decisions that must be made and enforced by each society, rather than sensationalized examples and concepts taken out of context and without any real bearing on the true issues at hand. It is a solution that I have been advancing in the various vocations God has called me to, but one that remains in dire need of support from others willing and able to answer the call.

As a case in point: Americans are demanding greater transparency regarding the use of emerging technologies by the criminal justice community. This includes the use of facial recognition and artificial intelligence across surveillance cameras and other Internet of Things sensors. Most public debates result in polarizing conclusions and unverifiable accusations that such technology is facilitating the systemic oppression of at-risk communities. In the face of this public outcry, the criminal justice community’s adoption of such technologies has become unnecessarily opaque and erratic across jurisdictions (stagnant in some and growing unrestrained in others). Not only has this effect diminished the tangible benefit realized by law enforcement in its fight against crime and terrorism, but the private industry manufacturing this technology has been motivated to look elsewhere for guidance on how this technology should mature and evolve — including to countries that do not share the same societal norms as the United States.

In reality, the criminal justice community is naturally compelled to provide great care in the selection and operation of technical solutions to address threats posed by criminal and terrorist actors. Law enforcement budgets are limited and procurement is an arduous, Byzantine process. Leaders cannot afford (politically or financially) to make large bets on technology programs with a high risk of failure. This leads to a “late majority” adoption model, with expenditures directed towards technology platforms proven in the commercial sector. Due care is also taken to ensure any new innovations conform to the even more outdated expectations of the courts, including explicit or implicit policies that analogize the methods and work product to prior art in the law enforcement playbook. Yet this definition of appropriate use is often poorly communicated (e.g. lack of publicly-available policies), inconsistent (e.g. lack of standards spanning jurisdictions), and untimely (e.g. relying on common law decisions). This leads external observers, and especially those predisposed to question the earnestness of the government, to conclude that the criminal justice leaders are out of touch at best and hiding something at worst.

Moreover, this opaque and unregulated nature of the criminal justice system’s use of technology leads to needless waste, broken families, and lives lost. As I once explained to a panel reviewing law enforcement’s use of facial recognition: if a bomb goes off in Midtown Manhattan, it’s not like everyone near the incident won’t be investigated simply because there is no tool to aid in the process. Rather than relying on technology to help expedite the identification of suspects wishing to cause harm, dozens if not hundreds of law enforcement officials spanning multiple government agencies will be brought in to manually review everything they can leading up to the incident. Countless man hours will be spent poring over information, and with each passing second the likelihood of apprehending those responsible before they can do further harm is reduced. The families of those serving in the criminal justice system are likewise materially impacted, and the trauma that the officers experience reviewing the darkest parts of humanity can be lifelong.  

The solution I am advancing addresses issues such as this by refocusing and reframing the conversation on the fundamental topics that should be transparently reviewed and agreed to based on societal norms and in light of the brokenness of this world.  

This solution includes the notion of persons deliberately and authoritatively identifying themselves within a population of people. These persons have individual rights and histories, and these persons also form broader communities that exhibit emergent properties all their own. As an avowed believer in democracy and the freedoms engendered by the United States Constitution, I believe that the default state of these persons should be one of privacy as no government or commercial entity has the right to peer through a one-way mirror at its fancy. Yet as a Christian, I understand that the evil of this world perverts such freedoms and leads certain persons to commit unspeakable crimes under this same shroud of privacy.  

The solution I am advancing therefore also enables the government’s indisputable identification of individual persons within a population of people. The government, through its agents (whom are also persons), must follow transparent processes in order to obtain authority to remove the shroud of privacy in order to identify persons of interest. The shroud of privacy resumes for all other persons. To prevent abuse, the means by which the identification process was adhered to must be made transparent as part of the trial process. To prevent bias, an aggregated view of the identification process and determination by the courts must be made transparent once a verdict has been reached and investigation closed.

In other words, the solution I am advancing represents the reconciliation of the subpoena process employed by the criminal justice system in the United States with the reality of today’s technology-driven world. Technology has lowered the barrier to achieving both true anonymity and complete transparency. Yet the solutions that achieve these ends are themselves shrouded in technical jargon and secrecy that creates an impenetrable definition and unregulated existence. The solution must take the form of something easy to understand and manage. However, the solution must also understand the fundamental forces at play dating back to the time of Adam and Eve.

Here’s a glimpse of how the solution I am advancing could work, using the same lightning rod example of facial recognition cited before: 

In the future, we should all go about our ways in complete anonymity to both the governments we are governed by and businesses we patronize. Our governing bodies should have a transparent rubric that defines criminal behavior and that has been translated into technical rules. When a crime occurs that justifies the identification of the otherwise-anonymous actors, law enforcement can request a subpoena from the courts that de-anonymizes only those actors in close proximity to the event. As the investigation narrows, anonymity returns to those whose proximity was mere happenstance. Once the case has been presented to a jury and the verdict rendered, the anonymity of those involved also returns as before -- albeit with the guilty receiving an indicator in their anonymized profile. The anonymized records of the population are generally available, encouraging privacy advocates and academia to continuously vet the efficacy of the system and provide feedback for continuous improvement.  

Scenarios such as the above where humans are on the technology loop guiding the criminal justice system in a transparent manner is no longer mere science fiction. The identities of persons within entire populations are routinely exploited by commercial ads, tech companies, and authoritative regimes around the world. What is missing in Western societies is a commitment to have the tough conversations required to establish a regulatory framework that balances the desire for privacy with the reality of the brokenness in this world. These conversations, implicitly or explicitly, must acknowledge our carnal desire to control which portions of our identity we reveal to God and to our fellow man.  They must likewise acknowledge the reality of evil and its propensity for deception. This evil resides in each of us and its temptations increase with our worldly influence. Those placed in positions of authority must therefore be held especially accountable in grace and love.

Personally, I am grateful for the Christian community that has supported me on this journey as I strive to strike this balance and innovate solutions that can bring about a more redemptive criminal justice system. Though I harbor no illusion that I will succeed in crafting a perfect solution to such a foundational problem in this imperfect world, I pray daily for guidance and forgiveness as the world marches closer to striking a new balance between transparency and privacy. Through it all, I cling to hope in the revelation God has given me that in the life hereafter we shall once again walk naked and unafraid in the cool light of grace.

This is one of the 2020 CEF Whitepapers. For more information on the Christian Economic Forum, please visit their website here.

 

Recent articles

——

[ Photo by Stephan Bechert on Unsplash ]