April 1, 2004
Susan Hart, Financial Economist
Office of Critical Infrastructure Protection and Compliance Policy
U.S. Department of the Treasury
ATTN: FACT Act Biometric Study
via e-mail: email@example.com
Re: Comments of Privacilla.org on Formulating and Conducting a Study of Biometrics and Similar Technologies to Combat Identity Theft
Privacilla.org is pleased to offer these comments on the study required by the Fair and Accurate Credit Transactions Act of the use of biometrics and other similar technologies “to reduce the incidence and costs to society of identity theft by providing convincing evidence of who actually performed a given financial transaction.”
Privacilla is a Web-based think-tank devoted to privacy as a public policy issue. The Privacilla site contains hundreds of pages of material about all aspects of the privacy issue, including privacy “fundamentals,” privacy from government, and commercial privacy, including financial, medical, and online privacy.
Privacilla explicitly adopts a free-market, pro-technology perspective. Our belief is that the best policy is to limit the role of government to robust and equal protection of individual rights under law. Limited government allows entrepreneurs in competition with one another to innovate and produce goods and services that best satisfy the wants and needs of the most consumers at the lowest cost. There are other viewpoints, and we encourage full consideration so that they may be rejected on their merits.
Biometric technologies have substantial implications for privacy and so are more than a passing interest for privacy advocates. That being said, however, the privacy implications of biometrics are often assumed or overstated because it is an “unfamiliar”
technology. The word “unfamiliar” must be placed in quotes because some biometrics are very familiar indeed. The name “biometrics” casts the technologies as highly futuristic and seemingly strange when, in fact, many are commonplace and deeply rooted in human interaction and commerce.
Though we make no particular claim to expertise in biometrics, we offer the following definitional thoughts about the topic from the perspective of the pro-technology privacy advocate. They may serve to help organize the study and place biometrics in a common-sense, demystified setting.
Overview of Biometrics
Students of Greek and Latin word forms know that the term “biometrics” is formed of two concepts: life (bio) and measurement (metric). Biometrics is simply the measurement and use of the unique characteristics of living humans to distinguish one from another. Examining and recording these distinctions can help authenticate people’s claimed identities when they enter into transactions. Biometrics are very useful tools for tying people to their identities and histories. In the commercial world, biometrics can make life much easier for the honest consumer — and very difficult for the dishonest one.
We are aware of two categories of biometrics: physiological and behavioral.
Physiological biometrics measure the distinct traits that people have, usually (but not always or entirely) dictated by their genetics. Examples of physiological biometrics include advanced techniques like DNA, retinal scans, and, facial geometry, but also well-known methods like fingerprinting and photography.
It may surprise some to learn that common photography is also a “physiological biometric” technique, but it is. In the early days, chemicals were used to record the photons of light that bounced off a human face, reproducing eye and hair color, facial shape, unique features, and so on. Modern photography records reflected photons digitally, as pixels on a fine grid. Either way, the relatively well understood technology of photography is a “physiological biometric” that need not be as daunting as those big words suggest.
The second category of biometrics is “behavioral.” Behavioral biometrics measure the distinct actions that humans take, which are generally very hard to copy from one person to another. Examples of behavioral biometrics include voice printing and gait analysis, which use computers to analyze the sound created by the human voice box or the movement of a person walking. A more common behavioral biometric is the handwritten signature, used daily by people to formally or informally signify their authorship of a document or assent to an agreement. The name “behavioral biometric” may be intimidating, but the signature is entirely familiar to the average consumer.
What is “New” About the New Biometrics?
In a sense, humans have used biometrics since the dawn of time to identify and distinguish one another. Individuals use rough measurements of each other to recognize one another. When a baby sees or hears the unique biometrics of a parent — the familiar facial structure, voice print, and so on — he or she knows that this person will make them safe. Adults are well trained to record and recall the unique “biometrics” of friends, relatives, acquaintances, and business partners. In this human-to-human context, of course, biometric identifiers and information are recorded by the brain in formats we do not yet understand or know how to mimic.
Two related elements make modern biometric techniques unique.
First, modern biometrics use fully articulated and standardized measurements. The most advanced of them, such as DNA, are widely regarded as proving identity with mathematical certainty. It does so using a scientific process that can be repeated by anyone with proper training. DNA has substantial confidence among the public for the accuracy of its standards. Fingerprinting has substantial confidence because of long experience, though perhaps less than DNA. The standards used in other biometric techniques may not yet have gained the confidence of the public. The question of standards and their acceptance goes to whether biometric are accurate and fair in the commercial context, whether their use provides due process in the governmental context.
Second, modern biometrics are generally machine readable and recordable. Biometric information, and the results of biometric scans or observations, can be copied easily, shared quickly or widely, combined, and stored for long periods of time without degrading. This is where privacy concerns enter in, in somewhat compounded form: biometrics and biometrically authenticated information tend to be both highly accurate personal information and highly usable personal information. This information will be used in ways that are still evolving and that many people struggle to stay familiar with.
For privacy, technologies like biometrics are essentially neutral. They have no natural bias to protect or degrade privacy. Rather, the privacy consequences of biometrics rely on the actors involved and the terms of their adoption.
If biometrics are adopted in a consensual process by market participants, the privacy consequences can be minimized and ameliorated various ways. Consumers can choose whether to use biometrics, balancing the effects on privacy against the benefits of the technology. They can demand terms of adoption that protect the interests they most value. Markets are by far the superior way to facilitate this balancing and negotiation.
Mandated by governments or forced on consumers by policies that interfere with market processes, biometrics may deeply degrade privacy and do away with healthy anonymity. It is even more important in the biometric context that privately developed and held data not be converted wholesale to governmental purposes, where consumers lose their power of choice.
Using Biometrics Well, and Winning Public Acceptance
Breaking biometrics down into its constituent concepts reveals many familiar information-policy issues that the biometrics field must follow as it develops.
To gain acceptance, the standards used in new biometrics must be of high quality and be accepted by the public as such. This will provide assurance of accuracy and fairness. Without this, consumers will reject the technologies out of fear that they will be treated unfairly, be confused for others, and so on.
Because it is increasingly digitized, biometric information draws in many of the concerns that consumers have had as they address other digital technologies, such as the Internet. They need to be reassured about how information moves, where it goes, who has control of it, and what assurances they have about security, further sharing, retention, and subsequent use of data.
It is a fool’s errand to predict what concerns will predominate with consumers, and it is something worse to try to speak for, or dictate to them. But consumers can be expected to adopt new biometrics at a speed they are comfortable with. Experience seems to show that policies or programs designed to tinker with consumer adoption rates can do only a little to either speed or slow adoption of new technologies.
Because of the extraordinary consequences for privacy, government policy should scrupulously avoid any kind of technology “forcing.” Consumers should, and probably will, reject any program that smacks of driving them to use new biometrics before they are fully comfortable with them. Through this study, the Department should work to demystify biometrics, but wherever it deals with biometrics it should make no effort to promote, “incentivize,” or push the technology on an unready public. Doing so would be counterproductive at best and potentially erosive of privacy and civil liberties at worst.
Most likely, biometrics will be adopted along lines of familiarity. Though they may not recognize it as such, consumers are very familiar with biometric technologies like photography, handwriting, DNA, and fingerprinting. The new biometrics that are most likely to gain acceptance are the ones that are most like these old ones, the ones that require the least new absorption of information.
Digital fingerprinting, for example, requires only one conceptual hurdle for the vast majority of consumers: “reading” fingerprints by computer. The electronic signature pads in many stores and in the hands of parcel deliverers are preparing many consumers to accept digital analysis of their signatures as a behavioral biometric. Gait analysis, on the other hand, has never been recognized as a way people identify one another (even though they do). Articulating standards for gait analysis, and then having them applied by computer, would require the consumer to leap over at least two tall conceptual hurdles. It is unlikely to gain acceptance any time soon.
Because biometrics have significant potential consequences for privacy, it is important to raise and address them at every turn and at the earliest stages. Biometrics hold substantial benefits for consumers, too. Suppression of identity fraud is one among many. The Department’s study of biometrics could go along way to demystify biometrics and smooth the way for adoption of these technologies on the terms consumers desire. In this study, or otherwise, the Department should not attempt to force biometric technologies on consumers through the financial services industry.
James W. Harper
 See generally, Clyde Wayne Crews Jr., Human Bar Code: Monitoring Biometric Technologies in a
Free Society, Cato Policy Analysis No. 452 (Sept. 17, 2002) <http://www.cato.org/pubs/pas/pa-452es.html>.