What are: Fingerprints?

Originally Posted on The Yale Herald - Medium via UWIRE

Illustration by Paige Davis, ES ’21, YH Staff

Fingerprints are formed in response to the first environment we know: the womb. They begin to develop in the 10th week of pregnancy; between the 17th and 19th, they have settled into a pattern that — bar erosion by acid or fire, and sometimes even in spite of that — is permanent. A few theories circulate as to how they are produced. One suggests that the basal layer of the epidermis buckles and scrunches, such that the epidermis caves complexly into the dermis, like terrain collapsed over tectonic plates. It is due to the depth of this dermal engraving that fingerprints can grow back even in the aftermath of abrasion. Another theory claims, more lyrically, that they arise from friction as fingers touch the walls of the womb.

Because they are not only genetically but also environmentally determined, owing themselves to contact, oxygen levels, blood pressure, amniotic fluid composition, the length of the umbilical cord and so forth, even those of identical twins differ. They are among the earliest instances, in an individual’s lifespan, of differentiation by nurture.

In spite of these environmental influences, some studies have suggested a correlation between fingerprints and brain lobes, calling fingerprints “blueprints of cognition.” (When I was a pre-teen, my mother took me and my older brother to have our fingerprints scanned in an evaluation that claimed to predict “in-born” proclivities and level of brain activity. It was the most elaborate personality test I have ever taken.) The information about the brain that fingerprints supposedly reveal “means that specialists have another tool for early diagnosis: our identity is mapped at our fingertips, but also, maybe, our individual fate,” writes Chantel Tattoli, somewhat dramatically, for The Paris Review. While fingerprint science of this sort does not seem to have much scholarly traction, more recent studies have suggested, in a fascinating inversion, that brain activity is so unique to and so stable within each individual that it can be used as a kind of “fingerprint,” which can supposedly be used to predict intelligence, risk of mental illness, and responsiveness to drugs. As of 2015, researchers could identify one person’s patterns from a set of 126 — a tiny figure, compared to the 120 million actual fingerprints housed in the FBI-managed Next Generation Identification Program, which belong, if the math holds, to 10 million people.

Fingerprints have, indeed, long been associated with identity, predating their 19th-century use in forensic science and their 21st-century use in biometrics. Tattoli offers a brief history: “Thumb marks were used as personal seals to close business in Babylonia, and, in 1303, a Persian vizier recounted the use of fingerprints as signatures during the Qin and Han Dynasties.” What is new, of course, is their collection en masse and their deployment for the sake of security. It will come as a surprise to nobody that the use of fingerprints as biodata began as a colonial enterprise. William Herschel takes credit for fingerprinting a Bengali man, Konai, as a form of personal identification. For what it is worth, there is a long history of fingerprinting in India that dates from the medieval period; Tattoli claims that Herschel observed and adopted the practice. His appropriation was highly successful — Tattoli’s was the only non-scholarly account that did not cite him as fingerprinting’s “inventor.” Herschel demanded fingerprints from contractors as well as from prisoners, in order to lock them into their contracts and prison sentences respectively. Identity morphed into criminal identification, which then morphed into crime-scene identification in 1902, when Alfred Bertillon became the first person to solve a case on the basis of fingerprint evidence. Fingerprinting became its own science: dactyloscopy, the analysis and classification of prints. The process of fingerprint identification is called individualization — much as the formation of a fingerprint is a kind of individualization in miniature. (Both criminology and dactyloscopy have, at different times, championed the myth of immutable personality types, but there has never been an effort to identify an archetypal “criminal fingerprint,” only a criminal’s fingerprints.)

Since then, fingerprint evidence has been somewhat discredited, despite its emblematic association with crime-scene investigation — high-profile cases of fingerprint misidentification testify against it. Its locus of use has shifted and expanded: fingerprint identification has moved from the realm of criminality to the not-unrelated realms of security and privacy. At the transnational level, biometrics monitor borders. At the civic level, fingerprinting is thought to secure communities: Hartford recently proposed a mandate that called for the fingerprinting of substitute teachers, student interns, and field trip chaperones, despite the weighty cost of $3,750 to the “small town’s tight budget.” And at the level of “private” life, fingerprints unlock high-tech homes and phones — or they did, until Apple replaced their fingerprint-reading home button with Face ID. Faceless situations — the absence of a criminal, for instance — might call for fingerprints, but facial recognition technology is becoming sufficiently sophisticated to phase fingerprints out of modes of privacy where your biology is the key. We might mourn the shift in biometrics from touch to sight: recent studies revealed that facial recognition technology is (surprise) racially biased, and commentators expressed concerns that it might open the doorway to rampant racial profiling when Amazon proposed a form of home security based on a doorbell camera. But fingerprinting is far from innocuous when a fingerprint is still, in the context of the FBI’s gargantuan database, a litmus test of whether or not you’ve ever been judged a criminal — and what’s more racially biased than that? Fingerprints will continue to testify to your fitness to freely enter a country or home, to your criminal record or a lack thereof. Biometric technology turns identity into a database charged with political significance: the more information, the better.

The moment my fingerprint touches my home button, as once upon a time it touched the walls of my mother’s womb, is a moment that an internal world is yielded up to me. “My identity” has produced it, and gives me sole access to it — unless, as commentators have grimly noted, someone cuts off my thumb. There is, of course, something terrifying about this, about the weaponization of identity and its role in extreme privatization. Why does my identity determine my world as exclusively as it does? Conversely, identity — in the sense of subject position — always determines worlds. The fingerprint is just yet another instance where biological fact is given far more world-determining significance than it ought to receive.

Unlike other facts about our biology, we don’t identify with our fingerprints. But they identify us.


What are: Fingerprints? was originally published in The Yale Herald on Medium, where people are continuing the conversation by highlighting and responding to this story.

Read more here: https://yaleherald.com/what-are-fingerprints-b88a69e61dc3?source=rss----c10413cdfba9---4
Copyright 2024 The Yale Herald - Medium