Arrow-right Camera
The Spokesman-Review Newspaper
Spokane, Washington  Est. May 19, 1883

Secret scores, data used to size up, market to consumers

E. Scott Reckard McClatchy-Tribune

Consumers won access to their credit scores more than a decade ago, after advocates voiced concerns over errors and lending bias.

But most people remain in the dark about hundreds of other data-collection programs still being used to size up consumers and market to them. The secret scores and data are used by employers, utilities, banks, health care providers, debt collectors and others.

Consumers have no way to review them or correct factual errors, say advocates of consumer access to the reports. They argue that the consumer protections applying to credit reports need to be extended to all consumer scores – particularly when they are used for identity checks, fraud prevention, medical histories and profitability predictions.

“What’s to keep the data storers from just making things up?” said University of Maryland law professor Frank Pasquale, whose book, “The Black Box Society: Technologies of Search, Reputation, and Finance,” will be published this fall by Harvard University Press.

“Or cooking the algorithm to increase their profitability, regardless of the underlying data?” Pasquale said. “How are we to be sure some malcontent isn’t just messing with people’s scores?”

A recent Federal Trade Commission hearing examined how the data collection programs can touch on highly personal matters: scores to predict whether individuals would take their medications; whether consumers are likely to pay a debt if contacted by phone or mail; the degree of a person’s influence over others on the Internet; whether a customer is pregnant, and if so, when the baby is due.

Some of the scores are used to screen for fraud or to determine whether to grant or deny consumers’ requests for goods and services. But defenders of the practices say most of the data crunching simply helps match consumers to goods and services they want.

The Direct Marketing Association, a trade group for data brokers, has calculated that the industry generates $156 billion in annual revenue, 70 percent of it involving companies sharing their records on consumers and groups of individuals with other enterprises. The brokers’ computer programs, which crunch data from public records and private databases, can involve hundreds and even thousands of factors, experts say, acknowledging that the complexities can appear threatening.

“We realize we have to act responsibly to have consumers’ trust,” said Rachel Thomas, a Direct Marketing Association spokeswoman.

But consumers have little cause for worry, she said.

“In marketing analytics, the worst thing that can happen is a prediction is wrong and the consumer gets an offer for something they’re not interested in,” Thomas said. “We work hard to make sure the information is only used for marketing purposes.”

But privacy advocates say these self-imposed protections are inadequate. There’s no way to tell, they say, whether the marketing data are used to deny consumers a product or service based on, for instance, the neighborhood they live in.

“These scores offer predictions that can become consumers’ destiny, whether they are right or wrong,” said Pam Dixon, director of the San Diego nonprofit World Privacy Forum.

Dixon said the secret scores include:

• Scores predicting households likely to pay debt;

• A job security score that claims to predict future income and capacity to pay;

• “Churn scores” seeking to predict when customers will move their business to another bank, cellphone provider or cable TV service;

• An Affordable Care Act health risk score creating a relative measure of predicted health care costs for a particular enrollee.

Many of the data collection companies will provide consumers with their personal information and allow them to correct errors, said Consumer Data Industry Association spokesman Norm Magnuson. More than 40 such companies are listed on a Consumer Financial Protection Bureau Web page, he said.

But the critics note that there are loopholes. There’s no requirement, for instance, that companies disclose scoring based on someone’s neighborhood. That raises the specter of redlining – the illegal practice of turning someone down for a loan or insurance merely because they live in an area deemed high risk.

The solution, critics say, would be for the industry to open its books. Regulators and consumers should be able to see the data to judge whether scores are being used improperly as proxies for such things as race or sex, or disclosing private information about consumers without their knowledge.