Predictive algorithm under wraps

In declining to release details, UA cites competitive harm

FAYETTEVILLE -- Even as complex mathematical models gain greater influence in government and elsewhere, evidence suggests public entities are refusing to disclose details about algorithms that guide important decisions, experts said.

The University of Arkansas, Fayetteville withheld details of an algorithm used to award student grants as part of a new pilot program. Documents released by UA described it as a data model aimed at predicting the likelihood of students staying in school.

In response to a request made under the Arkansas Freedom of Information Act, UA said the model was shielded from public disclosure because releasing it would harm the university's advantage "in competing with other institutions to attract and retain the targeted students."

The university also cited potential licensing opportunities.

"The selection process algorithm constitutes research-based craft knowledge that, taken into consideration as a unique and promising financial aid recruitment tool, would pose a competitive disadvantage to the university if released," the university said in response to the Arkansas Democrat-Gazette's request.

UA "has devoted considerable time and personnel resources to development of the selection process algorithm, which is the subject of internal university research; the requested information represents a valuable asset to the university that may present future research or licensing opportunities."

UA spokesman Mark Rushing said in an email the model was developed by UA's Office of Student Success "based on similar models created by other institutions."

Ellen Goodman, a professor at Rutgers Law School, co-wrote a scholarly article this year examining the response of governments in 23 states to requests for information about specific algorithms.

"The information was withheld in the vast majority of those cases," Goodman said.

She said questions raised by the increasing use of algorithms typically fall into two basic categories: are they doing what they're supposed to; and, are they fair?

Told of UA's algorithm, Goodman said similar questions could be posed.

"Are they sort of overlooking some kinds of students, and maybe over-rewarding other kinds of students?" Goodman said.

Data without clear policy guidance "will tend to perpetuate what's worked in the past," making it important for public entities to release details about the algorithms being used.

Nicholas Diakopoulos, an assistant professor of communication studies at Northwestern University, helped create an online database at AlgorithmTips.org of "potentially newsworthy algorithms used by the U.S. government," according to the site.

Diakopoulos said in 2015 that he researched the response of governments to requests for "algorithms used in criminal justice scenarios." He said about 20 percent cited a reason "along the lines of, this is a trade secret," to withhold information.

Told of UA's algorithm, Diakopoulos said that while it's benefiting some, "on the flip side, you have to look at what is lost by the people who don't get access to these things, people who might have benefited from these programs but for the particular way the algorithm uses data."

Even if the full algorithm was not released by UA, "I absolutely think they should disclose what the variables are that they use in their model, particularly if they're using anything like race or even gender," he said.

After UA declined to release the algorithm, the Democrat-Gazette asked for "the complete list of student characteristics" used as components of the mathematical model.

In response, UA declined to release such a list, instead listing what are described in UA documents as eligibility requirements for the program. UA stated that "further discussion beyond these aspects, we believe, will compromise the integrity of the program selection algorithm."

RELATED ARTICLE

http://www.arkansas…">UA testing use of math model to award aid, offer mentoring

Metro on 12/03/2017

Upcoming Events