Posted inPolitics / Science / ToMl

The Decline Of Research At Public Universities Erodes Public Trust

When Marc Edwards, a professor of environmental engineering at Virginia Tech, helped uncover the water contamination crisis in Flint, he did so mostly by burning through his own money.

Although the water contamination crisis in the Michigan town had been going on for quite some time, it took Edwards’ testing and FOIA requests to legitimize the crisis in eyes of the media — which eventually featured the controversy in national headlines.

But Edwards’ prominent role in exposing these kinds of issues, as he did when he investigated Washington D.C.’s water supply, has cost him professionally in the past, but he’s continued to criticize government agencies.

Edwards’ story is indicative of a larger problem in higher education: The way we currently fund research through public universities is limiting research opportunities that could save lives and improve the health of the public.

As public universities lose state funding, and are increasingly forced to function more like any other business, universities have to shift priorities. CEO-like presidents are hired, poorly paid adjuncts are called upon to fill in the gaps, and schools treat low-income and homeless students as “risks.” But another big problem is that these universities also lose their research missions — and as a country, we suffer for it.

With fewer resources in this area, researchers do fewer partnerships with communities, there is less innovation, and public trust in research falls, Edwards argued. Although 7 in 10 adults say that government funding in scientific research usually pays off in the long term, the share of the public who views U.S. scientific achievements as the best in the world has fallen 11 points from 2009, a 2015 Pew Research survey shows.
What went wrong in Flint

The problems started after the city of Flint switched to Flint River water in 2014. Last April, Edwards received a call from a woman who was convinced there was something wrong with the family’s tap water. After Edwards tested the water, he decided to look into the matter further.

Virginia Tech researchers then collected and analyzed lead levels in 300 Flint homes and found that the water was five times more corrosive than other water sources in the area. Edwards said there was too much salt in the water and that the city didn’t have a plan to control the corrosion that would happen as a result. It was clear that local, state, and even national officials were not doing their jobs. State officials told the EPA they had a corrosion plan in place, and although the EPA was aware there was a problem, they did not act aggressively enough or in a timely fashion, critics say, and thus dragged out this public health crisis.

“Had we not been involved, no one would have even known about it and they still would have been drinking that water,” Edwards said. “It’s scary for Flint, and it’s scary for everyone else who realizes they have to turn over rocks all of the country to find out that agencies had been untrustworthy. It raises the question that if you can’t protect us from lead in water and you can’t follow an existing law — it creates this massive justified uneasiness that who amongst us is safe?”

As many as 8,657 children in Flint may have been exposed to lead and were recommended for testing. The city and state is still working with the EPA to ensure drinking water is safe.

According to Edwards, the issues at play in Flint show how the dominant model for how research is chosen and funded in higher education — which he calls a “top-down model” — is hurting the public. Disadvantaged populations, like the residents of Flint, are especially at risk and in turn may lose their confidence in scientific researchers.

This top-down model refers to a scenario where academics are competing for a slice of the government funding pie for a particular subject. In order to win this type of funding, all of their grant proposals must be focused on whatever subjects they already believe these funding sources will approve.

“Professors are less frequently generating their own ideas from research. Rather, you are responding to this request to put a man on the moon and all of your ideas have to fall into the constraints of those boundaries,” Edwards said. “And if you don’t write those buzzwords in those proposals, and you’re not responding to larger effort, you have little chance of getting funding.”

Edwards said he doesn’t have an issue with this top-down approach as long as it’s balanced with a different “bottom-up model,” in which academics also make unsolicited proposals and have partnerships with community organizations. In his view, Flint proves why bottom-up models are necessary. In that case, a member of the public came to Edwards about a problem, so Edwards and other researchers investigated it — but had he relied on a model in which he first needed to see a demand for investigating water contamination, there wouldn’t have been any evident demand. The water wouldn’t have been tested.

When unsolicited research was more commonplace, and federal agencies responded to those ideas, the public interest was much better served, Edwards argued. Then, ideas for research can be generated from “people on the front lines dealing with problems day to day in their life” — like the people drinking tap water in Flint.
How universities’ business approach hurts scientific innovation

As public universities attempt to keep their heads above water with very little funding, these institutions increasingly behave more like private businesses. This often means measuring performance of faculty to better understand what they’re giving back to the university, and it doesn’t foster innovation, some experts say. These analytics are the yardstick by which faculty performance is judged. The measures look at things like how many patents the professor has and how many articles they published in order to understand their influence on academic discourse.

Manuela Ekowo, a policy analyst with the education policy program at New America who studies the effect of technology on higher education, said there are some important questions to ask about the accuracy and effectiveness of these measures.

“Are they good measures of impact? Which measures are included, which are excluded? Are the metrics gathered accurate? Do any of these metrics get at quality teaching, which is given little, if any attention in most tenure processes?” Ekowo said. “I would argue that quality and effective teaching should be at the heart of any attempt to measure or gauge faculty performance, productivity, or impact.”

Both Edwards and Ekowo agree that when universities rely on these analytics, there is a risk that professors will try to ensure they get the best score by doing things that hurt innovation, such as refusing to share ideas with other professors. That lack of community among researchers could really hurt the development of new ideas that would benefit the public.

“In a similar vein, there’s also the danger that academic analytics might actually breed unhealthy competition between academics as they strive to increase their numbers and their perceived impact and productivity, especially in comparison to perhaps other academics who are working in the same field,” Ekowo said. “If faculty will ‘do anything’ to secure funding for research, will they ‘do anything’ to secure high metrics? There’s a real potential of gaming the system.”

In spending time and energy in trying to get patents and copyrights, the process of discovery is also slowed down, Edwards argues.

Rutgers University opposed the use of one of these common private vendors for analytics, called Academic Analytics, in part because the teaching service component was never considered and also because researchers don’t see any way the vendor can determine the influence of their research. They also argue there is little transparency in terms of how it is used.

Ekowo suggests that if universities received more funding, they could work with private-sector partners instead of having private vendors do all of the analytics work. Because the vendors don’t disclose very much information about how their analytics work, it’s hard for universities to do research on them and critique them to get an understanding of how accurate they are. But in a partnership, there would be more ownership over the methods. In that way, she says “the tension between finding the truth, serving the public good, and securing research dollars” could potentially be reduced.
There are fewer federal dollars available for quality research

Less federal money is now going toward scientific research, according to the American Association for the Advancement of Science. Spending on research and development as a share of GDP has been decreasing since the 1970s and grants from the National Institutes of Health and National Science Foundation have been stagnant or have decreased since 2006, according to the AAAS. Between 2010 and 2013, congressional cuts to scientific research represented the largest decrease in three-year spending in 40 years, according to Boston University.

And as the pool of money becomes smaller, both for research funded by the federal government and the funding universities receive from states, academics becomes more competitive and willing to write the proposal they know will secure the funding they need to keep their careers afloat, Edwards argues.

Edwards said that adjunct professors, who have little pay and job security, understandably struggle to advocate for any kind of role in universities’ research missions, but he said there isn’t such an excuse for tenured professors who have more freedom but don’t end up using it. He added that researchers rarely admit any shortcomings to their findings, in part due to a culture of salesmanship but also because journals would be less likely to publish their work.

“We view ourselves as the helpless victims here but to some extent we are the problem… We share a large amount of the blame for our declining position in society and for to some extent killing the goose that has been laying the golden egg, which is our relationship with the public,” Edwards said. “By and large, scientific research is way oversold, overblown, overhyped — a used car salesman type pitch. And it’s part of the game we have to play … [The public] get that this is salesmanship. It’s not real,” he said.

He added that he’s worried about situations where donors provide money to universities on condition that their ideological agendas are given a place at the university. For instance, institutes called “freedom centers” that essentially function as conservative think tanks have been backed by the Charles Koch Foundation. Public colleges and universities in Arizona recently received $5,000 in funding for these centers — one of which produced research backing the Arizona Governor Doug Ducey’s (R) land trust plan.

“To the extent that we let down our guard, ever, and we allow bias to become more than is desirable we can never approach the ideal,” Edwards said.

— source thinkprogress.org By Casey Quinlan

Leave a Reply

Your email address will not be published. Required fields are marked *