:
I call this meeting to order.
Welcome to meeting number 112 of the House of Commons Standing Committee on Science and Research.
Today's meeting is taking place in a hybrid format.
I'd like to remind all members of the following points.
Please wait until I recognize you by name before speaking, and all comments should be addressed through the chair. Members, please raise your hand if you wish to speak, whether participating in person or via Zoom. The clerk and I will manage the speaking order as best we can. Those participating by video conference can click on the microphone icon to activate their mic. Please mute yourself when you are not speaking. For interpretation on Zoom, you have the choice at the bottom of your screen of the floor, English or French.
Thank you all for your co-operation.
Pursuant to Standing Order 108(3)(i) and the motion adopted by the committee on Thursday, October 31, 2024, the committee is resuming its study of the impact of the criteria for awarding federal funding for research and excellence in Canada.
It's now my pleasure to welcome our three witnesses.
From Colleges and Institutes Canada, we have Pari Johnston, president and chief executive officer; from U15 Canada's group of Canadian research universities, Dylan Hanley, executive vice-president; and from Universities Canada, Gabriel Miller, president and chief executive officer.
Up to five minutes will be given for your opening remarks, after which we'll proceed with rounds of questions.
Ms. Johnston, I'll start with you. You have the opportunity to make an opening statement of up to five minutes.
My name is Pari Johnston, the president and CEO of Colleges and Institutes Canada. On behalf of our 134 member colleges, institutes, polytechnics and CEGEPs, I want to thank the committee for making the time for this study.
When I was here for my appearance on the capstone organization, we talked about the fact that we must ensure that federal research investments lead to real and tangible impacts in the daily lives of Canadians—to results that support economic prosperity and social well-being for all, that drive community and business innovation, and that respond to the biggest challenges we face as a country. To do this, we must reimagine how we currently invest in research and re-evaluate what we value. We must redefine how we assess and award research to move beyond concepts of excellence defined primarily within a university-centred approach.
[Translation]
We also need to focus on the impact, relevance and scope of the research.
Impact‑driven research means first determining the nature of an issue and then designing a research program that brings together all the right partners and end‑users to resolve it, that uses all the research tools available and that implements inclusive assessment criteria with a focus on the application and impact.
Impact‑driven research is helping to build better houses more quickly; to increase drought‑resistant varieties of agricultural crops and find out how to encourage farmers to plant them; and to develop new methods for leveraging genomic tools in local clinics.
This exact type of research is carried out in colleges and institutes.
[English]
Colleges lead partnered, problem-driven and real-time research that generates applied knowledge and de-risks technology development and adaptation. This results in on-the-ground benefits through improved knowledge translation and mobilization, IP staying with the local business partner and greater technology uptake by local partners in priority economic and social sectors.
In 2021-22, our members led more than 8,000 applied research projects, resulting in 6,500 new processes, products, services and prototypes in areas like housing construction, advanced manufacturing, climate-smart agriculture, food production and social innovation. Ninety-nine per cent of our partners are Canadian companies and non-profits, keeping the fruits of our research at home in Canada.
If we want to optimize the impact of federal research investments, the following three recommendations must be actioned.
First, the federal granting agencies and research funders must redefine and rebalance the weighting of criteria that is currently used to award federal funding to ensure they adequately assess and reward research impact. This includes looking beyond traditional metrics of excellence, such as publication records, citations and other metrics aimed at establishing expertise in academic research. Criteria such as partner uptake of research outputs, capacity of a project to develop new IP or develop a novel application of an existing technology, or policy reports that lead to improved implementation pathways are indicators that speak to research impact.
Second, federal research funders must ensure that merit review committees include representation from a diversity of institution types, end-users, and industry and community partners that are able to provide a more holistic ecosystem perspective on research programs and how to ensure that benefits on the ground have broad reach. Currently, most merit review committees are almost exclusively composed of representatives from universities. To support impact, review committees must include voices from across the research ecosystem, including colleges, end-users and policy-makers familiar with effective implementation and delivery of research results.
Third, it is time for ISED and the federal granting agencies to expand eligibility for colleges and institutes in all existing tri-council programming. Right now, colleges are not eligible as lead applicants for NSERC's alliance program, which is its flagship partnership initiative. In addition, we must address informal barriers, such as not allowing research grants to cover college faculty course release time or to hire replacement faculty to carry out research projects.
Canadians and their communities expect their federal research programs to deliver for them. Enacting these three recommendations will help achieve this.
Thank you.
Thanks to all of the members of the science and research standing committee for the opportunity to appear before you today.
U15 Canada is an association of Canada's leading research universities. Today's study is concerned with the impact of federal funding policies on research excellence in Canada. Promoting excellence is at the very heart of our work at U15, and I'm confident that Canadians should be very proud of the global impact and competitiveness of our research university system, which punches way above its weight internationally, including in value for money.
Taking the University of Toronto as an example, last year the journal Nature ranked it at number two in the world for impact in health research, after Harvard and ahead of even Johns Hopkins, and yet it also educates more students every year than the entire Ivy League combined.
This is only one example of how our leading research universities deliver value to Canada and Canadians at a low cost to entry. U15 universities alone award 160,000 degrees a year, including to the vast majority of doctors and dentists in Canada, developing a crucial pipeline of talent.
We also know that research conducted at our universities has real-world impact, from research at the University of Saskatchewan to protect the Canadian pork industry from the risks of African swine fever, to research at the University of Alberta on carbon capture and storage aimed at enhancing a clean energy future for our country and to leading work on Arctic monitoring at Université Laval that will help track the impact of climate change and enforce our Arctic sovereignty.
Leading research universities also drive innovation. From artificial intelligence to agriculture, partnership between businesses and post-secondary institutions is a defining feature of our R and D system in Canada, with us ranking third in the G7 and in the top 10 in the OECD in the percentage of private sector R and D done in partnership with post-secondary institutions.
It's also important to underscore that our research universities deliver impact that's truly pan-Canadian. Our universities act as hubs of expertise across extensive networks that bring together other post-secondary institutions, research hospitals, innovative industries and community organizations. In 2022-23 alone, just our 15 universities collaborated with over 3,600 different partners and organizations on tri-agency-funded research in nearly every community and riding across this country.
Canadian research excellence has been made possible in part due to a long-standing cross-party consensus.
The Chrétien government brought in the Canadian Institutes of Health Research and launched the Canada Foundation for Innovation and the Canada research chairs program. The Harper government made major investments in the CFI, launched the Canada first research excellence fund—or CFREF—and the Canada excellence research chairs and funded excellence-based scholarships. As well, the current government has made significant investments in investigator-led research, the new frontiers in research fund and crucial programs in quantum, AI and genomics.
CFREF is expressly designed to create globally competitive platforms for Canadian research strengths. These networks have impact across the country and include projects in important evolving areas for Canada, including the links between brain and heart health at the University of Ottawa, climate change and ocean science at Dalhousie and the health and well-being of children at the University of Calgary.
Another example is the Canada excellence research chairs program, which attracts world-leading scholars to Canada, along with their talented teams, to create clusters of excellence and expertise here.
Another major pillar of our excellence system is the Canada research chairs program, which provides funding for universities to hire some of the best and brightest researchers across all areas of research.
To ensure that Canada's research enterprise can continue to flourish, I am going to suggest the following key principles.
Number one is the best ideas. It's important to note that excellence in research rests on the foundation of peer review, wherein experts in the fields are the ones to judge which proposals move forward and receive funding.
Number two is strong, healthy institutions. Our world-class research universities are a national strength that, again, all Canadians should be proud of, but we're facing unprecedented financial challenges because of decades of stagnant or declining real government funding and turbulence around international students.
Number three is unlocking impact. Our institutions are working hard to unlock the full potential impact of research in our businesses, communities and society through entrepreneurship programs, connectivity hubs for business and extensive partnerships with governments and non-profits in the social sector.
We certainly can do more, and we should do it together.
Thank you. I look forward to your questions.
:
Good afternoon, Madam Chair and members of the committee.
Thank you very much for inviting me.
I'm Gabriel Miller, the president and CEO of Universities Canada, the national voice of Canada's public universities.
I want to compliment the committee for its decision to undertake this study. I hope it's the start of an ongoing discussion about an issue that's critical to Canada's future.
As our challenges become more complex and our technology more advanced, research excellence must be at the heart of any serious plan to create jobs, increase our prosperity and improve our quality of life. Federal research investments are an essential pillar of our higher education system, a system that gives millions of Canadians a pathway to expand their career opportunities, increase their job security and earn higher salaries that help them pay the rising costs of owning a home and supporting a family.
Our research system is a training ground for the future doctors, engineers and entrepreneurs whom we need to support our economy and meet the needs of our aging population. It's through research that Canada can create the knowledge and develop the talent to fuel innovation and productivity across vital sectors of our economy, whether they are energy, agriculture, manufacturing or arts and culture.
Today I want to leave you with three broad recommendations.
First we must build on the core principles of Canada's outstanding research system, a system built over decades with support of successive governments and members from all parties. These principles vary from program to program but almost always include the need to make an original contribution to the research field, the need to provide training opportunities for highly qualified personnel and to demonstrate that the project is feasible with the resources available.
Second, I want to recommend that we continue the difficult and often imperfect work of expanding opportunities to more people and communities. What does that mean? It means that we have to help universities of every size in every region to contribute their fullest to the research enterprise. The research security fund should be strengthened so that smaller universities aren't held back by the costs of meeting a growing administrative burden and larger schools aren't slowed down.
It means that we must strengthen the unique role of universities as a forum for independent thought, discussion and discovery, where a broad range of political and ideological perspectives are engaged and explored. No one should be excluded from participating in scientific debate and discovery, conservative, progressive or otherwise. It means that we must continue the fight against racism and discrimination in all their forms and reduce the barriers that have deprived too many people of the opportunity to contribute their abilities and perspectives.
[Translation]
Lastly, we need to support French‑language research, which faces unique challenges when it comes to submitting applications and publishing in French. Systemic barriers remain an issue in French‑language research, including differences in success rates according to language.
We're heartened to see the committee that the government set up to study this issue. A new capstone organization could also promote excellence in French‑language research.
[English]
What I've described is not a narrow conception of diversity. It's a commitment to unleashing the incredible talents of our country. It's a vision we should embrace with passion but with care. We must undertake this work with humility.
Many Canadians have questions, concerns and criticisms about the most effective ways of expanding opportunity and about some of what they see and hear being done in the name of diversity. We need to be listening—"we" in the university sector. We need to be learning and engaging in these discussions. We must be evaluating the tools that are used and prioritizing those that reduce barriers while strengthening research excellence.
I want to close by noting that the upcoming supply votes include important research and graduate scholarship funding for next year. I can't stress enough the importance that this funding will have for graduate students at institutions across Canada and for creating a better foundation to address many of the issues I've discussed today.
Thank you very much.
:
Thanks for the question. It's a good one.
First off, I'll say regarding the dispersal of funds among the councils, just shy of 80% of all new funds go for health research, natural sciences and engineering—to NSERC and CIHR. The vast majority goes there. I think a lot of it may be partnered outside of here, but it isn't research on something specific or not specific to Canada.
With regard to the value of research on issues around the globe, it's crucial for us both to be studying things that happen in Canada and things that happen around the globe. My own personal background in graduate research was studying the Middle East and nuclear non-proliferation related to Israel and Iran. Am I bringing it to bear on the work I'm doing now for universities? I'm not sure, but it's important for Canadians to be engaged in issues that are important around the world.
I think what's more interesting is that a lot research that has taken place in recent years on our history here in Canada has been on aspects of indigenous cultures, local regional cultures, etc., that we may not have understood unless we were engaged in it.
:
The studies you mentioned are very impressive. I would never question that, because it's a very interesting topic to study.
However, I see research grants given out to study Dolly Parton's lyrics. The committee has heard me about that. Shaun, I think, gets a sore stomach when I bring it up. There is a study of the Maidan Museum in Ukraine. The list goes on and on. You'd think there might be a couple, but I would say there are hundreds. It's in the millions of dollars. I've tried to test it and maybe I have cherry-picked them. However, when I'm in my riding on the weekend and people ask what I'm doing, I say that we're doing this study and that these are some of the things they're studying. They can't believe that their taxpayer dollars are going to those things.
I'm trying to present it in a fair manner. Some of the stuff you're mentioning is absolutely great. People would be proud of that. However, should we take a step back and look at some of these projects we're funding?
However, I don't know how many times we've had people come here and say, “Ben, you cannot challenge the people who make these decisions. It's unhealthy. It's not wise to cherry-pick. We have to look at everything.” Still, I go down the list and I can't believe it.
Here's one. My optometrist said I need bifocals. Do you know what? He's right. I do. I can hardly read this. It says, “Understanding and addressing the mental health impacts of unpaid care work on women in Bogotá, Colombia”. If that were in Canada, I'd say, “Great”. However, this is in Bogotá, Colombia. Why doesn't Colombia pay for that study, and we'll do a study in Canada on unpaid work?
What value is that to the Canadian taxpayer?
:
Thank you, Madam Chair.
Thank you to the witnesses for coming here today. I hope you had the opportunity to review some of the testimony we heard last week because some of my questions will concentrate on what we heard, and I'd like your opinions.
My first question is for Mr. Miller. Last week we heard some witnesses who claimed that, by pursuing equity, diversity and inclusion, you inevitably reduce the research output of an institution or country. Perhaps you could tell us, from your experience, to what extent EDI plays a part in the evaluation of applications for funding, for federal funding? There was a claim that it was akin to affirmative action, which I have always seen as something applied to human resources and hiring of individuals, not to research applications. Could you tell us a little about the weighting of this sort of analysis related to EDI?
:
Through you, Madam Chair, first, I should say I'm not a researcher myself but, of course, I've been working with universities for several years and, now, with Universities Canada for eight months.
What I would say is that, in all of my interactions with the tri-council and with researchers in our community, what has been paramount in all of their work and in their assessment of research projects has been excellence, the potential to really create change inside the discipline, the ability to contribute to the training of graduate students and evaluation on the basis of merit. There's no question that there's been growing interest, discussion and, in some cases, policies around efforts to expand opportunities so that a broader group of Canadians can participate and contribute their perspectives and talent, which, ultimately, is in itself a major contributor to excellence. My observation has been that there is a concerted effort in the community to expand opportunity, but to support merit and excellence, not to compete with it.
:
Listen, I think all of our institutions are committed to the principle of ensuring that all Canadians get a fair shake, both in terms of accessing a university education should they wish to—which is still the most transformative action you can take for your economic future, and we think there are obviously other benefits to it in addition to economic—and participating to the greatest percentage possible specifically in the research enterprise.
With regard to EDI in the last number of years, I think that in our institutions, like in society as a whole, there's been a greater recognition of historic and traditional systemic barriers, and attempts to get rid of those, to open the field and the table to more people and to make sure that they have a fair shake. I think, in the last couple of years—and this is a pretty recent phenomenon that we've been undertaking this with the level of energy—there have been criticisms of some of the specific measurements and programs that have gone on. We're open to those criticisms and to making things better inside of our institutions. There are regularly programmatic reviews to look at what's.... And this isn't just for these policies; it's for any policies that are going on across the administration, so I think we are committed to it.
In the research enterprise, again, as my colleague said, I think this is about actually opening up the diversity of viewpoints that are able to come to the table and improve research. We know, in corporate teams, that diversity of perspectives and viewpoints makes things better and leads to better outcomes.
:
I think we need to look at whether these committees are broadly representative of those who are part of the research ecosystem. I think we have seen gaps, to date, in representation of those who are the end-users of research, those who are familiar with policy implementation and those who come from the college system.
Our premise is that excellence, relevance and impact should be part of how we think about the investments we make in research. If you accept that premise, which I think is incumbent upon us to start thinking about very actively as we think about our research investments, then it stands to reason that the merit review and the review committees that we're making are representative of those in the research ecosystem.
I would say that another opportunity is for us to look at two-stage reviews. How can we look at research, particularly challenge-based research funding, that takes the first scientific and technical stage of review, but then has a second impact review that is also composed of committee members who are broadly representative of those who will be benefiting from the research?
In our view, it's about being more intentional about representation that's inclusive of those who are part of the research ecosystem.
:
Thank you, Madam Chair.
Thanks to the witnesses for coming today.
As members of Parliament, our job is to ensure that the taxpayer is getting value for their tax dollars. If our goal of research is research excellence, where we put our money should pursue excellence as well.
In our next panel, we're going to hear from an organization called Retraction Watch, which has done an excellent job of exposing falsified or poor research. Its work ensures that we can move toward research excellence.
In 2017, the Liberal government put together a fund called the Canada 150 research chairs. Mr. Hanley, your organization, understandably, welcomed that. One of the new research chairs was Jonathan Pruitt, the chair for biological dystopias. He received a one-time federal grant of $350,000 for seven years, which is a total of $2.4 million taxpayer dollars. What he did with that money was write a bunch of papers using falsified data. Thanks to Retraction Watch, we know he had 15 papers retracted over three years.
When we're pursuing research excellence, going forward, how do we prevent this type of fraud from happening and what gaps might exist that we need to address to prevent this from happening again?
When I started working with U15—we have a committee of vice-presidents of research—it was not this specific case, but questions around academic integrity, fraud and research ethics were right at the centre there. Nobody has...no institutions have a greater interest in ensuring that the system is robust and has a minimum amount of that kind of behaviour.
I have to admit I'm not familiar with the case you're talking about, so I can't really speak to it specifically, but we are absolutely committed to ensuring that our academics are passing rigour and that the studies being put forward are world class.
Again, it underscores the importance of peer review. It doesn't mean that certain things don't slip through the cracks. You heard about a few famous studies in the United States that had the same thing. I don't think it's something you can ever entirely eliminate, but we do everything in our power to try to eliminate it, and I think our commitment to it will be steadfast.
I will say that's the case with all of those regulatory requirements. We handle hazardous materials in labs. There's a lot of regulation that needs to be overseen by universities. We think we do it in a way that's excellent, but the point to underscore is the reputations that are on the line when those things go wrong are our reputations, and we believe they're world class, and we do everything we can to protect them as well.
:
It is a challenge and, again, I'm not familiar with the case. I don't know which journals were publishing those various articles, whether they're Canadian journals or not or what adjudication processes were there.
One thing that has been an issue of focus, again, for the vice-presidents of research to try to manage these things across their institutions is the pressure on researchers to publish and the advent of journals that may have easier processes for publishing than others and what we do to be able to control the proliferation of those journals if they don't have tight enough requirements.
Also, I'll say there's a balance between weighting that and wanting open science and collaboration and not wanting only a few prestigious journals like Nature and Science to control all of the academic currency.
The case you're talking about is clearly egregious, and I'll tell you I'll make a point of looking it up after this committee meeting.
:
Thank you, Madam Chair.
I have a comment that I think will lead into the questions I'm going to ask all three of you.
The last meeting we had was quite interesting. We had researchers who had a particular point of view, a right-of-centre view, and who were somewhat jaded, and some comments were made about minority groups and women not holding their own compared to, say, people like me: men.
I thought, “Should I bring that up?” I do think it's important to have sunlight, as it's the best disinfectant. I disagree with them; they had their right to say what they had to say, although I vehemently disagreed with it.
When I think about the comment about women not measuring up, I think of people like Dr. Cheryl Bartlett at Cape Breton University. I think of Dr. Jane Lewis, Dr. Coleen Moore-Hayes and Dr. Shelley Denny, who's Mi'kmaq.
In one of the comments made at the last session, there was seemingly a discounting of indigenous knowledge and indigenous research in academia and research. I wonder if all three of you can come back to me with a comment on “it just doesn't add up” or “it just doesn't measure up to other types of research.”
We can start anywhere you like.
:
I am happy to take that question first.
It's very important that we have continued, over the last number of years, to invest in indigenous-led research and research on issues that are affecting first nations, Métis and Inuit communities in this country. A number of our members are in remote, rural and northern communities, and they are very dedicated to engaging in problem-based research with their local communities, which are really interested in working together with our applied researchers to discover and determine solutions to problems that they might be facing around food scarcity or environmental change in the community affecting the local water systems.
In my view, it is important to ensure that we have a diversity of teams, including those that are working with first nations, Métis and Inuit communities to solve the problems that they define as important.
I'll turn to Mr. Hanley to talk about this idea of the pressure to publish papers.
We heard some statistics of, I think, 22 papers that had a bad peer review. I just checked online and there are between two and eight million papers published per year, so 22 are probably fairly insignificant in that overall total. It points to the fact that there's this real pressure on researchers to publish, both for career advancement and to get grant money.
Could you expand on how we measure the impact and excellence of that research when we're funding research in Canada? That's the gist of this study.
Although I did a graduate degree, I'm not a researcher by background or training, as you are.
Again, we take academic fraud extraordinarily seriously. I do think it's a tiny percentage. Especially when you're talking about critical scientific studies, it still is important beyond the reputation of the university. It's critical for the credibility of the entire scientific method. It gives credence to conspiracy theories and all sorts of other odious things in society. I think we do need to stand on guard against it.
With regard to the impact of research across Canada, I will say, to your previous question about the provinces, I think that research and especially some of the excellence-based projects—CFREF is a good example—really serve to bring researchers together across different universities and regions. Yes, there's sort of a home base institution for these projects, but all of them have clusters and partnerships that span the country and bring together researchers across the country who are at the top of their fields, as well as non-profit organizations, businesses and others, in those projects that are really meant as an “own the podium” type of exercise for Canada.
This is really about building platforms up that help us compete on the global stage.
Again, research impacts our lives every day, whether it's cardiac research that saves the lives of Canadians, or research on lipid nanoparticles that you're familiar from UBC that helped unlock mRNA vaccines, or social research or economic theory, or whatever, that solves problems and gives us new perspectives on issues.
We'll get started again so that we can finish in good time.
This is a brief reminder for those participating by video conference to click on the microphone icon to activate your mike, and please mute yourself when you are not speaking. For interpretation for those on Zoom, you have the choice at the bottom of your screen of the floor, English or French.
It's now my pleasure to welcome, from Polytechnics Canada, Sarah Watts-Rynard, the chief executive officer. From Research Canada: An Alliance for Health Discovery, we have Alison Evans, president and chief executive officer. We have online, from Retraction Watch, Ivan Oransky, co-founder.
We welcome you.
Up to five minutes will be given to each of you for opening remarks after which we will proceed with rounds of questions.
Ms. Watts-Rynard, I invite you to make an opening statement of up to five minutes.
:
Thank you, Madam Chair.
I'm pleased to be back before this committee as you study the criteria being used to award federal research funding.
Polytechnics and institutes of technology have now been engaged in Canada's research ecosystem for more than 20 years. As experts at partner-driven research, these institutions help organizations of all sizes adopt, implement and commercialize new products and processes through applied research. Despite two decades of doing this work, there are a number of barriers to accessing federal research funding. For the purposes of my remarks today, I'll focus on three.
There is the minimal access to research support funding, a poor understanding of the salary composition of principal investigators at polytechnics, and adjudication criteria that favour research and publication-intensive CVs.
Let's start with the first. The federal government invests more than $450 million each year in the research support fund. According to this fund's website, it supports post-secondary institutions to maintain modern labs and equipment, secure research from threats, enable research management and administrative support and meet regulatory and ethical standards. For polytechnics and colleges, this fund is largely beyond reach. In fact, together they share about half of one per cent of the research support fund. The college and community innovation program is excluded from eligibility calculations, and this means there is virtually no funding for administrative support, research security or maintaining labs and equipment. These activities must be funded elsewhere.
Moving on to the second barrier, polytechnics and colleges hire faculty to be in the classroom and, while their university counterparts are compensated for spending part of their time on research activities, polytechnic instructors have a full teaching load and, as a result, experts drawn from the classroom to participate in research must be backfilled. This wouldn't be a barrier at all if federal research funding programs had faculty release provisions for those who need them. Instead, because programs are built for the university model, my member institutions are actively disadvantaged right from the proposal stages, and this means that winning conditions are missing.
Barrier number three drives to the heart of the matter. The vast majority of federal grants are built on an application process geared to individual principal investigators. Applications are often evaluated based on the background of an individual who is preparing the application. For example, it's relatively common in grant competitions to judge the merit of a proposal by the quality, quantity and significance of past experience and publications.
In the polytechnic context, applied research is a team effort. While research projects are often led by faculty members, activity is delivered out of the office of applied research. While this approach has no diminishing effect on the quality of the research, it raises challenges to participation in a system that is based around the expertise of a single individual. While peer review and research excellence are absolutely important criteria when awarding federal research funding, they aren't sufficient on their own. The current system has a bias toward research that is done in the same way by the same kind of researcher, as it has been well before polytechnics and colleges had even begun developing their research capacity, and this is quite restrictive.
To fully utilize Canada's ecosystem, the process by which funding is awarded must be reviewed and reconsidered.
Thank you very much for inviting me today, and I look forward to your questions.
:
Thank you, Madam Chair.
[Translation]
Good afternoon, everyone.
Members of the Standing Committee on Science and Research, thank you for inviting me to speak as part of your study on the impact of the criteria for awarding federal funding on research excellence in Canada.
[English]
My name is Alison Evans. I am the CEO of Research Canada, which is an alliance for health discovery and innovation. Our 130-plus members include hospital research institutes, pharmaceutical and life sciences companies, med-tech and AI start-ups, post-secondary institutions, provincial health organizations and health charities. Through Research Canada, we work together and with national partners, stakeholders and governments on shared interests. They include the vision of a vibrant, productive, world-leading health research and innovation system, one where better outcomes are pursued by teams in hospital research institutes and corporate and academic labs and through clinical trials and at the bedside. Such a system is critical if we are going to address the declining health and wealth of Canadians and reassert this country on the global stage.
Our most complex societal challenges increasingly require novel solutions and approaches that bring together many perspectives from diverse domains. Research is more international, collaborative and interdisciplinary. We need to respond by continually improving a research support system that exemplifies excellence and integrity; fosters collaboration amongst researchers and entrepreneurs and institutions and companies; strengthens our ability and capacity to respond to health, environment, economic, demographic, energy, technology and other opportunities and challenges; helps us attract, support and retain top talent; recognizes that knowledge is created by investigator-initiated research today and that this same research will help us drive the mission-driven needs of tomorrow; and takes calculated risks and uses evidence to inform continuous improvement.
We welcome this timely dialogue on how we fund research and today’s conversation about research excellence in all its forms. Of course, it's a broad term, and thus necessitates comprehensive and continual consideration. It encompasses how research is designed, conducted, assessed, funded and used. It's context-specific, and acknowledges that flexible, tailored approaches are required. It adjusts as new evidence comes to light and as science and society evolve.
In Canada, upholding research excellence is an aspiration and responsibility held by many federal granting agencies and other funders. Using independent, competitive, structured merit review processes guides the decision-making. In the case of health, these processes help strengthen our entire “research to impact” pipeline, from discovery, applied, mission-driven and translational research to the study of health care delivery itself; to the implementation of novel and life-saving treatments and processes, including AI, into the health care system; and to our preparedness for future pandemics and other health emergencies.
We're fortunate in Canada to have many who protect and promote research excellence for which Canada is globally renowned. Through the work of the advisory panel on federal research support and those that came before them—granting agencies, governments, other funders and countless stakeholders—we are collectively trying to seize the moment that's before us to modernize our research and innovation system to ensure greater agility, responsiveness and impact for all.
In this changing world, unfortunately, Canada is falling behind. Talent, innovation and competitive gaps are widening between Canada and other advanced economies. Our declining health, prosperity and quality of life must be addressed in new ways. We must use long-standing strengths and our growing prominence in areas like AI, clean energy, biotech and life sciences; our highly educated population; and our approaches to excellence to reassert ourselves globally and drive economic growth, prosperity, job creation and outcomes that matter for all Canadians.
[Translation]
I would be pleased to discuss this further.
I'm now ready to take questions from the committee members.
:
Madam Chair and members of the committee, thank you for the opportunity to present my views on this important issue today.
I'm a co-founder of Retraction Watch, a non-profit news organization based in the U.S., which reports on scientific misconduct and reactions to it by universities, publishers and funding agencies, among other issues. We also maintain the world's most comprehensive database of scholarly retractions for Crossref, another non-profit that acquired the database in 2023.
I'm also a distinguished journalist in residence at New York University's Arthur Carter Journalism Institute and editor-in-chief of The Transmitter, a publication covering neuroscience.
I base my comments on 14 years of reporting and writing about relevant issues at Retraction Watch.
Last year, there were well over 10,000 retractions from the scholarly literature. Of note, just dozens of 2023's 10,000 plus retractions included researchers affiliated with Canadian universities. While that 10,000 figure was an 88% jump from 2022, the growth reflects an overall trend since the turn of the century.
Increased scrutiny of the literature is largely responsible for that rise, but 2023 revealed that a significant portion of what is published every year—conservative estimates are at least 2%, although it is likely higher—is produced simply to game the metrics that determine career and institutional success.
I wish to quote Dan Pearson, who studies how researchers can engage larger audiences: "Academic publishing is a game. And a lucrative one for those who win."
That gaming is in large part being carried out by what are known as “paper mills”—shady organizations that sell papers to researchers desperate to publish lest their careers perish. They also sell authorships, and our reporting has revealed that some of these companies even bribe editors to publish papers by their clients.
All of this is an entirely predictable response to standard incentives in academia. Universities around the world demand that researchers publish a high volume of papers—as many as possible in prestigious journals. That's because influential international rankings, such as those created by Times Higher Education prioritize citations, which are, of course, references to a researcher's work in subsequent papers.
Citations are very easy to game, as paper mills know. Knowing that citation counts are an oft-used metric to judge the quality and impact of research, citation cartels ensure that members' citation counts rise. All of this means that there is an uncomfortable truth behind the press releases, advertisements and other material universities and countries use to crow about their high rankings,. These rankings are based on a house of cards built with a stacked deck.
With good intentions, it's easy for governments and funding agencies to fall into the same trap. After all, we all rely on heuristics, apparently validated shortcuts, if you will, to make decisions, particularly when faced with a large number of choices, but citation heuristics pave the road to bad behaviour and retractions.
China offers a lesson here. Their publishing incentives have been among the most extreme in the world, and while they do top some impact and innovation rankings, they also top a ranking they probably wish they didn't: more than half of retractions in the world are by authors affiliated with Chinese universities.
I was therefore pleased to learn, as the committee heard from Jeremy Kerr last week, that five major Canadian research funding agencies have signed on to the Declaration on Research Assessment, also known as DORA. Others have suggested that instead of counting papers and citations, funders examine a small selection of papers chosen by researchers being evaluated: In other words, quality over quantity.
Such efforts will require effort and resources, but progress in research is worth it.
Thank you for your time. I welcome the opportunity to expand on my comments during the Q and A with members of the committee.
:
Thank you, Madam Chair.
I want to thank our witnesses for being here today.
Mr. Oransky, I'd like to begin my questions with you.
As we heard in your opening statement, Retraction Watch reports on scientific misconduct and retractions by many corporations. I have many questions regarding much of the things you've said, but one thing I wanted to draw to the attention of the committee was something that you had on your website. It is an article on your website titled, “Psychiatrist in Canada faked brain imaging data in grant application, U.S. Federal Watchdog says”.
I read in Retraction Watch that Romina Mizrahi received grants from the Canadian Institutes of Health Research for nearly three million Canadian dollars and worked on this in the Department of Psychiatry at McGill University, where it appears that none of her papers have been retracted yet.
My question is, by rewarding the pursuit of publishing, how are we getting taxpayer dollars valued in this?
:
I want to sort of make a pitch in response, if I may, that perhaps as this committee—and, of course, the government—is considering how to look at research, how to examine it and how to assess it, it might also consider some contributions and funding that are specifically delineated for looking at problems in the literature—in other words, sleuthing behaviour, which is what is mostly done right now by volunteers, even though publishers and universities benefit from their work.
If you look, for example, at the case of the Office of Research Integrity, which is responsible in the U.S. for oversight of research at the National Institutes of Health and some other agencies, its budget is about $15 million versus the $48-billion budget, roughly, in U.S. dollars, of the NIH.
I would have you all maybe consider whether or not there is a way to use some of the funding that is now being used to fund research directly, to fund analysis of that research and to actually keep a check on it. I believe the public will be much more confident in what it reads about what it's funding with its tax dollars and what eventually, in many cases and certainly in the case that you mentioned, could contribute, if it's done properly, to better health and better outcomes.
Thank you to our witnesses for being here today as we continue with our study of how we award federal research dollars.
If I can, in the couple of minutes I have, turn my attention to Ms. Evans to talk a bit about Research Canada and health discovery. Your organization is dedicated to advancing health research and health innovation, and you mentioned that you have over 130-plus members, I guess through collaborative work and so on with various partners.
In the health field, how do we ensure that those funding decisions are independent from any interference, whether it's political or otherwise? I will put it in the context of since the pandemic—maybe even before that, but we've certainly seen it since. Have you seen a rise in the distrust of researchers and of research generally? What would you say the role of a parliamentarian should be in that? Again, I'm asking questions relevant to your expertise and experience in health, and to your role in that.
:
I'll start in and hopefully get to some points that are important to you.
First of all, I really welcome the fact that we have a parliamentary health research caucus that is non-partisan. It has leaders from all parties. It gives us an opportunity to bring the latest health research and innovation topics to Parliament and to policy-makers. In fact, we consult to hear what some of the most important themes are on the minds of people with constituents they're representing.
I think it's safe to say that health is on all of our minds, all the time, whether it's our own, our loved ones', our colleagues' or the people we represent. It's a great privilege to be able to bring the latest. I think that the pandemic actually, in many ways, increased Canadians' focus on the importance of health research. We had a very incredible response with mission-driven and rapid response research. The government worked across departments with companies—huge multinational pharmaceutical companies—and with innovation and research hubs at universities all across the country. Every part of this country was involved in that response. We've come out with lessons learned that will make us even stronger the next time.
I would agree with you. Sitting at home when COVID hit, just like most people were sitting in their homes, I was a provincial politician at the time. There were a lot of, not discussions, but phone calls or emails with constituents at the time. Certainly, from reading social media, and some papers that were still left at that time, there was the need for Canada to do more, to do better or to have its own research labs, and we wondered how this could happen and what we could ensure for the next time.
When we move forward a couple of years later, when the pandemic was over and when everybody was out of their homes and their basements, and away from their screens and so on, it seems to me that the conversation then shifted quite a bit. There were still people thinking that, but others were sort of a bit cynical about research, and the discussion, for many, also just shifted.
How would you help us address some of that? Would you say there is any value, or what would be the value in research projects that might contribute more broadly to this type of research?
:
I think that it depends, again, on where these discussions are happening. The discussions I'm involved in every day are looking at things like how artificial intelligence is going to revolutionize the delivery of health care. When we think about the incredible costs of health care delivery, about the budgets of the provinces and about the way we're trying to revolutionize things, there's actually quite a bit of excitement around the science.
I can think of a huge announcement in Ontario, just a week or so ago, between Roche Canada and Invest Ontario, which is going to see more than 250 new jobs created there for clinical research. I can think of researchers we've lost to American universities, and they are really hoping to come back to Canada and hire Canadians into their highly technically advanced manufacturing industries.
I think there's a lot of excitement. We have to balance our questions and our healthy debates by also applauding the excellence and the incredible work that is going on, and by allowing ourselves occasionally to be excited about the future we're building.
I think there is a direct relationship between gaming these metrics—you mentioned the h-index—and other similar metrics and the production of, I would certainly say, sloppy research. In other words, it's the overproduction of research.
There is a tension between quantity and quality. I think the system has so overemphasized and so over-rewarded quantity that quality has suffered a great deal. We often say at Retraction Watch that fraud, misconduct and sloppiness are all born of the same mother, and that mother is the pressure to publish.
Now, for some researchers, and I would argue most researchers, that simply means pushing harder and trying to do better work. For some small percentage, although I don't think it's as small a percentage as a lot of researchers, scientists and policy-makers would like to think it is, that means people commit fraud, turn to a paper mill or in some way fudge their results.
I think this is a big driver of problematic research.
I'm going to turn to Mr. Oransky as well, right off the bat.
Coincidentally, today I just got a news notification on my phone from the journal Science about yet another fraud in the science world around peer review. Hackers got in and were pretending to be scientists writing favourable reviews of things. It is a big problem.
As I mentioned in an earlier testimony, there are between one and eight million papers published every year. It's an absolute tsunami of papers. I think this has increased dramatically in recent years. You'd probably know exactly that rate. I know colleagues of mine who are just refusing to review papers anymore because it could be that it's all they do.
I guess you've been talking about some of the ways we can get around this and some of the ways we can try to reduce this problem. Part of it, as you say, is this pressure to publish quantity and maybe game the system for the quality.
We've all been talking about DORA here and there. I just wanted to maybe give you some more time to speak to that initiative, how it works and how we perhaps should be using that more than other ways of measuring the quality of science produced.
I want to be clear that while we have reported on DORA and I'm familiar with it, I in no way speak for DORA or any of the signatories.
It's essentially a manifesto. There's actually another one called the Leiden Manifesto, which does something a bit different, but is getting at the same problem. It's looking at what is known as bibliometrics—I think that term came up previously in this hearing—and whether or not that is a good way to measure or to assess research.
I would also note that there are a number of very good bibliometrics scholars in Canada and around the world, but particularly in Canada, Vincent Larivière in Montreal has done a lot of important work in this area. I might commend his work and perhaps his testimony to you in the future, if he hasn't already.
In a short period of time, it's difficult to really go into detail about DORA and others, but the general idea is that other metrics, if need be, or just other ways to assess research—we heard about some of those previously on this panel—should be considered. For example, impact can be measured by whether or not research makes a difference. In other words, it literally has an impact. Has it been cited in policy documents? Has it led to change? Has it led to better outcomes?
This is a very downstream way to measure the impact of research. I would also argue there are other ways to measure whether or not a particular piece of research or a group of, in other words, findings in general, have contributed to whether we know more about the universe, biology or neuroscience. I think all of that, if we need to replace metrics such as the impact factor—again, we all need heuristics and we all rely on heuristics—are ways to do that.
:
Something that blatant, in other words, where we actually see the evidence for it, is still fairly rare. What is much more common and, probably, far more common than anyone would like to admit, is pressure on authors, sometimes even from editors of journals who want to increase their impact factor—something we just heard about, of course—and so what they do is they don't quite come in out and say this, but they say.... Review recommendations go back, letters to the authors that say, “Well, we would really appreciate it if,” or “It would be better if you cited a paper from our journal,” or something like that.
Then, it gets even more complex, a little harder to track when they have these—and I used this phrase in my testimony—“citation cartels” that people actually organize as citation rings. Again, we don't know exactly how often it happens, but if you were to speak to a bunch of researchers, I doubt that any of them, if they were being honest—and I would like to think they would be—would say that they've never had an experience when someone in some way had pressured them to cite their work, whether it's a reviewer, an editor or even someone else.
This, again, is a natural outgrowth, if you will, a completely predictable response. People just respond to incentives, to knowing that you need your h-index to be higher. What's a good way to do that? It's to make sure that you are cited more often.
:
I'll say something that, perhaps, would have been controversial some years ago: There should be more retractions from Canada. I don't mean any disrespect to your great nation. There should be more retractions from the United States of America. I could go on. The fact is that it's good news we're finding them. There are fields and, in fact, journals sometimes, that.... We heard the case earlier, in the previous panel, of Jonathan Pruitt. It's pretty bad news when this misconduct happens. I believe the number of retractions.... I could double-check our database again. What's worse news, though, is how long it took to adjudicate. That's one lesson from that story.
However, here's some good news: A group of researchers from around the world got together and said they don't want people like Jonathan Pruitt to do any more collateral damage than they already have. This led to a lot of retractions, but also to protection for the researchers who were victims of Jonathan Pruitt.
I think all of these stories are complex. I am frequently asked, “What about this field? What about that field? What about this country?” I say that, if there are fewer retractions, it's because people aren't looking. I trust things when I see more retractions. Maybe that's easy for me to say, given my work, but I actually think that's an important way to think about it.
:
That's a good question. I'm going to answer hypothetically, obviously.
I think it's actually trivial, if you want to, for example, fund a large body of research. I'm not even sure you would need the funding. You could do other things to make sure that work is published, cited and eventually ending up in policy documents, guidelines and regulations.
I'm not talking about any particular area or field here, but I think it is not that hard to move a field in a particular direction, based on pushing on publishing levers.
:
Thank you very much, Chair.
I'm going to try to get questions to all of you, but I have five minutes, so we'll see where we go from here.
Ms. Watts-Rynard, I really appreciate your bringing up polytechnics and the difference between how research entities measure university research compared with polytechnic research. Where I'm from, the polytechnic is Nova Scotia Community College. I appreciate your making the distinction about the criteria used and how these need to change.
MP Lobb used an example of research that can help move bricks and construct different things, and I think that's of value. I also think the humanities have value. However, I think they're apples and oranges. I'm not sure what the purpose is behind, for example, studying unpaid work by women in Bogotá, but I could take a guess if I drilled down deeper. It might be to do a comparison between Canada and Bogotá. That's just an assumption on my part. It might be to learn best practices. Again, I'm assuming, because I haven't drilled down on it.
However, I want to go back to the applied research side.
I think there's a huge sandbox for us in terms applied research. You talked about some of the recommendations, but I want to drill down.
What is the most important recommendation—and we'll have many—that you want to see when you open up our report as it relates to polytechnics getting important investment to help Canadians and—it's okay to say this—the world?
:
If I understood it correctly, unfortunately, I think your question in some ways answers itself, in the sense that the rich become richer and the powerful remain more powerful.
An impact factor, just like anything else or any of the metrics that are used, can be used to essentially cement and consolidate funding and, I could even argue, power—but that is actually another reason why these are such problematic metrics.
Again, researchers at those institutions tend to be cited more often anyway, and then they can just double down on that, so they have no particular incentive. I would credit those institutions that are wealthy in terms of funding and that have signed the declaration or taken other measures, because I think it signals a real intellectual honesty and a willingness to change that might benefit everyone instead of just continuing the Matthew effect.
:
It's a very interesting idea, and I've read some of the coverage of that as well as a few studies of what has happened after that.
I think it's in the early days, so like everything else, we should be empirical. We should ensure that a pilot program is really tested and that we look carefully at the evidence with a clear eye.
On the other hand, we've also seen evidence that peer reviewers—and I'm talking here about grant reviewers, in other words, grant peer reviewers—often don't do any better if you were to measure impact later on, or even citations, as flawed a metric as that is. They don't do any better than a random selection. That, to me, is somewhat troubling. However, it would also argue for perhaps trying that as a system, at least for maybe some percentage. Maybe it's a pilot program.
I also want to recommend the work and the writing of someone named Stuart Buck. He's at the Good Science Project, and he has done a lot of really smart thinking about a lot of these issues, in particular about grant review—
:
Research across all domains is sort of a global endeavour, and researchers collaborate beyond geographic boundaries as they attempt to answer very complicated questions. That is why I mentioned that the way we consider research excellence needs to be broad, and it needs to be tailored and evolving. It changes as science does and as society does.
I'm really pleased in the case of health, when we think about the populations that health research and clinical research in particular are meant to serve, that clinical trials and other types of research think about the people we want to have as benefactors and about the impacts we want to have. That's really important.
I think the same principles apply to the kinds of things we have talked about already at this meeting. We need to think about ethics. We need to think about integrity. Openness has come through as being a very important theme today. It's important for people to understand that what they're doing, and their results, will be open and public so that these can be scrutinized and can be shared, and people can build on what's being learned.
We also need to make sure that the knowledge is being translated. This was mentioned by some of my fellow presenters today. There is the need to move ideas through to commercialize changes that we then put into the health care system or that are taken by companies, and there is a need for Canadians to get access to these things in a timely and affordable way.
I hope that answers your question.
Thank you very much to our witnesses. If there's something you wanted to mention and didn't get a chance to cover in your testimony today, you may submit it in writing to the clerk.
I want to thank you for your participation.
I do want to inform the committee that I will be presenting to the House, this Thursday, our report on funding of post-secondary institutions.
Also, Thursday, December 5, is our deadline for submitting our witnesses for the antimicrobial study.
We'll meet again on Thursday.
Is it the will of the committee to adjourn this meeting?
I'm sorry. Yes, Mr. Blanchette-Joncas.