CADTH in Conversation with…

Professor Gillian Leng CBE, Chief Executive of NICE

Episode Summary

This week Gillian Leng CBE, Chief Executive of the National Institute for Health and Care Excellence (NICE), joined Suzanne McGurn for an insightful conversation on what the future holds for the United Kingdom’s influential NICE. Gillian discusses NICE’s new 5-year strategic plan and how it will impact the assessment of new drugs, devices, and diagnostics in the United Kingdom and beyond. She also shares her thoughts on how the pandemic is realigning health system priorities and influencing how NICE supports decision-makers through a period of intense uncertainty. *This episode was recorded with a live audience

Episode Notes

This week Gillian Leng CBE, Chief Executive of the National Institute for Health and Care Excellence (NICE), joined Suzanne McGurn for an insightful conversation on what the future holds for the United Kingdom’s influential NICE. Gillian discusses NICE’s new 5-year strategic plan and how it will impact the assessment of new drugs, devices, and diagnostics in the United Kingdom and beyond. She also shares her thoughts on how the pandemic is realigning health system priorities and influencing how NICE supports decision-makers through a period of intense uncertainty.

*This episode was recorded with a live audience

 

Find out more about what CADTH does.

 

Episode Transcription

 

Speaker 1 [00:00:02] Hi, I'm Suzanne McGurn, president and CEO of CADTH, welcome to CADTH. In conversation with a show where I talk to Canadian and global leaders in health care value assessment and health technology management today, I'm so pleased to welcome Professor Gillian Leong. Julian is the chief executive of the National Institute for Health and Care Excellence, better known as Nice. Gillian is widely admired for her contributions to England's National Health Service and in 2011 was named a commander of the Order of the British Empire. For those unfamiliar with Nice, it is England's National Health Technology Assessment Agency. It is globally respected for its health technology, appraisals, evidence based guidelines, scientific methods and so much more when it comes to health technology decision making in Canada. A common question I hear is what has nice recommended or how is nice handling this issue? Well, today we'll find out. I'll be talking to Gillian about Nisa's new five year strategic plan and what it means for the future. A value assessment in a world that has changed dramatically because of covid-19. 

 

Speaker 2 [00:01:11] Gillian, welcome. Thank you for being my first guest, 

 

Speaker 3 [00:01:15] pleased to be here. 

 

Speaker 2 [00:01:16] I you know, I have a long list of questions for you. The work that your organization does is of interest to us here in Canada and around the world. I'm hoping that we will make it through them all and that will open things up to questions from the audience. So let's get started. I have to admit, I feel a kind of kindred spirit with you as you and I have something uniquely in common. We both became executives of respected health technology agencies amid a global pandemic. You recently passing the one year mark and mine just around the corner. I guess my first question maybe is I'm curious about what drew you to this job at this point in your career. And I suspect the first year hasn't been quite as you planned. What would you like to share with us about what the one has been like?

 

Speaker 3 [00:02:02] Good. Good point indeed. My first year wasn't at all as I planned, and most people taking on the chief executive role will have this first 100 days concept and think about what is going to do in those first hundred days. And it didn't turn out at all as I planned. However. First of all, you asked me why did I take this job at night? And the short answer to that is that I really care about using evidence to support best practice in health care. And that goes back a long way to when I was a junior doctor and I recognize there was quite a lot of variation in the care that was being provided. So I really care about that. I've worked at Nice for many years and I wanted to be able to take the organization forward both in its work on guidelines and Htay. So that's why I took the job. But the first year was rather surprising that it was all of the work that we suddenly had to do to respond to the pandemic. And that in many ways has been quite a positive catalyst. Perhaps we'll come on to it shortly, but some of the things that we were asked to do as a result of the pandemic, in terms of guidelines, et cetera, was really positive. 

 

Speaker 2 [00:03:11] Well, I'm certainly glad to know I wasn't the only one who's one hundred day plan, as originally outlined, didn't come to fruition. And you alluded to your strategic plan. So maybe let's spend a little bit of time talking about it. I've certainly followed your organization's work during the past year as you've been creating your new strategic plan. No small task, as you've just said, during a pandemic and new to this role. You've launched the plan a little over a month ago, virtually, of course, to what I believe is an alive audience of about a three thousand individuals. And I was one of the Canadians up and watching in the very early AM, just like a royal wedding. And I will talk to you more about some of the specifics as we move through the interview this morning. But I wonder if you could share what the highlights of that plan are for you. 

 

Speaker 3 [00:03:58] Very happy to share some of the highlights. The structure of the strategic plan is formed around four pillars. So rapid response of health, technology, evaluation, dynamic living guidelines, uptake, the guidance that we produce, obviously absolutely key. No point producing anything if it's not used and the future use of real world data in the work that we do. So within those four pillars, lots of different elements, and perhaps some of the highlights are the piece that's about being rapid and responsive, because this is reputation that we've built over the last 20 years is about being the gold standard. It's being robust and solid and reliable. And those things are clearly very, very important. However, with the variety of new technologies, new types of medicines, digital products that we're expecting to come forward over the next few years with that variety of things, we need to be more flexible and adaptable in the way that we look at those products. So that's easy to say. Flexible and adaptable, but quite a big change to the way we work. And part of working in a different way is about using real world data. When this was established 20 years ago or so, most of the research was a little formal research. That was the way things were done in those days, a gradual, real increase in the availability of real world data since then, and on some products, particularly digital ones, they inherently collect that data as they're being used. So we need to adapt to be able to take that into account in our work. So that's a couple of things about how we work. But there's also what we do within this to just picking out a couple of things. One is to develop more our approach to evaluating digital technologies because there's such such a lot of them coming through. We need to be able to help the system to see what represents good value for money. So we are just opening something we call. An office for digital health, which is it's not walls and a door office, obviously, however, is a metaphorical front door so that people interested in what we're doing around evaluating digital products know where to go and who to ask about how we're taking that forwards. And one key part about adoption of our recommendations is the transition from an assessment of a technology rapidly into a guideline. Guidelines tend to be what practitioners use in their day to day practice. And if we have a long lag between something being recommended and a piece of technology assessment guidance to get into the guideline, it doesn't help uptake. So we want that to be a seamless transition into a guideline that is a living guideline rather than an intermittently updated guideline. 

 

Speaker 2 [00:07:08] So I was very intrigued by your strategic plans, emphasis on reshaping relationships with industry patients, patient groups. Can you talk a bit more about what you'd like to see these relationships evolve to and perhaps reflect on whether there was any concern as part of your strategic development or since your launch that suggested that a more open relationship with these groups could have a negative impact on the robust independence that Nisa's work is known for. 

 

Speaker 3 [00:07:37] Both industry, industry stakeholders and patients and patient groups are core to the way we work now. So during every appraisal, there's engagement with industry points, engagement with patient organizations. So. So that's all good and generally works well. Clearly, sometimes there's tension depending on the outcome of the assessments that we're doing. However, in relation to the reshaping of that relationship, it's more about having an ongoing strategic relationship with both those groups. So, for instance, we are aware that industry colleagues have a huge amount of knowledge and insight into things that are being developed, things in the pipeline. So to help us make sure we are ready to deal with some of those things, we feel we should have an ongoing strategic relationship with core industry partners to help us understand what's coming down the track and to be ready to address them. That's one example. The industry also does a lot of work with real world data. And if we want to be able to evaluate it, conversations and links with them about what they're thinking, what they might be able to produce, how it might be shaped are clearly very important. And patient patient groups clearly, again, there are strategic conversations as well as the detailed ones around products that we're actively assessing. And again, there is an element of data and what matters to patients in the way that we might be looking at research in the future, looking at data. And at some point in the future, I'd like us to have a facility on our guidance pages where we can take ongoing feedback from patients and patient groups. That's how we're all used to engaging with the Internet, isn't it? Really? You can review this stuff in the other. It would be good to be able to have some real time engagement with patients and patient groups. But there's one more particular aspect that I could just mention, and that's called a project. We're nice lessons. We used to have something called the Citizens Council to get a bit more public perspective on some of the ethical considerations around our work. And that hasn't been in operation for a while now. And we're reinvigorating that engagement with the public as opposed to specific patient groups over the next few months. We've got some key questions we want to test. And the first piece is going to be around health inequalities and what the public thinks about that, what they think the ethical considerations are, and, of course, how that might link to NICE's work. So lots of things going on. 

 

Speaker 2 [00:10:22] And Gillian, you talked a little bit about real world evidence already, and it is very much growing in our ecosystem. It features prominently in your strategic plan, almost, you know, is an underpinning of your fourth pillar. It's something that we're really trying to turn our attention to in a thoughtful and proactive way here at, I guess, what will real world evidence and the lifecycle approach to to and value assessments look like for nice in the future? I know it's already a function similar to others that you do now, but you're envisioning it being something more. Can you talk a little bit about what that would be? 

 

Speaker 3 [00:10:59] The easiest place to start is probably with the current cancer drugs fund that we have here, because that is a way of. Ring real world evidence for particular cancer drugs that we then take back and reevaluate. Now, that's quite a managed approach to real world evidence, by which I mean we do an initial assessment of cancer drugs if they look promising. But the evidence base isn't complete. They're approved. And we recommend data collection as they're being used on real patients. And then we look at that. So that's quite managed. And that managed approach to collecting real world evidence is something we'd like to expand to other scenarios as we have an innovative medicines fund replacing the cancer drugs fund, for instance, and for medical technologies to increasingly we think that that sort of promising but not proven approach would work well, the medical technologies that are managing the collection of real world evidence. However, of course, there are a number of scenarios where the real world evidence might be the first thing that we see rather than something we've particularly asked for in a defined context. So there there needs to be quite a bit of work done in the background, which we initiated a year or so ago, to identify what sort of standards, what sort of quality of that real world data would we expect for us to be able to consider it. And the interesting question is, is when might real world evidence from a randomized controlled trial? Lots of controversies around, but how do you take the bias out of the real world data, but really important questions, because some of the technologies that are coming through will be inherently collecting that data and we need to be ready to use it in a robust way. So it's quite exciting and there's lots to do. We've set up a new directorate at night called the Science Evidence and Analytics Directorate, and their main role really is helping us with this. And that analysis of that data has been collected in all sorts of different ways, isn't it? And yesterday I was reading in the news about Google partnering with an international health care provider to collect data from patient records and using that to develop algorithms for patient care. And you begin to think, how does that all come together? What does it mean and how do we assess it? So lots to do. 

 

Speaker 2 [00:13:37] I think your comment about it both being exciting and filled with controversy are key messages. And there's just so much opportunity. The data is there that we've been collecting for years and now to figure out how to harness its abilities to help us do better assessments and make better recommendations. In your opening remarks, you talked about sort of covid in the pandemic, upending things. The other things you talked a little bit about was how can we do our job? And when I first started, I think one of the first events I went to, the fundamental question being asked about the methodologies and processes that we all use to do our FTA with about the trade off conversation between quality and speed. But as I listened to your launch, I actually thought I heard something different. I thought I heard you reflecting on a future where speed and rigor would go hand in hand. And from your extensive experience, what should decision makers expect from us in the very near, hopefully post covid world in that sort of quality and timeliness environment? 

 

Speaker 3 [00:14:38] It's it's really important, isn't it? We live in a world where things generally seem to be speeding up. People don't want to wait. They expect to be ordering something online and not just to come tomorrow, but to come today, if that's what it's like, sort of like here. So people people expect things to happen quickly. And the pandemic very much did speed up things in the UK. It increased the speed dramatically that we managed to get the randomized control trial in place. And then from going from the trial evidence to evaluating that to getting information out and getting used was dramatically faster than it would have been in a pre covid world. And this was part of a collaboration that is still ongoing, that runs from the research and through to the regulators tonight to assess and research products and get them into the system. So we want to carry on doing that. We want to keep this compressed time line. However, it means working harder at the collaboration and it means thinking about processes so you can do things in parallel rather than in sequence. And you mentioned robustness of process. Clearly, people expect to be able to trust us. And so if we're making headway into speed, we have to be very careful that we are not shaving things off the. Our assessment and putting anybody at risk, 

 

Speaker 2 [00:16:12] but it really is an interesting conundrum. There's there's so much opportunity for evidence. I say generally in what we've seen during one of the things that you also touched on earlier was the issue of inequities in health care. I think we all know that they've been there. And I know that there's at least here in Canada, we've talked about the inequities for years, but the world events of the last year plus the pandemic has certainly put a spotlight in a way that we really can't ignore. And I really do feel the conversation is changing. Your strategy emphasizes prioritizing work to reduce health equities. I guess I'm interested in what does that look like for Nice over the next five years? And how do we where we're often responding to what people bring to us devices or drugs, how do we make sure that those developers are looking forward to how to emphasize issues of equality? So no short task. And I know that there's two or three questions coming in about the issue of inequality. So I'm really interested in your vision on this topic. 

 

Speaker 3 [00:17:17] And it's a really important topic, isn't it? And aiming to reduce health inequalities through Nicias work is one of our core principles that's being there for for many, many years, and it is reflected in our core methods and processes. Now, however, with this renewed emphasis on health inequalities because of what we've seen during the pandemic, we want to take another look and see if there's more that we can do. It's hopefully also a priority of the UK government, too. And they describe it very clearly as the leveling up agenda, which is an important point to make because it indicates we want to reduce health inequalities by improvement rather than sort of reducing everyone to some sort of middle level. So what are we doing to see whether there's more we can do? There are a number of things that we're looking at starting at one end say it's checking what topics we're looking at, be that a technology or a guideline are the things that we might select more than others because inherently they are likely to improve health inequalities. Then there's a complicated bit around all methods of how we might suggest those or not. For instance, would we would we ever wait? Actually, because it had a positive effect on inequalities we don't have that in is a factor at the moment, but we might. And then, of course, at the other end, that is how things are put into practice, which we don't have a direct responsibility for. However, we know it clearly can make a difference, because if things that improve health are implemented better in relation to, say, areas of affluence or areas of better health, that will make inequalities worse. So we are working with system partners to think about that aspect of things, how things are implemented over the bit in the middle, around our methods and processes. That's quite tricky. I mentioned life lessons, so that's about engaging with the public. But we're also thinking we need to commission a piece of expert research around this because it's complicated and we don't think there is an obvious answer. But it would be definitely worth having some big brains looking at it on our behalf to see if we might change what we do and how we do it. 

 

Speaker 2 [00:19:41] Certainly one of the side effects of the pandemic is that there has been increased cooperation. I've seen it across Canada, I've seen it internationally. But sometimes that increased cooperation comes along with some confusion as well. Should the public think when we're making different determinations on the same products? And that confusion extends from a wide variety of whether it's vaccines or pharmaceuticals or technologies. So in your viewpoint, what are the prospects going forward for more alignment and cooperation among agencies? Do you have thoughts about where we should potentially consider pooling our resources or leveraging our relationships differently in an effort to both be able to respond to the new demands that are being placed on us, but also to maybe increase the consistency across our various organizations? 

 

Speaker 3 [00:20:34] I think increasing collaboration for the future is going to be really important. I've talked already about the volume of new technologies that we know are in development, whether that's medicines or whether that's medtech or digital. There's a lot more things that are coming through and I can't see any of us having the capacity to really keep up with all of that without actively collaborating. There are different levels of collaboration, of course, on the pinnacle, I suppose, is a full blown joint piece of work where things are. Jointly by jointly owned, I think in most of the world, we won't get to that level, and that's because the health technology assessment is inherently about the priorities of local health care systems and how they wish to spend their money and how the health care systems are organized and funded. All those things come into play. And those things are probably why we've seen the variation in adoption of vaccines. However, beneath that pinnacle of collaboration, there are other things that we really can support, and that's things like methods that we use developing some of these difficult ways of looking at digital products, perhaps of coming up with more common approaches to evidence assessment, common frameworks, common submission forms, say, for industry and others to use to make it all more efficient, possibly sharing some of that data. If we begin to align more around how things are done and how the data is collected, much more straightforward to share and use that data. Even if the actual value based judgment is made more locally, we could at least share some of the underpinning resource. 

 

Speaker 2 [00:22:24] And I'm really looking forward to opportunities that our organizations will have to work together in the coming months and years. I'm going to turn my attention to some of the questions. There's some great questions that have come in to us. One of them harkens back to your opening comments about digital evolution. And one of the questions that's been asked is, do you think there's critical components of assessment of digital health technologies that differ from what we traditionally think of as traditional device? And any thoughts about sort of the digital future? 

 

Speaker 3 [00:22:57] It's really complicated, I think is much short answer to that. And I won't claim to be an expert. There is an awful lot of work going on at the moment within Nice and with other partners in our health care system. It's complicated because digital products work in all sorts of different ways. The categorization of them has been quite challenging and that's a piece of work that Nice led a few years ago to set up a framework for different types of technologies so that as an HDR body, we won't need to assess all of them. There are those with highest risk, those that are most closely related to health care that we need to focus on. But then there's different ways that they work. As I say, they can evolve, they can learn, they can have A.I. built into them. They can be linked with medicines, they can be linked with monitoring, and they can collect data. So it's complicated because it's not a fixed thing. It's not a chemical entity like with a drug. And it was a couple of years ago now. So things will, of course, have moved on yet two years ago, held a really interesting workshop with a range of academics and, of course, a range of views about the evaluation of digital. Some people saying that they were very similar to what we do now and we should stick with randomized control trial model for those things that were really important and others saying, no, no, no, there are more fluid approaches that we might be able to use building on real world data. So it's those sorts of thorny issues that we are currently working on. 

 

Speaker 2 [00:24:33] I would agree. It's certainly it's not the same, but there's certainly a lot of what we've learned over the history of that I think can be brought to bear. And it's finding out how to make it work well together. One last question. We're almost out of time, and it has to do with the sort of lifecycle management of technologies. I think we all we recognize that the origins of Htay has come from probably the front end, the access and where he was an access hurdle. But over the last number of years, as lifecycle management has unfolded, I think the one thorny issue that we all continue to struggle with is late stage technology. So the decommissioning of technologies or pharmaceuticals, I think a lot of countries look to you about the experience you've had to date and wonder if there's anything that you would share, a sort of key learnings about being able to manage through the lifecycle, including the decommissioning. 

 

Speaker 3 [00:25:27] Yes, decommissioning. But is the tricky piece of the puzzle, isn't it? And lots and lots of ways that we've looked at that over the years. One way was to have a specific referral to nice to look at things, but were generally considered not effective and should be decommissioned. So somebody's done some prior thinking. Such nice. What if you look at this, Gromit's was an example. Lots of time and effort spent looking at Gromit's. And at the end of the day, there wasn't a black and white answer. There was a well, it's probably not useful in most people, but in this group it is. And that's the example that sticks in my mind. But the other examples all came down to a source. Well, yes, but and it was quite a complicated piece of work. I'm afraid I'm going to talk about guidelines again because it's guidelines where we've had more success in identifying things that shouldn't happen. And that's because the guideline gives you the overall picture of care in a particular pathway. So in future, let's hope that by rapidly moving a new technology into the guideline that that gives a group of people who we will need to have as a standing committee for a guideline to say, well, if this bit's coming in here, that means we shouldn't need the other. And it's only by having that group of people, I think, who are experts and think about the overall pathway, that we will really be able to spot those decommissioning elements because it's not always straightforward to embed new technologies in the system to get change to happen. However, it's also really quite hard to stop doing things, too. So at the moment we are identifying things to stop through our guidelines program and then we're working with other parts of the system to then help drive through a program of stopping those activities. 

 

Speaker 2 [00:27:24] Julian, I can't thank you enough for the time you've spent with us today. We're almost out of time and we have to wrap up this conversation. But maybe any final comments you'd like to share with the audience before we sign off? 

 

Speaker 3 [00:27:37] Well, thank you for the opportunity. It's been a great conversation. There are lots and lots of key areas of common interest between what we're doing in the U.K. and what's happening in Canada. And it's a collaboration that I'm really pleased to have in place. And I look forward to continuing to work with you. 

 

Speaker 2 [00:27:56] Thanks, and this has been a great opportunity. Thanks, everyone, for joining our podcast and we look forward to carrying on in further series later this year. Thanks, everyone. Have a great day. 

 

Speaker 1 [00:28:08] Thank you. That's a wrap on this episode of Carterton conversation with thanks for joining me. And remember, you can listen and subscribe to our show anywhere you find your podcasts to learn more about Canada's role in Canadian health care and the work we do. Visit our website at Karateka that CADTH. See you next time.