Podcasts

image of a green microphone

Innovate on demand

The Government of Canada is constantly looking for new ways to train public servants in an innovative and modern way. The launch of a series of podcasts by and for the federal public service, focused on innovation, will allow public servants to learn and share "on demand", at their convenience, on innovative projects and processes. This series of podcasts will give a voice to instigators who are implementing small and large innovations in their workplaces. Join us either as an auditor or as an innovation actor!

E3: Field of Broken Dreams

This episode tackles innovation in the public service by going back to basics. It sheds light on the meaning of innovation and examines the challenges of sweeping innovation across government. Are we innovating for a solution before there is even a problem? Co-hosts Natalie and Valeria speak with guest Sean for a reality check on innovation and share their personal experiences with innovation in the public service.

Todd Lyons
Producer

Natalie Crandall
Project Lead, Human Resources Business Intelligence, Innovation and Policy Services

Valeria Sosa
Project Manager, Engagement and Outreach

Sean Turnbull
Project Lead, IN.spire Innovation Hub

Field of Broken Dreams

Transcript

INNOVATE ON DEMAND - "Field Of (Broken) Dreams"

TODD
I'm Todd Lyons.

NATALIE
I'm Natalie Crandall.

VALERIA
I'm Valeria Sosa.

SEAN
And I'm Sean Turnbull.

TODD
And this is Innovate On Demand podcast.

On this episode of the program, we go back to basics. What does innovation even mean? Is it the magic bullet for any problem that some people make it out to be? Is it something we should even be attempting in the public service, given what's at risk? Or is it the latest shiny thing we feel pressured to chase? We were past due for a frank and fearless conversation. Natalie, Valeria and our guest were happy to oblige.

VALERIA
Welcome, Sean! So Sean, let's just let's just dive right in. Why don't you tell us what you think about innovation?

SEAN
I'm a little exhausted of the term "innovation".

VALERIA
I'm shocked! [laughs]

SEAN
Yeah. Spoiler alert: it's going to be a little bit dark. So, every job I've had in government so far, has had the word innovation in it. And I'm a little bit jaded when it comes to government bureaucracy innovation, based on those experiences. I do want to say from the start that I don't want it to be super negative and say people shouldn't try to innovate and that we should just continue along with the status quo. Obviously, there are times when there are many examples where we need to try new things and be more adaptive in government.

VALERIA
And you're also not just jaded when it comes to innovation. [laughs]

SEAN No, no, that's just a general sort of approach to life for me.

VALERIA
[laughs]

SEAN
Truth. But I think with government innovation, we oftentimes try and import ideas and concepts from other places, mostly the private sector, especially the startup sector. So we try to do things like the cargo cults where they would build these airports out of cardboard boxes, and then they would think that airplanes would land. So we think if we build an innovation lab, then innovation will then happen. It's the "Field Of Dreams" innovation strategies. And I don't think we take seriously, (A) I don't think we take seriously the lessons that we could be learning from the private sector and (B) I don't think that there are necessarily a lot of lessons to be learned from the private sector. We aren't the private sector. And we need to make a b case about why we need to innovate in a certain field or with a certain tool before we actually do it. And just because it's there to try and just because it's a new idea is not good enough. And so that's one of my main takeaways from, from being in the government innovation space, both in terms of innovating with new interventions to work with the public, but also in terms of innovating within our own Corporate Services, and internal innovation and the way we think about and see the world.

VALERIA
Can you give us a for instance?

SEAN
Sure, blockchain is really interesting. It's a very cool tool. Distributed ledger systems can can help solve real problems and are solving some real problems. And so we should know about blockchain, especially from a regulatory perspective. We should understand it. We should add it to our toolbox. But we shouldn't try and shoehorn in block chain solutions to problems that do not exist. We shouldn't take a problem that can be solved with just an Excel file, and try and do a block chain solution, just because it's cool and new. And similarly, with design thinking, a very cool way to approach the world. We can't design think every single problem. So I do sometimes worry that we fall into the trap of being tool-driven in our innovation, where we see a solution, and then we start looking for a problem. And I think the best innovations are ones that are problem-driven. We see a real problem in the world. This is a problem for Canadians. I'm less interested in problems for government employees; those are real problems, too. But when there's a real problem for Canadians that we can fix with an innovative tool, that's the best kind of innovation. And there are real, very cool examples of that and you guys have probably talked to some people, or will talk to some people who have great examples of that. But when we see something cool in the world, and we just want to try it out, that's when we get into some problems, I think in terms of not really delivering real value for your organization. You're not really delivering real value for Canadians. And this, this cuts to the core of what annoys me about it is, you don't get focused on outcomes anymore. The innovation itself becomes the outcome. Getting the money out the door, getting the innovation out the door, becomes the outcome, become success. And you are right to think that way, because you will get the ADM Award if you get that innovation done. But did you produce any real meaningful outcomes? So that's a lot of it. That's the core of my exhaustion.

NATALIE
So if I were to maybe try and paraphrase, and please correct me if I'm not capturing it: not everything necessarily lends itself to an innovative lens right now. And so maybe we should put a bit of time and thought into where we should be innovating? Would you say that that's fair? SEAN Yeah, definitely. I always say -- and this gets back to the definitional question -- what is innovation, which is both exhausting, but also important to think about. Innovation is trying something new. And when is the best time to try something new? Well, when the old thing isn't working. And what evidence do we have that the old thing isn't working? And in any given scenario, our evidence in government ranges from non-existent to kind of crappy. We rarely have good evidence about what we're doing right now. So how can we even presume to think an innovation would be necessary? We don't even know how fast the car is going. Does it make sense to put a new engine in there when you haven't even built a speedometer? And so I think a lot of our innovations are like engine-focused innovations: we've got get going faster. And I always get frustrated that we don't really know our original speed. We have no baseline to compare this to and those questions aren't even considered necessary, because it's just "Oh, cool. It's an electric engine. Let's get it in there."

NATALIE
So, I have a question for you. I've been thinking a lot about this, thinking of you coming on as our guest Sean, with your background. How do you think that experimentation can actually contribute to innovation within the public service?

SEAN
It makes me very happy that you asked me that question.

VALERIA

[laughs]

SEAN
That's a very b host move, I would say. So yeah. And that's been part of my innovation journey, too, is that I started in innovation and then I ended up for the last year at Treasury Board working on the experimentation file. And it was for that reason, I think, experimentation is like Innovation 2.0, or it is a necessary precondition to innovation. Because experimentation -- capital E experimentation, not when people just use the word experimentation to mean innovation because doesn't mean that -- but the rigorous experimentation of testing something: that really opens up a ton of space for you to then say, "Okay, we think this program is causing this change in the world. But we have to do an experiment to test it: test it against another way of working or test against nothing at all." Sometimes, maybe our programs are worse than doing nothing. And so once you start thinking in that way, you start questioning your assumptions. Well, we thought sending this letter would get this reaction from Canadians, or we thought this website would get people to click on this. And then you have to think, "Okay, we need to measure this now. We need to compare it to something." And then when you find out, it's not working the way you wanted it to, or it is working, but you think it could be working better, or it's working, but compared to an international benchmark, it doesn't look so great, that's when innovation can come in. Because then you've got actually created a problem that you can then solve with an innovation. You have a hypothesis you can test. You can say, "Look, we're doing it this way. That led to this level of client satisfaction, or that led to these processing times, whatever your metric is that you want to change in the world. We think doing it in with a design design thinking approach will will shorten that amount of time to process it or will, or will increase client satisfaction. We think moving to blockchain will help the whole thing run faster, while not sacrificing the timing. So experimentation, rigorous experimentation, even just thinking it through... not even running a full experiment, but but bringing experimentation mindset, I think helps better prepare you to have a more meaningful innovation conversation, because it forces you to ask the right questions, to draw your logic model, to interrogate your assumptions. And then that opens up space for meaningful innovation, instead of just slapping an innovation down because it's a new, cool thing to do.

VALERIA
So let me ask you, what do you think would be the next step? Or how could we go about educating people on this front? Because I think we've had this conversation before. I think after hearing you speak on the subject so passionately, so many times, I started observing in people, the fuzziness and the blur between innovation and experimentation and how even entire departments misinterpret these words and their meaning. So what do you think is the solution in terms of working with a common understanding of everything? Because we've been trying for years, really.

SEAN
I think it's really, really hard. I think there's like a depressing answer, and then a more hopeful one.

VALERIA
[laughs] Depressing first.

SEAN
So, the depressing one... the depressing and insulting one -- but it's insulting to me, so I can say it -- is I was doing a roundtable on experimentation and someone -- I think was at Global Affairs -- said we have a "liberal arts problem" in policymaking where we hire people who don't come from a rigorous empirical field. And if you hire -- 90% of those people, ECs, have that background. They weren't engineers. They weren't even psychologists who at least have to understand stats. If you hire that kind of people, then that's the kind of innovation experimentation you're going to get. It's going to be liberal-artsy, fuzzy experimentation. So that's the depressing answer is that there is maybe a systemic challenge in the way that we educate policymakers in the design of the programs. Literally going back to high school and then universities -- people are not empirical enough. They don't understand the stats. That's not to say empiricism is the only important thing in the world. But, a weaknesses is an over-used strength. And I think we're really b on the liberal arts ideas and the concepts of philosophy that we need, and the qualitative understanding of the world. We have that in spades. But what we might need is some more quantitative [understanding], and that can be achieved, other than reforming the education system -- which is kind of a crappy answer to give -- there are ways we can do that at a departmental level with Free Agents and the PCO Fellowship Initiative and people trying to make changes to rapidly staffing. Talent Cloud, maybe, with bringing in new kinds of expertise much more easily than we could do before. So that's one way of doing it. But then there's also just at an individual level, and that's maybe more the kind of point of your question was, how do we convince individuals who might not be so familiar with it? And that one, I don't know what the answer is to it, other than to say, I think we need to be more thorough, experimentally with the way we do this outreach and kind of test, instead of just sending out an email or organizing a workshop, try and be a bit more rigorous with thinking through what change, we want to see the world and testing different approaches to that change. So it's a meta point, but we have to be experimental with the way we we support experimentation as well, or else what kind of hypocrites will we be?

NATALIE
I find that really interesting because I think it'd be fairly easy to extrapolate the same situation around data and database decision making right now. So the world we live in, and the world we work in, the environment we work in is changing significantly. So how do we make sure we've got those skill sets? Because I agree, we talk about data all the time, but I don't see anything that lets an individual take that data and say, "How do I know that I've understood this data?" How do I know that I'm actually maximizing the use of what's available to actually make better decisions, to better understand the situation, to better determine an experiment, or identify a hypothesis or whatever it is that you're trying to do?

SEAN
Yeah, I totally agree. I think data is an overlapping Venn diagram with experimentation and innovation. And we are opening up new frontiers in terms of the level and type of data that we have access to the point of now, we, we probably have too much. And so again, I think it goes back to the like, level of expertise we need in government. We need more data scientists. I push Comms teams a lot on why they don't run more experiments, because they have access to so much data. If you're sending an email out to 10,000 Canadians, you can send it twice, and just randomize which version they get, and just see which one better leads to the outcome you want. And those kind of experiments are quite rare in Comms teams, even though those are the easiest to run. If you gave every Comms Team a data scientist, you'd see a lot more of these things. But we have no budget for that. And frankly, even if we wanted, there's not enough data scientists who have some Comms knowledge that you can even do that. You couldn't even hire that many. And so again, I do think it's an expertise problem. But then there's also an individual question of what can any individual do to get better at using data. I think some people have started to answer that question. We do have that data strategy that is out and things are happening around it, I guess, I don't really know.

VALERIA
I think this brings up an important point. I've also seen something [that] doesn't measure what it says it's going to measure. But people are convinced that this does measure what it says. You know what I mean? I've seen it so many times, we're like, "Really? I don't think this is actually measuring what you think it is." Yet, we accept it. And as you're talking, what I'm thinking about is building diverse teams, and how important that is. Consciously constructing a diverse team so that you have those knowledge gaps filled, and giving that opportunity. And that's part of the experimentation too, figuring out how we can put some rigor behind building those teams, because I know, but I don't have any evidence to support it at all. I think the results would be much better.

NATALIE
I feel like you're trying to define a unicorn. Some mythical creature who's both a data scientist and a master storyteller.

SEAN
Yeah, I think I think we know story is important. And sometimes when I when I go on this rant, people think...

NATALIE
So this isn't your first time on this rant?

SEAN
No, no...

NATALIE
Excellent. [laughs]

SEAN
It won't come as a surprise to you.

VALERIA
Oh no. Anyone who knows Sean, knows his rants. [laughs]

SEAN
I'm not trying to put myself out of a job. I don't think it should only be just the nerds with glasses doing the Moneyball advanced stats of government. We definitely need that qualitative storytelling side of things, and it is more about the diverse teams. But diverse teams are one way to do it. And so we can look at other [ways]. We can learn some lessons about experimentation from other fields as well and think about how do they do it in the medical side of things -- not that we want to reproduce that system necessarily, but we can take we can take lessons from how do they do it in the UK where they have a trial advice panel, where they have a list of academics who are willing to give their time to government to help design experiments. That's a structure that they have built. We can look at the private sector where they have experimentation teams that, much in the same way we have innovation labs in government, whose real focus is they have the data experts and they have the empiricists and they have the people who can do experimental design. They have the ethicists, though that's a separate question, and they can really build meaningful, useful ethical experiments within their respective organizations and learn from those who improve -- whatever -- the way you sell pet food or in the UK the way that government works.

VALERIA
Random example. [laughs]

SEAN
There was. I think Petco, or there's some company that has a real b experimentation side. It's very impressed. Just to say there's more than one way to do it. And I think at first, before we make the systematic change in the way we educate and hire public servants, I think it might be more the centralized approach which has its problems, but I think it is working in some departments. I think in ESDC you see it working in terms of they do have a team that is an innovation experimentation focused team that can deliver wraparound services to help people solve a problem, and they're getting some some good outcomes from that. And we're seeing other departments trying to replicate that model and the CRA as a great great team as well. IRCC. So we do see things happening. It's a question of... It's like any change management. It's got to be from the ground up. It's got to be at the mid-level. You have to incentivize it at your leadership level. There have to be champions for it. There have to be real resources put aside for this stuff. So I think it's like any big problem that we face. If we want to tackle it, we have to tackle it at every level with multiple vectors.

NATALIE
So would you say then, that it's fair to say that we need to mainstream some of the concepts of innovation, experimentation, data, all sorts of things within our everyday work culture so that we can utilize them better? What would you say that's fair?

SEAN
Yes. But... I think sometimes we over mainstream the innovation piece of it. Not everything needs to be an innovation. Some things work. Government is actually good at some things, there's no need to innovate in them. Now, the Foresight guys might argue, you always got to be one step ahead of the problems on the horizon. And that's a fair comment. But, I would say, maybe it's the lens I have because it's where I've been working. I've been working in this innovation bubble, so this might not be true, if you haven't been in that bubble, or you're in maybe not even in the National Capital Region. Maybe there's something about my experience, but I do feel that we've done a pretty good job on convincing people on the innovation side of things. And we still have a lot of work to do on the data and experimentation, even evaluation. Even the rigorous evaluation piece. I think we found a way to hive off evaluation into its own ecosystem that we don't really need to interact with as policy and program people, which defeats the whole purpose of evaluation. And so that's it's own podcast entirely. But we found a way to take a very useful concept and bureaucratize the usefulness out of it with the way we do evaluation. I hope evaluation people don't get mad at me for saying that. I think a lot of them would agree with that.

NATALIE
I think they'd feel supported. They would like to be much more involved. I think the people who work on evaluations firmly believe that it needs to be an integral part of the entire process, whether it's a program or a project or whatever it is. But if you are not incorporating everything that you learn through your evaluations and tweaking and adjusting, then what's the point of doing it?

SEAN
What's the point? And I think that we talk about building diverse teams. One way we could really quickly do that, and we wouldn't have to change very much at all, is break apart the evaluation shops and take those people and put them in program sector and just say, like, stop doing these three to five year post-hoc evaluations mandated by Treasury Board on questions of questionable relevance to the program, and people on the ground and start helping people design programs from the start that are based on existing evidence and help generate new evidence. Those are the two important things. Once we have better evidence, then that feeds into the experimentation and innovation pieces, because then we can know this program isn't working the way we thought it would. But right now, the way that evaluation is consulted for half of a second before TB sub goes through, or an MC goes through, and then they're brought in three years later to run the large evaluation. But that's too late. So many decisions have been made. And this goes to data as well. If you're not capturing the right data, it's pretty hard to run an evaluation five years later.

VALERIA
And I think that it's not only on a big scale, but when you talk about small scale, it's almost like a muscle that we have brain muscle that we need to build within teams, because people just put that aside, and they just feel like there isn't any time -- that it's a luxury. Evaluation is a luxury. You put something forth. Let's just say you put an initiative within your team. You have to talk about it afterwards. Did it work? Did it not work? And, it's not a luxury to have that discussion. It's a necessity. Evaluation of anything that we do is a necessity.

NATALIE
Some of these these topics come up often. If we want to talk about innovation, if we want to talk about experimentation, if we want to talk about these things, we need to take a step back and take a bit of time to think about things to be strategic. So we keep coming up to similar points in our conversations. And I guess one question that I would ask is: it's one thing to talk about innovation for those people who consider themselves in a and work in the innovation space. But our public service employment satisfaction survey this year, asked every employee what their views are around innovation, and whether they felt that they were encouraged innovate in the workplace, which really got me thinking about how an average public servant -- and I would consider myself to be exactly in that boat. I'm working in something that's called an innovative space for the first time in my career, but I don't feel like it's the first time that I'm applying an innovation lens. The question that I would ask is, what fundamental changes to our environment, to our work culture, could we make to allow those concepts and allow that understanding of what that all means to become more mainstream in the public service? I don't mean mainstream, that an innovation lens needs to be applied to everything but more that this kind of an articulation is something that's within reach of all public servants.

SEAN
I think that's a really hard question. I think we're wrestling [with it]. It's true and that's part of the problem is when you do approach these things, by trying to create a centre of excellence is that you undermine the mainstreaming side of things where you say, like, only these people are doing innovation, which, of course, couldn't be further from the truth. And so I think it's really important. And I think we don't want to marginalize innovation in the same way we have sadly done with gender based analysis, where it's something if you have time for it, it's nice. But you rewrite the paragraph after you've written most of the MC and you worry about it post-hoc. It's not. It needs to be an integral tool that you use to evaluate ideas from the start. And so all of these things we're talking about innovation, experimentation, data, concerns, evaluation, gender, reconciliation... these things are not solved by creating another annex to the treasure board submissions there solved by mainstream media into everyone's toolkit. I don't know how to do that. I do wish we were more evidence based when we tried to do things like that, because there's really good literature out there. Another issue I have is that we are pretty myopic, and the way we think about things, we barely even look at past evaluations of programs that are similar to what we do. We almost never look at the wider landscape of evidence for a given intervention or a given field. There's a lot of work out there on how to do change management. There's a lot of work on how to become a ber organization, when it comes to data. It's being generated by governments, NGOs, but by private sector as well. And so I think we need to be better at taking a breath and saying, What's the literature say on this? Like, a meaningful literature review of how to how to negotiate change management, these fields. And some of it's the same old stuff: you need resources and champions, and you do need some way of tracking it and incentivizing it. But I think we could be a little bit more innovative in the way we support innovation and a little more experimental as well and saying, Look, we know these approaches worked in this situation, let's try them. And let's make sure we're measuring something to know if it's working or not. And let's make sure we check that in a year to see if it's getting better. And if it's not, we'll try something else. Again, that's not a very helpful answer, but that would be the change I would like to see is an evidence based innovation approach to the way we do business. I haven't seen too much of that so far.

NATALIE
Thank you very much.

VALERIA
Yes. Thank you.

TODD
You've been listening to Innovate On Demand, brought to you by the Canada School of Public Service. Our music is by grapes. I'm Todd Lyons, Producer of this series. Thank you for listening.

E2: Artificial Intelligence Showcase – "The Art of the Possible"

Over the past few months CSPS has been working collaboratively with a number of federal departments and agencies to explore how Artificial Intelligence (AI) could facilitate, review and analyze the stock of 2,600 federal regulations to ensure that they support innovation and growth while protecting the health, safety, the well-being of Canadians, and the environment. A recent Request for Proposals process identified 17 successful industry proposals that were shared and showcased by the School to federal departments and agencies on October 19, 2018 at a showcase style event.

This Podcast was recorded live on October 19, and features the three panelists from the event:

Benjamin Alarie
University of Toronto, Faculty of Law Professor and Co-founder and CEO of Blue J Legal;

Jennifer MacLean
Executive Director, Southern Ontario Smart Computing for Innovation Platform;

Aneeta Bains
Assistant Deputy Minister, Digital Transformation Service Sector, Innovation Science and Economic Development.

This Podcast highlights a mix of public and private sector views on AI, its current state, and its integration into the realms of digital, agile and innovation within the public sector.

Length: 16:58

Transcript

Artificial Intelligence Showcase - "The Art of the Possible"

Hello and welcome [I guess] to our second episode now of our podcast Innovate on Demand. Where are we today, Laura?

So we are at Bayview Yard which is a fantastic innovative venue that we've collaborated with quite often now. I guess I can just give you a brief breakdown of what we're doing here, why we're here and just some background information if that works. So a couple months back, we put out an RFP, which is a request for proposal, for any kind of A.I. algorithm, application or proof of concept that could be applicable to the Government of Canada. When I say that I mean more specifically to the realm of regulations which not everyone is totally familiar with. So we qualified 17 vendors on our list and we were super happy with that result, and of those 17 we're going to have a kind of a different set up for today. So, we're going to have almost like a Dragon's Den style pitch for the top eight. And then we have our other vendors who have booths who can provide their proofs of concept and really pitch them to the Government of Canada. It's a really really neat project to be a part of and we're really happy and we're actually joined with some fantastic speakers here today.

People who can help us figure this out. That's great!

Exactly. That's right. So Jean, maybe if you can give them a brief introduction.

Absolutely so we're joined today by Benjamin. Alarie from the University of Toronto He's also CEO and co-founder of Blue Jay Legal. Jennifer McClean Executive Director Ontario's Smart Computing and Innovations Platform. And Anita Baines Assistant Deputy Minister, Digital Transformation Services at Innovation Science and Economic Development Canada. Thanks for joining us guys.

Our pleasure, Thank you. It's great.

So you guys just came off a panel discussion. How was it? Who argued with who? Do we all agree?

Yeah I think we I think we mostly disagreed about whether or not AI is able to make morally justified decisions.

OK. So I guess I guess before we do that for. folks like myself who have no idea, what is this AI beast we're talking about and when is it taking over the world?

And the private sector view? Is it the same?

I don't know sometimes when people ask what AI is, I'm inclined to say it's really just applied mathematics or applied statistics. So it's actually it's, it's really that fundamental. It's really performing mathematical operations, usually on data in ways that allow you to leverage that data in novel ways. And so, for me, if you retreat down to that kind of base level it becomes somewhat less scary and because I'm not as worried when people say, you know, is math taking over the world?

I see.

You know, you know, there are, you see, this podcast is only made available because of mathematical developments and computing power and all those things. But I don't think that we really worry that math is taking over the world. So that's how that's how I like to think about it. Maybe it's maybe it's just my maybe it's my human need to you know want to feel like I'm going to survive. But I think of A.I. as math.

So I would say A.I. has been around for a long time. Sure it's top of mind right now, but it's really just one type of tool that data scientists have available to them. So there's a lot more to data science insights and things that could be done than just AI and it's a really powerful tool that's now made available through technology improvements. So there's a lot that can be done using data science and AI.

Yeah so just kind of a question for all three of you. But from your perspective what would the advantages be to bring this to the government of Canada and when they see this. I mean A.I., this approach, and what do you think about that?

I think that it's time. The government of Canada is just like any other sector right.

So we all need to start getting into becoming digitally enabled sectors and the government's just a little bit behind, in that space. There's some fundamental things that we have to do that are still not quite fair in terms of how we leverage technology for better use but specifically A.I. it helps with speed to getting insights into information that we have so we can actually bring regulations out to the forefront faster so that the speed that comes with it and more precision. I mean that's essentially it, right? Instead of having 100 people in room crunching data that they are trying to get their hands on, most of it is not even physically available it's not structured it's all over the place. How long it would take versus pushing it through a mathematical process and algorithms and bringing that together and actually getting some sort of idea much faster. Essentially that's what it can do and it will get our processes more automated first to getting those insights and then it'll start driving better inside. And that will give us better outcomes around coin precision policy making right. Why not.

And I think that's a great point in touching on something that you had mentioned, Jen, in your panel, um just the idea of even having basic Wi-Fi access you know that's not something that everybody has.

So I think that's that's a really valuable point. And just kind of a starting point and you know there's there's basic needs that we need to address first and then move on and really kind of dive into this world.

So if I understand right we're not too late as the government.

No, no, I don't think you're ever too late in leveraging amazing innovations. Cause I don't like calling AI tech especially when we are calling it math.

I do not think we are ever too late.

But I do I do think that the benefits, we need to speed up the benefits, pretty quickly because the world is moving really fast and it's very diverse now. There's a lot more people that have a difference of opinion and I think we kind of have to provide some sort of social support. Right?

I think I might just add the private sector is a little bit ahead of the public sector in adopting artificial intelligence because of competitive forces in the private sector and so competitive forces motivate parties to adopt new technologies to compete to retain customers to grow their customers to deliver better more efficient services. What we're going to see I think what we're starting to see is that there's a different kind of motivation in the public sector typically which is, which is related to political will, and what the constituents are looking for. And I think we as a society are looking at what's happening in the private sector and saying oh wouldn't that be fantastic if we had access to some of the same developments in the public sector and so that's the vector for pushing for a lot of this. So I agree with everything that said, I just might say that you know it's kind of inevitable that it's going to be that the public sector is slightly lagging the private sector because you don't have the same incentives and to get the body politic to wake up and go why, where is my AI, don't know, whatever, you know they need to see examples. We need we all need to witness these examples and then we'll say OK so government, let's do this. And then when Canadians make the call for that sort of thing the government has a way of delivering.

So the good news also is that I'm finding , I am very new to the government, only a year old in my role, So I do come from the private sector and I'm finding that the government is actually wanting help and it's trying to find a ton of ways to get it. While in the past it was very streamlined and now it's you know helped this way that way small help big help you know faster. And I think that has created a lot more innovation in how we procure and partner and we're trusting each other more so that that's like a big big value proposition.

It's a good Segway actually I was you know knowing that AI is not the end all be all of solutions. I guess I'd ask each and every one of you, maybe we can start with you, Jen, one example complex not so complex in the actual world today where AI is solving a problem or a wicked problem.

Very cool, so I can use the example that I cited on the panel as well. We have a research project that we're supporting where a company is working together with a researcher at the University of Windsor to solve border wait times. Uh, So looking at a truck driver coming down the highway 401 when they get to, when they get to a border crossing to go into the States, what would be the fastest border for me to go through?. So they don't need to know what the border to wait time is right now. They want to know what it will be when I get to that border. So they're coming through there in about London. They want to know if they should go through Sarnia or if they should go through in Windsor. They can predict what the wait time will be based on traffic patterns on the 401. What would be the fastest border for them to get through? Makes transportation much more efficient makes companies more efficient makes them on time with their deliveries rather than kind of throwing it to the to the winds and just hoping for the best when they get to the border.

So a killer application of A.I. at the moment, I'm inclined to say that I really think it is, you know, in the realms that I'm familiar with, it really is viewing this paradigm shift of viewing the law as something that you can predict and saying that this really is something that can be predictable.

And this isn't the end of judges by any stretch in fact the paradox is it's going to be judging more important and more difficult because if you're able to predict how the easy cases are going to be decided they're going to be settled out of court and what's going be left in court are those challenging cases right on the boundary of yes or no plaintiff or defendant. And so judging actually comes very very important because those new judgments become precedents that then you know become part of the apparatus that we use to predict future cases. So I think it's a very exciting time for access to justice and the transparency of the legal system and also in the development of the legal system to make it fairer going forward. So that's what gets me excited.

I think my examples are more fun because, I because I, I'm not seeing a ton yet in more of the, I am going to say, you know, focused sort of industries, that are that are much larger more complex those traditional industries are still catching up you know Agriculture is trying to catch up. Science it trying to catch up. You know we know public sector definitely is in the kind of cusp of it but I always think about retail and retail as my favorite example of AI. We actually have it every day.

We just don't even notice it, right. Our phones are all AI engine. And so my favorite is Amazon right. You guys have maybe recently heard of Amazon Go. And when it bought Whole Foods, they created their first store that is complete AI driven. So it's in Seattle right. So basically take your mobile device, you go through the front, you scan your I.D. There's no human beings in there. You scan your ID,, you guys seen this ad on YouTube, You scan your I.D. you walk in there with, you know like, your basic bag or your plastic bags that are reusable and you put all of your groceries in there that you want and you leave that's it. You don't do anything else. Could you imagine? You feel like you're stealing food.

Start the car.

Yeah. That is essentially to me the most phenomenal change in creating efficiency easy consumer access to goods and like there's no transaction going on and that is completely driven by analytics AI and a form of a trusted framework which I think is super cool.

That's huge.

Let's lead out of really really interesting points there. And let's begin on the other spectrum of things, in five years time, how do you see the regulatory and legal profession changing as a result of AI?

I think it's unambiguously going to be way. So here's another paradox. It's going be way deeper. The amounts of regulation are going to be much more significant. But the paradox is that it's going to be much easier to comply. So you're gonna have much better tailored regulation so creating fewer friction points for those who are trying to comply those regulations are gonna be based on better science better understanding of what the actual real world implications are of those regulations. So the regulations are going to be more tailored. And then on the compliance side because they're going to be fewer of these pitfalls or gotchas or regulatory gaps and there's going to be technology to actually figure out in this situation what do I actually need to do. It's going to be a good news story. So it's a bad news story if you think about it from the perspective of Oh I'm going to have to read all of these regulations but that's not gonna be the reality for very many people at all that's going to be true of the engineers who are involved in building the systems. But regulation is going to become very much software driven. And so it's going to be it's going to code in the software sense not in the current regulatory code sense.

I think it's that I think it's tricky right. I mean, maybe the legal profession is definitely more advanced. Definitely. But I think it's really tricky and my thought also goes to the three step process of you've got to get your foundation right in terms of your internal processes and access to data and standards, the trusted framework and the sharing and you kind of got to get your process information around that so you can get insights and then you actually you know as you start to improve your precision around regulation you actually have to serve it. So there is a service aspect of it as well. And all of that is driven by digital technology. And of course A.I. But you know in five years from now I don't know how advanced we are going to be. I think we'll be hopefully advanced on the foundational pieces because that is very hard to do and takes a ton of experimentation and failure and risk and success. But then you actually have to serve those regulations so people can comply to the more easily you can't you don't even know what half of them are you can even find them right.

As a civil society so we do have some work to be done.

That actually brings up our final question which is really around the you know we're talking about it being math, being technology so and so forth but there's a human side behind that, a whole change management piece. Obviously you know we're we're lagging a bit behind but it's not too late so that's good news. But how do we continue to move forward. You guys have any practical advice on that more change management component for organizations within the public sector or even private because ultimately it gets the same challenges we're looking at.

I can start on that one. So one of the things we've seen when we work with companies is they're trying to dip their toe in the water when it comes to A.I. and data sciences, that often times the data scientists employed at the company understand it really well. But anyone above that really doesn't know how you can use it, how you can best leverage it, and it can be a little bit terrifying sometimes. So enabling those middle and senior managers to actually understand what data science and A.I. can be used for, what it can't be used for is a really powerful way of making sure that it's used properly and leveraged properly within companies and within the public sector.

I think I think a huge onus for change management actually goes to the solution providers. So if you just think about what, what's actually brought computing into being a mass phenomenon it's actually ease of use.

Right so it's the reason why everyone has an iPhone because an iPhone or an Android device is actually really really pretty simple to use. And so it's, you know, you could try to do the change management thing by teaching everyone to be a coder and there are these people who say we need to teach coding in kindergarten because that's how people are going to learn how to use computers or you could say, OK, we could do that or we could maybe bring the technology to the last mile and actually make it very, very accessible and easy for people to use, to try to get the payoffs that they need in performing their work.

And so I think as someone who thinks a lot about how do we actually make this happen.

A huge amount of it has to turn on the user experience the user interface. And how do we actually make this solve the problem for the user because you can you can architect the greatest algorithm in the world but if what you're disseminating is a command line interface that has a blink cursor at it you go, OK do the machine learning, you're not going to get anywhere in trying to roll that out. You need to make it much easier for people to use and get quickly the insights that they need to do their job more effectively. So I think it's the responsible is really on that those who are developing the solutions to make them easier to use.

Well we wanted to say a huge thank you again to all three of you for taking time to chat with us today and we hope you enjoy the rest of the events. Take care. Thank you. Thank you. Thank you. Thanks John.

E1: Innovation – It's a start!

Co-hosts Laura Smith and Jean Cardinal from the Canada School of Public Service kick-off this podcast series on innovation in the public service with quick chat with Neil Bouwer Vice President, Innovation and Policy Services Branch.  This series is targeted to all Public Servants unwilling to simply accept the status quo and willing to try new and funky things to innovate and improve the lives of Canadians.

Neil Bouwer
Vice President, Innovation and Policy Services Branch

Laura Smith
Communications Advisor, Innovation and Policy Services Branch

Jean Cardinal
Director, Foundation and Specialized Learning

Length: 8:48

Infographics

Podcasts what you should know

Text version of the infographic Podcasts what you should know

This infographic shows what you should know about podcasts. It is divided into three segments:

  • The first segment presents the following explanatory text, which is situated next to a graphic representation of a microphone: Podcasts are digital audio files available on the internet for downloading to a computer or mobile device. Podcasts can be accessed on demand, anytime and anywhere.

  • The second segment lists three interesting facts about podcasts. The first bullet fact is next to a maple leaf and states that more than 10 million Canadian adults have listened to podcasts in the past year. The second bullet fact is next to an image of a hand-held device and states that smart phones are the number one medium for podcast consumption. The third bullet fact is next to an image of headphones and states that people are two times more likely to remember something they heard than read.

    The sources for these facts are given as:

    • the Canadian Podcast Listener 2018
    • Neilson Podcast Insights, 2018
    • "Adult Learning and Retention: Factors and Strategies," 2010, Cultural Orientation Research Centre

    The bulleted list of facts is situated next to a drawing of a microphone surrounded by images of musical notes, sound waves, a hand-held microphone, a globe, a Wi-Fi symbol, headphones, a radio and a play button.

  • The third segment shows in words and pictures where people listen to podcasts: 49% listen at home, 22% listen in the car, 7% listen while walking or working out and 4% listen on public transport.

    The source for this information is given as Edison Research: Infinite Dial 2018.


Podcasts what you should know

Text version of the infographic Podcasts what you should know

This infographic shows what you should know about podcasts. It is divided into three segments:

  • The first segment presents the following explanatory text, which is situated next to a graphic representation of a microphone: Podcasts are digital audio files available on the internet for downloading to a computer or mobile device. Podcasts can be accessed on demand, anytime and anywhere.

  • The second segment has two headings. The heading "Did You Know?" is followed by the explanation that the term podcast comes from the combination of the words "iPod" and "broadcast." Both of these words are graphically represented. The heading "Benefits of Podcasts" is followed by a bulleted list including three benefits: you can customize content by subscribing to shows, episodes and topics relevant to your interests; you can listen on the go 24-7; and podcasts are portable and free.

  • The third segment lists the three primary reasons for listening to podcasts:

    1. to be entertained,
    2. to hear interesting stories, and
    3. to learn something new.

    This bulleted list is situated next to a drawing of a microphone surrounded by images of musical notes, sound waves, a hand-held microphone, a globe, a Wi-Fi symbol, headphones, a radio and a play button.



Features

We have just relaunched our website with a new look, new navigation and easier-to-find content. Let us know what you think of it.

Gender-based Analysis Plus

Gender-based Analysis Plus

Gender-based Analysis Plus

Understand the importance of supporting diversity and inclusion in the workplace.

Indigenous Learning Series

Indigenous Learning Series

Indigenous Learning Series

Gain insight on Canada's shared history with First Nations, Métis and Inuit Peoples.

Language maintenance tools

Language maintenance tools

Language maintenance tools

Improve your English or French language skills.

Date modified: