Tuesday, March 4, 2025 - AI is evolving at a rapid rate, and its implications for higher ed are changing in lock step. So, Michael and Jeff dove back into the topic at the Google Public Sector GenAI Live & Labs Conference with a panel of experts: Ann Kirschner of CUNY and ASU, Pace University’s Marvin Krislov, and Google’s Chris Hein. They discussed the necessity of an institutional AI strategy, the tech’s implications for the future of work, and why university partnerships will be essential to equity in the age of AI. This episode is made with support from Google.
Get notified about special content and events.
0:00 - Intro
1:40 - Campus-Wide AI Strategy
6:02 - Skills in the Age of AI
9:52 - AI Policy and Faculty Training
11:49 - The Dislocation of Entry Level Jobs
15:33 - Teaching AI
18:39 - Mounting the Liberal Arts Comeback
21:25 - The Future of Academic Research
24:37 - Building Access through Partnership
31:12 - Questionable Assumptions
Jeff Selingo
Hi, Future U listeners, Jeff Selingo here. Generative AI continues to be a topic we'll keep returning to on Future U because it seems to change by the month. Last week, Michael and I hosted a panel discussion about AI and higher ed in the workforce at a Google conference. It was held at Google's New York City headquarters. In the audience were both higher ed leaders as well as city and government leaders. You to Google for their sponsorship of this episode and now onto the conversation. So welcome everyone. A huge thank you to Google for hosting us today at Gen AI Live and Labs here in New York City. So we're pleased to welcome on the stage today Ann Kirschner, who's university professor at the City University of New York right here in the great city of New York, where she also served recently as interim president at Hunter College and is also a professor of practice at Arizona State University. Welcome, Ann. Chris Hein, who is field CTO at Google Public Sector, where he's the leading voice in applying AI in the public sector. Chris, welcome. And Marvin Krislov, who is president of Pace University, also here in the great city of New York. He's been there since 2017. He was previously president of Oberlin College and vice president and General Counsel at the University of Michigan. Welcome, Marvin.
Michael Horn
All right, let's dive in with a question for all three of you. We'll start with Ann and you can run down the line. I'm just curious to hear all of your take. If colleges need an AI strategy, just like they have an overall strategic plan or master plan for the campus, do they need an AI strategy? And if not, why not? And if so, what does one look like, given how fast this field is moving right now? Ann, let's start with you and go down.
Ann Kirschner
Yes. Is that enough?
Michael Horn
Say more.
Ann Kirschner
You want more? I mean, every organization, initiative, enterprise needs an AI strategy. And it's a moment that calls for a new sense of what leadership is going to be all about. And what that means is it has to start from the top and it has to be integrated across the campus because all of these things interconnect. What you do in the classroom interconnects with what you do in advising, interconnects with what you do in financial aid, it's all of one piece. So while it's true of all organizations that you need a strategy, I think it's particularly true of higher education because that's what we do is we teach people, we create new knowledge. I think I could also say that universities may be the last place where equity is not a four letter word. And so, as we think about the ethics of how AI is going to be applied, I think universities have a special role to play here in figuring out how AI is deployed on campus.
Chris Hein
I also will go with yes and then try to expand on it. So when I look at it, the reason that I think that it's so critical to really build out a cohesive plan around this in the higher education space is that the workforce of the future is going to be so dramatically different than the workforce of today. And so because of that, it's incumbent on our higher educational institutions to be trying to figure out how do we create curriculum, how do we educate in a way that creates a workforce that can meet the tool set that is coming. And so as we look at AI as this general purpose tool that can be used in a million different ways depending on how you want to implement it, it does create this moment where we need experts like you all in the education space to help us figure out what does come next, what is the best way to do it, how do we do it ethically and safely? Those are things that while us in the builder space at Google, yeah, we have people that are deeply, deeply thinking about this, that broader community being part of that overall knowledge generation is going to be so critical that we get this right. You know, the last thing I'll mention on it is that it's also just, it's so important that we're looking at this from a perspective of what are the tools that are best to use this for? How do we think about curriculum differently of what's going to come? You know, we had a brief spat where we thought we were going to train a bunch of prompt engineers. Were past that now. That was a fun six months. But we're really looking to figure out, like, how is this going to be impactful across every segment of our economy moving forward? And education is such a core component of that.
Marvin Krislov
So I'm going to go with, yes, we are doing that right now. And we set up a task force that would consider all the different aspects, both curriculum integration, training. We're very focused on workforce readiness. So of course thinking about our students. But I also want to put in a pitch for thinking about the ethics and the governance as well. And so that is part of the strategy is thinking about not just saying no, but saying how is okay. And this needs to be an ongoing discussion. I think a strategy, like any good strategy, needs to continue to evolve and adapt to the conditions. And so, yes, needs attention and needs to be something that continues to be evolving.
Jeff Selingo
So I want to ask about the broader impact of AI on higher ed. And Chris, let me direct this to you because one of my jobs in high school was working at AAA doing triptychs so I could fold a paper map in literally two seconds. And this came up recently. We were in the car and my teenage daughter said, you know, how do people get to places before they had this phone, you know, in their car? And I said, well, you would look at a map and you would essentially memorize where you were going. She's like, because she thought you would hold the map, you know, as, as you were driving. And, and it, it made me realize recently we, you know, we've seen some studies on AI technologies that may promote learners dependence on technology. Right. Just like we now depend on that map telling us where to go, left turn, right turn, instead of memorizing it like we used to. And it may trigger this idea of metacognitive laziness. Right. And so we always have thought of colleges and universities as these places of, you know, critical thinking and creativity, human interaction. So how can they continue to be places where students and learners of all ages learn these critical thinking skills even as they use AI in the learning? Right. How do we balance those two things so that they continuing to, we're continuing to produce critical thinkers even at the same time that they're using tools like AI, just like we're, I'm now not using those paper maps anymore.
Chris Hein
Yeah, it's a great point and I'd love to talk to you later about how to fold those because I could never actually get them back together after. So the way that I would look at this is, you know, A) we have to, we have to change a little bit of the stigma that that has started in terms of AI is cheating. Right. Because I think if we, if we default to this status quo of AI is cheating, that's the same as saying that every other technological revolution that has come forward has been a form of cheating, you know, from the, the good hard work of actually doing the process. But we can't stop there. We can't just assume like, okay, great, let them use all the AI tools and it'll be fine. We have to then start to recognize what makes someone unique and special and better when they've got AI as a tool set. And so this is where I think education has to start to up level and alleviate some of the burden that currently exists on the educators themselves because the right way to start to do this is not necessarily, you know, you can't just send someone home and say, write a 10 page paper and then grade the paper and assume that that's going to show a good cognitive understanding of the details of that, whatever it is that you're assigning them. You have to be able to have an interactive experience where you're asking them what was interesting, what did you learn as you were doing this with an AI companion as part of that process? Where did you dive a lot deeper that you didn't anticipate that you're going to dive as a result of this? Right. In many ways, AI is this continuation of Google's 26 year mission to make the world's information universally accessible and useful. We're just making it incredibly accessible and useful. And so just like Google changed how education worked 25 years ago, we're at that same point. So I do think it's important and it's critical and we do need to be making sure that we're not letting people get away with just accepting whatever the AI system that they used spat out at them. But I do think that we should assume that that's going to happen and then help them to figure out how can they synthesize, how can they use all the knowledge that we as humans get to have access to to get us to that next step of the knowledge endeavor?
Jeff Selingo
I mean, in many ways though, that requires faculty in particular to rethink how they teach, how they design courses, which, Marvin, brings me to a question for you because I want to discuss the implications when students are embracing AI much more quickly than the faculty are. We see that in every survey right now of faculty and students. So how does a university kind of create a policy when it in some ways could be obsolete in months or weeks because of rapid growth and AI capabilities? Or should the question really be around how does a university recruit or upskill its own faculty or staff and be able to think about how they are trained on this? So is it about creating policies constantly or is it about upskilling the faculty in a larger way?
Marvin Krislov
Well, I think it does involve including faculty and staff in a discussion about how to upskill, how to integrate their teaching and their work. And some of it is also work, as the previous panel talked about. And I think that's part of what we all need to do. And so what we've been doing is training faculty and staff. We're working on a pilot program with an industry leader and we are being led by internal folks Particularly from our computer science project, our computer science school. So that allows people to feel trust and connection. And yes, students are moving fast, but there are certain things that faculty and staff can do to help students evaluate the quality of the information, evaluate when it's a good time to use AI as well as, again, going back to the ethical and some of the governance concerns about AI. So I think it needs to be a project that embraces everyone and is led by trusted members of the community.
Jeff Selingo
Okay. The trusted members, I think is probably key.
Ann Kirschner
Right?
Marvin Krislov
Yeah.
Michael Horn
And I want to turn to you to this next question because it's often speculated fear mongering all sorts of things around how AI will threaten to eliminate entire classes of jobs, including the entry level jobs for college, coming out of college in particular. Medical coding, for example, has gotten a lot of speculation recently. It's a big program in many schools in the CUNY system and there's rampant speculation that the the job may get eliminated down the road. There's even a case study that RAND did recently of a large hospital that is already using AI for all of its outpatient billing. Now, regardless, I think of whether you think medical coding is or isn't going to disappear or whether it's just actually going to get elevated in a different way. Entry level jobs in general are sort of like it's a really important way that people graduating college start their career and eventually advance to their next job. And these first jobs are also the bread and butter that many colleges and universities sell as part of their ROI. How should colleges prepare for this dislocation or at least alteration in what we think of as entry level jobs?
Ann Kirschner
First, I want to double down on the point that you made before, Chris, that our job is to prepare students for a lifetime of learning and earning. Right? That's our job. The earning part we're a little queasy about. We don't want to be vocational. One of my least favorite words, as if there's something anti intellectual about students needing jobs. And the truth is that whether it's an Ivy League university or a public university, we have done a lousy job at thinking about how to prepare students for their next jobs. Now that was okay in another era when I graduated from college, you could sort of bumble around for a while and look for a job and maybe go to graduate school and then maybe work some more. That was okay. But that world really is over. It's a much more competitive world than it was before. And all of the research, I'm thinking back to a Burning Glass study of a couple of years ago that if you don't get the right, a good first job, you are likely to be detoured in your career for 10 years. So that first job is essential. So the answer is, I mean, there's two answers. One is about curriculum and pedagogy and the other is about career preparation writ large. The curriculum and pedagogy piece, we have to learn to teach in different ways and particularly in a fast moving field like let's say, medical billing. We've gotta have partnerships with the private sector because no faculty member who trained to do that 20 years ago is gonna understand what's current in the curriculum, what students need to know. And the way we teach has to be very different as well. I am back in the classroom for the first time in many decades and I have required my students to use AI. Not forbidden them, but. But required them, but also have said to them, disclose, tell me what you're doing, because I want to learn from you. So the curriculum and the pedagogy must change. But then the other piece of it is what passes for career services, which has always been the Siberia of college campuses. It has to change. We have to have more support for students, more awareness of what the labor market needs, and then a more realistic preparation for them. And again, partnerships with the private sector so that we're training them for real opportunities.
Michael Horn
So you brought up curriculum, which is exactly where I want to go with this. Next question to you, Marvin. Jeff and I had author and computer scientist Cal Newport on the podcast earlier this season, and he said that universities shouldn't be teaching students about the mechanics, if you will, of AI because we're still too early in the form factor of it. Just to your point, you know, Prompt Engineering, really hot for six months, all of a sudden not as important. How we interact with these tools is going to continue to evolve. Pace University, though, is certainly all in on AI at the forefront, as your website says, and I believe 39 courses incorporating content related degrees, research labs, et cetera. Tell us why you decided that Pace had to lean into AI as such a curricular part of these courses. And did you go out to industry to develop the curriculum or how did you develop it?
Marvin Krislov
So I will say that Pace has always been focused on preparing people for the next step. And so thinking about your career, thinking about your job, thinking about skills, thinking about internships is part of the discussion the minute you enter Pace University, whether it's any of our campuses. And so when we saw the important Change that's going on with technology and AI. We said we owe it to our students and our faculty to help them navigate this. Now, we know that there are going to be changes in the particular technology and the types of programs there are. But I think fundamentally, we buy in to the notion that colleges and universities need to pay attention to the market and, of course, the whole discussion about the value of a college degree. We want to have people knowledgeable, but we also want them to develop the skills, the critical thinking skills as well, that will allow them to think about AI a decade from now or 20 years from now, if it still exists in the form we think about it. And we have worked with industry partners as well as faculty and staff, some of whom are here today. Thank you. To help try to stay on trend, and we will continue to evolve it. But we're all in because it's part of our mission.
Jeff Selingo
So for those of you who don't know Ann, if you read her biography, you have a BA in English and Music, an MA in English. A PhD in English. Right. Yay for the liberal arts. Yes. All of those out there. But, you know, the liberal arts have been losing out to STEM for, like, a decade. I mean, the numbers are shocking if you look at them. You know, we have more bachelor's degrees now awarded in computer science and engineering than we do in all the humanities, for example. Now, since 2019, you know, vocational majors are just winning out. And I think a little bit of that is around the curriculum. A lot of that is around career services. No doubt about it. So how can the liberal arts, or can the liberal arts kind of reassert their role in an AI future?
Ann Kirschner
So, yay for the liberal arts. I like to say that I've never had a job that I was qualified for. And actually, it's true. I never have had a job I was qualified for. And that's because somewhere in the course of my education, I learned how to learn. And so you could sort of throw me in some unfamiliar environment like the National Football League, and I would simply learn whatever it was I needed to learn. I did spend five years at the National Football League. It's true. It's true. You haven't had lunch yet. So you're not believing me.
Jeff Selingo
Well, and if we have enough time, Ann's gonna tell you about her first NFL game. But go.
Ann Kirschner
Right, right, right. So I think we lost our way in the liberal arts. I think we got a little proud of ourselves. Like, only people who had liberal arts degrees love books, and only people who have liberal arts degrees can write, and only liberal arts degrees people, you know, have a sense of ethics and history. It's just not true. And so we built a moat around the liberal arts and it's time to dismantle the moat. I have no doubt in my mind that a liberal arts education is not only a source of pure joy and enlightenment and a lifetime of, of learning. I have no doubt of that. But we're doing a really bad job of telling that story. Prompt engineering is in or out, but understanding the human experience and how to talk to people and how to lead people, that's not out. So how do we draw the connectors between what you learn in the liberal arts? Dismantle that moat. I mean, whole books have been written about it. So I'm not as if I'm going to give you the silver bullet for it right now. But it has to do with that sense of partnership with other parts of the university, partnership with the private sector, and storytelling around what it means to be a true lifelong learner in an era where technology and global change is just coming at us so quickly.
Jeff Selingo
And so that bridge has to be built across that moat both ways, though. That includes employers as well.
Ann Kirschner
Absolutely. But you know what, you talk to employers and you look at surveys of what they're looking for. They're looking for liberal arts graduates in the sense that what do they talk about? They talk about empathy and emotional intelligence and critical thinking and communication skills and problem solving. These are all the verites of a liberal arts education. But we need a rebranding big time for how we make those connections.
Jeff Selingo
Yeah, we've been talking a lot about the classroom and academic learning. I want to focus a little bit on research here, Chris. Google recently introduced AI co-scientist, which it dubs a virtual scientific collaborator, to help scientists generate novel hypotheses and research proposals and to accelerate the clock speed of scientific and biomedical discoveries. I will tell you, recently a PhD student in the sciences told me they are freaked out that AI is going to shrink research teams, especially junior scientists, who of course eventually become senior scientists. Right. So what do you say about those worried about AI eating academic research jobs, in particular given AI co scientists?
Chris Hein
Yeah, I think it's a, it's a natural fear. Right. Like, and that's, that's across so many industries that have that exact same fear. Right. So the physicians students are not alone in that particular fear. But where I think it gets interesting is what it starts to unlock is we've gotten to the point where the amount of deep specialization that's required for so much research that's happening today is to a point where you've got to be to 20 years of learning before you can understand so many of the complex different things that interact with each other. Especially when you're in the, in the science and technology frames that you just can't get to that at that early level of your career anymore. But what a co-scientist starts to offer to you is that ability to say as this as a PhD student I don't have to, you know, if I'm in molecular biology, I don't necessarily have to understand the chemistry aspect as deeply as I would have yesterday because the co-scientist can understand that for me. So I don't think that this should be a closing of the aperture where we're seeing less research being done by fewer people. What this should be is a broadening of the aperture where there's more research being done because each individual person can have that deep understanding of what they're trying to accomplish and they can then port that and share that with AI collaborators that can do some of those other aspects of the research, build on those hypotheses, give them some good grounding points on where to go next with it, and then be able to use that to broaden the overall scope of knowledge that's coming out from it. So is it something that is a potential short term kind of thing that you're going to see, make things more efficient? Sure. Right. Like that is the way technology often works. Right. Technology starts to create efficiencies, then those efficiencies broaden that scope out. So if we're in a period where there's constriction, right. The grant dollars are harder. Right. We know that there's this chaos going on that is going to create some of those challenges. Can we use an agentic AI system like co scientist to further the mission that you're on right now while you're in that phase, but then long term, really give that as a tool set that empowers researchers to do so much more with less than what they had at available to them in the past.
Jeff Selingo
So it may end up saving us some time and money that we could then put back into the, into the research. So I want to talk a little bit about partnerships. We, we heard this morning around Carnegie Mellon. Ann and I have academic academic appointments at, at Arizona State University, which also has big partnerships with AI providers. I'm kind of curious about how do we ensure that there's not a have and have nots around AI. When we turned on the Internet back in the 90s, everybody had access at every college and university, equal access. Now obviously you had high power computing, as we heard earlier, more access to some universities than others. So is this where colleges may need to partner with each other and with companies, or is it a case where some universities are just going to have to specialize more and not be all things to all people so they may not be able to do everything in AI? Maybe they'll have to specialize while there will be some universities that could do everything. Marvin, I'll start with you and then we'll just go down the line the other way. I just wanted, you know, how should we be thinking about partnerships in.
Marvin Krislov
So in terms of access for students, one thing that we've done is require every student in their required computer science class to take a six week module on artificial intelligence. And obviously the content is going to evolve. So we are trying to level set so that when someone graduates with a degree as well as having liberal arts curriculum, they will have some experience in artificial intelligence. And actually we're trying to offer it to graduating seniors who might not have had that opportunity. But in terms of research and thinking big picture. Absolutely. Collaboration is the way we have jumped into relationships with Google and others and we're open to doing more as well as other universities. So absolutely, we're on board. It fits what we have done and it fits our philosophy of education which is preparing people for the future.
Jeff Selingo
Right. Chris, what might those relationships look like? What are some of the relationships that Google has now with the universities?
Chris Hein
Yeah, I think there's lots of ways that that can occur. Right. And I think that what we do is we end up looking at that in terms of like a spectrum of how much complexity do you want to take on in terms of inventing the future of like an AI platform that is going to be used in your higher education. Right. So, but that's going to be a, it's a small select group that's probably able to invest enough of their own time, resources, staff, faculty, all of those things. And those are places where we're excited. Right. Like Google desperately wants to be part of those conversations and so please come talk to us. We would love to have that conversation, but that's, we recognize that's a big investment. Right. Some of that might be that you need the, you know, the high performance computing that the former panelist was talking about, like being able to run those scale jobs. That's hard to do. And that's one of those things that you're not necessarily going to, you're not going to keep up with us. Google's going to spend $75 billion building better AI infrastructure this year. Right? That's an insane amount of money and technology that is going to go into the ecosystem. So if you're looking at that, yes, you should be coming to the hyperscalers for those kinds of conversations. But then I think that you need to move across that spectrum where, you know, I, I was a liberal arts, I have my Bachelor of Arts, you know, from a small school in Iowa. Right. Like that's where I started off at, at Luther College. And I think all the way from those small university systems and small college systems, they should be looking at what are some of the off the shelf tool sets, what are the ways that I can incorporate AI into curriculum, into the things that we're doing and making us special as a college. Because those things that are going to be done at those large R1s, those should, and this is what's great about living in this space, those will be shared and we're going to be able to work together on those things. And they should be reproducible all the way across, down into the smaller system. So I think, you know, from Google's perspective, we're really excited to be a partner all the way across that spectrum. Like, what can we do to take the incredible research, that $75 billion of capital expenditure and make that available into the entire education ecosystem?
Jeff Selingo
Ann in another era, you try to, you know, universities try to partner together in online courses. It has worked in some places, hasn't worked in other places. So how do you see this happening going forward? Is it, how do you see the partnerships not only with the private sector but between universities?
Ann Kirschner
I don't think universities get an A+ for partnership either with the private sector or with each other. I think it's something that universities are going to have to learn better how to do it. I'm lost in wonder at the seventy plus billion dollars in one year of spending. I'm lost in wonder at the astonishing, you know, smart thinking that's happening at pace. It's just not happening all across higher ed. You know, we have a sluggish, sleepy, tired, beleaguered sector. But it's time to wake up. I think we're going to be forced to wake up and maybe that's just how it has to be. So I think partnerships are the wave of the future. Partnerships with the private sector can take many shapes and forms. We had one at CUNY that still exists, the CUNY 2x program that partners with Google and others to bring practitioners into the classroom. So that in those areas where it is hard to keep up with curriculum and where it's hard actually to hire Computer Science faculty because the private sector is so alluring. So. So that's one form that partnerships can take. But I do start out with a little bit of Debbie Downer here because did you all see that on the Saturday Night Live? It was wonderful to remember it. Because universities, if you read the mission statement of most universities, they sound exactly the same. Again, pace, hats off to what you're doing, Marvin, but they all sound like they're trying to do the same thing. This is not going to hold in the next era. We have to be a lot sharper about what makes university X different from University Y. Parents are going to demand that, students are going to demand that, and even faculty and staff are going to demand that. So yes, we need partnerships. B, we've got to exercise that partnership muscle a lot more aggressively than we're used to.
Michael Horn
This might be a perfect segue into this question, which is the last one of our panel and this podcast, which is Rebecca Winthrop at the Brookings Institution is helping lead a project right now around AI where they do a pre mortem. So they basically say if the patient's dead, right, in five years from now, knowing what we know, what caused it and so where can we anticipate things, where they went off the rails for AI down the road, and if we can identify those risks now, how might we mitigate them? I want to ask a different version of this to you all. What are the biggest assumptions that we're making right now around AI? Where we hope we are right, but we could be dramatically wrong. What are those big areas, whether in speed of adoption, organization, adaptation, impact on jobs, improvement, something even more dramatic that you would highlight that we need to be paying attention to and perhaps putting some mitigation in place from the university perspective. Why don't we start with Marvin and we'll go down the line for final words in this.
Marvin Krislov
So I think that it's easy to assume that everything's good or everything's harmful. I think like most things in life, AI has both a lot of pluses, but also has some disadvantages, particularly if it's over relied on. And I will make a plug for universities in that there are a lot of free thinkers in universities and if you allow them to be involved and critical, they will help you distinguish between when it is good and when maybe it should not be relied on as well as all the ethical implications as well.
Chris Hein
I think the easiest way to think of this is that we usually assume that technology will continue to take a linear path, and so we expect that those transformations will happen and we'll get better and better. Where I think that there's a real danger sitting in front of us is that there is. You know, I'm very, very bullish on this technology being a positive change in this world, and I think there's a good chance that we will start to squash that because of the fear and because of some of the misuse of the technology itself. And so what I look at there is like, when I look across the healthcare space, you know, I have parents that are alive today due to some of the amazing work done in research institutions across the world, in cancer research, right? Like both of my parents are alive as a result of that. And I look at some of the things that is happening in AI right now as a future state where that can continue to get better and better and better, and we absolutely have to invest in it. And so what worries me is that there's the inappropriate usage of AI that could squash and really tamp down on how much additional investment we can put into this space in the future. And so getting regulation right, getting the ethics and the responsible use correct, but doing it boldly in a way that we can still make that change happen. That's where I think I want to make sure that we're still in lockstep with higher education, to make sure that we're pushing these things out to the boundary in the right way, that we don't get a public backlash against it.
Michael Horn
Ann, final word.
Ann Kirschner
I'm going to go back to that pre-mortem idea because I think it's really kind of interesting. I think we have to transform the form factor of, of higher education to make it more flexible, not to say you're an online student, you're an on ground student, but allow for much more flexibility between them. The stranglehold of majors, of semester structure, of prerequisites. All of that, I think is gonna have to change. And I think one of the breakthroughs that may change it. You know that scene in the Matrix when Tiffany says, download the, I don't know how to fly a helicopter. Download the manual for the helicopter. I think there's going to be changes and it's going to happen probably in the private sector before it does in higher ed, where there are going to be some breakthroughs about how to learn in an extraordinarily accelerated and yet effective way. And we're going to look at that and we're going to say, whoa, so what we're doing in this area or that area is not going to work in the same way. And that pressure to change our form, I think and adopt a much more Protean form which really answers to the way learning has to be in the 21st century, I think that's the change we're going to see.
Jeff Selingo
If you're a listener to Future U, you will know probably every five episodes. Michael talks about mastery based competency based education and that is a form factor. Right. We right now measure education by time spent in a seat. Perhaps this is one thing that we're not seeing around AI that could help us move much more to mastery based.
Michael Horn
And transcend boundaries. So let's leave it there. Just a huge thank you to Marvin, Chris, Ann. A big thank you of course, course to our sponsor Google and thank you to all of you and all of you listening for joining us on Future U.
Jeff Selingo
Thank you.