Home Main Menu

Speak to the human Podcast

Jeremy Bassinder on weaving people and digital minds

Guest: Jeremy Bassinder

09/10/25 | 51 mins

AI is changing work fast, but the story is still about people. So what are the opportunities for everyone, from personal development to boardroom strategy? And why does imagination, curiosity and play matter more than ever?

Sarah is joined by Jeremy (Jez) Bassinder, Partner and GenAI Consulting Lead for UK and Ireland at IBM. Jez is excited by the opportunities that GenAI is opening up, but he’s also clear that the very things that make us human matter more than ever.

He shares why and how organisations need to bring together different people with experience across technology, strategy, learning and more to develop cross-functional ways of thinking about AI-enabled organisational change.

Together we explore:

  • The story of the Luddites and what it reveals about human attitudes and fear of change.
  • How imagination, play and inclusivity can help us weave AI into the fabric of work – without losing what makes us human.
  • How organisational strategy needs cross-functional teams, from the top level down, to bring together technical insight with strategic thinking.
  • Why confidence and culture are so critical.
  • Helping people become more confident in their own use of the tools and technologies.
  • Implications of AI for inclusivity and democratisation of technology.
  • What learning and development teams can do to help people experiment safely.
  • What education might look like in an AI-enabled world.

Transcript (AI generated)

[00:00:00] Sarah Abramson: Speak to the Human is a podcast that explores how we build connections with people in their professional work. It's about the human experience at work and about how to foster that connection and belonging to support people and their organisations to flourish. I'm your host, Sarah Abramson, and I'm looking forward to you joining me in hearing from our brilliant guests.

[00:00:21] In this episode I'm talking with Jeremy Bassinder, who is a partner at IBM and Gen AI Consulting lead for the UK and Ireland. Jeremy brings an incredibly well-informed perspective to the conversation around ai and he emphasises that this current evolution is about people, culture and behaviors as much as technology.

[00:00:42] We explore what the implications are of AI, from shaping board level strategy to identifying appropriate uses and helping people become more confident in their own use of the tools and technologies, as well as how education might change to equip the future workforce. It's great to have Jeremy's experience [00:01:00] and I hope you enjoy the conversation.

[00:01:01] Please do like, subscribe and share the podcast, and as always, it's great to hear from you with feedback and ideas for future guests.

[00:01:09] Hi, Jeremy. It's great to have you joining us on the podcast. I'm really looking forward to hearing your experiences and perspective on what you're seeing happening out there in digital technology and AI for organisations and for all of us as individuals, how things are already starting to change in our working lives and what that might mean for future work.

[00:01:33] You are a partner at IBM and as UK and Ireland's generative AI consulting leader. No doubt you are coming across a whole world of stuff that's honestly mind bending to most of us. So I am hoping that we can tap into some of your insights in a way that's accessible enough for non-techies like me to understand.

[00:01:51] So I'm fully intending, embracing being the one who asks some stupid questions in this conversation. I hope that's okay. I'll try [00:02:00] to make sure they're not. All stupid, but, but welcome.

[00:02:03] Jeremy Bassinder: Thank you ever so much for having me. It's it's a delight to be here. I love what you're doing and, uh, this is very, very exciting for me to be part of it.

[00:02:10] Sarah Abramson: Brilliant. Well, let's, let's start with just hearing a bit about the work that you do at IBM and, and I guess what led you into that role as

[00:02:19] Jeremy Bassinder: well. Sure. I'm Jeremy or Jz Bainer, as most people know me. I have been in and around technology and people for. Getting on for nearly 30 years, well, 28 years now.

[00:02:30] And I am a partner in our consulting business and I focus really on the intersection of people and technology and most recently around generative ai, which I've led that part of our business for the past couple of years. My industry really of focus is consumer goods. Uh, and I'm fascinated by why people buy and what people buy and so on.

[00:02:52] Uh, but really I sort of sit in our customer transformation piece, which is really about how do organisations do more for their customers? And increasingly [00:03:00] that's with how do you do that with ai? And, uh, yeah, that's, that's me really in a nutshell. I work with all sorts of clients from the very large to the, the, the quite small and do all sorts of things in, uh, uh, a little bit in retail as well as consumer goods as well.

[00:03:15] Sarah Abramson: That's brilliant, and I love that you bring from the beginning that the, the sort of crossover of people in technology and that you talk about that. And you recently gave a TEDx talk, which I really enjoyed called weaving people in digital tech, digital minds. I think. So we should say, and it's probably a good time to say that we've known each other for absolutely ages and we were at university for together back.

[00:03:37] Probably further, further ago than either of us would care to admit. But one of the things that we bonded over was both coming from Yorkshire down south at university and your TEDx talk was based on huddersfield and The kind of emergence of the industrial revolution then that this lovely idea of weaving of [00:04:00] the warp and weft in, in the industrial revolution of weaving fabrics and yeah.

[00:04:05] Factories and huddersfield and you brought the, this sort of parallel of weaving people in digital minds. It would be great to hear a bit about that. Yes. And I guess why you think AI is as much about people and as it is about technology.

[00:04:19] Jeremy Bassinder: I mean the, so as you say I grew up in Huddersfield, and Huddersfield is a a cloth town, um, in, in the north of England.

[00:04:27] And it's it was more affluent than many of the other towns around it because of, um, of, of the cloth industry. And it was also the home of, some of the Luddites and actually in Huddersfield the Luddites took someone's life, uh, a mill owner William Al. And, and so that's kind of, I find that interesting, that kind of dynamic between the, the sort of positive side of.

[00:04:52] Technology and the, the fear of technology that those, that those men predominantly had back then. And there's an irony in all of [00:05:00] that for me, that today when we call somebody a Luddite, they, because they don't understand technology. Actually, I think those gentlemen understood technology pretty well at the time.

[00:05:09] What they struggled with was. They could only see the sort of productivity side of it. They couldn't see what might be the new businesses or the new world that might be created by the, the availability of clothing and the availability of fabric. And certainly they weren't considering that you might weave together carbon fiber into a, uh, the body of an F1 car, for example.

[00:05:32] And so, so, so the, it's those things that I find really interesting. Imagining what new businesses and new. Uh, opportunities for society could be and it's, it's really the, the, the technology is, is like the weft and the, the humans are like the warp. If you only have one, then you don't have a piece of cloth.

[00:05:52] It falls apart. And actually, I think it's the, the interweaving of these. People and technology together that actually makes [00:06:00] wonderful things happen. So that, that's kind of where the analogy comes from.

[00:06:03] Sarah Abramson: It's fantastic. It's such a, it's such a great analogy and it's visual as well, you know, you can, it evokes exactly what you mean, that these things are complicated.

[00:06:13] Yeah. I like that you've used the word imagination, because I think it takes a leap for. All of us really to go from where we are and how we use the tools that we currently use and the way that we currently work to thinking, right, well, there's this big thing that's coming and changing everything about how I work potentially.

[00:06:32] Mm-hmm. How do I make sense of that? And that I imagine that the, the Luddite movement, those people at the time, that's it was the fear of that. And I guess how difficult it is to make that leap of imagination, especially if you don't really understand why you should, because you feel that. Your value is totally embedded and entrenched with the old way of doing things, and that you're fearful of how you make that leap.

[00:06:58] So [00:07:00] I guess a long way, a very long question. To get us to how you think people are feeling about AI at the moment and the range of attitudes that you are coming across. Yeah,

[00:07:12] Jeremy Bassinder: I think, I think there's a variety of different attitudes that go from almost like your enthusiast, which I'm firmly in that bucket.

[00:07:18] I mean, I'm, I'm, I'm not. A technologist who builds the absolute core models of this thing are much more on the implied end of things. And, and how, I suppose my role has two dimensions. One really, which is about how do we take this capability to clients to help 'em change their business, but also how do we change ourselves to be, um.

[00:07:42] Just what a consultant in the future or a consultant today is going to to look like. And so I'm very much in the enthusiast. You know, I, I play with this stuff in my spare time. I geek out on it. I use it to help me with my own blog and that kind of thing. Through to people who are, I suppose you've got the [00:08:00] skeptics who like, well, you know, it's over-hyped and it's kind of, you know, it's, it's, it's all it's doing is this um, predicting the next word or what have you, you know, and then you've got the people who are blissfully unaware.

[00:08:13] And I think it's. Those are the people that I worry about the most because actually I think there's a danger that we're gonna leave a bunch of people behind. And it this becomes a world where we have digital haves and digital have-nots in a way that we've not, well, I mean, there's always been the differences between people who had technology and people who haven't.

[00:08:35] But I think the pace at which this is coming and the, that it's. Becoming mainstream. There's a, there is a danger that a, a generation get left behind, and that, that's something that I'm trying to avoid doing and trying to bring everybody on the journey with us because I, I think this technology is, is, I mean, it's been around or first, first was sort of talked about, um, [00:09:00] uh, in the fifties onwards really.

[00:09:01] So it's not brand new technology, it's just where we've gone. It seems to have been. Democratised recently and put in the hands of everybody, which means then people can innovate and come up with new use of it and so on. And, and that's what really excites me. I think

[00:09:17] Sarah Abramson: that's really interesting. So I can, a few different questions coming outta that.

[00:09:21] There's a whole kind of inclusivity strain isn't there, of thinking differently about what inclusivity might mean. And it's up to all of us, I suppose, not to lose the talent and the value that people bring who may not find it easy to make. Even if it's not a leap to, to take even gentle steps. How do you think wherever people are on that spectrum of comfort with ai, how do you think we can help people to kind of make sense of change, especially change that feels rapid and to kind of navigate that so that we, we are doing our best job of trying to bring everybody along?

[00:09:58] Jeremy Bassinder: You have to give people places [00:10:00] to play. Play is really, really important. And I, and I use that word by intentionally actually. The, you have to give people some ideas of what they could go and play it with and that they understand how this stuff works and what it's doing. They have to know at a deeply technical level.

[00:10:18] But just, you know, what happens when you put a request into chatGPT or Claude or whatever your particular, um. Poison is as it were. And, and, and then what it is doing, what it's not doing, how you should use your data with it. And, and then I think people can. Start to look at, and again, I'll come back to imagination.

[00:10:39] Imagine what the art of the possible might be and, and what I've found in my experience when I've worked with senior leadership teams, you've got the technology team on the board and they sort of might pretend that they understand this stuff. And yet. Well, secretly they don't quite understand [00:11:00] it. And you've got the business side of the, the, the house, and they're like, we, well, we probably don't really understand it, but we're a bit embarrassed because we don't wanna look stupid in front of the other side.

[00:11:09] And I think you get this, one of the most amazing things to do is to bring both sides together and start, uh, grassroots and then build up from that. And then what happens is you suddenly have these two sides of the house talking about. In co in common language. 'cause you, you've managed to bridge the fact that one is embarrassed and the other one's embarrassed, but for different reasons.

[00:11:30] And, and then you can have a proper conversation and then people start to come up with ideas of what you might do and how you might apply it.

[00:11:36] Sarah Abramson: Yeah.

[00:11:36] Jeremy Bassinder: So I think all of that then breeds how you go about change.

[00:11:39] Sarah Abramson: And you, you've, you already used the word democratisation and I think that could be, well it'd be great to get your perspective on whether there's a kind of shift of power and ownership of.

[00:11:49] Where as you described, there's a lot of executive level leaders who are probably feeling very unnerved. They maybe don't feel they understand things [00:12:00] as much as they want to, to be able to make the decisions, but also to feel like they are the people that can set the agenda. Whereas you've got people much more at grassroots level or at other.

[00:12:13] Places in an organisational hierarchy who are playing and are doing stuff and are, I don't know, is there, is there some sort of shift of power that's going on there where there's people who aren't the sort of board level decision makers, but who are gaining some understanding of how things might change at a deeper level than those leaders are?

[00:12:36] How do we kind of make sense of, of that?

[00:12:39] Jeremy Bassinder: Yeah, I think, I think there's a few things going on in, in, in that. I think the first thing is that boards are going to have to have people that understand the technology. And I don't think it's just about how do we run our systems or how to, I, I think we need real.

[00:12:55] Evangelists who can help the board move, move forward. And, and it, [00:13:00] there's a, an interesting twist in the sense that quite often the people understand the technology are quite low down in the organisation, but the people that can ask the right business questions are often quite high up in the organisation.

[00:13:11] And so navigating that, I think is a, is a challenge. I think the other thing that I've observed, I was expecting when I started this role a couple of years ago that the, the younger generation were just gonna get this and the older generation would struggle. And that's not what I found at all. I think it's, I think as you look down the, the organisation, it, it's much more about mindset.

[00:13:36] And, and you can have that mindset at any point. And it's, it's almost like there's a. You sort of moving to being an ai first way of getting work done. And so when I start work, I start work in a tool. I don't start writing. My notes elsewhere or start necessarily thinking just on my own. So the [00:14:00] manner in which people can work, I think changes.

[00:14:03] And those people that embrace that, they suddenly see this acceleration, which, which the others don't necessarily get.

[00:14:09] Sarah Abramson: It's fascinating, isn't it? I think some of this is to do with pace because actually, you know, if we think about, if we think back to how people were working 20, 30 years ago, everything we do is transformed.

[00:14:20] Of course it has. Everything we do is different. We're using all sorts of tools and ways of of achieving what we do in completely, ways that we couldn't even have conceived and we're fine. Yeah. And obviously some people are a, a little bit more, you know, have got more developed skills at using some of those tools than others, but we.

[00:14:42] I think we, we get it. That's the world that we live in. You know, my mum comfortably uses her smartphone and well, when she's not here to get across room, she's fine. You know, but we've got there gradually and I think it has felt like we've kind of had different things coming in at different times.

[00:14:58] We've had, you know, the [00:15:00] internet, we've had smartphones, we've had this emergence of lots of. Incredible tools that have transformed the way that we do things. But it feels like AI is happening very suddenly. And even you saying, or when you log on, you go straight into a tool. I'm like, I don't do that. I might, it might occur to me at some point in the day that maybe I could try using copilot or chat GPT for something, but it still, I have to like make myself think of that.

[00:15:24] So how do we get to thinking about barriers in it and enablers to help people? Overcome this feeling of pace and this sort of need to suddenly upskill it, it, it can't be quite that sudden. So how do we help with that?

[00:15:43] Jeremy Bassinder: Well, when chat GPT launched and that became and is the fastest growing consumer product of all time, hundred million users, two months.

[00:15:51] And, and it has continued. It's that kind of network effect that's happened with it. And really the maths, the technology [00:16:00] behind it, been around for quite a while. 2017 probably is the first instance of the, the, what's called the transformer model, which is what the T in chat GPT stands for. And but really what happened is someone put consumer grid experience on top of brilliant maths.

[00:16:18] Then the world went to play. And then, and then we've seen this acceleration in the past two years, really. And I think the, the first thing that happened was that organisations said, well, thou shalt not use chat GPT, but necessity has always been the mother of invention. So employees just went and grabbed their mobile phone and they used it on that instead.

[00:16:38] So, so I think access is the first thing you've gotta do. You've gotta give people access, you've gotta help them understand. What is safe to do? What's not safe to do? Just make conscious decisions about where you're putting your data and how you're using these things. And and then there's the different levels of risk that you might, might tolerate.

[00:16:56] You know, what you are gonna use this stuff for versus what you're not. [00:17:00] Um, some of that will be legal. Some of it will be reputational. And some of it will be operational. And what, what I mean by that is there, there'll be some things that you can't use this for, for, for, for legal reasons. And, and, and that's pretty black and white.

[00:17:13] There's then some things that you don't want your brand to be associated with you using it for, and that's much more in the reputational space. And then sometimes this stuff's really expensive. So actually there are operational reasons why you might not use it in certain ways and you might use different forms of it.

[00:17:29] And I think as you start building these ideas for people on what you can do, how you can apply it, and then you get people to play and build. Build it for themselves. Then I think you start to see it being adopted across organisations.

[00:17:45] Sarah Abramson: That's nice of a way of combining what you were talking about with play, but also with having sets and parameters.

[00:17:50] And do you think, is there a step where organisation, organisations need to help people understand risk? Or how does that bit happen?

[00:17:59] Jeremy Bassinder: The, and as [00:18:00] I said, there's, there's lots of dimensions of risk, but just even the very basics of when you put this prompt into this tool, where does that go? That you know, I'm gonna pull down information from a model that's been trained on some data that do I know whether the organisation that's built that model owns that data.

[00:18:21] It's pretty clear some of these, uh, chat, GPT and, and others have been trained on content that's across the open internet and there's various. Things going on around rights and whether or not they have the rights to script all this data and so on, and different legislations, different countries are taking different opinions on it.

[00:18:38] The, the Japanese opinion seems to be, well, it's a fair game. It's out in the open. You know, in the same way as, uh, Picasso looked at Rembrandt's pictures, well that was inspiration for, for, for them. You know, it, it, it, it's, it's fair game. Um, the US seems to be having a, a slightly different view. Uh, of things.

[00:18:56] And, and, and so there's the question of what am I pulling into my organisation [00:19:00] from, from what these models are trained on to what I'm pushing out. So I'm, I might be putting in company information, I might be putting in pub, uh, personal information and, and I need to be conscious of that, of, of what I'm doing.

[00:19:14] So that's kind of inbound and outbound IP risk really. And then, then there's this kind of language of hallucinations. I mean, all these things are doing really is predicting the next word. Let's say. It could be number, it could be postcode or, or, but, but it's essentially just making a prediction. So it's just statistics.

[00:19:34] And so sometimes it, the statistics point to something that's not the, the, the right answer, as it were. And so it hallucinates. Again, we, we sort of anthropomorphise this stuff and try and give it language that, that makes it more human in some respects. Whereas I, I think we've gotta be a little bit careful on that, but we, we probably have to sort of come up with language that we can use to explain these things to people.

[00:19:59] But [00:20:00] those are the kind risks that I see around the place where I'm putting my data or what I'm sharing. And then. Making sure that I've fact checking it as well.

[00:20:07] Sarah Abramson: Yes. Well that's a whole area, isn't it, of, yeah. Critical thinking, I suppose, of, of making sure that people don't take things at face value.

[00:20:15] Do you think there are good ways that we need to help people to, to do that, to use their own, apply their own knowledge and other ways where you use. AI most effectively as a tool in a sort of thinking process, like have you seen some good implementations where it adds value at certain points in a workflow or in a process, but without losing that critical thinking and that human input?

[00:20:45] Jeremy Bassinder: Why we're thinking about this is that everybody is gonna have a set of assistants around them. That helps them do their work. And assistants that I find useful, you might find useful because you do something similar to me in certain [00:21:00] areas and we might share those assistants. We might then connect those assistants to tools in the organisation.

[00:21:07] That could be as simple as, you know, word processing things or PowerPoint type stuff, or it might be out to. An HR system or something like that. And those connected assistants then almost become like digital workers. And so those bits can slot into our workflow to execute things on our behalf. And so, so where do I see people doing really well with this is where.

[00:21:36] They create something that's useful for them that then they can put across to others. And then you get this kind of network effect. So some of the best things are asking the tool itself, asking copilot how you know how to do X. So for example, I'm writing a requirements document, get it to ask me the questions that i, I might [00:22:00] need to be able to write a really good requirements document rather than getting it to just write a requirements document, if you see what I mean. So actually work with it as you would work with another person to prompt you and think through things. Obviously transcription, lots of people using it for that writing, meeting notes, that kind of stuff.

[00:22:18] But. We are doing things like writing user stories or writing personas and getting the model to adopt the persona of the person that you're designing for so that you, it will then give you its opinion on what that person might think, and you almost have a digital. Point of view coming into the design process as well.

[00:22:40] I guess I'm rambling a little bit.

[00:22:41] Sarah Abramson: No, you're not at all. I think that it brings us right back to that point about imagination where because we are, we have potential for changing so many steps in the way that we work that well. There's two things, aren't there. One is that you are almost on autopilot with how you work.

[00:22:56] You know, especially. Those of us that have been in our careers for a [00:23:00] while, you have embedded habits and things that you don't really think about. It's just how you do stuff and to kind of, I, I mean, I, I am excited about the opportunities of ai, but, and so I'm not averse to using these technologies at all, but I have.

[00:23:17] I have to stop myself to think about how and when. Mm-hmm. I could use them. And that kind of habit, that kind of imagination, I think is part of the, this leap that it feels like we need to make. Maybe we don't, maybe it could be baby steps a little bit more. I wonder if one of the things that helps with that is people having the ability to share Yeah.

[00:23:37] Like see examples, seeing how other people are working, just being surrounded by other ways that people are doing stuff. Are you seeing that kind of sharing happening and some good examples of it?

[00:23:50] Jeremy Bassinder: Hugely. Yeah. So, so I write a, an internal blog every fortnight of where we just. Find somebody who's doing something interesting and then we [00:24:00] record a video of them doing it and just and just share, share that out from everything from, you know, writing blog posts to writing code and code is a huge thing now where people are sort of.

[00:24:15] Um, actually almost creating like whole development teams that are doing the creation of apps and and so on. So, so that's, that's pretty fascinating. We've done sort of ab testing between teams working with generative AI and teams not working with generative ai, and you see 50, 60% productivity gains and people write better quality.

[00:24:38] 'cause that's not always something people think about either. The, they immediately go to the productivity side of things and yes, you're expecting to get things faster, but don't necessarily think you're gonna get things better as well. Which, which is, is definitely the case. What sort of things have I thought seen that's, that's interesting that people have, how they've [00:25:00] applied this stuff?

[00:25:01] Um, I think my mind has gone completely blank.

[00:25:09] Sarah Abramson: That's okay. I, no, I'm picking up that point about productivity. I think I mean we've certainly come across that with. With people who are talking about AI in terms of the efficiencies, and at the moment, that's acute because, you know, we are, we're all under pressure.

[00:25:26] organisations are all under pressure. There's, you know, things are quite tricky in the world. It's you know, there's economic uncertainty. There's, you know, shifting markets is all, there's a lot of things that we feel under pressure to navigate, create efficiencies. All of that. And plus, you know, as soon as you have a financial director on the board who thinks there's a way that they can reduce costs, of course they're going to want to.

[00:25:51] So we've come across quite a few people saying we need to introduce AI to create efficiencies, which I, I'm [00:26:00] not suggesting that that's not worth exploring and isn't valuable, but it seems like you say that that kind of misses. The value that, that we could be using. You know, we can only be thinking about AI as being something that improves stuff for us.

[00:26:16] Jeremy Bassinder: Why and we do a lot of work with Wimbledon at IBM and has for years. If we are watching the Women's Final in June of next year. It's now we've done generative commentary. You could take the highlights or in real time at some point and be able to do a different commentary for each person that's watching it.

[00:26:41] The people might be interested in the rules of the game, and they could have one commentary. Another person might be interested in the stats of how fast the ball was and that care. Another person might just care what the fashion is in the royal box and, and. You couldn't ha there's no other way you can do that as a business [00:27:00] because you just can't deal with the volume of being able to record that many versions of the, the commentary, let alone voice them with the same person at the same time in, in real time.

[00:27:11] And those kind of things, I think are then applying the art of the possible. Where's the what, what could a value, uh, thing be and the, the, the value's interesting in that people quite easily get into the business value or the human value, but you've got to connect the two. So if I, if I have something that's really valuable, but it's so badly designed that nobody uses it, well, then it doesn't get adopted

[00:27:37] so it's valueless If I've got something that's. Well designed and it's kind of cool, but it doesn't actually drive any business value. It's just a toy. And, and so actually this constant looking at what's it mean for the human, what's it mean for the business and how you link that together and that's what you should go and build I find fascinating.

[00:27:55] And that's kind of where, where, where we go and we're starting to look at what are [00:28:00] the, the things that we should build next for an organisation.

[00:28:02] Sarah Abramson: Yeah. And I guess in terms of the sort of strategy of. Development in terms of the tech side and the people side, like supporting people to continue with that ability to use the tools, but to think critically about them alongside implementation of technology.

[00:28:20] It feels like we need to think about parallel strategies really. Is that, is that how you approach things?

[00:28:28] Jeremy Bassinder: I think about things is this, historically when we developed technology, it was a. A thing that happened over a period of time and then you had a starting point of the project, project and then an end point, and then you went on to the next thing.

[00:28:45] I think where we're starting to to go now is these are continuous. So, so your, I dunno, your marketing platform or your, whatever it might be, these things are forever. And so there are [00:29:00] constantly iter and it's not started at one point. And then the project finishes. These are products, these are digital products that you.

[00:29:07] As much about your organisation as the physical products that you, you sell or, or the services that you offer. These, these things are there forever. And therefore you are, you need to build them with that mindset that we're gonna build these things, we're gonna feed them, water them, we're gonna improve them.

[00:29:27] And, and, and, and then. Look at how people are using them and, and then make sure that they're still offering the value that they offered on day one at day X in the future.

[00:29:40] Sarah Abramson: Yeah. And there's no point at which you've kind of rolled out a new system and then you train people on it. It's, it's just a, just a total breaking of that way, continuous of doing things.

[00:29:48] Can you, and, and with what you were just saying there, can you bring people in with their insights and what they're actually doing to the ongoing [00:30:00] development of the technology? How, how can that happen? How can organisations do that kind of thing? Well,

[00:30:04] Jeremy Bassinder: both overtly and covertly. Maybe covertly, not the nicest way, but it of describing it, but it, it sort of works.

[00:30:11] So I can overtly ask people what they need and try and do, you know that classic sort of, you are not your user. So do proper user research and try and understand people even if you think you know what they want, actually go and ask them and, and look at what they do and try and understand them and then build for that.

[00:30:30] But the other thing is that you put measurement systems behind the scenes. To look at what people actually doing, when, when do people struggle on their site? And then that's probably a point where it needs a bit of redesign or what are the most transactions people are asking. We, we have a, an internal system called Ask HR, which sits across all of our a HR and.

[00:30:54] Expenses and travel and all manner of systems, and you just come in through a, a chat [00:31:00] window. And when they were first building that they built all of the analytics on the backend to see what were the questions people asked. And initially all it did was it served up what the company policy was or how you going use this system to do that or the other.

[00:31:14] But over time you then say, well, I'm getting thousands of these requests every day. Maybe we should automate that request and then you can get at things incrementally. But yeah, I, I, and then. Once you start talking to people, you can build persona to people and then test them against uh, what they actually think of the products you're designing, but also build synthetic personas to try and model what digital versions of those people might do.

[00:31:41] Sarah Abramson: I feel like for most of us, this kind of stuff will make a lot more sense in 10 years when we go, oh yeah, of course, I know, understand what you're talking about. Whereas now I'm really conscious that a lot of this sounds very sophisticated and futuristic and, you know, the, I I'm sure there are people listening to this because I talk to people all the time who are feeling [00:32:00] a little bit like we are not doing much yet.

[00:32:03] Do you, how would you advise people who feel like that to kind of break things off a little bit, to, to make it feel smaller and to, you know, not feel like the technology's already got gotten away from them. Like, yeah. How do we do things at a step by step, kind of from how things feel normal now to gently venturing into this future?

[00:32:29] Jeremy Bassinder: I think you have to start with something where you have a need and, and maybe it's not a need in your work. It might be a need at home. It might be, I dunno, like, like I said, I, I, I write a, a Cookbook blog. And for me there was a need to start. How, how do I automate that a little bit? And actually that's a, a real thing I can go and play with and I can go and just start to make my life easier in that sense.

[00:32:57] And I, and I think, I think

[00:32:59] [00:33:00] this thing ain't going away. And so. I think it's in everyone's benefit, if that's the right word. Just try and learn a bit about it. And, and again, I come back to the, the, the haves and have nots. I mean, just go and play, play safely, and then start to go, well what, what is the art of the possible, of what I can do?

[00:33:21] And then how might I bring that into a. a work environment my, my, my brother for example he's, uh, social media director for a bank. And so he talks to me about this kind of stuff as you would expect, and what the same way as a a as you have done. And and I said to him, well, you know, he said, well, we've just got copilot.

[00:33:38] What could I do with it? And I said, well, why not try and make a, an assistant that can sound like the bank? And so he's built the bank's tone of voice into an assistant. And so in his world, what he is doing is he's taking some of the communications that he would normally write, and then he is put making sure that they're [00:34:00] in, he knows the tone of voice perfectly for that bank.

[00:34:03] I. What he's able to do then is he's able to give that to others so that they can write in the right tone of voice, first time. Clearly someone else is. Someone's gonna read through that and make sure that it's gone through before it goes out the door. But the amount of time it then takes to check is it on brand, is it the right sort of tone is much shorter than if.

[00:34:24] They didn't have that assistant and it's early days and they're playing with it and it's not necessarily going to go live yet, but being able to apply it in the work that you do, and, and he's found that revelationary really for him to be able to see how he can apply it in a, in a, a marketing context.

[00:34:44] Sarah Abramson: That's really cool. And a great example. I, I think what occurs to me is that some people are gonna have, are gonna be more comfortable even just with playing with stuff. And, and that in itself is almost a skill of being willing to experiment. You [00:35:00] know, some people are, they'd like to know what they're doing and they like to feel like they have got skills to do it.

[00:35:05] So for organisations to upskill employees, what do we need to look at? Is there a role for learning and development teams? Is there. Something that's in the culture or in the way that we empower people so that it doesn't feel overwhelming. It fe but it feels like everyone can do this and we need everyone to do it.

[00:35:25] But a sort of accessible way in,

[00:35:27] Jeremy Bassinder: it's funny, I was, I was talking with our learning and development team. I think you have to embed it in every bit of learning. So, so I don't think you should have on this is the AI training bit and it should be, and how are we applying? These tools in whatever it is that we're learning.

[00:35:43] So you can use these tools to gen, generate a project plan, and render a project plan and come up with a risk log. And in fact, one of the, the assistants I wrote was how to write better risks. People are lousy at writing risks. So actually write an assistant to help [00:36:00] you write better risks. I'm not saying it will write all the risks.

[00:36:04] I'm saying that, you know, take a fairly average human written risk and then give it to the, the machine. And the machine Say, well what about this? What are the mitigations? What are the timeline that this is gonna happen? What's the impact? You know, really sort of be your critical friend. And so I think embedding, these technologies in every bit of learning and development, I think is, is, is really important. And I, I think that's the, the general thing really. It's like when I, I've done some stuff at schools as well. It's like people think in the one dimension of, oh, well it's a technology thing. No, you can apply it to every single subject in the school.

[00:36:38] Whether that's philosophy or economics or. Art or sport, whatever, there is a way in which you can embed AI and the use of it in, in there. And I think that's what you have to do in an organisation as well.

[00:36:51] Sarah Abramson: I love that. And I, I'd like to come back to education in, in a minute because it's such an interesting topic.

[00:36:56] I just wanna get your, your perspective on it. But just staying with L&D for a [00:37:00] minute, mm-hmm. Do you think, like you were saying at board level, that we need technologists or people that understand the technology to be. Sort of contributing to strategy development at the very top level. Yeah. Do we need people with that kind of background in L&D as well?

[00:37:17] Or is it something that L&D teams need to

[00:37:20] Yeah. Sort of reframe for themselves? How do they train themselves to do that?

[00:37:23] Jeremy Bassinder: I, I mean, people all sort say say, oh, come and help us write an AI strategy. Okay. Surely that's just your strategy ?

[00:37:32] Sarah Abramson: Right

[00:37:33] Jeremy Bassinder: I firmly believe that as we look forward, they, they're in, they're inextricably linked.

[00:37:40] The business strategy, the ai, the technology strategy are one and the same thing. Because we're gonna start applying. You have to be applying technology to what you're doing. And we always have, but we've just chosen to compartmentalise them in different buckets. But, but I think at. Increasingly you're going to have a single strategy [00:38:00] about what we do and how we, how we apply technology and the way we do it.

[00:38:03] So, so I then think everybody, whether you're L&D, whether you are R&D, whatever you are, you have a dimension of what you do that has how we're gonna be applying the latest technology to, to help us be, yes more productive, but also more imaginative and more creative.

[00:38:20] Sarah Abramson: Yeah. And again, I think it comes back to that, that pace thing, doesn't it?

[00:38:23] Yeah. You know, we don't have a team that. Does the internet for us. Exactly.

[00:38:29] Jeremy Bassinder: Yes. That's quite correct.

[00:38:32] Sarah Abramson: But it, it does feel like a shift because it's, it's how we work as well, isn't it? It it is that kind of playing that willingness to experiment

[00:38:42] Jeremy Bassinder: and, and then to, to sort of enterprise grade production levels, you know, that's, that's a, uh.

[00:38:51] Art in itself. You know, it's not trivial to do this stuff, but, but I do think you have to start with the imagination.

[00:38:57] Sarah Abramson: Yeah. And that links us right [00:39:00] back to what you were talking about at the beginning with the Luddites and that leap of imagination, but also the fear. But the, the reason that the, the Luddite rebellion kicked off with that people were afraid of losing their jobs and.

[00:39:14] It's very interesting, isn't it? So I think, you know, clearly you are someone that is very comfortable with exploring technology and you know, leaning into that whole uncertainty side of things, kind of, you are excited about the potential that digital technologies and AI offer us, and you're happy to kind of roll up your sleeves and have a go and, um.

[00:39:35] But I think, you know, clearly a lot of people are anxious about their jobs. Mm-hmm. They don't maybe have that same confidence in exploration and are worried. So do you think there's a role for organisations to Yeah. Well, yeah. Clearly there is a role for organisations to think about that sort of people, emotional side of things.

[00:39:54] How, how do we, how do we go about that? Is there a sort of reassurance or, or is [00:40:00] it actually a little bit more brutal than that? Are there just gonna be some people who need to embrace it or, or end up getting left behind even if that's not intended?

[00:40:08] Jeremy Bassinder: Yeah, so, so I think. What we would say is that those people who use AI will outperform those that don't.

[00:40:17] And, and that doesn't necessarily mean that, you know, AI's gonna take everyone's jobs and, and that kind of thing. But I do think there is a different type of person coming that is applying AI and is using it to get work done and being more productive in so doing. The fear side of things. I would ask the question of people like, are you on top of your backlog?

[00:40:38] Sarah Abramson: Hmm.

[00:40:39] Jeremy Bassinder: So if, if you are on top of your backlog and you are always getting through all of your work, well okay, maybe may, maybe then you, you, you have got something to worry about. But most people I ask that question too. there's not enough hours in the day. So, so I think, I think you can certainly get on top of that and what, what more could you get after?

[00:40:58] How fast could you [00:41:00] get through stuff that maybe is taking you a very long time and you can apply this stuff to get through more in, in, in the week. And I think that the danger with it though, and as a race we have always, we're working as much as we've ever worked. Yeah. You know, when you go back to sort of.

[00:41:18] Medieval times. The end of all times, the amount of hours it took to support your a single per, uh, support your family was way less than 40 hours a week. Now, now we're, you know that you're lucky if you work 40 hours a week. It, it's, it, it, we seem to be obsessed with cramming more and more in, and so I think, I think there's a dimension of this is as we get more productive and as we use more and more of these tools.

[00:41:44] We carve out enough space for us to do a little bit more. That's not than this space.

[00:41:49] Sarah Abramson: You're so right. It's such an irony, isn't it, that like more obsessed we get with productivity the more like, the harder we seem to be working. It's kind of crazy. Absolutely. Uh, you made reference there [00:42:00] to, I, I think people kind of changing, the ways of thinking and people coming into the workforce. So I'd like to take you back to that, what you'd started to talk about with education. And mmhmm just explore that a little bit because Yeah. I think. In a parallel way to what we were talking about with L&D and with board level thinking.

[00:42:19] Um, in education, there's there's very set ways of doing things and I've seen with both my kids that there's been some great examples of how they've been set work that uses AI really well, but there's also been, and, and knowing through other people as well some sort of shutting down. And that's understandable because.

[00:42:41] How do educationalists get hold of AI in a way that fits with the way that they are required to do things? It's really difficult, but if we put that aside, what do you think would be a really good way, ideally of educationalist schools universities. Yeah. Starting [00:43:00] to get better at helping develop the next generation for the ways that they're gonna need to work.

[00:43:07] That kind of curiosity, that willingness to explore that. Yeah. Willingness to play, but still bringing critical thinking skills to the for in how they work.

[00:43:17] Jeremy Bassinder: Yeah. Uh. So I'll give you, I'll give you an example. I did my first lecture on this topic at a university I wont say which university it was. I got to the end of it and one of the proctors those are the people that sort of, um, enforce the rules of exams and that kinda stuff at universities.

[00:43:34] Um, came up to me and said, how, how do we stop people cheating with this thing? And I said, you've gotta reframe your language. That's like, and, and the same has been of, of these kind of technologies forever. Well, when books were first introduced people said, whoa, people wont use their brains if it's all written down.

[00:43:52] And we was like, come on. Yeah. And so, so, so it has to start with that. I then did, most recently I [00:44:00] did the inset day at my my kids' school for all their teachers to give them a briefing on, on ai. Oh, cool. Then we broke off. The teaching staff went with the digital director to do some, some thinking.

[00:44:13] I worked with the admin assistant and with the, um, the sort of the financial and, and HR parts of the, the, the, the school. And, and I heard over lunch the headmaster said, you know, I was at a conference and people were debating whether or not. You should use this to generate work and then you critic critique it or you should you, you know write work and then get it to critique it.

[00:44:37] I was like, it's not either or

[00:44:39] Sarah Abramson: Yeah, do everything,

[00:44:42] Jeremy Bassinder: do everything. And, and, and I think the, the, the way in which we have have sort of created exams and so on. Is really about how much can you retain Yeah. Not necessarily how much you can apply. Yeah. [00:45:00] And, and I think therefore it, we're going to have an interesting period where these technologies can know PhD level capability in every subject under the, the sun.

[00:45:15] So on the IQ domain dimension, they will, they already are better than human beings. That, and not just at one of those domains, at all of those domains at the same time. So then you've gotta say, well, what is it that humans bring to the mix? It's got to be more about the EQ side of things or emotional quotient, if that, if that's a thing, and the blending of those together.

[00:45:38] And, and so I think, you know, if you are, if you have a low EQ, low IQ task, well that's probably just ripe for automation, right? If it's high EQ and low IQ. There aren't that many tasks that are in that kinda space, but that's, that's probably well within the domains of human beings. High IQ low EQ probably is one of the things for these machines.

[00:45:58] And then where you've got [00:46:00] the high of both things, I think is where humans and machines are gonna work, work together, perhaps most beautifully. And I think therefore, you've got to give people the skills that you need to be able to do those higher EQ things. So we're working in teams and coming up with ideas together and systems thinking.

[00:46:21] So how do things come together in, in, in systems like the system that is London for, which is where I'm sitting today, you know, it's built up of all manner of things, of the, the transport system and the electricity and the people flow and all of that. Those are really complicated things to get at. And how the city works as a a, as a, an organism in of its own right.

[00:46:42] Bringing technology and human beings together to look at that as a problem. Those are the sort of skills we need to be giving kids to even be able to think about that kind of stuff as well as, you know, mathematics and reading and all the things that we'd expect them to do. I don't think it's an either or basically

[00:46:58] Sarah Abramson: I love it and I think, well, [00:47:00] we ultimately need to find the joy of being alive, working together. Right? We need to understand what we benefit from working with other people, teamwork, layering on creativity, different ways of working innovation, and, and understand how we can use tools to do the things that we enjoy as people, right? As humans. And that ultimately, like there is more purpose to work than just getting it done.

[00:47:26] It's like part of how we wanna exist and interact.

[00:47:29] Jeremy Bassinder: Exactly.

[00:47:30] Sarah Abramson: Yeah. Amazing. Look. Oh my gosh, this has just been, it is been so cool and I think we could talk a lot more, but I wanna respect your time.

[00:47:41] I wanna finish with a final question that I ask all of our podcast guests which is speaking to you as a human what's exciting you at the moment? What are you looking forward to? Motivated by both or either in or out of work.

[00:47:56] Jeremy Bassinder: Yeah. So, well my look, uh, in [00:48:00] work my current bit of thinking is around digital eyeballs versus human eyeballs, looking at websites, buying retail, that kind of stuff that I'm finding.

[00:48:12] Really interesting about what that means to how you generate content, how you, how people are gonna buy and purchase in the future and what the implications are for, for the industry. And that, that's getting me quite excited. Uh, and, and tomorrow, I'm, I'm. Talking to a whole bunch of, uh, NEDs about that kind of concept.

[00:48:30] Sarah Abramson: NEDs being?

[00:48:32] Jeremy Bassinder: Being non-executive directors. Sorry. Yeah. People who are advising boards uh, in, in my own life I'm I write this food blog called The Cookbook Shelves which I write. About a different cookbook on the shelf each week, and we cook from that cookbook and, and so on. And I'm getting towards the end of the shelves.

[00:48:51] So I'm, I'm excited about what is gonna be the next phase of the cookbook shelves. Lovely. Because. I've got to the end of the [00:49:00] shelf. But yeah, so yeah, and I'm just, I'm just really excited about how this technology is gonna change the way that that people work and and live. And I think it's very exciting to be at the forefront of that and be using that day in, day out and, and helping people along the journey, whether you are new to it, old to it, or you know, just start to learn with it really, I guess.

[00:49:21] Sarah Abramson: That's amazing. Oh gosh. There's, there's so much in there. I've really enjoyed this conversation. Thank you so much. And I always enjoy chatting with you. It's, it's, um, but we've explored so much here and, um. Thank you. It is been great.

[00:49:36] Jeremy Bassinder: I hope I didn't get too technical into things. I hope we've kept it at a level where I think we did is appeal to your audience.

[00:49:42] Sarah Abramson: I, I think so. If I understood all the words that you said just about, but hopefully most people will have done. But if people are interested in kind of finding out more, I think you are on LinkedIn, is that right? Yep.

[00:49:55] Jeremy Bassinder: I, me on LinkedIn. Yep.

[00:49:56] Sarah Abramson: And you, you shared your TEDx talk on there as well, so we can [00:50:00] send a link share a link to that in the notes.

[00:50:02] Yeah, so,

[00:50:03] Jeremy Bassinder: And yeah, and if you, uh, if you wanna follow me on Instagram, the cookbook shelves if that's more your thing, uh, you'll find me there.

[00:50:09] Sarah Abramson: Love it. Brilliant. Thank. So much Jez. Really, really welcome. Appreciate it. Welcome, welcome. I hope everyone's enjoyed this conversation as much as I have.

[00:50:18] And if so, please do like, subscribe and share the podcast and see you soon. Bye for now..

[00:50:25] Hi, Jeremy. It's great to have you joining us on the podcast. I'm really looking forward to hearing your experiences and perspective on what you're seeing happening out there in digital technology and AI for organisations and for all of us as individuals, how things are already starting to change in our working lives and what that might mean for future work.

[00:50:48] You are a partner at IBM and as UK and Ireland's generative AI consulting leader. No doubt you are coming across a whole world of stuff that's honestly mind bending to most of us. [00:51:00] So I am hoping that we can tap into some of your insights in a way that's accessible enough for non-techies like me to understand.

[00:51:07] So I'm fully intending, embracing being the one who asks some stupid questions in this conversation. I hope that's okay. I'll try to make sure they're not. All stupid, but, but welcome.

[00:51:19] Jeremy Bassinder: Thank you ever so much for having me. It's, um, it's a delight to be here. Um, I love what you're doing and, uh, this is very, very exciting for me to be part of it.

[00:51:27] Sarah Abramson: Brilliant. Well, let's, let's start with just hearing a bit about the work that you do at IBM and, and I guess what led you into that role as

[00:51:35] Jeremy Bassinder: well. Sure. I'm Jeremy or Jz Bainer, as most people know me. Um, I have been in and around technology and people, um, for. Getting on for nearly 30 years, well, 28 years now.

[00:51:48] Um, and I am a partner in our consulting business and I focus really on, uh, the intersection of people and technology and most recently around generative [00:52:00] ai, which I've led that part of our business for the past couple of years. Um. My industry really of focus is consumer goods. Uh, and I'm fascinated by why people buy and what people buy and so on.

[00:52:11] Um, uh, but really I sort of sit in our customer transformation piece, which is really about how do organisations do more for their customers? And increasingly that's with how do you do that with ai? Um. And, uh, yeah, that's, that's me really in a nutshell. I work with all sorts of clients from, um, the very large to the, the, the quite small and, um, do all sorts of things in, uh, uh, a little bit in retail as well as consumer goods as well.

[00:52:38] Sarah Abramson: That's brilliant, and I love that you bring from the beginning that, um, the, the sort of crossover of people in technology and that you talk about that. And, um, you recently gave a TEDx talk, which. I really enjoyed called weaving people in digital tech, digital minds. I think, um, that, uh, so we should say, and it's probably a good time to say that we've known each other for absolutely [00:53:00] ages and we were at university for together back.

[00:53:04] Probably further, further ago than either of us would care to admit. But, um, one of the things that we bonded over was both coming from Yorkshire down south at university and your TEDx talk was based on. Huddersfield and The kind of emergence of the industrial revolution then that this lovely idea of weaving of the, um, warp and weft in, in the industrial revolution of weaving fabrics and yeah.

[00:53:34] Factories and huddersfield and you brought the, this sort of parallel of weaving people in digital minds. It would be great to hear a bit about that. Yes. And I guess why you think AI is as much about people and as it is about technology.

[00:53:47] Jeremy Bassinder: I mean the, so as you say, um, I grew up in Huddersfield, and Huddersfield is a, um, a cloth town, um, in, in the north of England.

[00:53:57] And it's, um, it was [00:54:00] more affluent than many of the other towns around it because of, um, of, of the cloth industry. Um, and it was also the home of, um. Uh, some of the Luddites, um, and actually in Huddersfield, um, the Luddites took someone's life, uh, a mill owner, uh, William Al. Um, and, and so that's kind of, I find that interesting, that kind of dynamic between, um, the, the sort of positive side of.

[00:54:29] Technology and the, the fear of technology, um, that those, that those men predominantly had back then. And there's an irony in all of that for me, that today when we call, um, somebody a Luddite they, because they don't understand technology. Actually, I think those gentlemen understood technology pretty well at the time.

[00:54:46] What they struggled with was. They could only see the sort of productivity side of it. They couldn't see what might be the new businesses or the new world that might be created by, um, the, [00:55:00] the availability of clothing and the availability of fabric. And certainly they weren't considering, um, that you might weave together carbon fiber into a, uh, the body of an F1 car, for example.

[00:55:11] And so, so, so the, it's those things that I find really interesting. Imagining what new businesses and new. Um, uh, opportunities for society could be, um, and it's, it's really the, the, the technology is, is like the weft and the, the humans are like the warp. If you only have one, then you don't have a piece of cloth.

[00:55:32] It falls apart. And actually, I think it's the, the interweaving of these. People and technology together that actually makes wonderful things happen. Um, so that, that's kind of where the analogy comes from.

[00:55:44] Sarah Abramson: It's fantastic. It's such a, it's such a great analogy and it's visual as well, you know, you can, it evokes exactly what you mean, that these things are complicated.

[00:55:54] Yeah. I like that you've used the word imagination, because I think it takes a leap [00:56:00] for. All of us really to go from where we are and how we use the tools that we currently use and the way that we currently work to thinking, right, well, there's this big thing that's coming and changing everything about how I work potentially.

[00:56:13] Mm-hmm. How do I make sense of that? And that I imagine that the, the Luddite movement, those people at the time, that's it was the fear of that. And I guess how difficult it is to make that leap of imagination, especially if you don't really understand why you should, because you feel that. Your value is totally embedded and entrenched with the old way of doing things, and that you're fearful of how you make that leap.

[00:56:39] So I guess a long way, a very long question. Um, to get us to how you think people are feeling about AI at the moment and the range of attitudes that you are coming across. Yeah,

[00:56:53] Jeremy Bassinder: I think, I think there's a variety of different attitudes that go from almost like your enthusiast, which I'm firmly in that [00:57:00] bucket.

[00:57:00] I mean, I'm, I'm, I'm not. A technologist who builds the absolute core models of this thing are much more on the implied end of things. Um, and, and how, I suppose my role has two dimensions. One really, which is about how do we take this capability to clients to help 'em change their business, but also how do we change ourselves to be, um.

[00:57:24] Just what a consultant in the future or a consultant today is going to to look like. And so I'm very much in the enthusiast. You know, I, I play with this stuff in my spare time. I geek out on it. Um, uh, I use it to help me with my own. Blog and that kind of thing, um, through to people who are, I suppose you've got the skeptics, um, who like, well, you know, it's over-hyped and it's kind of, you know, it's, it's, it's all it's doing is this, um, um, predicting the next word or what have you, you know, and then you've got the people who are blissfully unaware.

[00:57:58] Um, and I [00:58:00] think it's. Those are the people that I worry about the most because actually, um, uh, I think there's a danger that we're gonna leave a bunch of people behind. Um, and it this becomes a world where we have digital haves and digital have-nots in a way that we've not, well, I mean, there's always been the differences between people who had technology and people who haven't.

[00:58:23] Um, but I think the pace at which this is coming and the, that it's. Becoming mainstream. There's a, there is a danger that a, a generation get left behind, and that, that's something that I'm trying to avoid doing and trying to bring everybody on the journey with us because, um, I, I, I think this technology is, is, I mean, it's been around or first, first was sort of talked about, um, uh, in the fifties onwards really.

[00:58:50] So it's not brand new technology, it's just where we've gone. It seems to have been. Democratised recently and put in the hands of [00:59:00] everybody, which means then people can innovate and come up with new use of it and so on. And, and that's what really excites me. I think

[00:59:06] Sarah Abramson: that's really interesting. So, uh, I can, a few different questions coming outta that.

[00:59:11] There's a whole kind of inclusivity strain isn't there, of thinking differently about what inclusivity might mean. And it's up to all of us, I suppose, not to lose the talent and the value that people bring who may not find it easy to make. Even if it's not a leap to, to take even gentle steps. How do you think wherever people are on that spectrum of comfort with ai, how do you think we can help people to kind of make sense of change, especially change that feels rapid and to kind of navigate that so that we, we are doing our best job of trying to bring everybody along?

[00:59:48] Jeremy Bassinder: You have to give people places to play. Play is really, really important. Um, and I, and I use that word by intentionally actually. Um, the, you have to give people some [01:00:00] ideas of what they could go and play it with, um, and that they understand how this stuff works and what it's doing. They have to know at a deeply technical level.

[01:00:10] But just, you know, what happens when you put a request into chatGPT or Claude or whatever your particular, um. Poison is as it were. Um, and, and, and then what it is doing, what it's not doing, how you should use your data with it. Um, and, and then I think people can. Um, start to look at, and again, I'll come back to imagination.

[01:00:32] Imagine what the art of the possible might be and, and what I've found in my experience when I've worked with senior leadership teams, you've got the technology team on the board and they sort of might pretend that they understand this stuff. Um, and yet. Well, secretly they don't quite understand it. Um, and you've got the business side of the, the, the house, and they're like, we, well, we probably don't [01:01:00] really understand it, but we're a bit embarrassed because we don't wanna look stupid in front of the other side.

[01:01:04] And I think you get this, one of the most amazing things to do is to bring both sides together and start, uh, grassroots and then build up from that. And then what happens is you suddenly have these two sides of the house talking about. In co in common language. 'cause you, you've managed to bridge the fact that one is embarrassed and the other one's embarrassed, but for different reasons.

[01:01:26] Uh, and, and then you can have a proper conversation and then people start to come up with ideas of what you might do and how you might apply it.

[01:01:32] Sarah Abramson: Yeah.

[01:01:32] Jeremy Bassinder: So I think all of that then breeds how you go about change.

[01:01:35] Sarah Abramson: And you, you've, you already used the word democratisation and I think that could be, well it'd be great to get your perspective on whether there's a kind of shift of power and ownership of.

[01:01:45] Where as you described, there's a lot of executive level leaders who are probably feeling very unnerved. They maybe don't feel they understand things as much as they want to, to be able to make the decisions, but also to feel [01:02:00] like they are the people that can set the agenda. Whereas you've got people much more at grassroots level or at other.

[01:02:09] Places in an organisational hierarchy who are playing and are doing stuff and are, I don't know, is there, is there some sort of shift of power that's going on there where there's people who aren't the sort of board level decision makers, but who are gaining some understanding of how things might change at a deeper level than those leaders are?

[01:02:32] How do we kind of make sense of, of that?

[01:02:35] Jeremy Bassinder: Yeah, I think, I think there's a few things going on in, in, in that. Um, uh, I think the first thing is that boards are going to have to have people that understand the technology. And I don't think it's just about, um, how do we run our systems or how to, I, I think we need real.

[01:02:54] Evangelists who can help the board move, move forward. And, and it, there's a, an [01:03:00] interesting twist in the sense that quite often the people understand the technology are quite low down in the organisation but the people that can ask the right business questions are often quite high up in the organisation And so navigating that, I think is a, is a challenge. I think the other thing that I've observed, um. I was expecting when I started this role a couple of years ago that the, the younger generation were just gonna get this, um, and the older generation would struggle. And that's not what I found at all. I think it's, I think as you look down the, the organisation it, it's much more about mindset.

[01:03:37] Um, and, and you can have that mindset at any point. And it's, it's almost like there's a. You sort of moving to being an ai first way of getting work done. And so when I start work, I start work in a tool. I don't start writing. My notes elsewhere or start necessarily thinking just on my [01:04:00] own. So the manner in which people can work, I think changes.

[01:04:05] And those people that embrace that, they suddenly see this acceleration, which, which the others don't necessarily get.

[01:04:11] Sarah Abramson: It's fascinating, isn't it? I think some of this is to do with pace because actually, you know, if we think about, if we think back to how people were working 20, 30 years ago, everything we do is transformed.

[01:04:22] Of course it has. Everything we do is different. We're using all sorts of tools and ways of, um, of achieving what we do in completely ways that we couldn't even have conceived and we're fine. Yeah. And obviously some people are a, a little bit more, you know, have got more developed skills at using some of those tools than others, but we.

[01:04:44] I think we, we get it. That's the world that we live in. You know, my mum comfortably uses her smartphone and well, when she's not here to get across room, she's fine. Um, you know, but we've got there gradually and I think it has felt like we've kind of [01:05:00] had different things coming in at different times.

[01:05:01] We've had, you know, the internet, we've had smartphones, we've had this emergence of lots of. Incredible tools that have transformed the way that we do things. But it feels like AI is happening very suddenly. And even you saying, or when you log on, you go straight into a tool. I'm like, I don't do that. I might, it might occur to me at some point in the day that maybe I could try using copilot or chat GPT for something, but it still, I have to like make myself think of that.

[01:05:27] So how do we get to thinking about barriers in it and enablers to help people? Um, overcome this feeling of pace and this sort of need to suddenly upskill it, it, it can't be quite that sudden. So how do we help with that?

[01:05:47] Jeremy Bassinder: Well, when chat GPT launched and that became and is the fastest growing consumer product of all time, hundred million users, two months.

[01:05:55] Um, and, and it has continued. It's that kind of network [01:06:00] effect that's happened with it. And really the maths, the technology behind it, been around for quite a while. Um, 2017 probably is the first instance of the, the, what's called the transformer model, which is what the T in chat GPT stands for. Um, and but really what happened is someone put consumer grid experience on top of brilliant maths.

[01:06:25] Then the world went to play. And then, and then we've seen this acceleration, um, in the past two years, really. Um, and I think the, the first thing that happened was that organisations said, well, thou shalt not use chat GPT, but necessity has always been the mother of invention. So employees just went and grabbed their mobile phone and they used it on that instead.

[01:06:47] So, so I think access is the first thing you've gotta do. You've gotta give people access, you've gotta help them understand. What is safe to do? What's not safe to do? Um, just make conscious decisions about where [01:07:00] you're putting your data and how you're using these things. And um, and then there's the different levels of risk that you might, might tolerate.

[01:07:06] You know, what you are gonna use this stuff for versus what you're not. Um, some of that will be legal. Some of it will be reputational. Um, and some of it will be operational. And what, what I mean by that is, um, there, there'll be some things that you can't use this for, for, for, for legal reasons. And, and, and that's pretty black and white.

[01:07:25] There's then some things that you don't want your brand to be associated with you using it for, and that's much more in the reputational space. And then sometimes this stuff's really expensive. So actually there are operational reasons why you might not use it in certain ways and you might use different forms of it.

[01:07:41] And I think as you start building these, um, ideas for people, um, on what you can do, how you can apply it, and then you get people to play and build. Build it for themselves. Um, then I think you start to see it being adopted across organisations

[01:07:59] Sarah Abramson: That's nice of [01:08:00] a way of combining what you were talking about with play, but also with having sets and parameters.

[01:08:04] And do you think, is there a step where organisation organisations need to help people understand risk? Or how does that bit happen?

[01:08:13] Jeremy Bassinder: The, and as I said, there's, there's lots of dimensions of risk, but just even the very basics of when you put this prompt into this tool, where does that go? That you know, I'm gonna pull down information from a model that's been trained on some data that do I know whether the organisation that's built that model owns that data.

[01:08:35] It's pretty clear some of these, uh, chat, GPT and, and others have been trained on content that's across the open internet and there's various. Things going on around, um, rights and whether or not they have the rights to script all this data and so on, and different legislations, different countries are taking different opinions on it.

[01:08:52] The, the Japanese opinion seems to be, well, it's a fair game. It's out in the open. Um, you know, in the same way as, uh, [01:09:00] Picasso looked at Rembrandt's pictures, well that was inspiration for, for, for them. You know, it, it, it, it's, it's fair game. Um, the US seems to be having a, a slightly different view. Uh, of things.

[01:09:11] Um, and, and, and so there's the question of what am I pulling into my organisation from, from what these models are trained on, um, to what I'm pushing out. So, um, I'm, I might be putting in company information, I might be putting in pub, uh, personal information and, and I need to be conscious of that, of, of what I'm doing.

[01:09:31] So that's kind of inbound and outbound IP risk really. Um, and then, then there's this kind of language of hallucinations. I mean, all these things are doing really is predicting the next word. Let's say. It could be number, it could be, um, postcode or, or, but, but it's essentially just making a prediction. So it's just statistics.

[01:09:52] Um, and so sometimes it, the statistics point to something that's not the, the, the [01:10:00] right answer, as it were. And so it hallucinates. Again, we, we sort of anthropomorphise this stuff and try and give it language that, that makes it more human in some respects. Um, whereas I, I think we've gotta be a little bit careful on that, but we, we probably have to sort of come up with language that we can use to explain these things, um, to people.

[01:10:19] But, um, those are the kind risks that I see around the place, um, where I'm putting my data or what I'm sharing. And then. Making sure that I've fact checking it as well.

[01:10:29] Sarah Abramson: Yes. Well that's a whole area, isn't it, of, yeah. Critical thinking, I suppose, of, of, um, making sure that people don't take things at face value.

[01:10:38] Um, do you think there are good ways that we need to help people to, to do that, to use their own, apply their own knowledge and other ways where you use. AI most effectively as a tool in a sort of thinking process, like have you seen some good implementations where it adds [01:11:00] value at certain points in a workflow or in a process, but without losing that critical thinking and that human

[01:11:07] input?

[01:11:08] Jeremy Bassinder: Why we're thinking about this is that everybody is gonna have a set of assistants around them. That helps them do their work. Um, and assistants that I find useful, you might find useful because you do something similar to me in certain areas and we might share those assistants We might then connect those assistants to tools in the organisation That could be as simple as, you know, word processing things or PowerPoint type stuff, or it might be out to. An HR system or something like that. And those connected assistants then almost become like digital workers. And so those bits can slot into our workflow, um, to execute things on our behalf. And so, so where do I see people, um, doing really well with this, um, [01:12:00] is where.

[01:12:02] They create something that's useful for them that then they can put across, um, to others. And then you get this kind of, um, network effect. So, um, some of the best things are asking the tool itself, asking copilot how you know how to do X. So, um, for example, I'm writing a requirements document, get it to ask me the questions that, um, uh.

[01:12:31] I, I might need to be able to write a really good requirements document rather than getting it to just write a requirements document, if you see what I mean. Um, so actually work with it as you would work with another person to prompt you and think through things. Um, obviously transcription, lots of people using it for that writing, meeting notes, that kind of stuff.

[01:12:52] Um, but. We are doing things like writing user stories or writing personas and getting [01:13:00] the model to adopt the persona of the person that you're designing for so that you, it will then give you its opinion on what that person might think, and you almost have a digital. Point of view coming into the design process as well.

[01:13:15] I guess I'm rambling a little bit.

[01:13:16] Sarah Abramson: No, you're not at all. I think that it brings us right back to that point about imagination where because we are, we have potential for changing so many steps in the way that we work that well. There's two things, aren't there. One is that you are almost on autopilot with how you work.

[01:13:30] You know, especially. Those of us that have been in our careers for a while, you have embedded habits and things that you don't really think about. It's just how you do stuff and to kind of, I, I mean, I, I am excited about the opportunities of ai, but, and so I'm not averse to using these technologies at all, but I have.

[01:13:51] I have to stop myself to think about how and when. Mm-hmm. I could use them. And that kind of habit, that kind of imagination, I think is part of [01:14:00] the, this leap that it feels like we need to make. Maybe we don't, maybe it could be baby steps a little bit more. Um. I wonder if one of the things that helps with that is people having the ability to share Yeah.

[01:14:13] Like see examples, seeing how other people are working, just being surrounded by other ways that people are doing stuff. Are you seeing that kind of sharing happening and some good examples of it? Yeah,

[01:14:26] Jeremy Bassinder: hugely. Yeah. Um, so, so I write a, an internal blog every fortnight of where we just. Find somebody who's doing something interesting and then we record a video of them doing it and just, uh, and just share, share that out from everything from, you know, writing blog posts to writing code and code is a huge thing now, um, where people are, uh, sort of.

[01:14:52] Um, actually, um, almost creating like whole development teams that are doing, um, [01:15:00] the creation of apps and, uh, and so on. Um, so, so that's, that's pretty fascinating. We've done sort of ab testing between. Um, teams working with generative AI and teams not working with generative ai, and you see 50, 60% productivity gains and, um, people write better quality.

[01:15:19] 'cause that's not always something people think about either. Um, the, they immediately go to the productivity side of things and yes, you're expecting to get things faster, but don't necessarily think you're gonna get things better as well. Um, which, which? Is, is definitely the case. Um, what sort of things have I thought seen that's, that's interesting that people have, how they've applied this stuff?

[01:15:45] Um, I think my mind has gone completely blank.

[01:15:52] Sarah Abramson: That's okay. I, no, I'm picking up that point about productivity. I think, um, I mean we've certainly come across [01:16:00] that with. Uh, with people who are talking about AI in terms of the efficiencies, and at the moment, that's acute because, you know, we are, we're all under pressure.

[01:16:11] organisations are all under pressure. There's, you know, things are quite tricky in the world. It's, uh, you know, there's economic uncertainty. There's, you know, shifting markets is all, there's a lot of things that we feel under pressure to navigate, create efficiencies. All of that. And plus, you know, as soon as you have a financial director on the board who thinks there's a way that they can, um, reduce costs, of course they're going to want to.

[01:16:37] So we've come across quite a few people saying we need to introduce AI to create efficiencies, which I, I'm not suggesting that that's not worth exploring and isn't valuable, but it seems like you say that that kind of misses. The value that, that we could be using. You know, we can only be thinking about AI as being something that improves [01:17:00] stuff for us.

[01:17:01] Jeremy Bassinder: Why? And we do a lot of work with Wimbledon at IBM and has for years. Um, if we are watching the Women's Final in June of next year, um, it's now we've done generative commentary. You could take, um, the highlights or in real time at some point, um, and be able to do a different commentary for each person that's watching it.

[01:17:32] The people might be interested in the rules of the game, and they could have one commentary. Another person might be interested in, um, the stats of how fast the ball was and that care. Another person might just care what the fashion is in the royal box and, and. You couldn't ha there's no other way you can do that as a business because you just can't deal with the volume of being able to, um, record that many versions of the, the commentary, let alone [01:18:00] voice them with the same person at the same time in, in real time.

[01:18:03] And those kind of things, I think are then applying the art of the possible. Where's the what, what could a value, uh, thing be and the, the, the value's interesting in that. Um. People quite easily get into the business value or the human value, but you've got to connect the two. So if I, if I have something that's really valuable, but it's so badly designed that nobody uses it, well, then it doesn't get adopted.

[01:18:30] So it's valueless If I've got something that's. Well designed and it's kind of cool, but it doesn't actually drive any business value. It's just a toy. Um, and, and so actually this constant looking at what's it mean for the human, what's it mean for the business and how you link that together and that's what you should go and build, um, I find fascinating.

[01:18:50] And that's kind of where, where, where we go and we're starting to look at what are the, the things that we should build next for an organisation

[01:18:57] Sarah Abramson: Yeah. And I guess in terms of the sort of strategy of. [01:19:00] Development in terms of the tech side and the people side, like supporting people to continue with that, uh, ability to use the tools, but to think critically about them alongside implementation of technology.

[01:19:15] It feels like we need to think about parallel strategies really. Is that, is that how you approach things?

[01:19:23] Jeremy Bassinder: I think about things is this, historically when we developed technology, it was a. A thing that happened over a period of time and then you had a starting point of the project, project and then an end point, and then you went on to the next thing.

[01:19:40] I think where we're starting to to go now is these are continuous. So, so your, I dunno, your marketing platform or your, whatever it might be, these things are forever. And so there are constantly iter and it's not started at one point. And then the project finishes. These are products, [01:20:00] these are digital products that you.

[01:20:02] As much about your organisation as the physical products that you, you sell, um, or, or the services that you offer. These, these things are there forever. Um, and therefore you are, you need to build them with that mindset that we're gonna build these things, we're gonna feed them, water them, we're gonna improve them.

[01:20:22] Um, and, and, and, and then. Look at how people are using them and, and then make sure that they're still offering the value that they offered on day one at day X in the future.

[01:20:36] Sarah Abramson: Yeah. And there's no point at which you've kind of rolled out a new system and then you train people on it. It's, it's just a, just a total breaking of that way, continuous of doing things.

[01:20:44] Can you, and, and with what you were just saying there, can you bring people in with their insights and what they're actually doing to the ongoing development of the technology? How, how can that happen? How can organisations [01:21:00] do that kind of thing? Well,

[01:21:01] Jeremy Bassinder: both overtly and covertly. Maybe covertly, not the nicest way, but it of describing it, but it, it sort of works.

[01:21:08] So I can overtly ask people what they need and try and do, you know that classic sort of, you are not your user. So do proper user research and try and understand people even if you think you know what they want, actually go and ask them and, and look at what they do and try and understand them and then build for that.

[01:21:27] But the other thing is that you put measurement systems behind the scenes. To look at what people actually doing, when, when do people struggle on their site? And then that's probably a point where it needs a bit of redesign or what are the most transactions people are asking. Um, we, we have a, an internal system called Ask HR, which sits across all of our a HR and.

[01:21:51] Expenses and travel and all manner of systems, and you just come in through a, a chat window. Um, and when they were first building that [01:22:00] they built all of the analytics on the backend to see what were the questions people asked. And initially all it did was it served up what the company policy was or how you going use this system to do that or the other.

[01:22:12] Um, but over time you then say, well, I'm getting. Thousands of these requests every day. Maybe we should automate that request and then you can get at things incrementally. Um, but yeah, I, I, and then. Once you start talking to people, you can build persona to people and then test them against, um, uh, what they actually think of the products you're designing, but also build synthetic personas to try and model, um, what digital versions of those people might do.

[01:22:42] Sarah Abramson: I feel like for most of us, this kind of stuff will make a lot more sense in 10 years when we go, oh yeah, of course, I know, understand what you're talking about. Whereas now I'm really conscious that a lot of this sounds very sophisticated and futuristic and, um. You know, the, I I'm sure there are people listening to this because I talk to [01:23:00] people all the time who are feeling a little bit like we are not doing much yet.

[01:23:04] Do you, how would you advise people who feel like that to kind of break things off a little bit, to, to make it feel smaller and to, you know, not feel like the technology's already got. Gotten away from them. Like, yeah. How do we do things at a step by step, kind of from how things feel normal now to gently venturing into this future?

[01:23:30] Jeremy Bassinder: I think you have to start with something where you have a need and, and maybe it's not a need in your work. It might be a need at home. It might be, I dunno, like, like I said, I, I, I write a, a Cookbook blog. Um, and for me there was a need to start. How, how do I automate that a little bit? Um, and actually that's a, a real thing I can go and play with and I can go and just start to make my life easier in that sense [01:24:00] Um, and I, and I think, I think

[01:24:06] this thing ain't going away. And so. I think it's in everyone's, um, benefit, if that's the right word. Um, just try and learn a bit about it. And, and again, I come back to the, the, the haves and have nots. I mean, just go and play, play safely, and then start to go, well what, what is the art of the possible, of what I can do?

[01:24:38] And then how might I bring that into a. Uh, a work environment my, my, my brother for example, um, he's, uh, social media director for a bank. Um, and so he talks to me about this kind of stuff as you would expect, and what the same way as a a as you have done. And, um, and I said to him, well, you know, he said, well, we've just got copilot.

[01:24:58] What could I do with it? [01:25:00] And I said, well, why not try and make a, an assistant that can sound like the bank? And so he's built the bank's tone of voice into an assistant. And so in his world, what he is doing is he's taking some of the communications that he would normally write, and then he is put making sure that they're in, he knows the tone of voice perfectly for that bank.

[01:25:24] Um, I. What he's able to do then is he's able to give that to others so that they can write in the right tone of voice, first time. Clearly someone else is. Someone's gonna read through that and make sure that it's gone through before it goes out the door. Um, but the amount of time it then takes to check is it on brand, is it the right sort of tone is much shorter than if.

[01:25:47] They didn't have that assistant and it's early days and they're playing with it and it's not necessarily going to go live yet, but being able to apply it in the work that you do, and, and he's found [01:26:00] that revelationary really, um, for him to be able to see how he can apply it in a, in a, a marketing context.

[01:26:08] Sarah Abramson: That's really cool. And a great example. I, I think what occurs to me is that some people are gonna have, um. Are gonna be more comfortable even just with playing with stuff. And, and that in itself is almost a skill of being willing to experiment. Um, you know, some people are, they'd like to know what they're doing and they like to feel like they have got skills to do it.

[01:26:31] So for organisations to upskill employees, what do we need to look at? Is there a role for learning and development teams? Is there. Something that's in the culture or in the way that we empower people so that it doesn't feel overwhelming. It fe but it feels like everyone can do this and we need everyone to do it.

[01:26:51] But a sort of, um, accessible way in,

[01:26:54] Jeremy Bassinder: it's funny, I was, I was talking with our learning and development team. Um, I think you have to [01:27:00] embed it in every bit of learning. So, so I don't think you should have on this is the AI training bit and um, it should be, and how are we applying? These tools in whatever it is that we're learning.

[01:27:13] So, um, you can use these tools to gen, generate a project plan, and render a project plan and come up with a risk log. And, uh, in fact, one of the, the assistants I wrote was how to write better risks. People are lousy at writing risks. Um. So actually write an assistant to help you write better risks. Um, I'm not saying it will write all the risks.

[01:27:35] I'm saying that, you know, take a fairly average human written risk and then give it to the, the machine. And the machine Say, well what about this? What are the mitigations? What are the timeline that this is gonna happen? What's the impact? You know, really sort of be your critical friend. Um, and so I think embedding, um.

[01:27:51] These technologies in every bit of learning and development, I think is, is, is really important. And I, I think that's the, the general thing really. It's like [01:28:00] when I, I've done some stuff at schools as well. It's like people think in the one dimension of, oh, well it's a technology thing. No, you can apply it to every single subject in the school.

[01:28:11] Um, whether that's philosophy or economics or. Art or sport, whatever, there is a way in which you can embed AI and the use of it in, in there. And I think that's what you have to do in an organisation as well.

[01:28:24] Sarah Abramson: I love that. And I, I'd like to come back to education in, in a minute because it's such an interesting topic.

[01:28:29] I just wanna get your, your perspective on it. But, um, just staying with L&D for a minute, um mm-hmm. Do you think, like you were saying at board level, that we need technologists or people that understand the technology to be. Sort of contributing to strategy development at the very top level. Yeah. Do we need people with that kind of background in L&D as well?

[01:28:51] Or is it something that L&D teams need to

[01:28:54] Yeah. Sort of

[01:28:55] reframe for themselves? How do they train themselves to do that?

[01:28:57] Jeremy Bassinder: I, I mean, people all sort say [01:29:00] say, oh, come and help us write an AI strategy. Okay. Surely that's just your strategy,

[01:29:06] Sarah Abramson: right?

[01:29:07] Jeremy Bassinder: I firmly believe that as we look forward, they, they're in, they're inextricably linked.

[01:29:14] The business strategy, the ai, the technology strategy are one and the same thing. Um, because we're gonna start applying. You have to be applying technology to what you're doing. And we always have, but we've just chosen to compartmentalise them in different buckets. Um, but, but I think at. Increasingly you're going to have a single strategy about what we do and how we, how we apply technology and the way we do it.

[01:29:38] So, so I then think everybody, whether you're L&D whether you are R&D whatever you are, you have a dimension of what you do that has how we're gonna be applying the latest technology to, to help us be, yes. More productive, but also more imaginative and more creative.

[01:29:55] Sarah Abramson: Yeah. And again, I think it comes back to that, that pace thing, doesn't it?

[01:29:58] Yeah. You know, we don't have a [01:30:00] team that. Does the internet for us. Exactly.

[01:30:04] Jeremy Bassinder: Yes. That's quite correct.

[01:30:07] Sarah Abramson: But it, it does feel like a shift because it's, it's how we work as well, isn't it? It it is that kind of playing that willingness to experiment

[01:30:17] Jeremy Bassinder: and, and then to, to sort of enterprise grade production levels, you know, that's, that's a, uh.

[01:30:26] Art in itself. You know, it's not trivial to do this stuff, but, but I do think you have to start with the imagination.

[01:30:33] Sarah Abramson: Yeah. And that links us right back to what you were talking about at the beginning with the Luddites and that leap of imagination, but also the fear. But the, the reason that the, the Luddite rebellion kicked off with that people were afraid of losing their jobs and.

[01:30:49] It's very interesting, isn't it? So I think, you know, clearly you are someone that is very comfortable with exploring technology and you know, leaning into that whole uncertainty side of things, [01:31:00] kind of, you are excited about the potential that digital technologies and AI offer us, and you're happy to kind of roll up your sleeves and have a go and, um.

[01:31:10] But I think, you know, clearly a lot of people are anxious about their jobs. Mm-hmm. They don't maybe have that same confidence in exploration and um, are worried. So do you think there's a role for organisations to Yeah. Well, yeah. Clearly there is a role for organisations to think about that sort. of People, emotional side of things.

[01:31:30] How, how do we, how do we go about that? Is there a sort of reassurance or, or is it actually a little bit more brutal than that? Are there just gonna be some people who need to embrace it or, or end up getting left behind, um, even if that's not intended?

[01:31:45] Jeremy Bassinder: Yeah, so, so I think. What we would say is that those people who use AI will outperform those that don't.

[01:31:54] Um, and, and that doesn't necessarily mean that, you know, AI's gonna take everyone's [01:32:00] jobs and, and that kind of thing. But I do think there is a different type of person coming that is applying AI and is using it to get work done and being more productive in so doing, um. The fear side of things. Um, I would ask the question of people like, are you on top of your backlog?

[01:32:19] Sarah Abramson: Hmm.

[01:32:20] Jeremy Bassinder: So if, if you are on top of your backlog and you are always getting through all of your work, well okay, maybe may, maybe then you, you, you have got something to worry about. But most people I ask that question too. Um, there's not enough hours in the day. So, so I think, I think you can certainly, um, uh. Get on top of that and what, what more could you get after?

[01:32:43] How fast could you get through stuff that maybe is taking you, uh, a very long time, uh, and you can apply this stuff to get through more in, in, in the week. And I think that the danger with it though, and as a race, um, we have always, we're [01:33:00] working as much as we've ever worked. Yeah. Um, you know, when you go back to sort of.

[01:33:05] Medieval times. The end of all times, the amount of hours it took to support your, uh, a single per, uh, support your family was way less than 40 hours a week. Um, now, now we're, you know that you're lucky if you work 40 hours a week. Um, it, it's, it, it, we seem to be obsessed with cramming more and more in, and so I think, I think there's a dimension of this is as we get more productive and as we use more and more of these tools.

[01:33:32] We carve out enough space for us to do a little bit more. That's not than this space.

[01:33:37] Sarah Abramson: You're so right. It's such an irony, isn't it, that like more obsessed we get with productivity the more like, the harder we seem to be working. It's kind of crazy. Absolutely. Uh, you made reference there to, I, I think people kind of changing, um.

[01:33:52] The ways of thinking and, um, people coming into the workforce. So I'd like to take you back to that, what you'd started to talk about with [01:34:00] education. And mmhmm um, just explore that a little bit because Yeah. Um, I think. In a parallel way to what we were talking about with L&D and with board level thinking.

[01:34:11] Um, in education, there's uh, there's very set ways of doing things and, um, I've seen with both my kids that there's been some great examples of how, uh, they've been set work that uses AI really well, but there's also been, and, and knowing through other people as well, um, some sort of shutting down. And that's understandable because.

[01:34:35] How do educationalists get hold of AI in a way that fits with the way that they are required to do things? It's really difficult, but if we put that aside, what do you think would be a really good way, ideally of educationalist schools? Universities. Yeah. Starting to get better at helping develop the next generation for [01:35:00] the ways that they're gonna need to work.

[01:35:01] That kind of curiosity, that willingness to explore that. Yeah. Willingness to play, but still bringing critical thinking skills to the for in how they work.

[01:35:11] Jeremy Bassinder: Yeah. Uh. So I'll give you, I'll give you an example. I did my first lecture on this topic, um, at a university I wont say which university it was, I got to the end of it and one of the proctors, um, those are the people that sort of, um, enforce the rules of exams and that kinda stuff at universities.

[01:35:29] Um, came up to me and said, how, how do we stop people cheating with this thing? And I said, you've gotta reframe your language. That's like, and, and the same has been of, of these kind of technologies forever. Well, when books were first introduced, um, people said, whoa, people wont use their brains if it's all written down.

[01:35:48] And we was like, come on. Yeah. And so, so, so it has to start with that. Um, I then did, most recently, um, I did the inset day at my, uh, my kids' [01:36:00] school, um, for all their teachers to give them a briefing on, on ai. Oh, cool. Then we broke off. The teaching staff went with the digital director to do some, some thinking.

[01:36:11] I worked with the admin assistant and with the, um, um, the sort of, uh, the financial and, and HR parts of the, the, the, the school. Um, and, and I heard over lunch the headmaster said, you know, I was at a conference and people were debating whether or not. You should use this to generate work and then you critic critique it or you should you, you know, um, write work and then get it to critique it.

[01:36:40] I was like, it's not either or

[01:36:42] Sarah Abramson: Yeah,

[01:36:43] do everything,

[01:36:45] Jeremy Bassinder: do everything. Um, and, and, and I think the, the, the way in which we have have sort of created exams and so on. Um, is really about how much [01:37:00] can you retain Yeah. Not necessarily how much you can apply. Yeah. And, and I think therefore it, we're going to have an interesting period where these technologies can know PhD level, um, capability in every subject under the, the sun.

[01:37:20] So on the IQ domain dimension, they will, they already are better than human beings. That, and not just at one of those domains, at all of those domains at the same time. Um, so then you've gotta say, well, what is it that humans bring to the mix? It's got to be more about the EQ side of things or emotional quotient, if that, if that's a thing, and the blending of those together.

[01:37:43] And, and so I think, you know, if you are, if you have a low eq, low IQ task, well that's probably just ripe for automation, right? Uh, if it's high EQ and low iq. There aren't that many tasks that are in that kinda space, but that's, that's probably well within the domains of human [01:38:00] beings. High iq, uh, low EQ probably is one of the things for these machines.

[01:38:04] And then where you've got the high of both things, I think is where humans and machines are gonna work, work together, perhaps, uh, uh, most beautifully. And I think therefore, you've got to give people the skills that you need to be able to do those higher EQ things. Um. So we're working in teams and coming up with ideas together and systems thinking.

[01:38:29] So how do things come together in, in, in systems like, um, the system that is London for, which is where I'm sitting today, you know, it's built up of all manner of things, of the, the transport system and the electricity and the people flow and all of that. Those are really complicated things to get at. Um, and how the city works as a a, as a, an organism in of its own right.

[01:38:50] Bringing technology and human beings together to look at that as a problem. Those are the sort of skills we need to be giving kids to even be able to think about that kind of stuff as [01:39:00] well as, you know, mathematics and reading and all the things that we'd expect them to do. I don't think it's an either or basically

[01:39:06] Sarah Abramson: I love it and I think, well, we ultimately need to find the joy of.

[01:39:10] Being alive, working together. Right. We need to understand what we benefit from working with other people, teamwork, layering on creativity, different ways of working innovation, and, and understand how we can use tools to do the things that we enjoy as people, right? As humans. And that ultimately, like there is more purpose to work than just getting it done.

[01:39:34] It's like part of how we wanna exist and interact.

[01:39:37] Jeremy Bassinder: Exactly.

[01:39:38] Sarah Abramson: Yeah. Amazing. Look. Oh my gosh, this has just been, it is been so cool and I think we could talk a lot more, but I wanna respect your time. I dunno what you bill by the hour, but I'll get the invoice later. Jez

[01:39:55] Jeremy Bassinder: Probably not enough.

[01:39:58] Sarah Abramson: But I wanna finish with a final question [01:40:00] that I ask all of our podcast guests, um, which is speaking to you as a human.

[01:40:04] What's, um, exciting you at the moment? What are you looking forward to? Motivated by both or either in or out of work.

[01:40:13] Jeremy Bassinder: Yeah. Um, so, well my look, uh, in work, um, my current bit of thinking, um, is around digital eyeballs versus human eyeballs, looking at websites, buying retail, that kind of stuff that I'm finding.

[01:40:32] Really interesting about what that means to how you generate content, how you, how people are gonna buy and purchase in the future, um, and what the implications are for, for the industry. And that, that's getting me quite excited. Um, uh, and, and tomorrow I'm, I'm. Talking to a whole bunch of, uh, NEDs about that kind of concept.

[01:40:52] Um,

[01:40:52] Sarah Abramson: NEDs

[01:40:53] being,

[01:40:54] Jeremy Bassinder: being non-executive directors. Sorry. Yeah. People who are advising boards, um, uh, in, in [01:41:00] my own life, um, I'm, uh, I write this food blog called The Cookbook Shelves, um, which, uh, I write. About a different cookbook on the shelf each week, and we cook from that cookbook and, and so on. Um, and I'm getting towards the end of the shelves.

[01:41:17] So, um, I'm, I'm excited about what is gonna be the next phase of the cookbook shelves. Lovely. Because. I've got to the end of the shelf. Um, but yeah, so, um, yeah, and I'm just, I'm just really excited about how this technology is gonna change the way that that people work, um, and uh, and live. And I think it's very exciting to be at the forefront of that and be using that day in, day out and, and helping people along the journey, whether you are new to it, old to it, or you know, just start to learn with it really, I guess.

[01:41:49] Sarah Abramson: That's amazing. Oh gosh. There's, there's so much in there. I've really enjoyed this conversation. Thank you so much. And I always enjoy chatting with you. It's, it's, um, but we've explored so much [01:42:00] here and, um. Thank you. It is been great.

[01:42:04] Jeremy Bassinder: I hope I didn't get too technical into things. I hope we've kept it at a level where I think we did is appeal to your audience.

[01:42:10] Sarah Abramson: I, I think so. If I understood all the words that you said just about, but hopefully most people will have done. But, um, if people are interested in kind of finding out more, um. I think you are on LinkedIn, is that right? Yep.

[01:42:24] Jeremy Bassinder: I, me on LinkedIn. Yep. Um,

[01:42:26] Sarah Abramson: and you, you shared

[01:42:27] your TEDx talk on there as well, so, uh, we can send a link, uh, share a link to that in the notes.

[01:42:33] Yeah, so,

[01:42:33] Jeremy Bassinder: um, and yeah, and if you, uh, if you wanna follow me on Instagram, the cookbook shelves, uh, if that's more your thing, uh, you'll find me there.

[01:42:41] Sarah Abramson: Love

[01:42:41] it. Brilliant. Thank. So much Jez Really, really welcome. Appreciate it. Welcome, welcome. Um, I hope everyone's enjoyed this conversation as much as I have.

[01:42:50] Um, and if so, please do like, subscribe and share the podcast and, um, see you soon. Bye. for now

[01:42:58] [01:43:00] Hi, Jeremy. It's great to have you joining us on the podcast. I'm really looking forward to hearing your experiences and perspective on what you're seeing happening out there in digital technology and AI for organisations and for all of us as individuals, how things are already starting to change in our working lives and what that might mean for future work.

[01:43:22] You are a partner at IBM and as UK and Ireland's generative AI consulting leader. No doubt you are coming across a whole world of stuff that's honestly mind bending to most of us. So I am hoping that we can tap into some of your insights in a way that's accessible enough for non-techies like me to understand.

[01:43:40] So I'm fully intending, embracing being the one who asks some stupid questions in this conversation. I hope that's okay. I'll try to make sure they're not. All stupid, but, but welcome.

[01:43:52] Jeremy Bassinder: Thank you ever so much for having me. It's, um, it's a delight to be here. Um, I love what you're doing and, uh, this is very, very [01:44:00] exciting for me to be part of it.

[01:44:01] Sarah Abramson: Brilliant. Well, let's, let's start with just hearing a bit about the work that you do at IBM and, and I guess what led you into that role as

[01:44:09] Jeremy Bassinder: well. Sure. I'm Jeremy or Jz Bainer, as most people know me. Um, I have been in and around technology and people, um, for. Getting on for nearly 30 years, well, 28 years now.

[01:44:22] Um, and I am a partner in our consulting business and I focus really on, uh, the intersection of people and technology and most recently around generative ai, which I've led that part of our business for the past couple of years. Um. My industry really of focus is consumer goods. Uh, and I'm fascinated by why people buy and what people buy and so on.

[01:44:45] Um, uh, but really I sort of sit in our customer transformation piece, which is really about how do organisations do more for their customers? And increasingly that's with how do you do that with ai? Um. And, uh, yeah, that's, that's me really in a [01:45:00] nutshell. I work with all sorts of clients from, um, the very large to the, the, the quite small and, um, do all sorts of things in, uh, uh, a little bit in retail as well as consumer goods as well.

[01:45:11] Sarah Abramson: That's brilliant, and I love that you bring from the beginning that, um, the, the sort of crossover of people in technology and that you talk about that. And, um, you recently gave a TEDx talk, which. I really enjoyed called weaving people in digital tech, digital minds. I think, um, that, uh, so we should say, and it's probably a good time to say that we've known each other for absolutely ages and we were at university for together back.

[01:45:37] Probably further, further ago than either of us would care to admit. But, um, one of the things that we bonded over was both coming from Yorkshire down south at university and your TEDx talk was based on. Huddersfield and The kind of emergence of the industrial revolution then that this lovely idea of [01:46:00] weaving of the, um, warp and weft in, in the industrial revolution of weaving fabrics and yeah.

[01:46:07] Factories and huddersfield and you brought the, this sort of parallel of weaving people in digital minds. It would be great to hear a bit about that. Yes. And I guess why you think AI is as much about people and as it is about technology.

[01:46:21] Jeremy Bassinder: I mean the, so as you say, um, I grew up in Huddersfield, and Huddersfield is a, um, a cloth town, um, in, in the north of England.

[01:46:30] And it's, um, it was more affluent than many of the other towns around it because of, um, of, of the cloth industry. Um, and it was also the home of, um. Uh, some of the Luddites, um, and actually in Huddersfield, um, the Luddites took someone's life, uh, a mill owner, uh, William Al. Um, and, and so that's kind of, I find that interesting, that kind of dynamic between, um, the, [01:47:00] the sort of positive side of.

[01:47:02] Technology and the, the fear of technology, um, that those, that those men predominantly had back then. And there's an irony in all of that for me, that today when we call, um, somebody a Luddite they, because they don't understand technology. Actually, I think those gentlemen understood technology pretty well at the time.

[01:47:20] What they struggled with was. They could only see the sort of productivity side of it. They couldn't see what might be the new businesses or the new world that might be created by, um, the, the availability of clothing and the availability of fabric. And certainly they weren't considering, um, that you might weave together carbon fiber into a, uh, the body of an F1 car, for example.

[01:47:44] And so, so, so the, it's those things that I find really interesting. Imagining what new businesses and new. Um, uh, opportunities for society could be, um, and it's, it's really the, the, the technology is, is like the [01:48:00] weft and the, the humans are like the warp. If you only have one, then you don't have a piece of cloth.

[01:48:05] It falls apart. And actually, I think it's the, the interweaving of these. People and technology together that actually makes wonderful things happen. Um, so that, that's kind of where the analogy comes from.

[01:48:18] Sarah Abramson: It's fantastic. It's such a, it's such a great analogy and it's visual as well, you know, you can, it evokes exactly what you mean, that these things are complicated.

[01:48:28] Yeah. I like that you've used the word imagination, because I think it takes a leap for. All of us really to go from where we are and how we use the tools that we currently use and the way that we currently work to thinking, right, well, there's this big thing that's coming and changing everything about how I work potentially.

[01:48:46] Mm-hmm. How do I make sense of that? And that I imagine that the, the Luddite movement, those people at the time, that's it was the fear of that. And I guess how difficult it is to make that leap of imagination, especially if you [01:49:00] don't really understand why you should, because you feel that. Your value is totally embedded and entrenched with the old way of doing things, and that you're fearful of how you make that leap.

[01:49:12] So I guess a long way, a very long question. Um, to get us to how you think people are feeling about AI at the moment and the range of attitudes that you are coming across. Yeah,

[01:49:27] Jeremy Bassinder: I think, I think there's a variety of different attitudes that go from almost like your enthusiast, which I'm firmly in that bucket.

[01:49:33] I mean, I'm, I'm, I'm not. A technologist who builds the absolute core models of this thing are much more on the implied end of things. Um, and, and how, I suppose my role has two dimensions. One really, which is about how do we take this capability to clients to help 'em change their business, but also how do we change ourselves to be, um.

[01:49:58] Just what a consultant in the future [01:50:00] or a consultant today is going to to look like. And so I'm very much in the enthusiast. You know, I, I play with this stuff in my spare time. I geek out on it. Um, uh, I use it to help me with my own. Blog and that kind of thing, um, through to people who are, I suppose you've got the skeptics, um, who like, well, you know, it's over-hyped and it's kind of, you know, it's, it's, it's all it's doing is this, um, um, predicting the next word or what have you, you know, and then you've got the people who are blissfully unaware.

[01:50:32] Um, and I think it's. Those are the people that I worry about the most because actually, um, uh, I think there's a danger that we're gonna leave a bunch of people behind. Um, and it this becomes a world where we have digital haves and digital have-nots in a way that we've not, well, I mean, there's always been the differences between people who had technology and people who haven't.

[01:50:57] Um, but I think the pace at which this is coming and the, that [01:51:00] it's. Becoming mainstream. There's a, there is a danger that a, a generation get left behind, and that, that's something that I'm trying to avoid doing and trying to bring everybody on the journey with us because, um, I, I, I think this technology is, is, I mean, it's been around or first, first was sort of talked about, um, uh, in the fifties onwards really.

[01:51:24] So it's not brand new technology, it's just where we've gone. It seems to have been. Democratised recently and put in the hands of everybody, which means then people can innovate and come up with new use of it and so on. And, and that's what really excites me. I think

[01:51:40] Sarah Abramson: that's really interesting. So, uh, I can, a few different questions coming outta that.

[01:51:44] There's a whole kind of inclusivity strain isn't there, of thinking differently about what inclusivity might mean. And it's up to all of us, I suppose, not to lose the talent and the value that people bring who may not find it easy to make. Even if [01:52:00] it's not a leap to, to take even gentle steps. How do you think wherever people are on that spectrum of comfort with ai, how do you think we can help people to kind of make sense of change, especially change that feels rapid and to kind of navigate that so that we, we are doing our best job of trying to bring everybody along?

[01:52:22] Jeremy Bassinder: You have to give people places to play. Play is really, really important. Um, and I, and I use that word by intentionally actually. Um, the, you have to give people some ideas of what they could go and play it with, um, and that they understand how this stuff works and what it's doing. They have to know at a deeply technical level.

[01:52:44] But just, you know, what happens when you put a request into chatGPT or Claude or whatever your particular, um. Poison is as it were. Um, and, and, and then what it is doing, what it's not doing, how you should use your data with it. [01:53:00] Um, and, and then I think people can. Um, start to look at, and again, I'll come back to imagination.

[01:53:06] Imagine what the art of the possible might be and, and what I've found in my experience when I've worked with senior leadership teams, you've got the technology team on the board and they sort of might pretend that they understand this stuff. Um, and yet. Well, secretly they don't quite understand it. Um, and you've got the business side of the, the, the house, and they're like, we, well, we probably don't really understand it, but we're a bit embarrassed because we don't wanna look stupid in front of the other side.

[01:53:38] And I think you get this, one of the most amazing things to do is to bring both sides together and start, uh, grassroots and then build up from that. And then what happens is you suddenly have these two sides of the house talking about. In co in common language. 'cause you, you've managed to bridge the fact that one is embarrassed and the other one's embarrassed, but for different reasons.

[01:53:59] [01:54:00] Uh, and, and then you can have a proper conversation and then people start to come up with ideas of what you might do and how you might apply it.

[01:54:05] Sarah Abramson: Yeah.

[01:54:05] Jeremy Bassinder: So I think all of that then breeds how you go about change.

[01:54:09] Sarah Abramson: And you, you've, you already used the word democratisation and I think that could be, well it'd be great to get your perspective on whether there's a kind of shift of power and ownership of.

[01:54:18] Where as you described, there's a lot of executive level leaders who are probably feeling very unnerved. They maybe don't feel they understand things as much as they want to, to be able to make the decisions, but also to feel like they are the people that can set the agenda. Whereas you've got people much more at grassroots level or at other.

[01:54:42] Places in an organisational hierarchy who are playing and are doing stuff and are, I don't know, is there, is there some sort of shift of power that's going on there where there's people who aren't the sort of board level decision makers, but who are gaining some [01:55:00] understanding of how things might change at a deeper level than those leaders are?

[01:55:05] How do we kind of make sense of, of that?

[01:55:08] Jeremy Bassinder: Yeah, I think, I think there's a few things going on in, in, in that. Um, uh, I think the first thing is that boards are going to have to have people that understand the technology. And I don't think it's just about, um, how do we run our systems or how to, I, I think we need real.

[01:55:28] Evangelists who can help the board move, move forward. And, and it, there's a, an interesting twist in the sense that quite often the people understand the technology are quite low down in the organisation but the people that can ask the right business questions are often quite high up in the organisation And so navigating that, I think is a, is a challenge. I think the other thing that I've observed, um. I was expecting when I started this role a couple of years ago that the, the younger generation were just gonna get this, um, and the older [01:56:00] generation would struggle. And that's not what I found at all. I think it's, I think as you look down the, the organisation it, it's much more about mindset.

[01:56:10] Um, and, and you can have that mindset at any point. And it's, it's almost like there's a. You sort of moving to being an ai first way of getting work done. And so when I start work, I start work in a tool. I don't start writing. My notes elsewhere or start necessarily thinking just on my own. So the manner in which people can work, I think changes.

[01:56:39] And those people that embrace that, they suddenly see this acceleration, which, which the others don't necessarily get.

[01:56:45] Sarah Abramson: It's fascinating, isn't it? I think some of this is to do with pace because actually, you know, if we think about, if we think back to how people were working 20, 30 years ago, everything we do is transformed.

[01:56:55] Of course it has. Everything we do is different. We're using all sorts of tools and [01:57:00] ways of, um, of achieving what we do in completely ways that we couldn't even have conceived and we're fine. Yeah. And obviously some people are a, a little bit more, you know, have got more developed skills at using some of those tools than others, but we.

[01:57:17] I think we, we get it. That's the world that we live in. You know, my mum comfortably uses her smartphone and well, when she's not here to get across room, she's fine. Um, you know, but we've got there gradually and I think it has felt like we've kind of had different things coming in at different times.

[01:57:35] We've had, you know, the internet, we've had smartphones, we've had this emergence of lots of. Incredible tools that have transformed the way that we do things. But it feels like AI is happening very suddenly. And even you saying, or when you log on, you go straight into a tool. I'm like, I don't do that. I might, it might occur to me at some point in the day that maybe I could try using copilot or chat GPT for something, but it still, I have to like make [01:58:00] myself think of that.

[01:58:01] So how do we get to thinking about barriers in it and enablers to help people? Um, overcome this feeling of pace and this sort of need to suddenly upskill it, it, it can't be quite that sudden. So how do we help with that?

[01:58:20] Jeremy Bassinder: Well, when chat GPT launched and that became and is the fastest growing consumer product of all time, hundred million users, two months.

[01:58:28] Um, and, and it has continued. It's that kind of network effect that's happened with it. And really the maths, the technology behind it, been around for quite a while. Um, 2017 probably is the first instance of the, the, what's called the transformer model, which is what the T in chat GPT stands for. Um, and but really what happened is someone put consumer grid experience on top of brilliant maths.

[01:58:58] Then the world went to play. [01:59:00] And then, and then we've seen this acceleration, um, in the past two years, really. Um, and I think the, the first thing that happened was that organisations said, well, thou shalt not use chat GPT, but necessity has always been the mother of invention. So employees just went and grabbed their mobile phone and they used it on that instead.

[01:59:20] So, so I think access is the first thing you've gotta do. You've gotta give people access, you've gotta help them understand. What is safe to do? What's not safe to do? Um, just make conscious decisions about where you're putting your data and how you're using these things. And um, and then there's the different levels of risk that you might, might tolerate.

[01:59:40] You know, what you are gonna use this stuff for versus what you're not. Um, some of that will be legal. Some of it will be reputational. Um, and some of it will be operational. And what, what I mean by that is, um, there, there'll be some things that you can't use this for, for, for, for legal reasons. And, and, and that's pretty black and white.

[01:59:59] There's then some things [02:00:00] that you don't want your brand to be associated with you using it for, and that's much more in the reputational space. And then sometimes this stuff's really expensive. So actually there are operational reasons why you might not use it in certain ways and you might use different forms of it.

[02:00:15] And I think as you start building these, um, ideas for people, um, on what you can do, how you can apply it, and then you get people to play and build. Build it for themselves. Um, then I think you start to see it being adopted across organisations

[02:00:32] Sarah Abramson: That's nice of a way of combining what you were talking about with play, but also with having sets and parameters.

[02:00:37] And do you think, is there a step where organisation organisations need to help people understand risk? Or how does that bit happen?

[02:00:46] Jeremy Bassinder: The, and as I said, there's, there's lots of dimensions of risk, but just even the very basics of when you put this prompt into this tool, where does that go? That you know, I'm gonna pull down [02:01:00] information from a model that's been trained on some data that do I know whether the organisation that's built that model owns that data.

[02:01:08] It's pretty clear some of these, uh, chat, GPT and, and others have been trained on content that's across the open internet and there's various. Things going on around, um, rights and whether or not they have the rights to script all this data and so on, and different legislations, different countries are taking different opinions on it.

[02:01:26] The, the Japanese opinion seems to be, well, it's a fair game. It's out in the open. Um, you know, in the same way as, uh, Picasso looked at Rembrandt's pictures, well that was inspiration for, for, for them. You know, it, it, it, it's, it's fair game. Um, the US seems to be having a, a slightly different view. Uh, of things.

[02:01:44] Um, and, and, and so there's the question of what am I pulling into my organisation from, from what these models are trained on, um, to what I'm pushing out. So, um, I'm, I might be putting in company information, I might be putting in pub, uh, [02:02:00] personal information and, and I need to be conscious of that, of, of what I'm doing.

[02:02:04] So that's kind of inbound and outbound IP risk really. Um, and then, then there's this kind of language of hallucinations. I mean, all these things are doing really is predicting the next word. Let's say. It could be number, it could be, um, postcode or, or, but, but it's essentially just making a prediction. So it's just statistics.

[02:02:26] Um, and so sometimes it, the statistics point to something that's not the, the, the right answer, as it were. And so it hallucinates. Again, we, we sort of anthropomorphise this stuff and try and give it language that, that makes it more human in some respects. Um, whereas I, I think we've gotta be a little bit careful on that, but we, we probably have to sort of come up with language that we can use to explain these things, um, to people.

[02:02:53] But, um, those are the kind risks that I see around the place, um, where I'm putting my data or what I'm sharing. And then. [02:03:00] Making sure that I've fact checking it as well.

[02:03:02] Sarah Abramson: Yes. Well that's a whole area, isn't it, of, yeah. Critical thinking, I suppose, of, of, um, making sure that people don't take things at face value.

[02:03:11] Um, do you think there are good ways that we need to help people to, to do that, to use their own, apply their own knowledge and other ways where you use. AI most effectively as a tool in a sort of thinking process, like have you seen some good implementations where it adds value at certain points in a workflow or in a process, but without losing that critical thinking and that human

[02:03:40] input?

[02:03:41] Jeremy Bassinder: Why we're thinking about this is that everybody is gonna have a set of assistants around them. That helps them do their work. Um, and assistants that I find useful, you might find useful because you do something similar to me in certain areas and we might share those assistants [02:04:00] We might then connect those assistants to tools in the organisation That could be as simple as, you know, word processing things or PowerPoint type stuff, or it might be out to. An HR system or something like that. And those connected assistants then almost become like digital workers. And so those bits can slot into our workflow, um, to execute things on our behalf. And so, so where do I see people, um, doing really well with this, um, is where.

[02:04:35] They create something that's useful for them that then they can put across, um, to others. And then you get this kind of, um, network effect. So, um, some of the best things are asking the tool itself, asking copilot how you know how to do X. So, um, for example, I'm writing a requirements document, get it to ask me the [02:05:00] questions that, um, uh.

[02:05:04] I, I might need to be able to write a really good requirements document rather than getting it to just write a requirements document, if you see what I mean. Um, so actually work with it as you would work with another person to prompt you and think through things. Um, obviously transcription, lots of people using it for that writing, meeting notes, that kind of stuff.

[02:05:25] Um, but. We are doing things like writing user stories or writing personas and getting the model to adopt the persona of the person that you're designing for so that you, it will then give you its opinion on what that person might think, and you almost have a digital. Point of view coming into the design process as well.

[02:05:48] I guess I'm rambling a little bit.

[02:05:49] Sarah Abramson: No, you're not at all. I think that it brings us right back to that point about imagination where because we are, we have potential for changing so many steps in the way that we work that [02:06:00] well. There's two things, aren't there. One is that you are almost on autopilot with how you work.

[02:06:03] You know, especially. Those of us that have been in our careers for a while, you have embedded habits and things that you don't really think about. It's just how you do stuff and to kind of, I, I mean, I, I am excited about the opportunities of ai, but, and so I'm not averse to using these technologies at all, but I have.

[02:06:24] I have to stop myself to think about how and when. Mm-hmm. I could use them. And that kind of habit, that kind of imagination, I think is part of the, this leap that it feels like we need to make. Maybe we don't, maybe it could be baby steps a little bit more. Um. I wonder if one of the things that helps with that is people having the ability to share Yeah.

[02:06:46] Like see examples, seeing how other people are working, just being surrounded by other ways that people are doing stuff. Are you seeing that kind of sharing happening and some good examples of it? Yeah, [02:07:00]

[02:07:00] Jeremy Bassinder: hugely. Yeah. Um, so, so I write a, an internal blog every fortnight of where we just. Find somebody who's doing something interesting and then we record a video of them doing it and just, uh, and just share, share that out from everything from, you know, writing blog posts to writing code and code is a huge thing now, um, where people are, uh, sort of.

[02:07:26] Um, actually, um, almost creating like whole development teams that are doing, um, the creation of apps and, uh, and so on. Um, so, so that's, that's pretty fascinating. We've done sort of ab testing between. Um, teams working with generative AI and teams not working with generative ai, and you see 50, 60% productivity gains and, um, people write better quality.

[02:07:53] 'cause that's not always something people think about either. Um, the, they immediately go to the [02:08:00] productivity side of things and yes, you're expecting to get things faster, but don't necessarily think you're gonna get things better as well. Um, which, which? Is, is definitely the case. Um, what sort of things have I thought seen that's, that's interesting that people have, how they've applied this stuff?

[02:08:18] Um, I think my mind has gone completely blank.

[02:08:26] Sarah Abramson: That's okay. I, no, I'm picking up that point about productivity. I think, um, I mean we've certainly come across that with. Uh, with people who are talking about AI in terms of the efficiencies, and at the moment, that's acute because, you know, we are, we're all under pressure.

[02:08:44] organisations are all under pressure. There's, you know, things are quite tricky in the world. It's, uh, you know, there's economic uncertainty. There's, you know, shifting markets is all, there's a lot of things that we feel under pressure to navigate, [02:09:00] create efficiencies. All of that. And plus, you know, as soon as you have a financial director on the board who thinks there's a way that they can, um, reduce costs, of course they're going to want to.

[02:09:10] So we've come across quite a few people saying we need to introduce AI to create efficiencies, which I, I'm not suggesting that that's not worth exploring and isn't valuable, but it seems like you say that that kind of misses. The value that, that we could be using. You know, we can only be thinking about AI as being something that improves stuff for us.

[02:09:35] Jeremy Bassinder: Why? And we do a lot of work with Wimbledon at IBM and has for years. Um, if we are watching the Women's Final in June of next year, um, it's now we've done generative commentary. You could take, um, the highlights or in real time [02:10:00] at some point, um, and be able to do a different commentary for each person that's watching it.

[02:10:05] The people might be interested in the rules of the game, and they could have one commentary. Another person might be interested in, um, the stats of how fast the ball was and that care. Another person might just care what the fashion is in the royal box and, and. You couldn't ha there's no other way you can do that as a business because you just can't deal with the volume of being able to, um, record that many versions of the, the commentary, let alone voice them with the same person at the same time in, in real time.

[02:10:37] And those kind of things, I think are then applying the art of the possible. Where's the what, what could a value, uh, thing be and the, the, the value's interesting in that. Um. People quite easily get into the business value or the human value, but you've got to connect the two. So if I, if I have something that's really [02:11:00] valuable, but it's so badly designed that nobody uses it, well, then it doesn't get adopted.

[02:11:04] So it's valueless If I've got something that's. Well designed and it's kind of cool, but it doesn't actually drive any business value. It's just a toy. Um, and, and so actually this constant looking at what's it mean for the human, what's it mean for the business and how you link that together and that's what you should go and build, um, I find fascinating.

[02:11:23] And that's kind of where, where, where we go and we're starting to look at what are the, the things that we should build next for an organisation

[02:11:30] Sarah Abramson: Yeah. And I guess in terms of the sort of strategy of. Development in terms of the tech side and the people side, like supporting people to continue with that, uh, ability to use the tools, but to think critically about them alongside implementation of technology.

[02:11:49] It feels like we need to think about parallel strategies really. Is that, is that how you approach things?

[02:11:56] Jeremy Bassinder: I think about things is this, [02:12:00] historically when we developed technology, it was a. A thing that happened over a period of time and then you had a starting point of the project, project and then an end point, and then you went on to the next thing.

[02:12:13] I think where we're starting to to go now is these are continuous. So, so your, I dunno, your marketing platform or your, whatever it might be, these things are forever. And so there are constantly iter and it's not started at one point. And then the project finishes. These are products, these are digital products that you.

[02:12:35] As much about your organisation as the physical products that you, you sell, um, or, or the services that you offer. These, these things are there forever. Um, and therefore you are, you need to build them with that mindset that we're gonna build these things, we're gonna feed them, water them, we're gonna improve them.

[02:12:56] Um, and, and, and, and then. Look at how [02:13:00] people are using them and, and then make sure that they're still offering the value that they offered on day one at day X in the future.

[02:13:10] Sarah Abramson: Yeah. And there's no point at which you've kind of rolled out a new system and then you train people on it. It's, it's just a, just a total breaking of that way, continuous of doing things.

[02:13:18] Can you, and, and with what you were just saying there, can you bring people in with their insights and what they're actually doing to the ongoing development of the technology? How, how can that happen? How can organisations do that kind of thing? Well,

[02:13:34] Jeremy Bassinder: both overtly and covertly. Maybe covertly, not the nicest way, but it of describing it, but it, it sort of works.

[02:13:41] So I can overtly ask people what they need and try and do, you know that classic sort of, you are not your user. So do proper user research and try and understand people even if you think you know what they want, actually go and ask them and, and look at what they do and try and understand them and then build [02:14:00] for that.

[02:14:00] But the other thing is that you put measurement systems behind the scenes. To look at what people actually doing, when, when do people struggle on their site? And then that's probably a point where it needs a bit of redesign or what are the most transactions people are asking. Um, we, we have a, an internal system called Ask HR, which sits across all of our a HR and.

[02:14:24] Expenses and travel and all manner of systems, and you just come in through a, a chat window. Um, and when they were first building that they built all of the analytics on the backend to see what were the questions people asked. And initially all it did was it served up what the company policy was or how you going use this system to do that or the other.

[02:14:45] Um, but over time you then say, well, I'm getting. Thousands of these requests every day. Maybe we should automate that request and then you can get at things incrementally. Um, but yeah, I, I, and then. [02:15:00] Once you start talking to people, you can build persona to people and then test them against, um, uh, what they actually think of the products you're designing, but also build synthetic personas to try and model, um, what digital versions of those people might do.

[02:15:15] Sarah Abramson: I feel like for most of us, this kind of stuff will make a lot more sense in 10 years when we go, oh yeah, of course, I know, understand what you're talking about. Whereas now I'm really conscious that a lot of this sounds very sophisticated and futuristic and, um. You know, the, I I'm sure there are people listening to this because I talk to people all the time who are feeling a little bit like we are not doing much yet.

[02:15:38] Do you, how would you advise people who feel like that to kind of break things off a little bit, to, to make it feel smaller and to, you know, not feel like the technology's already got. Gotten away from them. Like, yeah. How do we do things at a step by step, kind of from how things feel normal now [02:16:00] to gently venturing into this future?

[02:16:03] Jeremy Bassinder: I think you have to start with something where you have a need and, and maybe it's not a need in your work. It might be a need at home. It might be, I dunno, like, like I said, I, I, I write a, a Cookbook blog. Um, and for me there was a need to start. How, how do I automate that a little bit? Um, and actually that's a, a real thing I can go and play with and I can go and just start to make my life easier in that sense Um, and I, and I think, I think

[02:16:40] this thing ain't going away. And so. I think it's in everyone's, um, benefit, if that's the right word. Um, just try and learn a bit about it. And, and [02:17:00] again, I come back to the, the, the haves and have nots. I mean, just go and play, play safely, and then start to go, well what, what is the art of the possible, of what I can do?

[02:17:11] And then how might I bring that into a. Uh, a work environment my, my, my brother for example, um, he's, uh, social media director for a bank. Um, and so he talks to me about this kind of stuff as you would expect, and what the same way as a a as you have done. And, um, and I said to him, well, you know, he said, well, we've just got copilot.

[02:17:31] What could I do with it? And I said, well, why not try and make a, an assistant that can sound like the bank? And so he's built the bank's tone of voice into an assistant. And so in his world, what he is doing is he's taking some of the communications that he would normally write, and then he is put making sure that they're in, he knows the tone of voice perfectly for that bank.

[02:17:57] Um, I. [02:18:00] What he's able to do then is he's able to give that to others so that they can write in the right tone of voice, first time. Clearly someone else is. Someone's gonna read through that and make sure that it's gone through before it goes out the door. Um, but the amount of time it then takes to check is it on brand, is it the right sort of tone is much shorter than if.

[02:18:21] They didn't have that assistant and it's early days and they're playing with it and it's not necessarily going to go live yet, but being able to apply it in the work that you do, and, and he's found that revelationary really, um, for him to be able to see how he can apply it in a, in a, a marketing context.

[02:18:42] Sarah Abramson: That's really cool. And a great example. I, I think what occurs to me is that some people are gonna have, um. Are gonna be more comfortable even just with playing with stuff. And, and that in itself is almost a skill of being willing to experiment. Um, you know, [02:19:00] some people are, they'd like to know what they're doing and they like to feel like they have got skills to do it.

[02:19:05] So for organisations to upskill employees, what do we need to look at? Is there a role for learning and development teams? Is there. Something that's in the culture or in the way that we empower people so that it doesn't feel overwhelming. It fe but it feels like everyone can do this and we need everyone to do it.

[02:19:25] But a sort of, um, accessible way in,

[02:19:28] Jeremy Bassinder: it's funny, I was, I was talking with our learning and development team. Um, I think you have to embed it in every bit of learning. So, so I don't think you should have on this is the AI training bit and um, it should be, and how are we applying? These tools in whatever it is that we're learning.

[02:19:46] So, um, you can use these tools to gen, generate a project plan, and render a project plan and come up with a risk log. And, uh, in fact, one of the, the assistants I wrote was how to write better risks. People are lousy at [02:20:00] writing risks. Um. So actually write an assistant to help you write better risks. Um, I'm not saying it will write all the risks.

[02:20:08] I'm saying that, you know, take a fairly average human written risk and then give it to the, the machine. And the machine Say, well what about this? What are the mitigations? What are the timeline that this is gonna happen? What's the impact? You know, really sort of be your critical friend. Um, and so I think embedding, um.

[02:20:25] These technologies in every bit of learning and development, I think is, is, is really important. And I, I think that's the, the general thing really. It's like when I, I've done some stuff at schools as well. It's like people think in the one dimension of, oh, well it's a technology thing. No, you can apply it to every single subject in the school.

[02:20:45] Um, whether that's philosophy or economics or. Art or sport, whatever, there is a way in which you can embed AI and the use of it in, in there. And I think that's what you have to do in an organisation as well.

[02:20:58] Sarah Abramson: I love that. And I, I'd like to come [02:21:00] back to education in, in a minute because it's such an interesting topic.

[02:21:02] I just wanna get your, your perspective on it. But, um, just staying with L&D for a minute, um mm-hmm. Do you think, like you were saying at board level, that we need technologists or people that understand the technology to be. Sort of contributing to strategy development at the very top level. Yeah. Do we need people with that kind of background in L&D as well?

[02:21:24] Or is it something that L&D teams need to

[02:21:28] Yeah. Sort of

[02:21:28] reframe for themselves? How do they train themselves to do that?

[02:21:31] Jeremy Bassinder: I, I mean, people all sort say say, oh, come and help us write an AI strategy. Okay. Surely that's just your strategy,

[02:21:39] Sarah Abramson: right?

[02:21:41] Jeremy Bassinder: I firmly believe that as we look forward, they, they're in, they're inextricably linked.

[02:21:47] The business strategy, the ai, the technology strategy are one and the same thing. Um, because we're gonna start applying. You have to be applying technology to what you're doing. And we always have, but we've just [02:22:00] chosen to compartmentalise them in different buckets. Um, but, but I think at. Increasingly you're going to have a single strategy about what we do and how we, how we apply technology and the way we do it.

[02:22:11] So, so I then think everybody, whether you're L&D whether you are R&D whatever you are, you have a dimension of what you do that has how we're gonna be applying the latest technology to, to help us be, yes. More productive, but also more imaginative and more creative.

[02:22:29] Sarah Abramson: Yeah. And again, I think it comes back to that, that pace thing, doesn't it?

[02:22:32] Yeah. You know, we don't have a team that. Does the internet for us. Exactly.

[02:22:37] Jeremy Bassinder: Yes. That's quite correct.

[02:22:40] Sarah Abramson: But it, it does feel like a shift because it's, it's how we work as well, isn't it? It it is that kind of playing that willingness to experiment

[02:22:50] Jeremy Bassinder: and, and then to, to sort of enterprise grade production levels, you know, that's, that's a, uh.[02:23:00]

[02:23:00] Art in itself. You know, it's not trivial to do this stuff, but, but I do think you have to start with the imagination.

[02:23:06] Sarah Abramson: Yeah. And that links us right back to what you were talking about at the beginning with the Luddites and that leap of imagination, but also the fear. But the, the reason that the, the Luddite rebellion kicked off with that people were afraid of losing their jobs and.

[02:23:23] It's very interesting, isn't it? So I think, you know, clearly you are someone that is very comfortable with exploring technology and you know, leaning into that whole uncertainty side of things, kind of, you are excited about the potential that digital technologies and AI offer us, and you're happy to kind of roll up your sleeves and have a go and, um.

[02:23:43] But I think, you know, clearly a lot of people are anxious about their jobs. Mm-hmm. They don't maybe have that same confidence in exploration and um, are worried. So do you think there's a role for organisations to Yeah. Well, yeah. Clearly there is a [02:24:00] role for organisations to think about that sort. of People, emotional side of things.

[02:24:03] How, how do we, how do we go about that? Is there a sort of reassurance or, or is it actually a little bit more brutal than that? Are there just gonna be some people who need to embrace it or, or end up getting left behind, um, even if that's not intended?

[02:24:19] Jeremy Bassinder: Yeah, so, so I think. What we would say is that those people who use AI will outperform those that don't.

[02:24:28] Um, and, and that doesn't necessarily mean that, you know, AI's gonna take everyone's jobs and, and that kind of thing. But I do think there is a different type of person coming that is applying AI and is using it to get work done and being more productive in so doing, um. The fear side of things. Um, I would ask the question of people like, are you on top of your backlog?

[02:24:52] Sarah Abramson: Hmm.

[02:24:54] Jeremy Bassinder: So if, if you are on top of your backlog and you are always getting through all of your work, well okay, maybe may, [02:25:00] maybe then you, you, you have got something to worry about. But most people I ask that question too. Um, there's not enough hours in the day. So, so I think, I think you can certainly, um, uh. Get on top of that and what, what more could you get after?

[02:25:16] How fast could you get through stuff that maybe is taking you, uh, a very long time, uh, and you can apply this stuff to get through more in, in, in the week. And I think that the danger with it though, and as a race, um, we have always, we're working as much as we've ever worked. Yeah. Um, you know, when you go back to sort of.

[02:25:38] Medieval times. The end of all times, the amount of hours it took to support your, uh, a single per, uh, support your family was way less than 40 hours a week. Um, now, now we're, you know that you're lucky if you work 40 hours a week. Um, it, it's, it, it, we seem to be obsessed with cramming more and more in, and so I think, I think there's a [02:26:00] dimension of this is as we get more productive and as we use more and more of these tools.

[02:26:05] We carve out enough space for us to do a little bit more. That's not than this space.

[02:26:10] Sarah Abramson: You're so right. It's such an irony, isn't it, that like more obsessed we get with productivity the more like, the harder we seem to be working. It's kind of crazy. Absolutely. Uh, you made reference there to, I, I think people kind of changing, um.

[02:26:25] The ways of thinking and, um, people coming into the workforce. So I'd like to take you back to that, what you'd started to talk about with education. And mmhmm um, just explore that a little bit because Yeah. Um, I think. In a parallel way to what we were talking about with L&D and with board level thinking.

[02:26:45] Um, in education, there's uh, there's very set ways of doing things and, um, I've seen with both my kids that there's been some great examples of how, uh, they've been set work that uses AI really well, but there's also [02:27:00] been, and, and knowing through other people as well, um, some sort of shutting down. And that's understandable because.

[02:27:09] How do educationalists get hold of AI in a way that fits with the way that they are required to do things? It's really difficult, but if we put that aside, what do you think would be a really good way, ideally of educationalist schools? Universities. Yeah. Starting to get better at helping develop the next generation for the ways that they're gonna need to work.

[02:27:35] That kind of curiosity, that willingness to explore that. Yeah. Willingness to play, but still bringing critical thinking skills to the for in how they work.

[02:27:44] Jeremy Bassinder: Yeah. Uh. So I'll give you, I'll give you an example. I did my first lecture on this topic, um, at a university I wont say which university it was, I got to the end of it and one of the proctors, um, those are the people that sort of, um, [02:28:00] enforce the rules of exams and that kinda stuff at universities.

[02:28:03] Um, came up to me and said, how, how do we stop people cheating with this thing? And I said, you've gotta reframe your language. That's like, and, and the same has been of, of these kind of technologies forever. Well, when books were first introduced, um, people said, whoa, people wont use their brains if it's all written down.

[02:28:21] And we was like, come on. Yeah. And so, so, so it has to start with that. Um, I then did, most recently, um, I did the inset day at my, uh, my kids' school, um, for all their teachers to give them a briefing on, on ai. Oh, cool. Then we broke off. The teaching staff went with the digital director to do some, some thinking.

[02:28:44] I worked with the admin assistant and with the, um, um, the sort of, uh, the financial and, and HR parts of the, the, the, the school. Um, and, and I heard over lunch [02:29:00] the headmaster said, you know, I was at a conference and people were debating whether or not. You should use this to generate work and then you critic critique it or you should you, you know, um, write work and then get it to critique it.

[02:29:13] I was like, it's not either or

[02:29:16] Sarah Abramson: Yeah,

[02:29:17] do everything,

[02:29:18] Jeremy Bassinder: do everything. Um, and, and, and I think the, the, the way in which we have have sort of created exams and so on. Um, is really about how much can you retain Yeah. Not necessarily how much you can apply. Yeah. And, and I think therefore it, we're going to have an interesting period where these technologies can know PhD level, um, capability in every subject under the, the sun.

[02:29:53] So on the IQ domain dimension, they will, they already are better than [02:30:00] human beings. That, and not just at one of those domains, at all of those domains at the same time. Um, so then you've gotta say, well, what is it that humans bring to the mix? It's got to be more about the EQ side of things or emotional quotient, if that, if that's a thing, and the blending of those together.

[02:30:17] And, and so I think, you know, if you are, if you have a low eq, low IQ task, well that's probably just ripe for automation, right? Uh, if it's high EQ and low iq. There aren't that many tasks that are in that kinda space, but that's, that's probably well within the domains of human beings. High iq, uh, low EQ probably is one of the things for these machines.

[02:30:38] And then where you've got the high of both things, I think is where humans and machines are gonna work, work together, perhaps, uh, uh, most beautifully. And I think therefore, you've got to give people the skills that you need to be able to do those higher EQ things. Um. So we're working in teams and coming up with ideas together [02:31:00] and systems thinking.

[02:31:02] So how do things come together in, in, in systems like, um, the system that is London for, which is where I'm sitting today, you know, it's built up of all manner of things, of the, the transport system and the electricity and the people flow and all of that. Those are really complicated things to get at. Um, and how the city works as a a, as a, an organism in of its own right.

[02:31:23] Bringing technology and human beings together to look at that as a problem. Those are the sort of skills we need to be giving kids to even be able to think about that kind of stuff as well as, you know, mathematics and reading and all the things that we'd expect them to do. I don't think it's an either or basically

[02:31:39] Sarah Abramson: I love it and I think, well, we ultimately need to find the joy of.

[02:31:44] Being alive, working together. Right. We need to understand what we benefit from working with other people, teamwork, layering on creativity, different ways of working innovation, and, and understand how we can use tools to do [02:32:00] the things that we enjoy as people, right? As humans. And that ultimately, like there is more purpose to work than just getting it done.

[02:32:07] It's like part of how we wanna exist and interact.

[02:32:11] Jeremy Bassinder: Exactly.

[02:32:12] Sarah Abramson: Yeah. Amazing. Look. Oh my gosh, this has just been, it is been so cool and I think we could talk a lot more, but I wanna respect your time. I dunno what you bill by the hour, but I'll get the invoice later. Jez

[02:32:29] Jeremy Bassinder: Probably not enough.

[02:32:31] Sarah Abramson: But I wanna finish with a final question that I ask all of our podcast guests, um, which is speaking to you as a human.

[02:32:37] What's, um, exciting you at the moment? What are you looking forward to? Motivated by both or either in or out of work.

[02:32:47] Jeremy Bassinder: Yeah. Um, so, well my look, uh, in work, um, my current bit of thinking, um, is around digital eyeballs versus human eyeballs, [02:33:00] looking at websites, buying retail, that kind of stuff that I'm finding.

[02:33:05] Really interesting about what that means to how you generate content, how you, how people are gonna buy and purchase in the future, um, and what the implications are for, for the industry. And that, that's getting me quite excited. Um, uh, and, and tomorrow I'm, I'm. Talking to a whole bunch of, uh, NEDs about that kind of concept.

[02:33:25] Um,

[02:33:26] Sarah Abramson: NEDs

[02:33:26] being,

[02:33:27] Jeremy Bassinder: being non-executive directors. Sorry. Yeah. People who are advising boards, um, uh, in, in my own life, um, I'm, uh, I write this food blog called The Cookbook Shelves, um, which, uh, I write. About a different cookbook on the shelf each week, and we cook from that cookbook and, and so on. Um, and I'm getting towards the end of the shelves.

[02:33:50] So, um, I'm, I'm excited about what is gonna be the next phase of the cookbook shelves. Lovely. Because. I've got to the end of the shelf. Um, but [02:34:00] yeah, so, um, yeah, and I'm just, I'm just really excited about how this technology is gonna change the way that that people work, um, and uh, and live. And I think it's very exciting to be at the forefront of that and be using that day in, day out and, and helping people along the journey, whether you are new to it, old to it, or you know, just start to learn with it really, I guess.

[02:34:23] Sarah Abramson: That's amazing. Oh gosh. There's, there's so much in there. I've really enjoyed this conversation. Thank you so much. And I always enjoy chatting with you. It's, it's, um, but we've explored so much here and, um. Thank you. It is been great.

[02:34:38] Jeremy Bassinder: I hope I didn't get too technical into things. I hope we've kept it at a level where I think we did is appeal to your audience.

[02:34:44] Sarah Abramson: I, I think so. If I understood all the words that you said just about, but hopefully most people will have done. But, um, if people are interested in kind of finding out more, um. I think you are on LinkedIn, is that right? Yep.

[02:34:58] Jeremy Bassinder: I, me on LinkedIn. Yep. Um, [02:35:00]

[02:35:00] Sarah Abramson: and you, you shared

[02:35:01] your TEDx talk on there as well, so, uh, we can send a link, uh, share a link to that in the notes.

[02:35:06] Yeah, so,

[02:35:07] Jeremy Bassinder: um, and yeah, and if you, uh, if you wanna follow me on Instagram, the cookbook shelves, uh, if that's more your thing, uh, you'll find me there.

[02:35:14] Sarah Abramson: Love

[02:35:15] it. Brilliant. Thank. So much Jez Really, really welcome. Appreciate it. Welcome, welcome. Um, I hope everyone's enjoyed this conversation as much as I have.

[02:35:23] Um, and if so, please do like, subscribe and share the podcast and, um, see you soon. Bye. for now

Want to share our goodies?

Sign up to our newsletter...

for communications nuggets, behavioural insights, and helpful ideas. All treats and no spam.