Amanda Brock is CEO of OpenUK, the UK organisation for the business of Open Technology (open source software, open hardware and open data); elected Board Member, Open Source Initiative; appointed member of the Cabinet Office’s Open Standards Board; Member of the British Computer Society Inaugural Influence Board; Advisory Board Member, KDE, Planet Crust, Sustainable Digital Infrastructure Alliance and Mimoto; Charity Trustee Creative Crieff and GeekZone; and European Representative of the Open Invention Network.
As Head of Research at the ODI, Jared Keller conducts both qualitative and quantitative research across a wide range of issues related to open data, and supports the ODI’s efforts to map complex technological, business, and regulatory landscapes.
He specialises in emerging technologies and their implementation in, and impact on, commercial, public sector, and social contexts. Before joining the ODI, Jared worked as a qualitative analyst for a public consultancy firm in London, and as a researcher of technology futures at Nesta.
As part of the Data Decade, at the Open Data Institute (ODI), we are exploring how data surrounds and shapes our world through 10 stories from different data perspectives. The sixth explores data and technology, and the impact these can have on our lives
Listen to the podcast now
This is Data Decade, a podcast by the ODI.
Hello, and welcome to Data Decade from the ODI. I’m Emma Thwaites, and in this series, we’re exploring the last 10 years of data, the next decade ahead, and the transformational possibilities for data in the future. We’ve looked at a range of topics so far, from data in arts and culture and how data is shaping our cities, to who looks after data and how to use data responsibly.
But in this episode, we are going to explore data and technology. Data and technology, of course, go hand-in-hand, driving innovation and opening up new possibilities in products and services. It’s been fundamental to technologies that have emerged in the last 10 years, from AI to digital twins. But the pace at which these advances are made means we often find ourselves in uncharted and sometimes unexpected waters.
These changes have been significant in the last decade. So where might we end up in the next 10 years? Let’s find out more, welcome to Data Decade.
Emma Thwaites: Really looking forward to this episode, joining me to talk through how data and technology impacts our lives is Amanda Brock, the CEO of OpenUK, and one of my ODI colleagues, Dr Jared Keller, the Head of Research. It’s really nice to have both of you with us.
Jared Keller: Thanks for having me.
Amanda Brock: Great to see you, Emma.
Emma Thwaites: So Amanda, let’s start with you. You’ve got a really interesting background. You started your professional career as a lawyer, and you were a lawyer for 25 years – I guess you’re always a lawyer aren’t you? You don’t stop being a lawyer.
Amanda Brock: Yeah. You can’t get out. You can recover
Emma Thwaites: But how did you get into technology? What was that journey?
Amanda Brock: So back in the mid-nineties, I did a Masters in IP and IT law. And I did the first internet law course in the UK and ended up joining Dixons, the high street retailers. And I worked for them for five years, but I joined them to work on something called Freeserve that I guess some people that are listening to this podcast might remember, which was the first free and open ISP back in the day. Working with Dixons, I guess that I worked for some time around internet, around all sorts of different areas of emerging technology coming out of e-business, eCommerce. And in some ways helped to shape the laws at the time, because we were seeing the law not being able to keep pace with technology as it was evolving.
You know, this was the Wild West back in the day. And there were only a few of us really working at the coalface as lawyers. , we would get together. We would discuss what we were doing and how we could make the old laws try to fit that space. But we also did a lot of work lobbying government and trying to change and shape the laws that would regulate the internet going forwards.
Emma Thwaites: That’s really interesting. But what I notice about you – cause we’ve, full disclosure, we’ve known each other for a while – and you are so passionate about open, and passionate about technology. In that sort of professional journey in the 2000s, what really piqued your interest and made you go: “that’s what I’m gonna go for?”
Amanda Brock: I think it was the fact that it was something new and it was the opportunity to try and shape it. And to make it into something. And as a lawyer, a lot of what you do is you implement, you know, you take existing laws, you apply them. It’s very practical. This was really creative. And this was trying to find ways to take the old and make it new, but also to allow that technology to evolve.
And of course, when you’re a lawyer, you really have to be able to describe what you’re working on. You really have to be able to write it down. And that means with technology, you have to understand products – and products, of course, of the technology themselves. In this case, the internet.
You know, back then we were doing things like Linking Agreements. We were trying to structure how the web would be in a very regulated legalistic way that was never gonna survive. But we didn’t know that, we couldn’t see where it was going. We could only see what we had in front of us at the time. And we were already dragging things like the Sale of Goods Act that was designed for taking something off the shelf and trying to digitise it.
We were doing a lot of what we’d now be called digital transformation, but nobody knew then that that’s what it was.
Emma Thwaites: It’s interesting, ‘cause one of the things that you said to me when we were preparing for this podcast was that if we’d known 20 years ago where we were going, we’d have lobbied for something else.
Amanda Brock: Oh totally. So what you saw then was a very small internet. I mean, you remember that noise it used to make dialling up. That’s nothing like what we got today. Today, the digital access that we have through the internet, primarily because we have connectivity, means that absolutely everything we do is dictated to or happens across the internet. Back then, it was just a small thing you did. You would go and dial up and you would maybe buy something online or you would send an email, you know, and that’s evolved more and more over time.
And what we’ve seen is that it’s evolved. It’s not only that what people use it for is very different, but the rise of social media. And nobody could ever have predicted. Nobody could have thought that they were gonna get their news on Twitter, or that people would believe what somebody, you know, in another country posted and then went viral, that that was actually truth.
Emma Thwaites: Absolutely. Jared, I’m gonna come to you just for a second here. I wonder whether anything that Amanda has described there, in terms of that ability 20 years ago, to predict the future, what you were thinking the development of technology was gonna look like back then.
Well, do you want me to pretend that I have thoughts from back when I was a very, very young lad?
Emma Thwaites: , you’re not that young!
Jared Keller: 20 years ago, I wasn’t having these conversations, let’s just say that. Over the last 10 years I’ve been doing a lot of thinking about the last hundred, let’s say that.
Because I have a degree in history, so I’m doing a lot of thinking about the history of science, the history of technology. And I have loved the way that, like you said in the intro, technology is perceived as always advancing very rapidly. And yet you have history, which is meant to be looking backwards, so why would you study the history of technology? But I think it’s interesting to study, to be able to make comparisons between technologies and different generations, different ages.
So a lot of what was just discussed makes total sense to me. I think there are tons of examples from the history of science of technological advancement outpacing the advancement of the law and of regulation. I’m thinking just from the last 10 years, you could mention self-driving cars, micro-mobility, the fact that there are entire cities that are essentially test beds for “Does this technology work? Let’s see.”
And the fact that that’s allowed to happen, I’d be very interested to see where that goes. Things like e-cigarettes as well. Just the fact that we allow new technologies to be deployed in society without really knowing the consequences of them. But obviously that’s a difficult question for lawyers and for regulators because national governments don’t want to be seen as anti-innovation, but they also don’t want to let technology outpace their understanding of that technology in a way that puts their citizens at risk or puts their markets at risk.
Emma Thwaites: That is super interesting, ’cause I’m a history graduate as well. And I’d really love to just unpick that. Cause you said there’s lots of examples in history of technologies sort of outpacing law and outpacing where society currently sits and surprising us in terms of the development of that. Are there any examples, historical examples that you can think of, Jared?
Jared Keller: So one example that is interesting within this conversation, it might not be directly to what you’re talking about, is the development of electrical systems back in the early 1900s. And I make a comparison in my head a lot to the way that social media was rolled out in the early 20th century. So the thing that I find interesting is that if you believe that technologies are a reflection of the people who make them and that the people that make them embed those technologies with their biases, but also their social norms, their political and social values, then technology is a reflection of the area that it is created in and developed.
And so back in the early 1900s, you had electrical systems being developed in very different parts of the world. And those different parts of the world had different ways of viewing the world and social and political values.
So in the US, you got a highly centralised system that was entirely owned by a corporate entity. So it was a very capitalist approach to how to roll out this new technology. But then in London, you had a lot of public bodies, I think up to like 70 different public bodies, all kind of competing in the proliferation of different standards and models.
And then in Germany you had a kind of hybrid approach where the system was owned by a private government, a private sector company, but then it was regulated by a public sector organisation. Those meant to make sure that the new private company did things that were in the public interest. And so I compare that to the way that a lot of social media has developed and I guess, parts of the internet in general.
And I think the difference – I like something you said earlier on about how the pace of change is unprecedented, and I don’t actually know if that’s true, or at least I’m a sceptic, because generations for the last hundred, let’s say 200 years have been saying that the pace of change for them is unprecedented.
So there’s something that is continuous, even if every generation convinces itself that change is unprecedented in their times. But if there is something that I think is unprecedented about today, it’s about the way that technologies can expand across the globe or at least into many, many different regions very, very quickly.
So instead of electrical grids being developed in different regions, you have an entire technology being deployed across most of the globe within a couple years. And because of that, you don’t have these different regional approaches to that technology that reflect that region’s values. You actually have values from, let’s face it, Silicon valley and the US government that are then being embedded or drawn with them across the world.
Then what I think, if I had to guess, what I think is about to happen is actually the reverse of what happened in the 1900s, where you had all this proliferation of electrical grids and then eventually they came together and tried to standardise them in some way. I think what we’re seeing is we have this massive monolith – internet, social media companies – that exist across the world.
But those regional values are gonna start to be pushed more and more. So it’s the people who are talking about the four internets, the splinternet. And if you believe that, then I think what’s gonna happen there is, regions are gonna start to reassert their values, their social and political values, to try to refashion these technologies in ways that suit how they see the world.
Which is really interesting just for me in terms of the history of science, but it’s also hopefully really good progress to finally take back control of this technology that for a long time, we didn’t feel like we had control of.
Emma Thwaites: That’s super interesting. I think Amanda, there’s a lot to go into there, but I think Amanda’s got some thoughts. Thanks, Jared.
Amanda Brock: Yeah. Yeah, absolutely. So there’s a lot to unpick. I guess if you go back to the .com stuff, you’re right. The pace of change in the last 20 years, we say it’s unprecedented, it’s the tech evolution that I’ve lived through. I do think it’s unprecedented. I think it’s unprecedented because what you’ve got is you don’t have a single isolated development or a single isolated project working on new innovation.
What you have is joined up global collaboration and you’ve got that partly because of openness. And the way that open source – you know, open source software is really my deep expertise, I know about a lot of different stuff in enough detail to be dangerous, but open source software is the one bit that I really know about.
And the communities around that have evolved over those 20 years and they’ve allowed innovation to move at a greater pace. And that’s not immediately apparent to everybody. It becomes more apparent in the last 10 years through things like GitHub and the way that software has been shared. So I think that the first thing I would say is I think that joined-upness and that international nature, diverse global nature of innovation has actually made it go faster.
I think it also creates potentially de facto standards because the standards bodies generally cannot move at the pace that’s needed. And you see software that becomes adopted, that becomes a base point effectively becoming a standard and the sector makes the choice. The people make the choice rather than the standards body.
There’s an awful lot in what you said there, thinking more about the future and this idea that things will become more localised. I think that’s right, and I think it’s wrong at the same time. So geopolitical shift, I think absolutely that the impact that politics has on technology is more than most of us understand.
And that geopolitical shift with bifurcation of the internet in China, that kind of thing of course will localise to some extent. And you see the Europeans calling for, you know, digital and data sovereignty. So there is a pull back and there are local regulations, but that local focus also at the same time has to work with the globalisation.
Because you can’t escape the fact that we’ve globalised. And I think what it does is to some extent, maybe it is the US tech companies imposing, but I don’t think it’s entirely true. I think you’ve got quite a global diversity there and I think you see the products, reflecting what people want. And as somebody at one stage did computer buying for Europe, you know, for Europe’s biggest computer retailer, you would have to buy on a localised basis.
You would buy the base or the basic product for the whole of Europe, but then you would have to buy certain things that were localised because you have different standards and countries, and people want things different ways, and you have different keyboards and computers, you know, depending on whether you’re in France or the UK, whatever.
So I think there is an element of localisation and I think that’s really good, and it has your culture. But I think this globalisation and this standardisation that we see is important. And I also think going back to what we started with, you know, talking about the internet in the nineties and the last 20 years. What we’ve seen is data, and what we’ve seen is as software and technologies evolved, data is absolutely fundamental and at the heart of everything we do in technology.
And that’s created a gold rush and a land-grab of people, grabbing all the data they possibly can, creating data pools, creating data pools that they don’t necessarily understand what they want for – in fact, creating data lakes.
Emma Thwaites: I’m very glad you brought it back to data.
Amanda Brock: I bet you are.
Emma Thwaites: This being the Data Decade podcast and all that. But do you feel that, ’cause I quite often wonder whether – we like to think at the ODI perhaps that 10 years into our lifespan, things have moved on, but actually I quite often think where data’s concerned, we’re still very much at the beginning of the journey. So technology – the wires and the bells and the whistles and whatnot – has moved on.
But actually in terms of understanding, you know, what the potential is for the data and also developing the technologies that allow us to access, use and share data in ways that is, you know, is safe and protects people’s identity and all those kinds of things.
Where do you, you know, you talked about the Wild West a little bit earlier on. Do you think where data is concerned specifically that we’re still in the Wild West or are we kind of coming out of it now and beginning to sort of, you know, learn the ropes a bit, get our heads around it.
Amanda Brock: I think it’s taken us a very long time. And when I look back on implementing the Data Protection Act, the first time round – that old! But when we were implementing it the first time round, actually it was done properly by most organisations.
And then over a period of time, we saw slippage. So things like your collection notices becoming prepopulated happened as time went by, and the respect for people’s data slipped away until you got GDPR. And a lot of what GDPR was doing was actually reinforcing what had originally been in place. It wasn’t entirely new.
So do I think that we respect people’s data more today? I think civil society’s pushed back so hard that there is definitely better process around it. I think there is a better understanding that we have to balance between the individual’s privacy and corporate and business or organisational utilisation of data.
And I think governments try to create legislation that has a balance there, but I don’t know that it necessarily works. And I think one of the reasons for that, I’ve already mentioned that you can’t have technology without data, the value of most technology transactions relates back to data. And that’s partly possible because of the scale of technologies, things like database technologies that we have today and the way people’s data can be mined and the way it can be used, you know, the AI and algorithm developments that allow assessment of people at scale. And the more data these things gobble up, the more business revenue they’re able to generate.
Emma Thwaites: I guess that – Jared I’m coming to you with a very similar question, actually – you know, leading the research function at the ODI as you do, these kinds of questions are very much at the heart of the matter for you.
You made the point earlier that you can’t think back 20 years, so let’s just go back 10 years. I wonder how far you think we’ve come and, you know, what the next developments are likely to be in terms of the intersection, perhaps between data and technology?
Jared Keller: Well, I think – actually, Emma, maybe you’ll enjoy this, I’m gonna make another historical comparison. That is just how my mind works.
My PhD was on a Scottish chap who went to the BBC in the 1940s and had written a book where he was essentially expressing that he had a chip on his shoulder that physicists got all the credit for the industrial revolution, and no one gave credit to the chemists who actually were producing the materials through which those technologies were built.
And I see a similarity with the way we talk about data and digital technologies. A lot of the digital technologies are more visible and therefore people are more aware of those technologies, but we feel like we need to constantly remind people that they’re built on data. Data is the really important part.
So to push that comparison just a little bit further I’d say, it took nearly 100, 150 years for people to realise that the actual collection and extraction of all those chemicals and materials from the earth had negative consequences. So I think the popular retelling of it is that Rachel Carson’s Silent Spring really woke people up to the fact that science and technology we’re having an impact on the environment.
I don’t think it’s taken as long for us to realise that the collection and extraction of data about us is having negative impacts. I think possibly to the point that we’re talking about, that has sped up, that hasn’t taken us 150 years this time. And to me, if there’s a comparison between Rachel Carson’s Silent Spring waking people up to the need for environmentalism, and something waking people up to the need to talk more about data and how it’s used and how data about us is manipulated, it’s probably pretty obvious, but Cambridge Analytica.
I worked at the ODI before that happened and after, and I had such a hard time explaining to people what I did before Cambridge Analytica happened. But once that happened, it made sense to people. It was a touchstone, something that everyone, many people were aware of – I won’t say everyone. It crystallised for people, I think, why what we’re talking about here is so important, and that has made the community-building, the communication much easier, from my perspective.
If you ask me, where are we going over the next 10 years, I think what I’d say is I hope – this is maybe back to a point earlier – my hope that technology becomes more localised and regional is a hope. It’s not necessarily a prediction. But if I have a hope for the next 10 years, it would be that people, societies, communities are able to take back the question of “what does the future hold?” rather than coming to me and saying, “Jared, what do you think as Head of Research at the ODI, the future holds? What new technologies will be big? What should we invest money in?”, we actually say “well, that’s a question that we need to take to people first, we need to go speak to communities about what they want. We need to make sure that all the research is directed at what people want to happen.”
And that way there’s less of a disconnect between the research and then the people who are actually expected to use the results or the products that are coming out of academia.
Emma Thwaites: Do you think people know what they want to happen? I mean, this is a slightly provocative question, but I think quite often people know what they don’t want. It’s quite often challenging for people to think what they do want.
Amanda Brock: Well, they can’t know. And they can’t know because they don’t know what the opportunity is. I was a lawyer for 25 years. I wouldn’t have been a lawyer if I’d understood the world better, but where I grew up, the kind of jobs that I knew I could do if I went to university were to become a lawyer, a vet or a doctor. Right. So I couldn’t have more imagination because I didn’t have the access to the understanding.
And people generally don’t have the access to the understanding. So I had a very similar experience to what Jared’s just talked about with Cambridge Analytica. For years as I was working in tech, I would go dinner parties. You know, people would be chit-chatting about what was going on in the world. They’d ask me what I did. I would explain I was a tech lawyer. Their eyes would glaze over.
Then in about 2008, the first smartphone came out with Apple. Rapidly open source got onto the smartphones with Android. And when I mentioned that I worked in open source or I worked in tech, they would ask me, you know, they’d get their phones out and they’d ask me if I knew something about it and I could talk to ’em for hours about their phones.
So suddenly it was relatable and it was relatable because of that understanding. And I don’t think you can ask people those questions. I think there are questions you can ask them that help shape what they don’t want. And I think it’s easier – what you don’t want is easier to understand, you know, you don’t want the Cambridge Analytica repeated, maybe. That’s something that’s measurable for them and understandable. So I think we do also still have to go back as well. If we go back and look at the sort of stuff I was doing around 2000, I’ve already said, if we’d known then what we know today, we would never have chosen things the way we did.
And if you think about how data is shared and how social media works and how we understand that content. We assume, generally as a population, that that content is true, but there’s no validation of it. There’s no authenticity about it. Right. We don’t know that it’s accurate. There’s no source of truth and there’s no source of truth because the way the laws are structured and they’re structured around it being something that is posted on a platform, that the platform is a host of, that it isn’t an editor, that it doesn’t make changes to, that it doesn’t own that content. So perhaps if we were looking at that now, and you’ve seen the online harms bill sort of come and go recently, but if we were looking at that now we would have a very different view because we’re in a very different world.
And I think we have to keep the last 20 years, not just the last 10, constantly in mind, as we look at the decisions and we look at the way we take things forward in the next 10. I do think data and software, data and technology now are in a symbiotic relationship. So I don’t think, you know, maybe I’m a Scot with a chip on my shoulder.
I think that what we see now is that the technology goes off and creates what it does, but you can’t really use it without the data, but the data would just be sat there. And that’s the problem we’ve had through the last 10 years in particular, where people have grabbed all the data they could, they create these lakes. They spend a fortune in consultancies to come in and look at, you know, building products around it, but not really understanding what they actually want to use the data for.
Emma Thwaites: And that very much reflects – I mean, years and years ago, anecdotally, not long after I joined the ODI, I went to see the Chief Data Officer for a big FMCG company. And he said that very thing, he said, you know, “Have we got data? Sure. We’ve got, you know, more data than you could imagine, but we don’t know what to do.”
Amanda Brock: No. And you also often don’t know. You don’t know what to do with it, you don’t know what to ask of it because you don’t know what’s there. And it’s very hard because it’s like a needle in the haystack, right? There’s this magic thing within that data that you want to access, but you don’t know how to access it and you don’t know how to access it because you don’t know it’s there.
And I think that the moves and technology now, and that, you know, the last year or two in particular, we’re seeing so much happening. That the data is becoming more relevant, but also what we’re seeing is a lot of duplications. So if you look at the public sector, you see, you know, in healthcare, multiple products being built, different money being spent on the same product multiple times, as people try to work that out. If you wanna go through medical records, the pandemic really focuses people on this.
There’s a brilliant thing called the QCovid risk calculator. And it was an existing algorithm from Oxford University that the NHS used, and they used it to work out the vaccine rollout and they were able to use this existing technology. Take the NHS data that was sitting in the GP’s records and analyse who should get the vaccine where the real risks were. And if you look at how I suppose the test and trace app – if you compare that to how the vaccine rollout went, you know, it’s night and day, and it’s because you’ve got that existing technology and that ability to analyse data, that’s been well managed and it’s very clean and you know what you’re after.
Emma Thwaites: I love that. That’s a really good place to move on to the final question, actually – that we achieved that symbiosis, begun to achieve that symbiosis. So my favourite question always is with this podcast, when I ask you to do a bit of crystal ball gazing, and you have sort of touched on this already. But, if I could ask you to be a bit of a soothsayer, a predictor of the future, where do you think, I guess technology, but maybe technology and data are going to go next? What does the next 10 years hold?
From my perspective, what we see today is that there is a massive uptake in software for our infrastructure and that’s happened because of digitalisation. So as the national public infrastructure has digitised, it’s become software-defined.
And by being software-defined, these days, that means it’s open source. So I think we will see a huge focus as that understanding comes to light and people try to work out how to curate that open source well, to make sure that it’s well maintained, that it’s well governed. That it’s secure. I think we’ll see a big focus on that and that will mean that governments understand they have to financially contribute.
And I think we’ll see a re-characterisation of open source software, not just straight take away from the commons – I think it’ll still be in the comments, obviously – but I think we’ll see it characterised as a digital public good. And I think that data’s interplay in that symbiotic relationship means that we’ll see much more of a focus on both data and software as digital public goods and funding for that.
Emma Thwaites: Excellent. Thank you. And Jared, over to you, your thoughts for the next 10 years.
Jared Keller: Yeah, I think very similar actually. So not just open source, but thinking about open data, open science, open access, that entire structure, process of research being opened up even more fully. So that it’s not just publicly-funded research published in peer-reviewed academic journals that we’re talking about when we talk about open science, but actually involving industry, government and civil society in that process.
I think that’s what needs to happen. And in terms of the data, then that means data is shared not just within academia, with non-academics, researchers working in industry, researchers working in government. Cause one thing we’re very aware of at the ODI is that there is research related to data happening everywhere. Since data is everywhere, here’s research about data.
It’s not as if you can just say “I wanna know about the latest and greatest in research in data, I’ll go to a computer science department.” There could be research about data and its impact on society, how to value it and how to use it happening in an agriculture department, happening in a philosophy department, happening in a civil society organisation or community organisation.
And it’s really hard for those different researchers, those different people, to have visibility of all the other researchers. And because of that, it’s really hard for them to know “where could the data that I have just produced in my research be reused by some other researcher?” It’s a really, really challenging problem.
Academia has some system set up to track research to make it findable so that you can share those types of datasets. But it’s really, really hard to do that outside of academia. And so if there’s a challenge, I guess, for the next 10 years, it’s how do we make that research about data and those datasets more findable for people working, not just within academia, but in government, industry and civil society as well.
Emma Thwaites: So there’s a lot of work to do
Jared Keller: Yeah, better get started!
Emma Thwaites: Better get started. Well, that’s all from this episode of Data Decade, and a great insight into how data and technology are shaping and impacting our lives. Interesting thoughts as ever. Thanks again to our guests, Amanda Brock…
Amanda Brock: Thank you for having me, Emma.
Emma Thwaites: And Jared Keller.
Jared Keller: Yeah. Thank you for having me.
Emma Thwaites: So thanks for listening, and what a great insight into how data and technology are shaping and impacting our lives. If you want to find out more about anything that you’ve heard in this episode, just head over to theodi.org, where we continue the conversation around the last 10 years of data and what the next decade has in store for us.
And data and technology is just one of many subjects we’ll be discussing further at the ODI Summit in November. All the details and how to book tickets are on our website. And if you’ve enjoyed listening, please do subscribe for updates. I’m Emma Thwaites, and this has been Data Decade from the ODI.
Data and technology go hand-in-hand, driving innovation and opening up new possibilities in products and services. It’s been fundamental to the technologies that have emerged in the last 10 years, from AI to digital twins. But the pace at which these advances are made means we soon find ourselves in uncharted – and sometimes unexpected – waters. These changes have been significant in the last decade, so where might we end up in the next 10 years?
The evolution of data and technology
Technology has evolved a huge amount since the days of dial-up internet access 20 years ago. Accessing the internet was just something you did if you needed to send an email or buy something online. Today however, the technological landscape is entirely different. The digital access and connectivity we have means that almost everything we do happens across, or is dictated by, the internet and its connectivity. Even in recent years, we’ve seen the growing digitalisation and adoption at scale of traditionally-offline activities, like schooling and healthcare, in response to the Covid-19 pandemic.
Data sits at the heart of tech’s evolution. It’s fundamental to technology and the value of almost all technological transactions is generated from data. People have begun to recognise the very real impact that data can have. For example, the Cambridge Analytica scandal thrust the potential harms of data collection into the public consciousness. However, when it comes to people’s understanding of the potential of data – and the ways in which technology can enable the safe sharing of data – we’re still close to the beginning of our journey.
A lot of the digital technologies are more visible and therefore people are more aware of those technologies, but we feel like we need to constantly remind people that they’re built on data. Data is the really important part.
– Jared Keller, Head of Research, ODI
As technology has evolved, legislation has not always been able to keep pace. In the last decade alone, we’ve seen plenty of examples of new technology being developed and deployed within society without being fully aware of the consequences: self-driving cars, e-cigarettes, micro-mobility vehicles like e-bikes or electric scooters.
This presents a difficult question for lawyers, regulators and governments to contend with – if technological developments are outpacing legislation, how do we stimulate innovation without putting people and markets at risk?
This dilemma isn’t new. Since the emergence of the Web and surrounding technologies, lawyers and regulators have tried to keep pace with how tech is evolving.
In the dot.com boom I was a lawyer in one of the first ISP’s. We were trying to structure how the Web would function and were making up how the laws would work as we went along. We lobbied for legislation in Europe and the UK that is still in place today but we couldn’t imagine then where digitalisation was going. How could anyone envisage a world where people go to Twitter for their news as a single source of truth, where data is at the heart of technology revenue models?
– Amanda Brock, CEO, OpenUK
Over the last two decades, we’ve seen the introduction of better processes and legislation around data like the Data Protection Act and GDPR, and a better understanding of the importance of the balancing act between privacy and innovation. However, there is still some way to go.
Local v global
The pace at which technology is evolving is often called ‘unprecedented’. But is this the case?
Generations for the last 100, let’s say 200, years have been saying that the pace of change for them is unprecedented. So there’s something that is continuous, even if every generation convinces itself that change is unprecedented in their times.
– Jared Keller, Head of Research, ODI
One thing that is certainly unprecedented about today’s technological landscape is the global nature of tech. Technologies are developed, deployed and standardised across the globe at pace. Take for example the development of social media – where massive US-based social giants like Meta, Twitter and Alphabet have a global reach. Alongside monolithic internet companies, we also have global communities that have evolved around the idea of joined-up global collaboration and openness.
This has not always been the case. For example, in the early 1900s, electrical systems were being developed in very different parts of the world, all of which had different social and political norms and values. The resulting systems reflected this. For example, Chicago developed a highly-centralised system entirely owned by a corporate entity. In London, there were lots of public bodies all competing, and the development of different standards and models. Berlin developed something between the two – with a privately-owned system regulated by a public body.
As data and technology continue to evolve, what does the next decade hold?
It’s uncertain as to whether tech will continue to globalise in the same way it is now. While globalisation is inescapable, we may see regions re-assert their social and political values and embed them within technologies in ways that fit how they see the world. There’s already discussion around the ‘splinternet’ – the fragmentation or bifurcation of the internet based on regional values including religion, politics, government and national interests – and it’s not just the Europeans who are calling for digital and data sovereignty.
One trend we hope to see in the next decade is people, societies and communities having more control over the future of technology. Diverse communities should be consulted on what they want from technology, and our research and development should be directed by the wants and needs of people. This is, of course, not without its challenges – it can be hard to know what we want for the future, as we don’t always know what the opportunities are. It’s important to make these ideas more tangible – through education, but also through asking people what they don’t want for the future of technology, based on developments made in the past.
We should keep the last 20 years, not just the last 10, constantly in mind, as we look at the decisions and we look at the way we take things forward in the next 10.
– Amanda Brock, CEO, OpenUK
We also hope to see more ‘openness’. With the growing digitalisation of our infrastructure, we hope to see a re-characterisation of open source software and open data as a digital public good – and with that, governments financially-contribute to the curation, upkeep and governance of open source software and open data.
Beyond open source software and open data, the process of research should be opened up beyond academia and peer-reviewed academic journals to actually involve industry, government and civil society in the process. Data – and research about it – is everywhere. In the next decade, we hope to tackle the question of how we make datasets and research about data findable and accessible for people not just within academia, but in government, industry and civil society as well.
Research related to data is happening everywhere. It’s not as if you can just say, ‘I want to know about the latest and greatest research about data, I’ll go to a Computer Science department’. There could be research about data and its impact on society, how to value it and how to use it happening in an Agriculture department, happening in a Philosophy department, happening in a civil society organisation or a community organisation.”
– Jared Keller, Head of Research, ODI