The Hoover Institution Program on the US, China, and the World hosted Digital Authoritarianism and Strategies to Promote a Democratic Digital Future, on Monday, April 28, 2025 from 4:00 – 5:30 PM PT in the Shultz Auditorium, George P. Shultz Building. 

The People's Republic of China is collecting and analyzing unprecedented volumes of data from both public and private sources, within and beyond its borders, for social control. It is leveraging advanced data-centric technologies such as artificial intelligence, neuro and immersive technologies, quantum computing, and digital currencies to enhance and export its authoritarian governance model. This has led to an erosion of privacy, personal freedoms, and a climate of fear and self-censorship within the PRC. As the PRC exports its technologies to other countries, these authoritarian practices may spread globally. What are the most effective strategies for democratic societies to prevent the misuse of emerging technologies for surveillance and control by authoritarian regimes? How can we effectively track and monitor the global spread of data-centric authoritarian practices? What approaches can democratic governments and civil society adopt to develop and promote privacy-preserving solutions that offer viable alternatives to authoritarian methods, while ensuring accountability, transparency, and the protection of human rights? How can we engineer democratic values into the architectures of our technology platforms? In this event, our panel will examine the unique aspects of the PRC’s approach to digital authoritarianism and the opportunities for a democratic response.

WATCH THE EVENT

>> Larry Diamond: Okay, good afternoon, everyone who is physically here in the George Schulz Auditorium and everyone who is joining us virtually for this program that's going to in part be discussing virtual technologies of various kinds. I'm Larry Diamond. I'm a senior fellow here at the Hoover Institution, working with Glenn Tiffert and Elizabeth Economy and Frances Hiskin on the US China and the World Project, which is one of the co-sponsors of this event with the National Endowment for Democracy on Digital Authoritarianism and Strategies to promote a democratic digital future.

I want to say this event is inspired by a paper authored by one of our speakers that I highly recommend to you. The title of the paper is a little bit different. It's called Data Centric Authoritarianism, how China's development of frontier technologies could globalize repression. And if you just type into Google Data Centric Authoritarianism, and the author, Valentin Weber, who you'll be hearing from soon, who's written.

I wanna stress this remarkable and invaluable paper, or type in Data Centric Authoritarianism and National Endowment for Democracy or any combination like that, you will get this remarkable paper that you can read for free. We're very pleased not only that Valentin is with us, but the Vice President for Studies and Analysis of the National Endowment for Democracy, Chris Walker, who will speak after me briefly and welcome you.

And Beth Curley, who's the Senior Program Officer for Technology and Democracy at the National Endowment for Democracy, who is going to join us on this podium and has helped to conceptualize and contributed in a way to this report. I want to just make a few substantive comments before I turn it over to Chris.

This event engages three broad themes in the relationship between technology and democracy. First, the progress of technology. What is the pace of it? Second, the competition and technological development between autocracies and democracies, and more specifically between the US And China. And third, the diffusion of technologies, and therefore potentially diffusion of technologies of control and repression on the technology itself.

There are interesting debates about the paces at which these are moving. Valentin's paper talks about different technologies, digital forms of surveillance, digital and biometric, AI enhancement of digital surveillance that may be moving at a different pace perhaps than some of the further out technologies like quantum computing, and neuro, and immersive technologies.

But we'll hear, and then you'll make your own judgments. The second point, obviously we're at a kind of different moment now than perhaps when this project was conceptualized. We've long since entered an era when leadership in the development of critical technologies has become contested, not only between different democracies, but between, again, democracies and autocracies, and more especially between the People's Republic of China and the United States.

And the rapidly emerging context, which I think really we need to remain, you know, deeply mindful of, both from a technological standpoint and from a democracy standpoint, a geopolitical standpoint, is that China is surging into the lead in some areas of technological innovation, and it has the potential at least to achieve broader dominance if US Policy shifts result in.

This is my opinion now. I won't attribute it to anybody else for the moment. If US Policy shifts result in diminished funding for research and development in the US in science and engineering. And then two, the emerging environment, not only of diminished funding for RD, but for visas for foreign talent that we heavily depend on in the development of these technologies and basic discovery risks undermining U.S. leadership in many of these realms and frankly, further advantaging authoritarian competitors, especially the People's Republic of China.

There are very few things that worry me more than this right now. All of this is happening my fifth and penultimate point, at a moment when, as our director of the Freeman Spogli Institute, Mike McFall, who's just finished a book on this, has walked into the room and our colleague as well at the Hoover Institution, when the competition between democracies and autocracies is accentuating and when the global recession of democracy that's been underway for nearly two decades is deepening.

And the final point which we're going to hear discussed here is that the transfer of digital technologies of surveillance and control from the world's most powerful autocracy, which I call a neo totalitarian system, in my view, that is not a misplaced characterization of the People's Republic of China to less technologically sophisticated countries, including some even that are formally and in other cases formerly democratic, that digital transfer of technology is accelerating at the same time.

And this is the final point, as the balance of power shifts in the world and as that balance of power shifts, if it continues, democracies will have less leverage to monitor and restrain this transfer, and they may wind up having less will to aid Democrats in these countries who are struggling to push back against these technological intrusions to defend their rights.

So with that, I introduce you to the vice president of the most important organization in the United States that helps people around the world to defend their rights, Chris Walker of the National Endowment for Democracy.
>> Chris Walker: So thank you very much, Larry, for that. Let me just say a few more words to build on some of the ideas that Larry shared I'd suggest that this event is dedicated to discussing a truly important digital challenge to democracy.

And the discussion is an outgrowth of work we've been doing with wonderful partners here at Stanford, including a workshop we held last year. I wanna take a moment to recognize both Larry, Frances Hisgin, Glenn Tiffert, Elizabeth Economy, who's a member of our board, also would like to thank Eileen Donahoe for her leadership.

She's a member of our board, our vice chairman, who's here with us. With us as well. And she's done so much on these issues over the years. It's really helped us to situate our thinking at NED. Right now, this issue continues to be both helping us situate the current dynamics around the proliferation of China's authoritarian technologies.

And to look at how emerging technologies, from digital currencies to quantum computing, may transform the way authoritarians collect, process, and make use of digital data. Today, the world confronts diametrically opposed visions of the future of freedom. 15 years ago, many commentators assumed that liberation technology would lead to a world where the free flow of information broke down walls of authoritarian censorship.

And that people facing repression would find new opportunities to connect, organized, and that autocrats would find themselves on a back foot. China's trajectory in the intervening years shows us otherwise. In a system that's closely entangled homegrown tech champions and the party state, a resurgent Communist Party is leveraging the power of data-driven technologies to subject citizens to pervasive surveillance, maintain a closed digital ecosystem that's censored and suffused with state propaganda.

And to identify new population level methods to reward favored and penalize disfavored behavior. Rather than empowering people and fostering open debate, technology functions as a lever to keep the powerful on top, establish narrative control, and close down ever more space online and off for manifestations of independent civic life.

While techno totalitarianism is practiced in PRC may for now be unique, China's global economic, technological and political acumen creates vectors for authoritarian practices to spread globally. Companies like Huawei provide the pipes, such as 5G networks through which information flows and supply governance packages such as safe cities that shape how local officials understand, in some cases repress their people.

The technologies and training offered by the PRC vendors bolster the practice of authoritarianism elsewhere in the world. Meanwhile, the secret and often politically infected deals that surround PRC tech imports sorry exports provide an opportunity for the CCP to extend a shadowy web of influence over foreign and economic elites.

Fellow autocracies from Russia to Iran to Belarus are leveraging the power of AI surveillance to crack down on dissent, while Venezuela's ruling party has turned a PRC sourced digital ID system into a powerful instrument of dictatorial power. In in the UN and global technical standard setting bodies, the PRC is showing its resolve to act as a norm shaper rather than a norm taker, strategically occupying key leadership positions in advancing norms antithetical to basic democratic values from public participation to free expression.

Over the past 15 years, the consolidation of the PRC techno authoritarian model and the avalanche of democratic backsliding around the globe have offered a stark reminder to all of us that we shouldn't rest on easy assumptions about where the world is headed or unduly limit our vision of the possible risks.

Therefore, the report authored by Valentin Weber, which Larry commended everyone to and that we released earlier this year, offers a glimpse of just how pervasive the dragnet of surveillance and manipulation threatens to become in view of emerging technologies that will enable authoritarian actors to break into not only currently encrypted data, but the basic privacy rights of our thoughts.

For its part, the National Endowment for Democracy is proud to lend itself and its support to the courageous people around the world who are working to shed a light on secretive cross border deals, outsmart increasingly sophisticated authoritarian sensors, and put technology to work in the services of principles like transparency and public engagement.

And as we move rapidly into a quickly evolving, heavily contested digital future, the democratic community can and must put forward a vision, an alternative to the authoritarian one that's on offer from the CCP. So with that, let me introduce the panel very briefly. I'm gonna start with Charles Mok.

He's a research scholar at the Global Digital Policy incubator of the Cyber Policy Center at Stanford University. That's a mouthful. And a member of the Board of Trustees of the Internet Society and a board member of the International center for Trade Transparency and Monitoring. Charles served as an elected member of the Legislative Council in the Hong Kong Special Administrative Region, representing the Information Technology functional constituency for two terms from 2012 to 2020.

Valentin Weber, who's the author of the report that's the basis for the discussion, is a Senior Research Fellow with the German Council on Foreign Relations. His research covers the intersection of cybersecurity, artificial intelligence, quantum technologies, and technological spheres of influence. Valentin Weber is also a China Foresight Associate at LSE Ideas, the foreign policy think tank of the London School of Economics and Political Science.

And he holds a PhD in Cybersecurity from the University of Oxford. And finally, let me introduce my colleague Beth Kerley, who's a Senior Program officer with NED's International Forum for Democratic Studies. She's the editor and contributor to the forum series of publications on emerging tech and democracy, including the report we're discussing, Data Centric Authoritarianism.

So let me introduce the three speakers to the panel and welcome again to everyone.
>> Beth Kerley: All right, so thanks Chris and Larry, and thanks to all of you for being here. Appreciate the Hoover Institution hosting this discussion of our very timely and important report. So for the moderated portion of this discussion, there are kind of three broad things I want to cover.

The first is this concept of data centric authoritarianism, what it is, how it looks in China and how it's spreading. The second is how frontier technologies are potentially changing the game. And the last is going to be what we can do about it. So starting at the conceptual level, the title of the paper is Data Centric Authoritarianism.

What's that idea all about? What features of the authoritarian system that we see in the prc, the laws, practices, bureaucratic institutions, make data important and make it make sense to think of these very different technologies that are discussed as kind of contributing to the same political project.
>> Valentin Weber: Thank you so much, Beth.

Thank you for the National Endowment for Democracy for making this report possible, and also to the Hoover Institution for organizing this event and bringing us all together. So, Data Centric Authoritarianism, the report is really about what the role of data is in China's surveillance state. And for a long time, China has relied on information on people, on informants, on people who are in the security apparatus to report on dissidents and so on.

But really, in the early. 2000S, there was this drive for digitization, for putting up CCTV cameras and so on. And throughout the last two decades, really data became central to the CCP's idea of how it wants to govern it. So, kind of a scientific idea of using data to every day to predict people's behavior, to see in real time where people are to decide on whether the police should crack down on a protest or when, not because the police can't be everywhere, right?

If it's just a small protest, you might decide not to. And that's really what the data gives you, that insight as to what to do and what is really a change, I guess. And what is the core concept of this paper is that this frontier emerging technologies, they're at this time profoundly changing again what the Chinese surveillance state is.

And so I looked at four technologies. One is quantum technologies and especially quantum computing, which is projected to break current encryption in the next five to ten years or so. And here the idea is really that the CCP would get access to data that is currently protected. Let's say if you use the Tor browser in China, which is difficult, but if you use it, you're protected by encryption.

In the future that wouldn't be possible if it hasn't been upgraded to post quantum cryptography. In terms of AI, that's the second technologies. The first one is quantum technology and the second one is AI. And here really the CCP again has used it to make sense of very, very complex systems of millions of people's behavior and to look at the patterns.

And I think one of the core things of the article again here is that the masses of data that the CCP is processing are growing and growing and also that control is becoming more centralized. So we have at the moment these kind of command centers where police use data of cities.

And there is already a thing where it started in cities and now it's getting to the provincial level where provinces can see what's happening in the cities in their provinces. And so the current estimate is that, there is even two on two screens, the Chinese secure creds can see around what 20% of the population is doing over 200 million people.

So that's how centralized control has already become. And that's because of AI. Third technology is really the metaverse and newer technologies. Metaverse mostly AR, VR, and newer technologies can be invasive or non-invasive, looking at thoughts. And that's a technology which gives the CCP access to new data in the thousands or in the 20th century they weren't able to look at thoughts, but now it's getting easier by looking at things like pupil dilation and things similar to it.

And lastly, the last technology frontier technology that the report looks at is the digital currency. And here again it's especially looking at what China did regarding its central bank, which instituted the EUN and which can potentially centralize the control over financial data within China. And the corporate core also of the report is not just to look at these technologies separately, but look at what all these technologies together could do in transforming the surveillance state and shaping data in a way that was previously not possible.

Make it more centralized within China, new access to new data and given also access to data that has might have been previously protected from the prying eyes of the the surveillance state.
>> Beth Kerley: Thanks, Valentin. So running sort of through those comments, this idea of centralizing data, right, creating an individual locus of control based in part on instruments that are already there, particularly cameras that blanket the physical space in China.

There's also of course the question of digital surveillance, which maybe Charles could say a bit more about, and then integrating that using AI. So there's a foundation of already existing surveillance tech and the potential for these new capacities to augment that. But before we get too deep into the specific areas of tech development, Charles, I want to turn to you for you've done a lot of different work looking at the model of digital authoritarianism that we see in the PRC and particularly how it's projected outward.

So anything that you want to add on the goals of this pervasive web of digital control and how we see it, for instance, in technical standard setting bodies being promoted globally.
>> Charles Mok: Yeah, thank you, Beth, and thanks Valentine for the valentine for the report. So to me, if you look at the regime in China and how they look at various aspect in their society, including technology, business, culture, education, everything, it all has to serve the party.

The party is the only goal. So to me, when I look at how they view technology ever since the Internet was introduced to China, probably around just like for the rest of the world, the commercialization of the Internet in the early 90s, I think very quickly they figure out that they have to find ways to control it rather than just refuse to let it come into China, rather they allow it to come into China, but they have to make sure that they can controlled it, make sure that it will serve the party.

So that's when you see that they have all these concepts and development from the Golden Shield project and the Great Firewall, which is more as a matter of a passive way to censor the Internet coming in. But then again very quickly I think before people talk about big data, they figure out the importance of data.

So they started to collect all these data before they know how to, how to analyze it before they have the technology to be able to analyze them. So that I think evolved into the mechanism and the philosophy in China to adopt further control through surveillance so and also use these mechanism and the technology to make it more of a propaganda tool that they can take advantage of.

So if we look at the ways that they are trying to control and work in the technical standard community I think this is very similar to the ways that China has been the philosophy that they have in terms of thinking about the whole technology in the last several decades coming into China and they start to adopt them and they start to find ways to control them and control the development.

So I would say that number one, they are very well planned, but they don't look necessarily immediate. They look for perfection. They don't need to have the whole strategy mapped out, but they can be very quick to adapt because they are also they realize that they are in a developing mode, especially in the beginning.

So they don't mind what we say in Chinese, like touching the stones while they are crossing the river. So I can figure it out as we move along. So when they deal with this, the technical standard community internationally, there are two different types of these organizations. You would have the more top down national control intergovernment agency controlled ones such as the itu, International telecom unions and the working groups under them.

And then you will also have another type of standard organizations such as the Internet Engineering Task Force, IETF and IEEE and so on. These are more multi stakeholder and bottom up organization technical societies. For both, they are trying to exert more and more control through participation. Because particularly in the latter, for the multi stakeholder and open technical society organizations, they are free to participate.

So they would provide resources, they would participate at a high level and frequency. And of course sometimes we also look particularly at the UN organizations, they also use the same strategy and participate at a very high level. So for the last 20 years I would say or more, they, they have been increasing this level of participation.

And in the last several years I think they are trying to change the mode of operation of these organizations in a couple of ways. First of all, they would create and try to propose new standards that would fit their philosophy of future technologies, including, we would say that this is adopting elements of surveillance into the technology.

But to them this is all about creating a safer and more secure network environment, right? Because obviously today we have so many problems with crimes and scams and so on on the Internet. And so they're trying to figure out ways to help law enforcement tackle these issues. So that's number one and number two, I think the ways that they are trying to change the system is that or the particularly the standard organization system.

Is that they want to change the governance aspect of it which is that they are strategically trying to switch some of these technology standard organizations. And the work that they do from the multi stakeholder organizations over to the intergovernmental United nations organizations. Which they feel that they would have a better chance of controlling the outcome and influencing the outcome through governments.

And through their other friendly governments or the Belt and Road countries and so on through their Chinese influence. So I think These are the two typical ways that in the last 10 or more years that China has been gradually trying to increase the level of influence in these standard Bodies around the world.


>> Beth Kerley: Thanks, Charles, and I know some points coming out from a conversation that we held last fall as well with our colleagues at the center for International Media Assistance. Looking at really the challenge facing civil society advocates and supporters of democracy more broadly and attempting to engage on standards in this new environment.

It strikes me that in both of the stories you tell, there's kind of a progression that we see from a more reactive defensive to a more assertive mode, right? So from tech as, okay, this is dangerous, how do we stop the free flow of information to tech as, hey, this is a map of our population that we can leverage for control, this is really cool.

From trying to shift dynamic, shift proposals within standard setting bodies that are maybe set up in ways that don't work so well for the CCP. To actually trying to change the way their run to shut out non governmental voices and when it comes to the dissemination of these norms.

So the standard setting bodies are one vector for that, right. You can have the authoritarian ideas about surveillance kind of baked into your standard for smart cities. Or the case everyone is familiar with is the so called new IP which was proposed and rejected a few years ago that would have increased centralized state control over the Internet.

But another piece of this question that's really important is what happens on the ground. Because in China, of course, you have a very well established physical human authoritarian infrastructure. Lots of investments in internal security, this idea of grid management, all of this being combined with particular technologies. And now these technologies are being exported all over the world to very different settings.

And there's a pretty robust debate among scholars about the extent to which you can see PRC like systems take root elsewhere. So Valentin, you engage with that a little bit in the paper, tell us about that debate and where you come down.
>> Valentin Weber: Sure, yeah, so it's really the debate is about whether the Chinese model or the Chinese approach to surveillance can be also implemented elsewhere.

And the main argument against it is that you can export the surveillance gear, you can export CCTV cameras, but the police on the ground in countries, in different countries won't be able to use it. Because what China has is a very sophisticated security infrastructure with organizations that have been, you know, have large funding, are very, very, very sophisticated.

And so in that premise, you could maybe export it to Iran, Russia and so on, because those are countries which also have very sophisticated security organizations. But most other countries, developing countries, wouldn't be able to take China's approach because you would export the technology and they just wouldn't be able to use it.

But I think on the ground, there's a very different reality, we know that I would make a bet now and say that Chinese surveillance tech can be found in every country across the world. Whether it's being used by the public infrastructure or by the state is a different question, but it's found everywhere.

And, even the poorest countries, such as in developing countries or in Venezuela, they they do import the technology. And even if they can't afford everything, they have a priority that's the regime security. And because of regime security, they will spend on those smart cities and so on. And even if they don't have the money, China will find a way of giving them the surveillance gear in exchange for, let's say, oil.

There was a deal between Ecuador and the PRC where Ecuador gave the PRC oil, and it got surveillance gear in exchange. So really, everyone, let's say. Can afford it, but then again, there is the question about whether they can implement it if they don't have sophisticated police. And they are really the private companies, Chinese tech giants come into play.

We saw in Uganda that Huawei was training local police officers to use their gear and to get access to dissidents phones. And so, so we can really see that even though they might not be immediately capable of doing it, they'll get the support from Chinese companies to get access to those phones.

And so what's the lesson really here is that they won't be completely able to copy China. What China did is really remarkable. It has its own tech giants such as Tencent, Baidu and so on. So a country, let's say Uganda or another country, won't be able to do that, but they will be able to buy the gear and they will get help from China really to implement it.

And China will do everything that they can to support that. So I think really it's the reality on the ground shows already that the surveillance gear is diffusing and also that, you know, the model is being exported and imitated abroad and quite successfully.
>> Beth Kerley: And I think that also brings us to a really important point that it's not just the technologies that are going abroad, right?

It's often advisors, trainings and so forth that could also export certain ideas about how it ought to be used along with it. You mentioned the role of companies and I want to turn to Charles with one final question on this segment. So any responses to that? And also just given the ups and downs that we've seen in the CCP's relationship to some of the tech giants over recent years with a raft of regulatory and creative crackdowns, I would say to a seeming warm up in the past couple of months or so given, okay, private companies are going to help us with the AI race, so maybe we need them.

What does that mean for the role of private companies from the PRC in this export of digital authoritarianism?
>> Charles Mok: Well, but before that actually I wanna respond a little bit to what Valentin has been saying about the way that China, how do they export the model and so on.

I recall that two years ago I wrote a paper on the great firewall development over the last 20 years in China. And one of the things that I actually was in part of my report was that China's great firewall model is very difficult to export. I said exactly the same thing, that it takes huge resources, huge amount of human resources as well in order to make it work.

So at the time, we were referencing an example in Cambodia because of the fact that that country also wanted to use Chinese technology to implement their great firewall. And it hasn't been successful because of the lack of the same kind of infrastructure, including controlling the telecom companies, which they don't in Cambodia, as well as in China with their state owned company structure and so on.

So, but having said that, I think the reality right now is that China doesn't have to export the whole thing. They only have and they figure out that if these countries are not as capable of doing it as China, they could still use, as you mentioned, companies such as Huawei and others to fill in the gaps.

So these countries, these companies would be able to, and this partly also answering part of your question as well, these companies would play a role in the scheme of things for China over the world in terms of providing training to these countries. And they call them cybersecurity trainings, they call them anti crime, anti-cybercrime training, which is what we, everyone would need, right?

And if the country doesn't have those resources and capabilities, they might as well even outsource it. And also an important part of it is that, if we believe that controlling these infrastructure in these other countries would benefit China in the sense that, if we worry about having back doors and so on, if they really exist.

Then China would get access to doors data, whether it be back door or the front door, whatever, they get the access to the data too. So it just works to China's advantage no matter what. But back to the question about what is the role of the companies? You know, the relationship warming up since the crackdown on the commercial or tech sector in China starting about, you know, two, three years ago and gradually warming up right now, I actually don't think that the situation has changed that much in terms of the relationship between China's tech companies and the government and the party.

Because to me, it's all in the family. It is like the children are not very obedient and they forget the values of the party so I have to spank them a little bit and they cry. And then they come back and become like the good kids, again, and they, well, they are obedient again.

They obey. So now, and also not to mention that the global situation has changed as well. You know, remembering that when they were starting to, in 2020, you know, when they were cracking down on companies like Didi and Alibaba and so on, these companies were perceived to be getting out of control of the Party and the stuff that we have been talking about the last two years about sanctions and so on hasn't really started yet.

So the global relationship between the tax geopolitics at that time and today or the last two years has changed a little bit too. So I think out of practicality and out of the fact that these companies are in, you know, coming back to the fold of the party, this is the time that they actually, the government, Chinese government, actually need these companies.

You know, they shape them as well. You know, we've been talking about moving them from the soft tech to the hard tech. So in many ways these companies that were focusing a lot on making money, quick money, on, creating games and so on. Now they're making chips in AI, and other hard tech that the governments believe that is more important for the country's future or the party's future.

So I think right now, and even before the government makes sure that they would exert the right amount of control so that they would be serving the party's need. And I think that really hasn't changed, but right now it is warming up a little bit because compared to when these companies were first cracked down three years ago, the global situation has also changed.

So the party right now do need these companies in terms of both in the sense of real technology development and advancement as well as. In terms of propaganda purpose as well. I mean, think of the ways that they have been using the success of DeepSeek domestically.
>> Beth Kerley: Thanks.

So it's kind of good explanation of why it makes sense to think of both public and private layers of the PRC tech apparatus as pursuing common goals, particularly around this idea of security, defined as regime security, security per se, right? And another point that came out in there that I think it's really important to keep in mind is that when we're looking at the proliferation of PRC tech from a democracy point of view, there are at least two different risk angles.

One, which we've been focusing on is about potentially reinforcing authoritarian practices in the importing states. So local law enforcement learns how to use facial recognition to identify protesters and so on. But the other angle is that a lot of the data may be going back to the PRC and being used to train AI or for other more problematic purposes.

Valentin, any comments on the private companies angle before we move on to the four technologies?
>> Valentin Weber: Yeah, just a quick one. I think there was a lot of public debate about, you know, as you said it, about companies misbehaving and so on, but privately everyone knew who was in charge because there's laws in the PRC like the 2017 National Intelligence Law, which requires companies to.

To share all data if requested, even proactively share data with the government. And there's so many levers of power that the government has in order to make life difficult for companies. It was always clear that they have to stay in line with the party and whether that's at home or abroad as well.

So I think that's more of a public display as to, you know, we're privately owned, we have independent and so on, but really, if you look at the structure as to, as the power relationship between the party and the companies, it's very clear, and it's always been the same that the party is in charge and it will, if it needs to, you know, impose their power and their will onto the private companies.


>> Beth Kerley: Thanks, so we've got a picture at this point of a pretty pervasive web of authoritarian institutions spanning the public and private sector, spanning online and offline, and this is already in operation and already being exported. So to what extent is the development of the frontier technologies that are studied in this report going to make a difference, right?.

For instance, if we can assume that the CCP already probably has a pretty high level of access to people's information From Alipay and WeChat, for instance, how much of a difference would the Transition to the use of the cny, China's digital currency make and in general, how big do you see the impact from the changing tech landscape itself being.


>> Valentin Weber: Yeah, no, it's a very, very good question. And the question is really, why does China do what it does? Why does it want a central bank, a digital currency? What's the advantage if it has access to data anyway? So let's say you're the state and what you currently do, you have banks, you have credit card companies, you have digital payment apps such as Alipay or WeChat Pay, and you get your information from them, you inquire from them.

And so in that case, you really have a bit of a friction there. You need to get the data there, you need to inquire. You might not be sure if they give you everything and so on, but by instituting a central bank digital currency, the EU and you do away with that layer.

So you'd have a direct access to the financial data of citizens across the country. But with the eun, it's interesting because it hasn't really gotten so much adoption. People have already the digital payment apps, they're accustomed to it, and there wasn't really, even though there were incentives by the party to get people onto that digital currency, it's been very, very slow because people just don't see the point of adopting it.

And here it's actually an interesting case where there would be a wish by the states to move people in that direction. But because it's so convenient at the moment, with the digital payment apps, there is still people are sticking with it. But just to say very shortly, it would really do away with a lot of friction.

It would give direct access and more of a centralized control as well to financial data, which is really crucial about telling so much about people right where they spend money, where they are at a certain moment. And so it's really, really powerful way of understanding citizens better and controlling them in a better way.


>> Beth Kerley: So that's one example where authoritarian practices are already quite deeply entrenched, but the system of control could become even faster, more pervasive if you take away the things that get in the way. Charles, any thoughts on this question of how much of a difference do frontier technologies make, and perhaps in particular the recent advances we've been seeing in AI from China?


>> Charles Mok: Well, actually about the central bank digital currency example, in addition to the aspect of usage that Valentin has been talking about, more on the personal level, individual level of having a digital wallet and the renminbi, and so on. There's also the aspect of the, the cross country transfer of digital currency that would enable the CCP and China to bypass the global banking system which is something that they have always been trying to do.

I just remember, let me see. I remember that. Yeah, I forgot which country. I was just reading about a news that they were also just signed an agreement with yet another country. I forgot whether it was in Asia or to do this kind of bank transfer or digital currency transfer just very recently.

So anyway, so this is actually not just working at the personal or local level. Not to mention that the fact that they can use this to effect immediate change or immediate control on people's finances and in addition to holding on to their data through getting the data From Alipay or WeChat Pay and so on.

But I always think that at some point if the government really believe that they need to enhance and increase the control to that particular level, they could work with these companies and just convert Alipay into a nationally controlled system if they choose to. But they probably think that this isn't the time to do it yet.

They don't need to. But eventually if they really want to push it. What I mean is Alipay isn't an obstacle. In the end, they control it. They can even nationalize the whole thing. So I think China is playing a long game in this regard to try to carry out experiment about using CBDC to.

Achieve its financial aims. And actually, I mean, let's not forget that I think when China first started to the idea to implement the Eronminbi and the CBD and the central bank digital currency in China, it was actually inspired by Facebook because Facebook was actually trying to create its digital currency at the time on this platform and it was not successful because of opposition from the US Congress and so on.

And then China pick up the idea and got the system running in two years so they, they, they can learn and adapt. That's what I was trying to say.
>> Beth Kerley: And question to both of you. Sort of building on that. And also Larry at the outset mentioned we've seen a shift in the situation over the past year in part due to apparent PRC leadership in some of these areas of tech development.

So one of the things that's been on a lot of people's minds in the tech community lately is of course Deep Seek. And I think that shifted the narrative a bit on advanced AI and generative AI in particular. Whereas in the immediate post chat fervor you could see a lot of commentary that this particular type of AI maybe is not well suited to authoritarian systems because the censorship limits the availability of training data or because they'll be scared of it because it's unpredictable.

And so whereas the PRC is very good at biometric surveillance AI, for instance, the free world has an advantage in generative AI. But DeepSeq has clearly called that narrative into question. So how are we thinking about the implications of DeepSeq for the surveillance state? And the PRC authoritarian system in general?

How important is that going to be? Are they gonna succeed in threading the needle of employing gen AI to the max while maintaining a high level of censorship, starting with Valentin?
>> Valentin Weber: Yeah, Deep Seq was really again a game changer and we saw it very interestingly. The CCP didn't see it, or doesn't at least in public see it as a threat.

We can see now thousands of police officers being trained as to how to use DeepSeq already, a few weeks afterwards they're training sessions as to how to use it for writing reports, police reports, how to use it for query large video data footage very effectively. So they're really, really embracing it.

And especially also more poor or poorer provinces are now also deploying it because it's open source, they can use it, it's cheaper. People who weren't or provinces who weren't able to roll it out previously can do it now. So it's really making AI even more widespread than it used to be in terms particularly of censorship.

Could it be a danger when censorship isn't working, when Deep Seq isn't the model isn't censoring the content? It should be. So I think there it could be really go into two ways. Recently we did see that people can get a deep seek to speak things that it shouldn't be, and you can get around it.

But I think there's already ways being developed as to how to put that in check. And it's interesting as to even here in the US, I think it's anthropic, which is working on AI that is able to check another AI, right, that the AI is behaving. So I think there's already being solutions being put in place to make AI do what they should be doing.

And I think China will of course also do that if it's in its interest to hold regime security. So I think technologically it's possible that, you know, censorship will be upheld, but at the same time it could also go very, very wrong because AI is becoming increasingly autonomous.

And we saw that AI can go a little bit rogue. You know, it can. There is cases where AI has been disabling the oversight mechanisms that it had, you know, or that it was diffusing to places that it wasn't supposed to go. And I think there's a particular danger there, especially as China's already experimenting very strongly with agentic AI, which is able to execute decisions on behalf of the CCP.

Right. It's not just, you know, giving, you know, analysis, but it's able to already act. And that's gonna be a real danger, I guess, if those AI agents are not being understood properly and if they're perhaps acting a little bit in a direction which might not be in the interest of the CCP.

So I think at the moment it's still very much out there whether it could be, you know, a bit of an unpredictable gamble that the CCP is taking there, but they're really embracing deep SEQ as much as they could for the public security sector at the moment. I don't see the day, I really see a danger there.


>> Beth Kerley: And just quickly before we move on, because you recently had a piece in the Journal of Democracy on this, and I think it's a bit different from how most people think about Deep SEQ and generative AI. So what does AI advances in the security context specifically mean and in particular the use of agentic systems?

What could that look like on the ground?
>> Valentin Weber: Sure. I'll give you one example. One case is where you could query hundreds or let's say hours or thousands of hours of video footage to say, how did this car behave in the last 24 hours? Just that, to get the query and an answer.

But in terms of agentic AI, I would say the most advanced I've seen currently is really that in cities, AI agents are working together already, coordinating between each other to implement orders. Let's say you would have an agent in the commercial field. You would have an agent in the transport field.

The transport one would be, you know, you know, would be monitoring or executing, you know, traffic, traffic cameras and traffic lights and so on, whereas the commercial one would be able to have a lot of data on the commercial field. And the idea really, as it's conceptualized in China at the moment, is that these AI agents would be working together and that there would be a super AI agent.

That's how they call it, is coordinating these AI agents to get to a certain goal. So your goal would be, let's. There was a protest and you would tell the super AI agents, let's prevent that in the future. And so the super AI agent would coordinate the field AI agents to prevent it in the future.

It would tell one AI agent, identify me, all the people who are in that protest, and another AI agent to tell me where those people were in the last 24 hours. And together they would execute decisions. They could, let's say an AI agent could reach out to all the people who were in that.

Protest to all their contacts to tell them that they shouldn't be talking to those people. So it's really automating everything, basically control in a way that is at the moment still was recently unimaginable.
>> Beth Kerley: Thanks for that picture. So a lot there to be concerned about, both if it does what the CCP expects and potentially also if it doesn't.

Charles, thoughts on AI Deep Sea.
>> Charles Mok: So, well, I guess when we think about AI from our perspective and look at, no, not, I mean, deep seek from our perspective, we tend to think about all the potential issues, problems, privacy, security, data leakage or disinformation or even from China's perspective, like Valentine, you mentioned that they want to censor, but they are imperfect and they make mistakes and so on.

But I think from China's perspective, they've been there before, the whole Internet was like that, and they figure out a way to make it work for them. So these are minor issues to China that they have to fix in order that they get complete control. And I think they still have the confidence that they are going to be able to be successful from their point of view and also to them, the most important thing at the moment is the race to adoption.

So just like you also mentioned, you know, they're adopting it in big ways in all kinds of aspect in society or business or the economy, manufacturing, education, in the court system or militarily and so on. But I think the danger here is that they really haven't been really focused at all on the safety aspect and the security aspect.

And that might come back to bite them in the end, but we're not sure yet. Of course, at the moment I think they're more keen on making sure that they win in this race to adopt, that they can have the leg up against their competitors, the Western countries and so on.

So this is what they're focusing on at the moment. But I know how it will turn out in the end, whether or not some of these issues will come back to haunt them.
>> Beth Kerley: All right, thanks for that. So perhaps turning to something that will play into the answer to that question, democratic responses.

There are a couple of different angles to this. One, what we can do to improve our efforts specifically to counter the proliferation of authoritarian technologies and practices, including from the prc, and then also the possibility of offering something different. So first I want to give you both a chance to offer a few suggestions as to what we might do on the first front.

What are one or two strategies that you think would be particularly effective in countering the proliferation of digital practices? Digital norms that stand stem from authoritarian systems raising our game in that area, whether among researchers in standard setting bodies on the ground elsewhere, how can we raise our game in that area starting with Valentin?


>> Valentin Weber: Yeah, so what can we do? I guess reining in technology, a diffusion will be very, very difficult. We have seen it currently there was a coordinated effort to do that, and it's we still end up with Chinese surveillance tech everywhere. But I think where I saw positive things is you got to look in very authoritarian regimes it's difficult to prevent the misuse.

But I guess in hybrid regimes it's a bit easier or in also, you know, more swing states or a bit more democratic leaning countries. One such case was I think in the Philippines where a smart city was prevented because of an opposition senator who raised it in parliament.

But there was a national security grounds saying, okay, this wouldn't be working because you know, because of national security in the Philippines, even the military prevented another smart city because it was in a strategic location in the north of the Philippines. So we can see already that that can be a factor.

In Mauritius also, I think there was a very interesting debate in parliament again about Chinese smart cities and that was very, very good again because there was a public debate in parliament and so on. So I think really opposition politicians very, very important and especially regional coalitions of opposition politicians which face a similar challenge with Chinese tech.

Bringing those people together can be very, very, very good and that they exchange their practices and knowledge. So I've been part of one of such meetings amongst ASEAN countries, opposition politicians in ASEAN countries, and they exchanged their also as to how to face Chinese surveillance tech. And I think that was really, really good.

I think another one is also that highlighting Chinese surveillance tech and bringing it out of the shadow. Often people just don't know that surveillance tech is present. And so I think in Belgrade there's a very good project in highlighting where CCTV cameras are taking it out of the shadows.

They created a map of all Chinese CCTV cameras deployed in the city, and I think that can again bring civil society together. You saw it recently also in protests against the government. It's really bringing people together and also focusing their attention on those issues.
>> Beth Kerley: So coordinating at the political level and making the public more aware.

Charles countering CCP digital authoritarianism.
>> Charles Mok: First, about the standard setting bodies, I think we have to reiterate democracies have to reiterate the importance of supporting multi stakeholderism, the bottom up standard setting process. And really putting the resources into the effort and preventing the attempt for some countries to move the standard setting mechanism from the multi stakeholder bottom up process organizations such as IETF and IEEE and so on the professional bodies, moving them over to the ones that are controlled by government such as the United nations and ITU working groups and so on.

So that is the first one. The other thing that I'm very worried about is that democracies are increasingly losing the moral high ground when we talk about these issues about digital authoritarianism. Part of it is because of cybercrime which ironically is very much in big part created by countries such as China as well.

But right now every country is scared about the impact on cybercrime and scams and so on on their citizens and, and all that. So you do see that the reaction, the knee-jerk reaction from many government is to stay is to say that we need to enable the law enforcement more power to get the backdoors to encryptions and to get the legal power to get data more easily from the platforms and Internet companies and telcos and so on.

That is to me a very dangerous trend. And we do see that fortunately there are. There are still some countries that are more standing on a firmer ground in terms of saying that encryption is important for the protection of everyone's privacy. And we should not take a short sighted approach to thinking that you give the law enforcement the secret key and then soon enough that secret key must be accessible to the criminals at the same time as well.

So there are some countries, such as the, or groups of countries such as EU or France, and so on, that apparently are still sticking to that principle a bit better. The US only woke up to that because of the assault Typhoon, the hacking from the Chinese, from China to the telcos in the US, and then they say that encryption is important.

They didn't say it because they were really saying it because they believe that privacy is important for the US position to me is still a little bit uncertain. So, my worry is that democracy is increasingly losing the high ground on this particular debate about security versus personal rights and privacy.

And that's the second worry that I have. So finally, I think I'm echoing what Valentin is saying as well. Educating people about the importance of their own privacy and the risk of using these Chinese technology and apps. People were so happy that they could download a Deep Seq app and play around with it without considering or looking at the terms of use, and so on.

Obviously, nobody would. But maybe they should really get worried just because they know that this is a Chinese app because. But nobody did so deep seek as an example. And the other worry that I have is even in this country, for example, when there were news about TikTok being shut down shortly, imminently.

And then you see a lot of young people or users flocking over to Xiaohongshu, the Little Red Book, or Red Note app. And that to me is deeply worrying as well, because it almost seems like these users were jumping from one Chinese app to another Chinese app simply because they were doing it as a protest action against a potential US ban.

And I asked why, they should realize that this is not a safe thing for them to do. But this obviously never crossed their mind. So I think a lot more education about the danger and the risk of using these Chinese technologies I think needs to be done. A lot more education and awareness.


>> Beth Kerley: And one piece that I want to throw in there. You spoke a considerable deal about cybercrime and the risks from government approaches. And one landmark on a lot of people's minds is the recent adoption of a UN Convention against cybercrime that raised a lot of concern for human rights defenders.

Because of some vague provisions that could potentially be weaponized by authoritarian governments that basically define cybercrime as saying things online that the government is not too fond of. Valentin, I know you also had some thoughts on the encryption piece.
>> Valentin Weber: Yeah, absolutely. So about the cybercrime convention, just very shortly, it's maybe more about preventing even worse things from happening because if the west and democratic countries wouldn't be engaged in it.

There would be a cybercrime convention without them, and it would have been probably even worse. And it's not a good outcome at the moment, not an ideal outcome, that's true. But yeah, so we're there in international space, and it's very difficult, I guess, regarding encryption. We're here at the very crucial time at the moment because, people are still arguing whether there should be end-to-end encryption or there shouldn't.

And we saw recently also in the UK, end-to-end encryption being basically banned. And that's a very bad thing. It goes back to what you said. We got a lead by example and it's not just for our own national security because it's prevents China from hacking us or getting very easy access to data.

But it's really about not just national security, but also privacy. Here is on the same argumentative line. It benefits both, and again, as we come back also to the report, we're transitioning in an era where there might be quantum computing being able to break encryption in the next five to ten years or so.

And there's gonna be again a very crucial moment, a time of transition, which is also very dangerous because we will have very soon widely implemented post-quantum cryptography. And there will be again the temptation to insert backdoors into those post quantum cryptography to give law enforcement access. But again there we need to resist inserting any backdoor government access to that.

So I think here we got to be careful now and also plead for very encryption that is not subverted by the government because China is gonna create such a post-quantum cryptography which will have backdoors in it. So we got to have an alternative to that and really argue for not subverting it.


>> Beth Kerley: And a last question from me before we open this up. So also please be thinking of your questions for our distinguished speakers. We were speaking just now about the negative side of the response, how to oppose the diffusion of digital authoritarianism from the PRC. What are one or two recommendations that you would make for affirmative responses from the democratic community to offer a more rights respecting vision of tech development.

And I'd be particularly curious about your thoughts on the potential role of privacy-enhancing technologies like federated machine learning and whether that could help to reduce the authoritarian affordances in frontier tech starting with Valentin. And if we could be quick, so we can leave some time for our audience.


>> Valentin Weber: Yeah, sure, so what's one thing that I would say really is that we got to keep innovating because if we do have the first mover advantage, we can really shape the standards that are being out there, not only international institutions but also on the ground. Right if you export technology you're also setting how things are looking on the ground.

You can set the ethical boundaries there and set a first thing there. Also get the lead by example as said on encryption I think and really I think at the moment if we want to create a democratic digital ecosystem that's the opposite of authoritarian. Ecosystem. I think at the moment we have very, very isolated, you know, let's say democratic technologies.

We do have Signal, which is an end-to-end, encrypted, very Internet freedom going into that direction, being good for it. We might have other technologies, but there is no yet overarching kind of ecosystem. Right. It's small islands of democratic tech there. And I think US government has done great things in proposing and giving a first financing to a lot of these technologies, Tor Project, Signal and all these technologies came about also with US government financing.

And then they developed from there. But I think definitely that should be continued to create a broader digital ecosystem and perhaps a more strategic vision as to how to create a more holistic system rather than small products, let's say, which are fulfilling certain purposes such as messaging.
>> Beth Kerley: That's an interesting thought moving, the DemTech is almost an exception to DemTech as normalized.

Charles Democratic digital ecosystem in two minutes or less.
>> Charles Mok: Yeah, I follow on Valentin's points about keep innovating. But of course if you want to innovate, you have to support research and support smaller companies and so on. So that is what's needed in this country, as well as maybe in Europe and other democratic nations.

And it's also very important for the government to take the lead in adopting these technologies as well. So if they are privacy enhancing technology, then is the government adopting these technologies? Or the government is just following or just focusing on consolidating the influence of the big technology firms and so on?

That is also needed. So they need to have policies that will support not just innovation, but also innovation by smaller players and so on. And of course, just like what you said, we have technology that are good, such as or applications that are privacy enhancing, such as signal and so on.

But then again, it just take a couple of not very smart users to use them and then people thought that these are not good technologies, which is a shame.
>> Beth Kerley: All right, thanks. And so from both of you, this idea, innovation can be something that actually helps get us to a better place in terms of pro democratic technologies.

And a pro democratic digital ecosystem doesn't have to be in opposition to it. Questions from the room or from our audience online, please raise your hand. We've got some microphones in the back. I'll try to take a couple at once and then give our panelists a chance to respond over there, please.


>> Speaker 6: First, thank you very much for this enlightening discussion. I had a question to Valentin, in your paper you outlined seven critical steps for democracies or near democracies to do. And I think you guys touched on some of those during the panel discussion. But I was curious about in the paper.

Are you basically dooming the autocratic world to following in the steps of China and its technology and its know how? And it seemed like the steps that you were outlining were kind of defensive for for the democracies to do. But I didn't see anything there that was kind of offensive for us to prevent or even roll back the what would happen in the authoritarian regime.

So I was curious if that's something that you guys could touch on.
>> Beth Kerley: Thanks. That is an intriguing question. Anything else from the room? I just want to collect a couple to make sure that we get you in under the wire. Yes, please, in the back of the table.


>> Speaker 7: Thank you for the great panel. My question also follows last comment that Charles mentioned about case of TikTok for instance. I just think that this would be a very pragmatic example. Realistically discuss the idea of digital democracy. And in a way when it comes to sort of U.S. policies, how should we rethink democracy?

Does democracy itself really mean what it meant before, like let's say 1940s, 1950s. Right. Democratic round that US kind of created through this sort of notion of consumerism and notions of sort of historical environment through media creation of media and democratic media. But today with the basically situation that we have at hand, does that still work or we should even rethink and redefine those, the democratic environments really.


>> Beth Kerley: Thank you very much. And did we have one more. Yes, the gentleman in the middle of this room.
>> Speaker 8: Thank you for the talk. I'm curious, how much are the open source tools that are available? How much are do they compare with the state of the art that surveillance that China has to offer?

My background is on Iran and you know, they have, you know, in Iran there are ride sharing apps, there are a lot of digital, there's, you know, a thriving digital economy and the government does have access to a lot of those data. So I'm just curious, you know, in a country like that, in a context like that, do you really need access to like the state of the art, you know, technology or can you just, you know, impose that kind of authoritarian regime just with this open source tools.


>> Beth Kerley: All right, thanks very much. We've got three great questions. So are there any strategies for actually going on the offensive and authoritarian settings? What do we think about when we're thinking about the intersection of tech and democracy? How are we thinking about democracy? And here I'd be particularly curious if you have any thoughts on the use of digital deliberation platforms, which is something we've talked about in previous forums at ned and then the comparison between PRC tech and open source tools.

So please answer the questions that feel most relevant to your experience, starting with Valentin.
>> Valentin Weber: Sure, going on the offensive, yeah, it's a good point cuz otherwise we just have a whack a mole thing and we're always one step at the back. So I would say really the most, let's say the most sustainable one is just getting market share, doing more to promote democratic alternatives in those countries.

But I think in especially authoritarian settings there are things that are done at the moment also I think which are to give people the tools to reach information, right? Anti-censorship tools, proper VPNs, not Chinese VPNs which are government surveilled. So there are things in giving people access to information.

So I think there's these things which are being done, which are good, which should be done. Further and beyond that, I wouldn't know, maybe you have some suggestions which would be interesting going and giving more access there. But it's really about, you know, giving people, I guess, access to information and that's what's being currently done, which is quite interesting.

Regarding open source, I would say even Deep Seq is open source. But there is other open source technologies which are out there and you know, authoritarian governments will use whatever they have at their hand. If it's cheap, if it's not as good, if it gets to the purpose that they want to achieve, they're going to use it and they're not going to be too, you know, too picky as to what they want.

But I think on the democratic side, open source tools are very much encouraged to proliferate and get out there in a very cost efficient way to support that. Yeah, Jjust to make the point that on both sides, democratic and authoritarian side, there is this open source elements sometimes.

And it really depends on how it was conceived in the beginning by the producers and developers.
>> Beth Kerley: Yeah, that Deep Seq as open source and that as part of the PRC digital ecosystem is I think, something that's a bit of a thrill for those of us who have been used to thinking of open source almost as an inherent part of the pro democratic digital movement, right?

Which it still is in many cases, but people can design open source tools with different intentions in mind. Charles, any responses on the question?
>> Charles Mok: Well, going on the offensive is an intriguing thought, but I can only try to answer the three questions linking them all together by saying that I think the best thing that we can do is still hold on to the values behind the ways that we believe these systems should be designed, that they are right enhancing, that they protect people's privacy and they are designed in such a way that people can distinguish as much as possible or designed to protect people from disinformation and so on, you know, all those important values.

My biggest worry is that right now democracy is acting more and more like authoritarians. So how do you go on the offensive like that? Right, because you're becoming like them. Right, we are becoming like them. So I think going back to the basic, it might be defense is the best offense if we really hold on to the values, if we can change the current ties.

And I'm sure, thinking back to maybe the 90s or the early 2000s, why are there so many people at that time in China trying to circumvent the firewall to reach information outside and today they don't anymore? So I think maybe I'm too idealist. I still think that maybe holding onto our values and defense, and then defense will turn out to be the best offense for us.

And also maybe there are a few things that we can really do in concrete ways, such as supporting the development of these anti censorship technologies or circumvention tools and so on. But unfortunately some of the funding that we, this country, provides to the global community to develop these technologies are no longer available.

And the other thing about open source is that I think China do realize the issues or the concerns that they should have about open source technologies. But at this point, I think they are lesser worried about backfiring on themselves than to disrupt the US Model of AI. So they chose to do this.

So that's why I said China is not like a one sided thing. So they really can try to adopt, adapt to different circumstances and different purpose and requirements at the time to decide what to do.
>> Beth Kerley: Thanks very much. And yeah, I do think there is something to now it's sort of considered passe, right, to recapture that techno optimism of the 90s when people thought of technology as inherently liberating and so forth and that original vision of the Internet.

But I do think there's something deeper there behind that impulse that speaks to a much more enduring that we see in so many settings around the world, human instincts for freedom. And so it's worth thinking, and there's a lot of thinking to do about the different strategies we can deploy to get back to that vision where tech is not a lever for centralized control by those who know best, let alone it's something that itself knows better than us.

It's a system that gives everybody the right to speak up and have a free voice in their own destiny and in the destiny of their country. If we have any additional questions, we have online.
>> Speaker 9: Question here from the online audience. And while I know it's early days still I would be curious to hear from the panel if they can speak at all to what the policies of the current US Administration might be as far as we've seen to the issue set that we've been discussing tonight.


>> Beth Kerley: All right, thanks. And any more last call for questions, speak now or catch us in the hallway afterwards. All right, so since we just have a few minutes left, I'm going to say any thoughts on perhaps I'll broaden that a little to what we think the prospects are in the coming years for both US and perhaps global responses to the digital authoritarian challenge and any closing words that you'd like to leave our guests here at Stanford with, starting with Valentin?


>> Valentin Weber: Yeah, I'll start. Okay, two thoughts also on the US administration's policy. I think one interesting example is encryption and there we saw that the current US Government is really supporting end to end encryption, which is interesting, I think. And there could be a broader coalition also with lots of European countries on that front.

And so I think that's one of the issues that could be brought forward. And regarding again, on our vision of what we should do, I think often we have our own conception of what people want across the world and with our conception of our freedom, of what we want, of what we conceive things, but I think we often have to think more also as to what they want.

Going back to the question on going offensive and let's say an offensive tactic would be providing information to people in authoritarian governments and bringing that information there. But again there we have. Have to think like them and say, okay, what information are they even interested, what's the demand side?

And that goes back to also what Charles said. People often are not interested in what's happening on the West Coast, what's happening West Coast politics or in Germany or somewhere else. They're really interested in what's happening on the ground in their locality. If there was an environmental disaster, they want to know what's happening there.

And so I think if we think about, you know, sharing that information, then we need to get information on these kind of events to them. So there will be also a demand side to the supply that we can potentially give to them with anti-censorship tools and things like that.

So I think that's the broader view. We really need to see things as they are, as they see it, and not pursuing our own conceptions of what, you know, freedom might mean on the other side of the world.
>> Beth Kerley: And I'll just foot stomp that emphasis on anti censorship technologies as something where there is really technologically fascinating things going on right now with the use of satellite tech for anti censorship, use of AI for anti censorship.

So it's a potential opportunity to innovate for democracy. Charles, last words.
>> Charles Mok: Okay, I hope that democracies can develop more coherent strategy and positions towards our values. And right now my worry is also that the policies are too much concerned about fighting for technology or AI leadership and commercial incentives and so on.

So we still have to remember that technologies can be neutral and the values are often the most important thing where it shapes our policies. So I wish maybe we could learn a few things from the PRC in the sense that we should have persistence in our policies according to our values, which is what they do, but we don't do as well as they do.

Different values, of course.
>> Beth Kerley: All right, thank you, Charles, and so thanks to both of you for this rich and wide-ranging conversation. And thanks again to Hoover for hosting us today. Valentin Weber's report, Data-Centric Authoritarianism, can be found on our website at NED. Thank you all for joining us, and have a fantastic rest of your day.

Show Transcript +

ABOUT THE SPEAKERS

Beth Kerley is senior program officer with the National Endowment for Democracy’s International Forum for Democratic Studies. She is editor of and contributor to the Forum's series of publications on emerging technologies and democracy, including most recently Data-Centric Authoritarianism and Leveraging AI for Democracy: Civic Innovation on the New Digital Playing Field. She was previously associate and online editor at the Journal of Democracy, and holds a PhD in History from Harvard University.

Charles Mok is a Research Scholar at the Global Digital Policy Incubator of the Cyber Policy Center at Stanford University, a member of the Board of Trustees of the Internet Society, and a board member of the International Centre for Trade Transparency and Monitoring. Charles served as an elected member of the Legislative Council in the Hong Kong Special Administrative Region, representing the Information Technology functional constituency, for two terms from 2012 to 2020. In 2021, he founded Tech for Good Asia, an initiative to advocate positive use of technology for businesses and civil communities. As an entrepreneur, Charles co-founded HKNet in 1994, one of the earliest Internet service providers in Hong Kong, which was acquired by NTT Communications in 2000. He was the founding chair of the Internet Society Hong Kong, honorary president and former president of the Hong Kong Information Technology Federation, former chair of the Hong Kong Internet Service Providers Association, and former chair of the Asian, Australiasian and Pacific Islands Regional At-Large Organization (APRALO) of ICANN. Charles holds a BS in Computer and Electrical Engineering and an MS in Electrical Engineering from Purdue University.

Valentin Weber is a senior research fellow with the German Council on Foreign Relations. His research covers the intersection of cybersecurity, artificial intelligence, quantum technologies, and technological spheres of influence. Weber is also a China Foresight Associate at LSE IDEAS, the foreign policy think tank of the London School of Economics and Political Science. He holds a PhD in cyber security from the University of Oxford. Most recently, Weber published an online exclusive with the Journal of Democracy on Why DeepSeek Is So Dangerous

Larry Diamond is the William L. Clayton Senior Fellow at the Hoover Institution, the Mosbacher Senior Fellow in Global Democracy at the Freeman Spogli Institute for International Studies (FSI), and a Bass University Fellow in Undergraduate Education at Stanford University. He is the founding co-editor of the Journal of Democracy and has written extensively on democratic development worldwide. At Hoover, he co-leads the Project on Taiwan in the Indo-Pacific Region and participates in the Program on the US, China, and the World. At FSI, he is among the core faculty of the Center on Democracy, Development, and the Rule of Law.

Christopher Walker is Vice President for Studies and Analysis at the National Endowment for Democracy, an independent, nonprofit, grant-making foundation supporting freedom around the world. Walker oversees the multidimensional department responsible for NED’s analytical and research efforts. Prior to joining NED, Walker was Vice President for Strategy and Analysis at Freedom House. He has been at the forefront of the thought leadership on modern authoritarian influence, including through the exertion of sharp power, a concept he and his colleagues developed. Walker is co-editor (with Larry Diamond and Marc F. Plattner) of Authoritarianism Goes Global: The Challenge to Democracy (2016), co-editor (with Jessica Ludwig) of Sharp Power: Rising Authoritarian Influence (2017), and a co-editor with William J. Dobson and Tarek Masoud of Defending Democracy in an Age of Sharp Power (2023).

Upcoming Events

Friday, May 30, 2025
america
More Than Red Vs. Blue: Stories, Struggles, And Strengths In Rural America
The Center for Revitalizing American Institutions (RAI) hosts More Than Red vs. Blue: Stories, Struggles, and Strengths in Rural America on May 30,… Hoover Institution, Stanford University
Tuesday, June 3, 2025
The Party's Interests Come First: The Life of Xi Zhongxun, Father of Xi Jinping
The Party's Interests Come First: The Life Of Xi Zhongxun, Father Of Xi Jinping
The Hoover History Lab and The Hoover Institution Library & Archives invite you to The Party's Interests Come First: The Life of Xi Zhongxun,… Shultz Auditorium, George P. Shultz Building
Wednesday, June 4, 2025
Breaking the Engagement- How China Won and Lost America
Breaking The Engagement: How China Won & Lost America
The Hoover Institution Program on the US, China, and the World invites you to Breaking the Engagement: How China Won & Lost America on Wednesday… Shultz Auditorium, George P. Shultz Building
overlay image