The views and opinions expressed or implied in WBY are those of the authors and should not be construed as carrying the official sanction of the Department of Defense, Air Force, Air Education and Training Command, Air University, or other agencies or departments of the US government or their international equivalents.

Wild Blue Yonder on the Air - Ep. 20 - Capt. Jesse Lubove, Dr. Anita Kellogg, and Dr. Ben Kruger on China's Investment in the Data Economy

  • Published
  • By Capt. Jesse Lubove, Dr. Anita Kellogg, and Dr. Ben Kruger

Opinions, conclusions, and recommendations expressed or implied within are solely those of the author(s) and do not necessarily represent the views of the Air University, the United States Air Force, the Department of Defense, or any other US government agency.

Capt. Jesse Lubove: Good morning. I'm Jesse Lubove, Captain in the US Air Force, and I'll be leading today's panel on China's Investment in the Data Economy. Thank you, Dr. Sankey and Air University for hosting this panel. Thank you again, Dr. Kellogg and Dr. Krueger for being here. I'm excited about this discussion.

Just a little background on who's on our panel today, Dr. Anita Kellogg earned her Ph.D. In Political Science from the University of California Los Angeles (UCLA) and is currently a Brzezinski Post-Doctoral Fellow in National Security and Foreign Policy at Johns Hopkins University. She's writing a book that examines the relationship between economic interdependence and military conflict and focuses on the ability from the business community influencing national security policymaking. Then we also have Dr. Ben Kruger, who earned his Ph.D. In Communication from the University of Maryland and is currently a teaching assistant professor of Communication Studies at the University of Nevada-Reno. His research focuses on the U.S.' political discourse from the 1960s to the present.

Yeah, I am Jesse Lubove, again. I got my Master's in Public Policy from the University of Maryland, where I focused on the intersection of economic and national security policy. I currently work at the Air Force Technical Application Center in Florida, where I support nuclear-treaty monitoring and verification efforts. For this topic, there is a, I think, growing consensus among U.S. Policymakers that China is and will continue to be the U.S.' primary geopolitical threat.

However, that relationship is incredibly complex. A big reason why is because of the tremendous economic ties between these two countries. That economic relationship is not just limited to physical goods anymore, but extends to the tech sector, one of the key drivers for worldwide economic growth in the past decade. My research on this topic started as a project I worked on while at Squadron Officer School in April of this year. I was interested in how China's growing investment in acquiring and exploiting internet data for commercial purposes could impact national security. Most of us have probably become aware of this, with the rise of TikTok, which is owned by a Chinese-based company. As I was doing my research, I came across the term "data economy," which I thought was tremendously helpful for understanding this growing field, and that, the data economy, really refers to the sector where companies acquire, analyze, and use data from the internet for economic gain.

Most of us are probably familiar with how companies like Google keep track of what we search and then use that information to sell personalized advertisements, ads. Most of us are willing to give up that data in exchange for free internet services. However, as I was doing my research, I came across other examples of how that data is acquired and sold that most of us are not aware of and might not be as okay with. That lack of awareness of how countries like China are using that data to their advantage, I think has economic and national security consequences for the U.S. That's that background for the panel today.

First, I'm going to hand it over to Dr. Kellogg who will discuss trends in the U.S.-China relationship and how the economic relationship between these two countries impacts national security. Next, I'll cover some specific examples of Chinese investments in internet data and the national security implications of those investments. Then Dr. Krueger will cover how to best communicate technical findings, like the ones presented today, to the general public and policymakers. We'll finish up with a little discussion. Unless there's anything else, with that, over to you, Dr. Kellogg.

Dr. Anita Kellogg: It is such a pleasure to be here. As I was reading this really excellent piece, it made me think that this issue is not just an international relationship between the U.S. and China, but also has a lot to do with policymaking, more generally. We often think about policymaking in China being a completely top-down process, but actually there is a lot of local autonomy at the local level. It's really interesting, the policies, particularly when it comes to economics that happen there that are sometimes not in accordance with the central government. There's a reciprocal relationship that happens where the central committee really depends on the local governments to implement the policy. Of course, the local officials depend on the central government for promotions but even when it comes to foreign policy economic policymaking, we see a lot of initiative that happens at the provincial levels. 

In the conflict between China and Japan over the Senkaku Islands in 2012, there was a large trade boycott of Japan. What we see that the boycott only lasted four or five months, because one of the issues was the provincial governments were losing the federal direct investment from Japan in these certain provinces, which they relied on heavily for employment issues, which also ties to their economic performance and the way that they are judged by the central government. You saw a lot of trade coordination. You had the Japanese business delegations, which these provincial governments reached out to, as well as talking to the central government. In fact, foreign direct investment (FDI) is a really interesting point here because the central government would like less reliance on FDI, but at the provincial level, they are very much encouraging FDI and make a lot of these decisions with these foreign businesses itself.

Now, of course, when we're talking specifically about national security issues, that's where you see a lot of central government being able to be, of course, very, very effective. You have the case in 2010 between China and Japan, where China blocked the use of rare earth minerals for several months. In the end though, this was another thing that they did not manage to keep the blockade for very long, and it's disputed what overall effect it had and what part ... it was part of just an increase in raising prices that had been launched in the World Trade Organization (WTO) protest in August beforehand, but we do see more central intervention in those. We also see more central intervention since Xi has taken office and tried to reduce some of this federal loose policymaking on a lot of different levels.

The tech sector in particular is one example of this. As most of you're probably aware, the past year or so, the tech crackdown. We normally think of it though, in the U.S. Headlines, as just the imprisonment of Jack Ma, who is the head of Alibaba, but mostly this has a lot been on regulations. A lot of these regulations have been protecting privacy and data. One thing to note is that a lot of both the public and businesses are actually quite happy with some of these regulations. Of course, the public themselves appreciate more privacy regulations. You have businesses where the regulations as ... one of the trade barriers we often talk about is regulation is a very common trade barrier. It makes it a little bit more difficult for foreign companies to have to comply and compete, which gives Chinese companies advantages in the domestic market.

Of course, for the public, the opposite side of this is that the increased data increases the Chinese ability for AI and thus identifying them on every level. But, in general, this is typical of what they expect from their central government. It has a generally mixed, but more positive than we often cover it in the West. The U.S., obviously, policymaking is very different. One of the big differences is just the size of lobbying groups and the amount of money that they put in. One of the things that's interesting when we think about the tariff trade war that happened is, of course, you had a lot of lobbying by U.S. businesses. On the other hand, when we talk about a specific sector or a very specific set of goods, unlike war and unlike total trade barriers, you have a lot of conflicting interest. But one of the things that was interesting, once those tariffs were put in place, the exporters then invested their FDI in China, which went way up, so that instead of having those barriers to trade, they were actually making those products in China and could sell as part of the domestic market.

One thing that's really relevant too is data protection. The European Union (EU) has passed a lot of data protection regulations, and yet the U.S. does not have a federal program. One part of this is simply, interest groups in general are much larger than in other countries. For example, it's twice the size of the EU. Lobbying is usually done on the EU level, because it is setting these regulations at that level rather than the national level. You have these interests, lobbying group, both in terms of the amount of money that the Chinese tech firms have put into the United States lobbying, but then also we've seen this with the data companies themselves who are collecting this data. Although there's been definitely calls to regulate this, their lobbying efforts have gotten nowhere. It's important to distinguish this from the content, but this is the actual attempting to get regulations for privacy data. 

There are some on the state level, as you mentioned, and California is one of them, where I've lived for a long time. The dynamics of California, though, policymaking in terms of environmental regulations and others is very different than the national government itself. It's also really difficult for the U.S. to have private-public coordination on national security issues. In a different study, I've been looking at, you have the case of oil in World War II, and you would think that it would be more natural that you could have those companies be more willing to work with the government. But, in actuality, the government had to really pass laws and regulations that were very strict, but also very time limited, in order to get that cooperation on national security. 

The difficulty that you have here, I think, is just almost in the domestic policy, not just in terms of what the Chinese governments are lobbying in the United States, but also in terms of just simply the whole process and the U.S. interest groups, all the lobbying as well. Obviously, you have the factor of size and money. These are definitely things that go into formulating policymaking, and why it is harder for the U.S. to implement policymaking that affects private companies, obviously more so, than China. I hope that was a brief overview that sets the tone for your paper. I'd be really happy to answer any questions or go into more detail about these topics.

Capt. Lubove: Well, thank you so much, Dr Kellogg. That was perfect. It provided a lot of context to the unique challenges in the U.S. of enacting any federal data privacy legislation. Yes, I definitely want to keep building on this.  I'll discuss some of these specific concerns about Chinese investments in the tech and data sphere, two of the main national security concerns I see with those investments and then finally offer some maybe focus areas for where policy should look going forward. As mentioned in the intro, TikTok is probably how most of us became aware of China's growing reach in the tech and data sector. TikTok has over a 100 million active users in the U.S. If any of you hang out with young people today, you probably know that they spend a lot more time on TikTok than Facebook or Twitter.

If you just read TikTok's terms of service, you can see that they are able to collect detailed biometric, so that's things like your face, your fingerprints, demographic data, detailed location data, and detailed information about your device. But it's not just TikTok; another popular Chinese owned app in the U.S. is WeChat. In 2020, when the Trump administration tried to basically force the sale or ban TikTok and WeChat, WeChat was also covered under that executive order. WeChat gathers similar levels of data from its users, and it can read any message sent through the app. You have Alibaba, which has a cloud computing platform, like Amazon. They have over 4 million global customers, and at times has included large U.S. companies like Ford and IBM. They're currently under investigation by the U.S. Commerce Department for whether or not the Chinese government could gain access to private employee data and intellectual property when their data is stored on that cloud platform.


I think for most of us, those, in our minds, are probably pretty similar to what U.S. tech companies do and maybe not ideal, it's still probably not surprising. However, in my research, I came across another way that receives a lot less scrutiny for how Chinese companies can and do collect data on Americans. There's this whole really unregulated industry of third party actors, and they're sometimes referred to as data brokers, who buy or sometimes acquire their own data, but they basically compile data from lots of different sources, maybe from all the mobile apps on your phone, from the weather apps, from your grocery store's loyalty program, from websites you visit. They compile all that information to provide more useful tools for marketing, but it can also be used for things like intelligence. 

There are almost no U.S. Government regulations on data brokers. They can be used to bypass some of the limited ways that the U.S. does regulate collection of data. For instance, in 2019, the Committee on Foreign Investments in the U.S., or CFIUS, ordered a Beijing-based tech company which owned a majority stake in Grindr, which is a popular gay-dating app in the U.S. CFIUS was concerned that the Chinese government could use the app to acquire sensitive data on Americans. It includes really detailed information that most people would not want out in the public like HIV-status, dating habits, location history. In 2020, that Beijing-based tech company that had its majority stake in Grindr sold that stake to a U.S.-based investment group. However, the Norwegian government later released a report that showed Grindr was still selling data to 18 different third party companies. Many of those were data brokers, including Tencent, which is the owner of WeChat and is an incredibly large Chinese tech company. Even though CFIUS forced the sale of Grindr, it could not stop Grindr itself from selling that data to other third parties.

In 2019, the New York Times released a really interesting article about how they were able to use data from data brokers to track individual secret service agents protecting the president at the time. This information is really cheap and almost anyone can acquire it. This is not unique to Chinese companies. Most Americans are willing to accept the trade offs and, in exchange for free internet services are willing to sell their data, but I think what is concerning is the close relationship that some Chinese tech companies have with the Chinese government. Dr. Kellogg brought up some of the regulations in place in China. Some of them do actually do a pretty good job of protecting individual privacy. They have some data regulations, which is more than the U.S. has at this point. 

However, other parts of some of their data protections and national intelligence laws are a little more concerning. They require Chinese companies to help develop domestic surveillance technologies, censor politically sensitive content and assist with criminal and national security investigations. The U.S. Government can require tech companies to assist with national security or criminal investigations. However, the U.S. process is more transparent and there is more of an established appeal process, which as far as I'm aware of, does not exist to the same extent in China. You have all these legal ways that China acquires data on Americans, and then they can supplement that with illegal methods, such as their hacking of the Office of Personnel Management, where they stole over 4 million government employee records, including security-clearance information. The list of hacks goes on and on from places like Equifax with the financial information of, I think, over 150 Americans. They stole the data of over 500 million people from the Marriott hotel chain.

It's probably not surprising that China is illicitly acquiring data, as well as investing in the tech and data sphere, it's an incredibly important part of the global economy. However, China's seemingly unrestricted access to U.S. data has, I think, two main national security implications. First, I think Chinese companies and the Chinese government can use their strength in the data economy to shape the information environment to China's advantage. Most are probably aware of China's great firewall, where they heavily restrict what Chinese citizens can view on the internet within China. However, there are concerning, I think, elements of some of that censorship extending outside of China. For instance, WeChat has blocked the messages of former Chinese citizens and pro-democracy activists who came to the U.S., are living in the U.S., messaging on WeChat with Americans. Their accounts, even though they're American citizens in the U.S., have been blocked and suspended because they post about things like Tiananmen Square.

Another instance, on TikTok in 2019, Vice reported that when they uploaded videos critical to the Chinese government with the tag "Xinjiang," that's the region where the Uyghurs in China live, TikTok allowed the videos to be posted outside of China, but when a user searched for the term "Xinjiang," those videos didn't show up, only innocuous tourist videos showed up. That's a more subtle version of censorship, it's harder to detect. The user might not even recognize how their voice is being restricted. Another area I see of concern is artificial intelligence. The same tech companies that are huge in the data economy are also major leaders in AI tech. How this works is that having large, high quality data sets allows people to better train and develop artificial intelligence software, and then there's this self-reinforcing cycle. As companies gather more data, it increases their advantages in AI.

For instance TikTok, a lot of its popularity is due to its very addictive AI-recommendation algorithm. Because the app is so popular, it continues to gather data on more users that it can use to increase the effectiveness of that algorithm. I don't think it's necessarily a national security risk in itself that people spend a lot of time on TikTok, but where I do become concerned is that a lot of those, for Chinese companies and for American companies as well, but a lot of those AI algorithms and underlying software can be used for domestic security purposes, as well as national security purposes. For instance, Alibaba has a feature where it uses AI facial recognition to authorize mobile payments. Alibaba also advertised how it uses that same software in its smart city, as part of China's smart surveillance cities. They even mentioned how they were able to use the software to distinguish between the faces of Uyghur and Han Chinese to assist Chinese authorities in the repression of Uyghurs in Xinjiang.

Much of China's use of AI data is currently focused on ensuring internal stability like in Xinjiang. However, that same information technology is being exported as part of China's smart cities and can be used to identify U.S. personnel operating in other countries. For instance, in 2020 Foreign Policy reported that China used stolen data to undercover CIA officers operating in Africa and Europe. That threat will only continue to grow as China improves its AI tech and gathers more data and continues to export that technology abroad. As I'm wrapping up, Dr. Kellogg did such a good job discussing why it's so difficult to enact policy in this area, but I think there are two areas where I would primarily focus and that might be less controversial and less susceptible to lobbying efforts. 

One is providing more transparency on what those third party actors do, those data brokers. Before I did this research, I was unaware of that industry and how powerful and how much data they can do. When we say yes to the terms of services, we do give up that, but those terms of services, most of us never read. There's probably some room for legislation to at least provide more transparency or some basic restrictions there. The second area is to at least limit the data that can be sold by third-party companies to foreign companies, especially companies in places like China and Russia. The U.S. already has a robust export control regime in areas like missile technology. I think on national security grounds, you could make a similar argument that that might apply to data, particularly to sensitive data, like apps like Grindr collect.

China has passed a data privacy law in 2021 that emphasized the risks to national security of data. It imposes strict restrictions on how foreign companies can move the data of Chinese citizens out of China. I'm not necessarily suggesting a law similar to China's, but unless the U.S. Government takes action, I do think that China will continue to benefit from unrestricted access to U.S. data, while at the same time, China restricts access to similar data on U.S. citizens. Those worrying trends in the information environment and artificial intelligence will continue to grow. With that, I'll hand it over to Dr. Krueger for your discussion.

Dr. Ben Krueger: Good morning, thank you. By way of quick background here, I should say that rather than being a media systems theorist or a media regulatory specialist, my research focuses on the study of public discourse and how it circulates within the public sphere. In other words, I look at language itself that surrounds political issues and how people in specific communities engage with that language to accomplish specific objectives. Captain Lubove's paper draws our attention to a significant communication problem with clear ramifications for both U.S. national security and U.S. domestic politics. There are two themes that I want to address today in my comments. First, I want to address the implications of this research for how we understand the media information environment within the contemporary United States. Second, I want to address the translational piece, that is how do we explain these findings in a way that makes them more impactful to the general U.S. public and to policymakers. 

With that in mind, let me dive into things. The first general theme to note here is that issues of data privacy and social media represent a growing problem within the growing media landscape and global media landscape, especially within the United States. Dr. Kellogg and Captain Lubove have already addressed policymaking and national security implications of this in quite a bit of detail. I want to talk here about the implications of this for how we think about communication and media for a moment. First of all, let me begin by note that there has unfortunately been a tendency in some media research to treat social media apps as simply an extension of earlier technological innovations, like televised cable news and internet news sites and to say, well, the research we've done on these other previous technological innovations in communication still applies. That misses the fact that there are some pretty transformational changes taking place here that fundamentally change the way that people interface with mass media in daily life, related here to micro-targeting of audiences and the ability of private corporations to track what users are doing with these apps.

This brings me to the central problem posed by social media apps, which is what José van Dijck of the University of Amsterdam and other researchers have described as a blurring between the public sphere and the personal spheres of deliberation. In other words, no longer am I just a private citizen reading a newspaper in the privacy of my own home. By having a Facebook page or by having a TikTok account, I'm entering into the public sphere on some level and, as a result of that, making myself subject to scrutiny. I suspect this doesn't come as a surprise to anyone who is here on this panel. Yet the issue is that we aren't as a society good about recognizing the implications of these blurred spheres. 

Just as an example, my undergraduate students here at the University of Nevada very much want to believe that what they're doing on social media is private to them. They're usually quite shocked when I point out to them that I can find their accounts and point out to things that they are doing. Why is this? I don't have a very clear answer to this, but I do have one speculative idea that I want to point out. In 1982, G. Thomas Goodnight, who is now a professor in the Annenberg School at the University of Southern California, wrote an influential essay about the personal, technical, and public spheres of argument, in which he makes the point that any time we find ourselves involved in an argumentative controversy related to public policy, there are really implications happening at three different levels. 

The first level would be what Goodnight calls the personal sphere, how people make sense of an issue at an individual level around the dinner table, if you will. The technical sphere, how communities of experts adjudicate issues using the specialized knowledge of their field. Finally, the public sphere, how society at large uses social and political processes to try to adjudicate these issues or disputes. One of the points that Goodnight made over 40 years ago now is that the technical and public spheres will always compete with each other for dominance. Technical experts will often have solutions in mind for particular problems, but those have to be translated in a way that the public will listen to. Public leaders don't always want to listen to what the technical experts have to say. 

With regards to where we are now, I think we have a very clear technical understanding of the fact that data privacy on social media apps is a huge problem and that something needs to be done about it, but because of the blurring of the public and personal spheres, I think what we're seeing is a knee-jerk reaction from some quarters that says that regulating this data would somehow be government overreach. We have to do a better job of selling the need for data privacy to the general public and to policymakers in particular. This brings me to the second thing I want to address today, which is how we can do a better job of translating technical findings about data privacy in a way that the U.S. public and policymakers will understand. I don't have any magic solutions here. If I did, I would probably be somewhere other than academia and making way more money than I do in higher education, but I do have a few ideas drawn from the study of rhetoric or the art of persuasion that I want to share. 

The first recommendation here is that we need to use a rhetorical concept that communication theorists call presence, which is, in other words, we need to make the urgency of the problem front and center in the minds of listeners. To do this, we need to remember that we are speaking to non-experts. This requires that we abandon technical language when we are explaining the problem and instead use rhetorical devices that are recognizable to general audiences. There are two rhetorical devices I want to talk about that I think are useful to this study. The first would be metaphors. Can we compare data security policy to some other policy that general audiences would already be familiar with? Captain Lubove, you briefly mentioned HIPAA and medical privacy in your paper. I wonder if we might use other forms of privacy, specifically medical privacy and personal financial privacy, as a starting point for talking about the need for privacy on social media, and use that as a entry point, if you will, for gaining the attention of general audiences.

The other area where I think that we have room to communicate technical findings to general audiences is through visual images. Like the saying goes, a picture is worth a thousand words. Is there some way to create a compelling visual image or graphic that compares U.S. data policy to Chinese data policy? There have been some interesting studies done in argumentation theory, which have found that in terms of creating rhetorical presence, that is making a problem or issue front and center in the minds of audiences, visual images get the job done faster than verbal language does. For the purposes of brevity, I'll stop here but I look forward to discussing the issue further in the discussion section of this panel. 

Thank you so much, Dr. Krueger. That was, I think, an incredibly helpful way to frame it, particularly on how to maybe increase the impact of some of these findings. Because I had a few questions, but before that, I did want to open it up to Dr. Kellogg and Dr. Krueger, if you had anything you wanted to touch on further, any questions either of you had? I'll start with Dr. Kellogg, please.

Dr. Kellogg: Yeah, just as I was listening to both of you, particularly Dr. Krueger, one of the things that struck me is the recommendation, so straightforward that at least you could have export controls on selling of data to foreign companies like China and wrap that up into the rhetoric, somewhat problematically, but of the anti-China policy in general. But when it comes down to the policies of export controls, usually you have some portion of the domestic business who's benefiting. Just as I talked about the domestic tech companies benefiting from a lot of the regulations that the Chinese have implemented in being able to have market share and even build those companies up to be internationally competitive, the United States just putting controls on who they can sell data to limits profits rather than benefits any part of the industry. The government would have to create some sort of buy in. I mean, I think that's the problem and a problem that would certainly have to be overcome.

Just in general, another problem with data regulation, if you talk to anyone in the tech industry, I mean, of course this goes to the rhetoric, they're very aware of all these problems. But the average person, even though they'll come up in the news and Facebook allows you to see all the information they gather on you and how it's used, most people are not actually engaging with that. The culture in the U.S. is so vehemently against any government control of information and centralized information and just, in general, really favors private enterprise. This is, as I said before, one comparison should not just be China, but also the EU, because there you have a democratic capitalist system. But you also can see just the difference in cultures and how people view data privacy. Definitely in trade, thinking about who benefits is a huge problem. Like I said, companies don't cooperate purely for national interests. They are always going to prioritize profits. That was just what I was thinking. 

Yeah. If I can jump in here for a minute, this highlights an important rhetorical problem, which is that I think in the United States, the general public thinks of social media companies like Facebook or Twitter or TikTok as being public utilities or thinks of them as being neutral and they're not. They have a specific agenda to make profit, which means that they're often not transparent with their data practices. I think that countering that perception that the companies are neutral is an important part of how we address this issue. 

Capt. Lubove: Yeah. In my research, I also looked at some of the EU's data privacy regulations. I went back and forth on how useful it was for the U.S. to follow that model. Where I lean towards is that the fact that the EU is such a large market, most U.S. tech companies have a pretty big presence, and, if we don't create some sort of our own data regulations, U.S. tech companies are forced to play by the de facto global rules, which will be the EU's and China's, without having as much of a say in the process in shaping those rules and norms. I think it's a similar trade-off to the world I work in now of arms-control treaties, where there are restrictions that legislation places on you, but when you're not at the table, you lose a lot of power in setting the path forward that everybody has to play by.

I did have a question too, because for me, it's natural to frame these issues in the national-security lens. In some ways I think it might be helpful because I think countering China is an area of bipartisan consensus today when there isn't a ton of examples of that. However, sometimes I don't know if it's the best way to frame it because, just for example, national security, and particularly issues like this, that can get technical in the weeds and focus a lot of potential future harm, aren't always the salient issues to voters. Voters in general tend to not vote on foreign-policy issues. I'd be curious to get your take from both of you on whether there might be a better way to frame this. 

Dr. Kellogg: I just wanted clarify, I've been following the EU regulations. I meant understanding the process of how regulations get made and why you have this stricter, but more an even broader centralized, I guess, over a bigger period, bigger space amongst countries, which would be harder than the States. It definitely goes to the culture and some of the obstacles that we have in the U.S. I meant, more like, we could have more regulations, but why don't we have more of them like the EU? The thing is, who's going to be willing to pay, right?

If companies are losing money, the government has actually been very unwilling to put that investment directly into building different aspects, such as even rare earth minerals and the processing of that. It's been very difficult to get funding. I just think your obstacles really come down to money. I think one way of phrasing it is simply focusing on these companies certainly as private enterprises with their own profits, but the struggle has been, people have tried to do that and have been trying to do that with Facebook and Twitter and have not been successful. I hate to say this, while it's not necessarily that salient for voters, like I said, I have a lot of problems with the anti-China rhetoric and the trade consequences. I do think that might be the most successful approach and being able to just limit the selling of the data. I'm not sure there's much else you'll really get regulations for, but I think you might be a lot more successful in that, particularly if you frame it in that way, just to get popular buy-in. 

Dr. Krueger: Related to the national security frame, I guess I would say that people are not typically going to be voting on things like arms-control policy or international policies that seem very abstract to them. But maybe rather than abandoning the national security frame entirely, we pivot a little bit what is in that frame to make it not just national security, but security in general. Even if people are not going to vote on arms-control policy, people are worried. People do pay attention to things that they perceive as threatening their own personal security. Again, as I mentioned earlier, I think that there are opportunities related to personal privacy, with things like health privacy or personal financial privacy, that could potentially interface with the national security frame if you can show how those two things are related.

Capt. Lubove: That's very helpful. Dr. Kellogg, I think one of the reasons why in my paper I did focus on the kind of export-control angle is that most of the largest data brokers that operate in the U.S. are U.S. companies, and they sell to other U.S. companies. I think what, effectively you kind of talked about it, who would be losing with these restrictions. Out of those 18 companies that Grindr sold data to, one was Chinese. I think there would be ways where the economic impact, particularly for some of those data brokers, wouldn't be severe. 

It would be a way that would have, I think, relative bipartisan consensus because, I guess, there's a lot of hesitancy for federal, strong EU style data privacy regulations in the U.S. Instead we are having different state-by-state levels, which I think at some point I think the social media companies are already recognizing that they would like to have some. That there are drawbacks to no federal legislation, and that every state gets to make its own regulations and for very different things, from the things that Florida and Texas might focus on to the things that California focuses on and to have to try to comply with 50 different regulatory frameworks becomes increasingly difficult.

Dr. Kellogg: Just one point. You said that only one of those companies was Chinese, but the problem is more who are the data brokers selling too, right? They could all be American, but they're selling to overseas markets. How do you substitute that money or get buy-in on that sense?

Capt. Lubove: Yes. I guess what I meant, Grindr was selling its data to 18 different data brokers. One of those was Tencent, that was the only Chinese company. Tencent could have sold it to hundreds of more Chinese companies. You do lose it, but that initial app that makes money by selling data to third party actors, it presumably would still maybe have 17 of it 18 customers, if you just limited maybe Chinese companies. I'm not saying it's perfect. I mean, they would definitely probably lose money, but it seems like maybe the least intrusive way to limit some of the worst consequences and in an industry that lacks a lot of transparency.

Dr. Kellogg: Yeah. I just think there has to be pressure, they have to feel like they're benefiting maybe from the public or something where they actually get some public-relations benefit. But the data brokerages in particular, I think, are difficult because most people don't even realize that it's not just the apps collecting data. It's them selling that data to data brokers who then, if I'm understanding your argument right, who are then selling it on a larger scale to different governments. Just also thinking attitudes, and even in my own, there's sometimes more lackadaisical because of hacking. When Equifax was hacked, it's like, what do you do, right? They had all this information, they're this big credit bureau, it was impossible to protect that information that way.

Of course, it's not just China hacking, it's Russia as well. I just think that's some of the things that makes it harder to overcome, but, mostly, I think it's the money, in either PR recognition, which they think they'll benefit from, but even losing a little bit of money, I feel like has to be compensated if you're going to have effective policy. I think this is great. I think you have made such a straightforward argument that it was convincing to me from the very beginning. I think this argument should be out there. I really do hope it has an effect. All I'm trying to, I think, contribute to is just also some basic knowledge of trade and the difficulties of regulation, but I absolutely agree with it. I think it might just need some creative thinking on how to get buy-in from the industries.

Capt. Lubove: That's incredibly helpful. I think so too, it's why the bulk of my paper was spent on ID-ing some of the threats. It's always tough coming up with policy recommendations and realistic ones because a lot of the time, there's a reason why regulations haven't been put in place already. Dr. Krueger, did you have anything else to add before we wrap up?

Dr. Krueger: Nothing that I can think of. I think there's a lot of very important policy details that you are teasing out in here that I don't have the expertise from a policymaking perspective to comment on, but I think that, like Dr. Kellogg said, you make a persuasive case that there should be regulation of what happens to this data and of data-brokers in particular.

Capt. Lubove: Thank you. I appreciate that. I just want to say thank you again, both of you, I really enjoyed this, and your insights were invaluable.

Dr. Kellogg: I hope this has a public distribution. I'm not sure how exactly this process works, but I think presenting your argument in a way that the public sees, I think, is just really important. I hope it gets that dissemination. 

Capt. Lubove: Thank you again for this lovely panel discussion.

Dr. Kellogg: Thanks. It was such great honor to be here and to help participate in this. Thank you.

Dr. Krueger: Thank you. It was good to meet you all.

Wild Blue Yonder Home