Sunday, September 3, 2023, 11:16 PM
Posted by Administrator
False claims and disinformation, especially in a social media-driven society, have become major problems with potentially severe consequences. Thanks to an initial seed grant from the Data Institute for Societal Challenges and the Oklahoma Aerospace and Defense Innovation Institute, researchers at OU and collaborating institutions have received a projected $599,947 from the National Science Foundation’s Secure and Trustworthy Cyberspace program to study false claim attacks.Posted by Administrator
Kash Barker, Ph.D., principal investigator and the John A. Myers Professor in the School of Industrial and Systems Engineering, Gallogly College of Engineering, is leading a team of researchers examining indirect attacks targeting infrastructure systems via unwitting users.
"We’ve seen an increase in the number of incidents of false claims in recent years, and studies suggest that a majority of online users tend to be initially fooled by fake news,” Barker said. “A potentially over-the-horizon problem could occur when these incidents are weaponized by an adversary against America’s infrastructure networks.”
Disinformation can be weaponized to disrupt underlying cyber-physical systems, human lives and economic productivity. Recent examples include tweets that trigger spikes in gasoline prices and false social media posts reporting impending water pumping station shutdowns due to cold temperatures. In these scenarios, chaos is caused because people, not systems or devices, are “hacked.”
"Certain utility companies are now using demand response management systems that allow consumers to play a role in the operation of the electric grid by shifting or reducing their usage,” Barker said. “You can imagine a situation where an adversary sends out information claiming that the electric company is giving away free power during the hottest hours of the day and enticing customers to use as much power as they’d like. This would likely overload the grid’s capacity and cause major problems.”
To combat these weaponized false claims, the researchers will examine the information layer – social media platforms, individual user interactions, etc., and the physical layer – utilities, transportation networks, and other critical infrastructure. Both layers are intrinsically linked but are also separately vulnerable to potential attacks.
“We can imagine a weaponized false claim attack through the information layer that causes humans to respond in a way that adversely alters the performance of the physical layer,” Barker said. “To combat these attacks and ensure secure cyber-physical systems, we must be able to offer a plan for integrating our research with the educational mission at our universities.”
Barker’s team will work with industry partners throughout this project to bring real-world insights into the research and disseminate findings. Additionally, they anticipate providing outreach activities for undergraduate and graduate students pursuing cyber-physical systems education and research. They are also planning educational offerings for the broader community.
Barker and his co-principal investigators, Andrés González, Ph.D., an assistant professor in the School of Industrial and Systems Engineering, Elena Bessarabova, Ph.D., an associate professor of communication in the Dodge Family College of Arts and Sciences, and Sridhar Radhakrishnan, Ph.D., a professor and the interim associate dean for partnerships for the Gallogly College of Engineering, received a supply chain research seed grant in 2022 for their project, “Securing Critical Networks from Weaponized Disinformation Attacks: Initial Surveys;” a precursor to this NSF-awarded research. John Jiang, Ph.D., a professor and the OG&E Endowed Chair Professor in the School of Electrical and Computer Engineering, while not involved in the seed grant, is also assisting on the NSF project, as are collaborators from Rutgers University, Stevens Institute of Technology, and Washington University in St. Louis.
add comment
( 246 views )
| permalink
| ( 3 / 335 )