Information Warfare, Radicalisation, and Internal Security: External Influence and Domestic Risk
In today’s world, information warfare has become a significant aspect of modern conflict, driven by rapid advancements in artificial intelligence and digital communication. Information warfare poses serious risks to both national security and international stability by using information and communication technology to interfere or influence an adversary’s decision-making process by influencing public opinion to compromise democratic processes destabilises the government through the utilisation of AI-powered tools on social media platforms.
A particularly alarming development is the application of autonomous weapon systems, intelligence analysis and logistics, cyber operations, and the strategic role of AI in modern information warfare.[1] While AI is the inevitable reality of the modern technological era it raises ethical questions regarding accountability, bias and upholding democratic values in the digital age. Security challenges arising from non-kinetic warfare include misinformation and deep fakes which can manipulate public opinion and undermine trust in institutions through AI-powered tools that can be used to generate and promulgate misleading or false information. The major cybersecurity concerns regarding data privacy and surveillance need legitimate solutions as external actors can exploit information ecosystems to intensify domestic radicalisation, and threaten internal security as the vast amount of data collected by social media platforms, combined with AI-driven analytics can lead to possibilities of information warfare, cryptographic warfare, psychological operations, or electromagnetic offensives.
The emerging problem of algorithmic bias can reinforce existing social problems as the AI algorithms can learn from the data they are fed, which means that if the data is biased, the AI will lead to biased, unfair or discriminatory results. A comprehensive framework should be made for improving cyber security and digital literacy and implementation of counterpropaganda strategies with international cooperation to reduce the intricate risks of AI-driven information warfare. Governments, intelligence agencies, and non-state actors are increasingly using platforms like Facebook, Twitter, and WhatsApp to spread false information, influence political narratives and create conflict. Automated bot accounts and organised groups of trolls help spread harmful and divisive content.[2] For example, China’s “50 Cent Army” is known for posting a large number of messages supporting the government on social media.
Meanwhile, Russia’s Internet Research Agency has been involved in conducting social media manipulation on a global level. Social media manipulation can also impact military operations. False information and fake news about ongoing conflicts can change public opinion, affect recruitment, and even mislead military personnel. In 2020, during the India-China military standoff in Ladakh, many false stories spread across digital platforms, making diplomatic and strategic responses more complex. Information warfare has transformed internal security by blurring the line between external aggression and domestic unrest, where foreign influence amplifies existing societal cleavages.
Body
Information Warfare
According to American studies on non-kinetic strategies and hybrid conflict by Frank G. Hoffman, hybrid warfare creates strategic inconsistencies and multi-layered pressure due to the blend of conventional military force with irregular tactics, cyber tools, and information operations.[3] Expanding this concept British strategic studies scholar Lawrence Freedman demonstrates how warfare has progressively shifted toward psychological and informational domains, where influence and perception management are as critical as battlefield dominance.[4] The framework of information warfare, as a strategic fusion of military, cyber, and cognitive instruments, comprehensively aims to shape political and social outcomes.
Radicalisations
In digital environments where online echo reinforces the existing one-sided narratives that intensify group polarisation, radicalisation can be understood as a process rather than a single event. Radicalisation is the gradual progression in which the use of violence is morally justified because of perceived injustice.[5] Due to identity politics individuals view political and social issues through the lens of religious, ethnic or national identities and Social media platforms prioritise emotionally charged and engaging content because of algorithms amplification deepening the identity-based divisions.[6]
State and non-state actors weaponised narratives to destabilise adversaries without direct military confrontation as the digital ecosystem mainly the social media platforms mobilises grievances that may be real or constructive and portrays them as collective injustice for a particular group or community so that personal frustration turns into collective outrage as the state and non-state actors strategically curate and propagate compelling narratives.[7] Psychologically this narrative leads to conflicting personal values of “us versus them” through victimhood framing, symbolic imagery and selective storytelling. These extreme viewpoints are amplified through identity-based narratives on social media and radicalisation in the contemporary era is intertwined with the digital communication structure.
Internal Security
National security threat emerges at the junction of external influence and internal weaknesses. The United Nations Office on Drugs and Crime (2018) claims that online platforms have developed as an important source for the recruitment of extremists and the spread of their ideologies, making it difficult to distinguish between foreign influence and domestic vulnerability as the reason for radicalisation.[8] The Red Fort bombing incident that came to light on the 10th of November uncovers the online radicalisation strategy and new ‘white-collar’ terror module showcasing an alarming shift towards the cross-border terror strategy of Pakistan where digital means by handlers operating from Pakistan and around the world are adapting to groom the highly educated professionals against anti-India activities.[9] False narrative with external manipulation leads to governance paralysis and institutional erosion gradually weakening the credibility of the judiciary, media independence, electoral bodies and legitimacy of law enforcement.
Digital Platforms as Battlespace
For governments all over the world, the digital platforms have become a battleground and social media applications have become the strategic terrain where fake accounts are used as cyber troops to spread anti-government propaganda. The bot network that runs on the fake account which are highly automated accounts designed to mimic human behaviour online, is mainly used to intensify false narratives and or to spread political dissent in a nation.[10] Cyborg accounts, which blend automation with human curation is another strategy as the social media companies are aggressively taking down accounts associated with cyber troopers’ activities.
The Digital battle-space has become even more complicated due to deep fakes and AI-generated content which makes it harder for ordinary citizens to differentiate between authentic and manipulated information as the fabricated video, audio and text content looks highly realistic
According to the U.S. Senate Intelligence Committee (2019), during the 2016 U.S. elections, the Internet Research Agency used fake social media accounts, targeted ads, and identity-based messaging to amplify racial and ideological divisions; these efforts were designed to manipulate public opinion and undermine democratic processes.[11]
Transnational extremist ecosystems serve as echo chambers bringing together online communities, diaspora organisations, and groups with similar ideologies. They spread popular narratives about perceived victimhood, historical injustice, or existential danger, which have the power to radicalise opinions and provide justification for the use of extremist tactics.[12]
Indian and Global Case Studies of Information Conflict
The 2025 India-Pakistan information conflict is a recent and obvious illustration of how countries wage information warfare in the digital era. Following the terror incident in Jammu & Kashmir, the conflict extended beyond the battlefield into the digital space. To shape public perception recycled war footage, coordinated hashtag campaigns and AI-generated deep fake content flooded social media with unverified reports about military action, misleading claims, and doctored visuals leading to psychological and narrative warfare alongside conventional military operations.[13]
In the global arena, the Russia-Ukraine conflict demonstrates how information ecosystems can influence sanctions, military aid, and public opinion worldwide. It is often described as full scale ‘TikTok war’ as both state and non-state actors have spread manipulated videos and misleading battlefield claims across platforms like Telegram, X (twitter) and TikTok. The claims also suggested attempted cyber attacks on the government website and communication systems. To establish narrative control President Volodymyr Zelenskyy used frequent video addresses and social media diplomacy to maintain global support. Meanwhile, Russian state media framed the war as a defensive “special military operation.”[14]
Technological Factors and governance gaps
While foreign actors may initiate influence campaigns, their success largely depends on pre-existing technological, institutional, and social fault lines within a state. While the foreign actors. May attempt to influence the narrative building through non-kinetic warfare but its success largely depends upon the existing institutional, technological and social fault lines within the state. Rapid digitalisation has created a vacuum that external actors can exploit. Encrypted platforms provide end-to-end messaging services that make communication anonymous. Pseudonymous accounts and the dark web help in concealing the identity of hostile actors thereby making digital ecosystems a fertile ground for radicalisation.[15] Vulnerable cybersecurity infrastructure highlights the governance gaps through a lack of proper interagency coordination and limited digital forensic capacity and legal frameworks often struggle to keep pace with evolving digital tools.
Policy & Strategic Responses
To counter the systemic disinformation campaign requires a multi-layer strategy by advocating media literacy, digital civic training and counter-narrative campaigns that focus on building societal resilience. Undertaking capacity building measures by strengthening cyber defence systems, data transparency mechanism and inter-agency coordination would lead to robust cyber defence mechanisms additionally promoting global cooperation and rule-based digital governance through efforts like the Global Initiative for Information Integrity, which would address the emerging concerns regarding information warfare.[16]
Efforts to combat information warfare and radicalisation create a core democratic dilemma as the counter-radicalisation policies may blur the line between violent extremism and legitimate political dissent. The measures designed to protect national security can simultaneously undermine civil liberties. Ethical governance therefore requires proportionality, transparency, and legal safeguards
Conclusion
21st-century warfare is both spatial and cognitive in this dynamic strategic environment. Developing “narrative resilience” is just as important as fortifying physical boundaries. Territorial defence and kinetic capacity are no longer sufficient for internal security. Media literacy, institutional legitimacy, and the ability to downplay hostile narratives without compromising democratic liberties are all important components of societal resilience. Networked propaganda shows how both domestic and international actors take advantage of these platforms to deepen societal division and undermine democratic confidence. Today’s radicalisation is heavily affected by digital means. Grievance development and identity polarisation are accelerated by cross-border ideological networks, algorithm-driven content propagation, and online echo chambers.
Force is not the only way that power is used in the twenty-first century; persuasion and perception control are also used. The degree of societal trust and shared narratives influences security just as much as military prowess. Nations must acknowledge that protecting borders is insufficient without protecting brains as war spreads into the information sphere. The distinction between the domestic and the foreign anti-national elements has become increasingly hazy due to information warfare. In the era of digital connectivity hostile actors can directly penetrate the digital ecosystem. The narrative-building propaganda ranging from psychological manipulation to a disinformation campaign makes the social media platform a strategically disputed venue.
References:
[1] Singh, Prithvi, and Harsh Kumar Sinha. “AI-Powered Information Warfare: The Strategic Impact of Social Media on Modern Conflict Dynamics.” ResearchGate. January 2025. Accessed February 24, 2026. https://www.researchgate.net/publication/393776465_AI_-_Powered_Information_Warfare_The_Strategic_Impact_of_Social_Media_on_Modern_Conflict_Dynamics
[2] Kumar, Pramod. “Social Media Manipulation: A Strategic Tool in Contemporary Information Warfare.” New Man International Journal of Multidisciplinary Studies (NMIJMS) 12, no. 2 (February 2025). Published in the International Conference on Challenges to India’s National Security. Accessed February 24, 2026. https://www.researchgate.net/publication/391450228_Social_Media_Manipulation_A_Strategic_Tool_in_Contemporary_Information_Warfare
[3] Hoffman, Frank G. Conflict in the 21st Century: The Rise of Hybrid Wars. Arlington, VA: Potomac Institute for Policy Studies, December 2007. Accessed February 24, 2026. https://www.potomacinstitute.org/images/stories/publications/potomac_hybridwar_0108.pdf.
[4] Honig, J. A. S. (2019). Book review of The Future of War by Lawrence Freedman. International Journal of Nuclear Security, 5(1), Article 8. https://doi.org/10.7290/ijns050108
[5] Sunstein, C. R. (2017). #Republic: Divided democracy in the age of social media. Princeton University Press.Retrieved from https://www.researchgate.net/publication/324548791_Cass_R_Sunstein_Republic_Divided_Democracy_in_the_Age_of_Social_Media_Princeton_NJ_Princeton_University_Press_2017_Pp_xi310_2995
[6] Dahlgren, P. M. (2021). A critical review of filter bubbles and a comparison with selective exposure. Nordicom Review, 42(1), 15–33. https://doi.org/10.2478/nor-2021-0002. Retrieved from https://www.researchgate.net/publication/350174545_A_critical_review_of_filter_bubbles_and_a_comparison_with_selective_exposure
[7] Neumann, Peter R. Countering Violent Extremism and Radicalisation that Lead to Terrorism: Ideas, Recommendations, and Good Practices from the OSCE Region. Report, Organization for Security and Co-operation in Europe, September 28, 2017. https://cdn.osce.org/sites/default/files/f/documents/1/2/346841.pdf.
[8] United Nations Office on Drugs and Crime. UNODC launches report to assist Member States in countering the use of the Internet for terrorist purposes. Vienna: United Nations, October 22, 2012. https://www.unodc.org/unodc/en/frontpage/2012/October/unodc-launches-report-to-assist-member-states-to-counter-the-use-of-the-internet-for-terrorist-purposes.html?ref=fs4.
[9] Telegram, VPNs and AI videos: How Pakistani handlers trained doctors from ‘white collar’ module; radicalisation began in 2019, The Times of India, November 24, 2025, sec. Delhi News, https://timesofindia.indiatimes.com/city/delhi/encrypted-apps-vpns-ai-videos-white-collar-jem-module-used-telegram-youtube-to-train-doctors-in-delhi-blast-case-radicalisation-began-online-in-2019/articleshow/125532370.cms.
[10] Bradshaw, Samantha, and Philip N. Howard. The Global Disinformation Order: 2019 Global Inventory of Organized Social Media Manipulation. Working Paper 2019.3. Oxford, UK: Project on Computational Propaganda, Oxford Internet Institute, University of Oxford, 2019. https://demtech.oii.ox.ac.uk/wp-content/uploads/sites/12/2019/09/CyberTroop-Report19.pdf.
[11] United States Senate Select Committee on Intelligence. 2020. Report of the Select Committee on Intelligence United States Senate on Russian Active Measures Campaigns and Interference in the 2016 U.S. Election. S. Rpt. 116-290. Washington, DC: U.S. Government Publishing Office. Accessed February 25, 2026. https://www.intelligence.senate.gov/2020/08/18/publications-report-select-committee-intelligence-united-states-senate-russian-active-measures/.
[12] Institute for Strategic Dialogue. The Networks and Narratives of Anti-Refugee Disinformation in Europe. London: Institute for Strategic Dialogue, July 1, 2021. https://www.isdglobal.org/wp-content/uploads/2021/07/The-networks-and-narratives-of-anti-migrant-discourse-in-Europe.pdf
[13] Singha, Anand. “Operation Social Media: The India-Pakistan War Wasn’t Just Over Borders.” The Economic Times, June 7, 2025. Accessed February 26, 2026. https://economictimes.indiatimes.com/news/india/operation-social-media-india-pakistan-cyberwar-misinformation-fake-news-social-media-battle/articleshow/121688632.cms.
[14] Bergengruen, Vera. “Inside the Kremlin’s Year of Ukraine Propaganda.” TIME, August 31, 2023. Accessed February 26, 2026. https://time.com/6257372/russia-ukraine-war-disinformation/.
[15] Tucker, Joshua A., Andrew Guess, Pablo Barberá, Cristian Vaccari, Alexandra Siegel, Sergey Sanovich, Denis Stukal, and Brendan Nyhan. Social Media, Political Polarization, and Political Disinformation: A Review of the Scientific Literature. William and Flora Hewlett Foundation, March 2018. https://www.hewlett.org/wp-content/uploads/2018/03/Social-Media-Political-Polarization-and-Political-Disinformation-Literature-Review.pdf.
[16] United Nations. “Annex I: Global Digital Compact.” Pact for the Future, United Nations, accessed February 26, 2026. https://www.un.org/pact-for-the-future/en/annex-i-global-digital-compact
(The views expressed are those of the author and do not represent the views of CESCUBE)
Photo by Adi Goldstein on Unsplash