• Privacy Policy

Research Method

Home » 500+ Computer Science Research Topics

500+ Computer Science Research Topics

Computer Science Research Topics

Computer Science is a constantly evolving field that has transformed the world we live in today. With new technologies emerging every day, there are countless research opportunities in this field. Whether you are interested in artificial intelligence, machine learning, cybersecurity, data analytics, or computer networks, there are endless possibilities to explore. In this post, we will delve into some of the most interesting and important research topics in Computer Science. From the latest advancements in programming languages to the development of cutting-edge algorithms, we will explore the latest trends and innovations that are shaping the future of Computer Science. So, whether you are a student or a professional, read on to discover some of the most exciting research topics in this dynamic and rapidly expanding field.

Computer Science Research Topics

Computer Science Research Topics are as follows:

  • Using machine learning to detect and prevent cyber attacks
  • Developing algorithms for optimized resource allocation in cloud computing
  • Investigating the use of blockchain technology for secure and decentralized data storage
  • Developing intelligent chatbots for customer service
  • Investigating the effectiveness of deep learning for natural language processing
  • Developing algorithms for detecting and removing fake news from social media
  • Investigating the impact of social media on mental health
  • Developing algorithms for efficient image and video compression
  • Investigating the use of big data analytics for predictive maintenance in manufacturing
  • Developing algorithms for identifying and mitigating bias in machine learning models
  • Investigating the ethical implications of autonomous vehicles
  • Developing algorithms for detecting and preventing cyberbullying
  • Investigating the use of machine learning for personalized medicine
  • Developing algorithms for efficient and accurate speech recognition
  • Investigating the impact of social media on political polarization
  • Developing algorithms for sentiment analysis in social media data
  • Investigating the use of virtual reality in education
  • Developing algorithms for efficient data encryption and decryption
  • Investigating the impact of technology on workplace productivity
  • Developing algorithms for detecting and mitigating deepfakes
  • Investigating the use of artificial intelligence in financial trading
  • Developing algorithms for efficient database management
  • Investigating the effectiveness of online learning platforms
  • Developing algorithms for efficient and accurate facial recognition
  • Investigating the use of machine learning for predicting weather patterns
  • Developing algorithms for efficient and secure data transfer
  • Investigating the impact of technology on social skills and communication
  • Developing algorithms for efficient and accurate object recognition
  • Investigating the use of machine learning for fraud detection in finance
  • Developing algorithms for efficient and secure authentication systems
  • Investigating the impact of technology on privacy and surveillance
  • Developing algorithms for efficient and accurate handwriting recognition
  • Investigating the use of machine learning for predicting stock prices
  • Developing algorithms for efficient and secure biometric identification
  • Investigating the impact of technology on mental health and well-being
  • Developing algorithms for efficient and accurate language translation
  • Investigating the use of machine learning for personalized advertising
  • Developing algorithms for efficient and secure payment systems
  • Investigating the impact of technology on the job market and automation
  • Developing algorithms for efficient and accurate object tracking
  • Investigating the use of machine learning for predicting disease outbreaks
  • Developing algorithms for efficient and secure access control
  • Investigating the impact of technology on human behavior and decision making
  • Developing algorithms for efficient and accurate sound recognition
  • Investigating the use of machine learning for predicting customer behavior
  • Developing algorithms for efficient and secure data backup and recovery
  • Investigating the impact of technology on education and learning outcomes
  • Developing algorithms for efficient and accurate emotion recognition
  • Investigating the use of machine learning for improving healthcare outcomes
  • Developing algorithms for efficient and secure supply chain management
  • Investigating the impact of technology on cultural and societal norms
  • Developing algorithms for efficient and accurate gesture recognition
  • Investigating the use of machine learning for predicting consumer demand
  • Developing algorithms for efficient and secure cloud storage
  • Investigating the impact of technology on environmental sustainability
  • Developing algorithms for efficient and accurate voice recognition
  • Investigating the use of machine learning for improving transportation systems
  • Developing algorithms for efficient and secure mobile device management
  • Investigating the impact of technology on social inequality and access to resources
  • Machine learning for healthcare diagnosis and treatment
  • Machine Learning for Cybersecurity
  • Machine learning for personalized medicine
  • Cybersecurity threats and defense strategies
  • Big data analytics for business intelligence
  • Blockchain technology and its applications
  • Human-computer interaction in virtual reality environments
  • Artificial intelligence for autonomous vehicles
  • Natural language processing for chatbots
  • Cloud computing and its impact on the IT industry
  • Internet of Things (IoT) and smart homes
  • Robotics and automation in manufacturing
  • Augmented reality and its potential in education
  • Data mining techniques for customer relationship management
  • Computer vision for object recognition and tracking
  • Quantum computing and its applications in cryptography
  • Social media analytics and sentiment analysis
  • Recommender systems for personalized content delivery
  • Mobile computing and its impact on society
  • Bioinformatics and genomic data analysis
  • Deep learning for image and speech recognition
  • Digital signal processing and audio processing algorithms
  • Cloud storage and data security in the cloud
  • Wearable technology and its impact on healthcare
  • Computational linguistics for natural language understanding
  • Cognitive computing for decision support systems
  • Cyber-physical systems and their applications
  • Edge computing and its impact on IoT
  • Machine learning for fraud detection
  • Cryptography and its role in secure communication
  • Cybersecurity risks in the era of the Internet of Things
  • Natural language generation for automated report writing
  • 3D printing and its impact on manufacturing
  • Virtual assistants and their applications in daily life
  • Cloud-based gaming and its impact on the gaming industry
  • Computer networks and their security issues
  • Cyber forensics and its role in criminal investigations
  • Machine learning for predictive maintenance in industrial settings
  • Augmented reality for cultural heritage preservation
  • Human-robot interaction and its applications
  • Data visualization and its impact on decision-making
  • Cybersecurity in financial systems and blockchain
  • Computer graphics and animation techniques
  • Biometrics and its role in secure authentication
  • Cloud-based e-learning platforms and their impact on education
  • Natural language processing for machine translation
  • Machine learning for predictive maintenance in healthcare
  • Cybersecurity and privacy issues in social media
  • Computer vision for medical image analysis
  • Natural language generation for content creation
  • Cybersecurity challenges in cloud computing
  • Human-robot collaboration in manufacturing
  • Data mining for predicting customer churn
  • Artificial intelligence for autonomous drones
  • Cybersecurity risks in the healthcare industry
  • Machine learning for speech synthesis
  • Edge computing for low-latency applications
  • Virtual reality for mental health therapy
  • Quantum computing and its applications in finance
  • Biomedical engineering and its applications
  • Cybersecurity in autonomous systems
  • Machine learning for predictive maintenance in transportation
  • Computer vision for object detection in autonomous driving
  • Augmented reality for industrial training and simulations
  • Cloud-based cybersecurity solutions for small businesses
  • Natural language processing for knowledge management
  • Machine learning for personalized advertising
  • Cybersecurity in the supply chain management
  • Cybersecurity risks in the energy sector
  • Computer vision for facial recognition
  • Natural language processing for social media analysis
  • Machine learning for sentiment analysis in customer reviews
  • Explainable Artificial Intelligence
  • Quantum Computing
  • Blockchain Technology
  • Human-Computer Interaction
  • Natural Language Processing
  • Cloud Computing
  • Robotics and Automation
  • Augmented Reality and Virtual Reality
  • Cyber-Physical Systems
  • Computational Neuroscience
  • Big Data Analytics
  • Computer Vision
  • Cryptography and Network Security
  • Internet of Things
  • Computer Graphics and Visualization
  • Artificial Intelligence for Game Design
  • Computational Biology
  • Social Network Analysis
  • Bioinformatics
  • Distributed Systems and Middleware
  • Information Retrieval and Data Mining
  • Computer Networks
  • Mobile Computing and Wireless Networks
  • Software Engineering
  • Database Systems
  • Parallel and Distributed Computing
  • Human-Robot Interaction
  • Intelligent Transportation Systems
  • High-Performance Computing
  • Cyber-Physical Security
  • Deep Learning
  • Sensor Networks
  • Multi-Agent Systems
  • Human-Centered Computing
  • Wearable Computing
  • Knowledge Representation and Reasoning
  • Adaptive Systems
  • Brain-Computer Interface
  • Health Informatics
  • Cognitive Computing
  • Cybersecurity and Privacy
  • Internet Security
  • Cybercrime and Digital Forensics
  • Cloud Security
  • Cryptocurrencies and Digital Payments
  • Machine Learning for Natural Language Generation
  • Cognitive Robotics
  • Neural Networks
  • Semantic Web
  • Image Processing
  • Cyber Threat Intelligence
  • Secure Mobile Computing
  • Cybersecurity Education and Training
  • Privacy Preserving Techniques
  • Cyber-Physical Systems Security
  • Virtualization and Containerization
  • Machine Learning for Computer Vision
  • Network Function Virtualization
  • Cybersecurity Risk Management
  • Information Security Governance
  • Intrusion Detection and Prevention
  • Biometric Authentication
  • Machine Learning for Predictive Maintenance
  • Security in Cloud-based Environments
  • Cybersecurity for Industrial Control Systems
  • Smart Grid Security
  • Software Defined Networking
  • Quantum Cryptography
  • Security in the Internet of Things
  • Natural language processing for sentiment analysis
  • Blockchain technology for secure data sharing
  • Developing efficient algorithms for big data analysis
  • Cybersecurity for internet of things (IoT) devices
  • Human-robot interaction for industrial automation
  • Image recognition for autonomous vehicles
  • Social media analytics for marketing strategy
  • Quantum computing for solving complex problems
  • Biometric authentication for secure access control
  • Augmented reality for education and training
  • Intelligent transportation systems for traffic management
  • Predictive modeling for financial markets
  • Cloud computing for scalable data storage and processing
  • Virtual reality for therapy and mental health treatment
  • Data visualization for business intelligence
  • Recommender systems for personalized product recommendations
  • Speech recognition for voice-controlled devices
  • Mobile computing for real-time location-based services
  • Neural networks for predicting user behavior
  • Genetic algorithms for optimization problems
  • Distributed computing for parallel processing
  • Internet of things (IoT) for smart cities
  • Wireless sensor networks for environmental monitoring
  • Cloud-based gaming for high-performance gaming
  • Social network analysis for identifying influencers
  • Autonomous systems for agriculture
  • Robotics for disaster response
  • Data mining for customer segmentation
  • Computer graphics for visual effects in movies and video games
  • Virtual assistants for personalized customer service
  • Natural language understanding for chatbots
  • 3D printing for manufacturing prototypes
  • Artificial intelligence for stock trading
  • Machine learning for weather forecasting
  • Biomedical engineering for prosthetics and implants
  • Cybersecurity for financial institutions
  • Machine learning for energy consumption optimization
  • Computer vision for object tracking
  • Natural language processing for document summarization
  • Wearable technology for health and fitness monitoring
  • Internet of things (IoT) for home automation
  • Reinforcement learning for robotics control
  • Big data analytics for customer insights
  • Machine learning for supply chain optimization
  • Natural language processing for legal document analysis
  • Artificial intelligence for drug discovery
  • Computer vision for object recognition in robotics
  • Data mining for customer churn prediction
  • Autonomous systems for space exploration
  • Robotics for agriculture automation
  • Machine learning for predicting earthquakes
  • Natural language processing for sentiment analysis in customer reviews
  • Big data analytics for predicting natural disasters
  • Internet of things (IoT) for remote patient monitoring
  • Blockchain technology for digital identity management
  • Machine learning for predicting wildfire spread
  • Computer vision for gesture recognition
  • Natural language processing for automated translation
  • Big data analytics for fraud detection in banking
  • Internet of things (IoT) for smart homes
  • Robotics for warehouse automation
  • Machine learning for predicting air pollution
  • Natural language processing for medical record analysis
  • Augmented reality for architectural design
  • Big data analytics for predicting traffic congestion
  • Machine learning for predicting customer lifetime value
  • Developing algorithms for efficient and accurate text recognition
  • Natural Language Processing for Virtual Assistants
  • Natural Language Processing for Sentiment Analysis in Social Media
  • Explainable Artificial Intelligence (XAI) for Trust and Transparency
  • Deep Learning for Image and Video Retrieval
  • Edge Computing for Internet of Things (IoT) Applications
  • Data Science for Social Media Analytics
  • Cybersecurity for Critical Infrastructure Protection
  • Natural Language Processing for Text Classification
  • Quantum Computing for Optimization Problems
  • Machine Learning for Personalized Health Monitoring
  • Computer Vision for Autonomous Driving
  • Blockchain Technology for Supply Chain Management
  • Augmented Reality for Education and Training
  • Natural Language Processing for Sentiment Analysis
  • Machine Learning for Personalized Marketing
  • Big Data Analytics for Financial Fraud Detection
  • Cybersecurity for Cloud Security Assessment
  • Artificial Intelligence for Natural Language Understanding
  • Blockchain Technology for Decentralized Applications
  • Virtual Reality for Cultural Heritage Preservation
  • Natural Language Processing for Named Entity Recognition
  • Machine Learning for Customer Churn Prediction
  • Big Data Analytics for Social Network Analysis
  • Cybersecurity for Intrusion Detection and Prevention
  • Artificial Intelligence for Robotics and Automation
  • Blockchain Technology for Digital Identity Management
  • Virtual Reality for Rehabilitation and Therapy
  • Natural Language Processing for Text Summarization
  • Machine Learning for Credit Risk Assessment
  • Big Data Analytics for Fraud Detection in Healthcare
  • Cybersecurity for Internet Privacy Protection
  • Artificial Intelligence for Game Design and Development
  • Blockchain Technology for Decentralized Social Networks
  • Virtual Reality for Marketing and Advertising
  • Natural Language Processing for Opinion Mining
  • Machine Learning for Anomaly Detection
  • Big Data Analytics for Predictive Maintenance in Transportation
  • Cybersecurity for Network Security Management
  • Artificial Intelligence for Personalized News and Content Delivery
  • Blockchain Technology for Cryptocurrency Mining
  • Virtual Reality for Architectural Design and Visualization
  • Natural Language Processing for Machine Translation
  • Machine Learning for Automated Image Captioning
  • Big Data Analytics for Stock Market Prediction
  • Cybersecurity for Biometric Authentication Systems
  • Artificial Intelligence for Human-Robot Interaction
  • Blockchain Technology for Smart Grids
  • Virtual Reality for Sports Training and Simulation
  • Natural Language Processing for Question Answering Systems
  • Machine Learning for Sentiment Analysis in Customer Feedback
  • Big Data Analytics for Predictive Maintenance in Manufacturing
  • Cybersecurity for Cloud-Based Systems
  • Artificial Intelligence for Automated Journalism
  • Blockchain Technology for Intellectual Property Management
  • Virtual Reality for Therapy and Rehabilitation
  • Natural Language Processing for Language Generation
  • Machine Learning for Customer Lifetime Value Prediction
  • Big Data Analytics for Predictive Maintenance in Energy Systems
  • Cybersecurity for Secure Mobile Communication
  • Artificial Intelligence for Emotion Recognition
  • Blockchain Technology for Digital Asset Trading
  • Virtual Reality for Automotive Design and Visualization
  • Natural Language Processing for Semantic Web
  • Machine Learning for Fraud Detection in Financial Transactions
  • Big Data Analytics for Social Media Monitoring
  • Cybersecurity for Cloud Storage and Sharing
  • Artificial Intelligence for Personalized Education
  • Blockchain Technology for Secure Online Voting Systems
  • Virtual Reality for Cultural Tourism
  • Natural Language Processing for Chatbot Communication
  • Machine Learning for Medical Diagnosis and Treatment
  • Big Data Analytics for Environmental Monitoring and Management.
  • Cybersecurity for Cloud Computing Environments
  • Virtual Reality for Training and Simulation
  • Big Data Analytics for Sports Performance Analysis
  • Cybersecurity for Internet of Things (IoT) Devices
  • Artificial Intelligence for Traffic Management and Control
  • Blockchain Technology for Smart Contracts
  • Natural Language Processing for Document Summarization
  • Machine Learning for Image and Video Recognition
  • Blockchain Technology for Digital Asset Management
  • Virtual Reality for Entertainment and Gaming
  • Natural Language Processing for Opinion Mining in Online Reviews
  • Machine Learning for Customer Relationship Management
  • Big Data Analytics for Environmental Monitoring and Management
  • Cybersecurity for Network Traffic Analysis and Monitoring
  • Artificial Intelligence for Natural Language Generation
  • Blockchain Technology for Supply Chain Transparency and Traceability
  • Virtual Reality for Design and Visualization
  • Natural Language Processing for Speech Recognition
  • Machine Learning for Recommendation Systems
  • Big Data Analytics for Customer Segmentation and Targeting
  • Cybersecurity for Biometric Authentication
  • Artificial Intelligence for Human-Computer Interaction
  • Blockchain Technology for Decentralized Finance (DeFi)
  • Virtual Reality for Tourism and Cultural Heritage
  • Machine Learning for Cybersecurity Threat Detection and Prevention
  • Big Data Analytics for Healthcare Cost Reduction
  • Cybersecurity for Data Privacy and Protection
  • Artificial Intelligence for Autonomous Vehicles
  • Blockchain Technology for Cryptocurrency and Blockchain Security
  • Virtual Reality for Real Estate Visualization
  • Natural Language Processing for Question Answering
  • Big Data Analytics for Financial Markets Prediction
  • Cybersecurity for Cloud-Based Machine Learning Systems
  • Artificial Intelligence for Personalized Advertising
  • Blockchain Technology for Digital Identity Verification
  • Virtual Reality for Cultural and Language Learning
  • Natural Language Processing for Semantic Analysis
  • Machine Learning for Business Forecasting
  • Big Data Analytics for Social Media Marketing
  • Artificial Intelligence for Content Generation
  • Blockchain Technology for Smart Cities
  • Virtual Reality for Historical Reconstruction
  • Natural Language Processing for Knowledge Graph Construction
  • Machine Learning for Speech Synthesis
  • Big Data Analytics for Traffic Optimization
  • Artificial Intelligence for Social Robotics
  • Blockchain Technology for Healthcare Data Management
  • Virtual Reality for Disaster Preparedness and Response
  • Natural Language Processing for Multilingual Communication
  • Machine Learning for Emotion Recognition
  • Big Data Analytics for Human Resources Management
  • Cybersecurity for Mobile App Security
  • Artificial Intelligence for Financial Planning and Investment
  • Blockchain Technology for Energy Management
  • Virtual Reality for Cultural Preservation and Heritage.
  • Big Data Analytics for Healthcare Management
  • Cybersecurity in the Internet of Things (IoT)
  • Artificial Intelligence for Predictive Maintenance
  • Computational Biology for Drug Discovery
  • Virtual Reality for Mental Health Treatment
  • Machine Learning for Sentiment Analysis in Social Media
  • Human-Computer Interaction for User Experience Design
  • Cloud Computing for Disaster Recovery
  • Quantum Computing for Cryptography
  • Intelligent Transportation Systems for Smart Cities
  • Cybersecurity for Autonomous Vehicles
  • Artificial Intelligence for Fraud Detection in Financial Systems
  • Social Network Analysis for Marketing Campaigns
  • Cloud Computing for Video Game Streaming
  • Machine Learning for Speech Recognition
  • Augmented Reality for Architecture and Design
  • Natural Language Processing for Customer Service Chatbots
  • Machine Learning for Climate Change Prediction
  • Big Data Analytics for Social Sciences
  • Artificial Intelligence for Energy Management
  • Virtual Reality for Tourism and Travel
  • Cybersecurity for Smart Grids
  • Machine Learning for Image Recognition
  • Augmented Reality for Sports Training
  • Natural Language Processing for Content Creation
  • Cloud Computing for High-Performance Computing
  • Artificial Intelligence for Personalized Medicine
  • Virtual Reality for Architecture and Design
  • Augmented Reality for Product Visualization
  • Natural Language Processing for Language Translation
  • Cybersecurity for Cloud Computing
  • Artificial Intelligence for Supply Chain Optimization
  • Blockchain Technology for Digital Voting Systems
  • Virtual Reality for Job Training
  • Augmented Reality for Retail Shopping
  • Natural Language Processing for Sentiment Analysis in Customer Feedback
  • Cloud Computing for Mobile Application Development
  • Artificial Intelligence for Cybersecurity Threat Detection
  • Blockchain Technology for Intellectual Property Protection
  • Virtual Reality for Music Education
  • Machine Learning for Financial Forecasting
  • Augmented Reality for Medical Education
  • Natural Language Processing for News Summarization
  • Cybersecurity for Healthcare Data Protection
  • Artificial Intelligence for Autonomous Robots
  • Virtual Reality for Fitness and Health
  • Machine Learning for Natural Language Understanding
  • Augmented Reality for Museum Exhibits
  • Natural Language Processing for Chatbot Personality Development
  • Cloud Computing for Website Performance Optimization
  • Artificial Intelligence for E-commerce Recommendation Systems
  • Blockchain Technology for Supply Chain Traceability
  • Virtual Reality for Military Training
  • Augmented Reality for Advertising
  • Natural Language Processing for Chatbot Conversation Management
  • Cybersecurity for Cloud-Based Services
  • Artificial Intelligence for Agricultural Management
  • Blockchain Technology for Food Safety Assurance
  • Virtual Reality for Historical Reenactments
  • Machine Learning for Cybersecurity Incident Response.
  • Secure Multiparty Computation
  • Federated Learning
  • Internet of Things Security
  • Blockchain Scalability
  • Quantum Computing Algorithms
  • Explainable AI
  • Data Privacy in the Age of Big Data
  • Adversarial Machine Learning
  • Deep Reinforcement Learning
  • Online Learning and Streaming Algorithms
  • Graph Neural Networks
  • Automated Debugging and Fault Localization
  • Mobile Application Development
  • Software Engineering for Cloud Computing
  • Cryptocurrency Security
  • Edge Computing for Real-Time Applications
  • Natural Language Generation
  • Virtual and Augmented Reality
  • Computational Biology and Bioinformatics
  • Internet of Things Applications
  • Robotics and Autonomous Systems
  • Explainable Robotics
  • 3D Printing and Additive Manufacturing
  • Distributed Systems
  • Parallel Computing
  • Data Center Networking
  • Data Mining and Knowledge Discovery
  • Information Retrieval and Search Engines
  • Network Security and Privacy
  • Cloud Computing Security
  • Data Analytics for Business Intelligence
  • Neural Networks and Deep Learning
  • Reinforcement Learning for Robotics
  • Automated Planning and Scheduling
  • Evolutionary Computation and Genetic Algorithms
  • Formal Methods for Software Engineering
  • Computational Complexity Theory
  • Bio-inspired Computing
  • Computer Vision for Object Recognition
  • Automated Reasoning and Theorem Proving
  • Natural Language Understanding
  • Machine Learning for Healthcare
  • Scalable Distributed Systems
  • Sensor Networks and Internet of Things
  • Smart Grids and Energy Systems
  • Software Testing and Verification
  • Web Application Security
  • Wireless and Mobile Networks
  • Computer Architecture and Hardware Design
  • Digital Signal Processing
  • Game Theory and Mechanism Design
  • Multi-agent Systems
  • Evolutionary Robotics
  • Quantum Machine Learning
  • Computational Social Science
  • Explainable Recommender Systems.
  • Artificial Intelligence and its applications
  • Cloud computing and its benefits
  • Cybersecurity threats and solutions
  • Internet of Things and its impact on society
  • Virtual and Augmented Reality and its uses
  • Blockchain Technology and its potential in various industries
  • Web Development and Design
  • Digital Marketing and its effectiveness
  • Big Data and Analytics
  • Software Development Life Cycle
  • Gaming Development and its growth
  • Network Administration and Maintenance
  • Machine Learning and its uses
  • Data Warehousing and Mining
  • Computer Architecture and Design
  • Computer Graphics and Animation
  • Quantum Computing and its potential
  • Data Structures and Algorithms
  • Computer Vision and Image Processing
  • Robotics and its applications
  • Operating Systems and its functions
  • Information Theory and Coding
  • Compiler Design and Optimization
  • Computer Forensics and Cyber Crime Investigation
  • Distributed Computing and its significance
  • Artificial Neural Networks and Deep Learning
  • Cloud Storage and Backup
  • Programming Languages and their significance
  • Computer Simulation and Modeling
  • Computer Networks and its types
  • Information Security and its types
  • Computer-based Training and eLearning
  • Medical Imaging and its uses
  • Social Media Analysis and its applications
  • Human Resource Information Systems
  • Computer-Aided Design and Manufacturing
  • Multimedia Systems and Applications
  • Geographic Information Systems and its uses
  • Computer-Assisted Language Learning
  • Mobile Device Management and Security
  • Data Compression and its types
  • Knowledge Management Systems
  • Text Mining and its uses
  • Cyber Warfare and its consequences
  • Wireless Networks and its advantages
  • Computer Ethics and its importance
  • Computational Linguistics and its applications
  • Autonomous Systems and Robotics
  • Information Visualization and its importance
  • Geographic Information Retrieval and Mapping
  • Business Intelligence and its benefits
  • Digital Libraries and their significance
  • Artificial Life and Evolutionary Computation
  • Computer Music and its types
  • Virtual Teams and Collaboration
  • Computer Games and Learning
  • Semantic Web and its applications
  • Electronic Commerce and its advantages
  • Multimedia Databases and their significance
  • Computer Science Education and its importance
  • Computer-Assisted Translation and Interpretation
  • Ambient Intelligence and Smart Homes
  • Autonomous Agents and Multi-Agent Systems.

About the author

' src=

Muhammad Hassan

Researcher, Academic Writer, Web developer

You may also like

Mental Health Research Topics

300+ Mental Health Research Topics

computer information systems research paper topics

300+ Social Media Research Topics

Business Research Topics

500+ Business Research Topics

American History Research Paper Topics

300+ American History Research Paper Topics

Psychology Research Topic Ideas

500+ Psychology Research Topic Ideas

Communication Research Topics

300+ Communication Research Topics

Grad Coach

Research Topics & Ideas: CompSci & IT

50+ Computer Science Research Topic Ideas To Fast-Track Your Project

IT & Computer Science Research Topics

Finding and choosing a strong research topic is the critical first step when it comes to crafting a high-quality dissertation, thesis or research project. If you’ve landed on this post, chances are you’re looking for a computer science-related research topic , but aren’t sure where to start. Here, we’ll explore a variety of CompSci & IT-related research ideas and topic thought-starters, including algorithms, AI, networking, database systems, UX, information security and software engineering.

NB – This is just the start…

The topic ideation and evaluation process has multiple steps . In this post, we’ll kickstart the process by sharing some research topic ideas within the CompSci domain. This is the starting point, but to develop a well-defined research topic, you’ll need to identify a clear and convincing research gap , along with a well-justified plan of action to fill that gap.

If you’re new to the oftentimes perplexing world of research, or if this is your first time undertaking a formal academic research project, be sure to check out our free dissertation mini-course. In it, we cover the process of writing a dissertation or thesis from start to end. Be sure to also sign up for our free webinar that explores how to find a high-quality research topic. 

Overview: CompSci Research Topics

  • Algorithms & data structures
  • Artificial intelligence ( AI )
  • Computer networking
  • Database systems
  • Human-computer interaction
  • Information security (IS)
  • Software engineering
  • Examples of CompSci dissertation & theses

Topics/Ideas: Algorithms & Data Structures

  • An analysis of neural network algorithms’ accuracy for processing consumer purchase patterns
  • A systematic review of the impact of graph algorithms on data analysis and discovery in social media network analysis
  • An evaluation of machine learning algorithms used for recommender systems in streaming services
  • A review of approximation algorithm approaches for solving NP-hard problems
  • An analysis of parallel algorithms for high-performance computing of genomic data
  • The influence of data structures on optimal algorithm design and performance in Fintech
  • A Survey of algorithms applied in internet of things (IoT) systems in supply-chain management
  • A comparison of streaming algorithm performance for the detection of elephant flows
  • A systematic review and evaluation of machine learning algorithms used in facial pattern recognition
  • Exploring the performance of a decision tree-based approach for optimizing stock purchase decisions
  • Assessing the importance of complete and representative training datasets in Agricultural machine learning based decision making.
  • A Comparison of Deep learning algorithms performance for structured and unstructured datasets with “rare cases”
  • A systematic review of noise reduction best practices for machine learning algorithms in geoinformatics.
  • Exploring the feasibility of applying information theory to feature extraction in retail datasets.
  • Assessing the use case of neural network algorithms for image analysis in biodiversity assessment

Topics & Ideas: Artificial Intelligence (AI)

  • Applying deep learning algorithms for speech recognition in speech-impaired children
  • A review of the impact of artificial intelligence on decision-making processes in stock valuation
  • An evaluation of reinforcement learning algorithms used in the production of video games
  • An exploration of key developments in natural language processing and how they impacted the evolution of Chabots.
  • An analysis of the ethical and social implications of artificial intelligence-based automated marking
  • The influence of large-scale GIS datasets on artificial intelligence and machine learning developments
  • An examination of the use of artificial intelligence in orthopaedic surgery
  • The impact of explainable artificial intelligence (XAI) on transparency and trust in supply chain management
  • An evaluation of the role of artificial intelligence in financial forecasting and risk management in cryptocurrency
  • A meta-analysis of deep learning algorithm performance in predicting and cyber attacks in schools

Research topic idea mega list

Topics & Ideas: Networking

  • An analysis of the impact of 5G technology on internet penetration in rural Tanzania
  • Assessing the role of software-defined networking (SDN) in modern cloud-based computing
  • A critical analysis of network security and privacy concerns associated with Industry 4.0 investment in healthcare.
  • Exploring the influence of cloud computing on security risks in fintech.
  • An examination of the use of network function virtualization (NFV) in telecom networks in Southern America
  • Assessing the impact of edge computing on network architecture and design in IoT-based manufacturing
  • An evaluation of the challenges and opportunities in 6G wireless network adoption
  • The role of network congestion control algorithms in improving network performance on streaming platforms
  • An analysis of network coding-based approaches for data security
  • Assessing the impact of network topology on network performance and reliability in IoT-based workspaces

Free Webinar: How To Find A Dissertation Research Topic

Topics & Ideas: Database Systems

  • An analysis of big data management systems and technologies used in B2B marketing
  • The impact of NoSQL databases on data management and analysis in smart cities
  • An evaluation of the security and privacy concerns of cloud-based databases in financial organisations
  • Exploring the role of data warehousing and business intelligence in global consultancies
  • An analysis of the use of graph databases for data modelling and analysis in recommendation systems
  • The influence of the Internet of Things (IoT) on database design and management in the retail grocery industry
  • An examination of the challenges and opportunities of distributed databases in supply chain management
  • Assessing the impact of data compression algorithms on database performance and scalability in cloud computing
  • An evaluation of the use of in-memory databases for real-time data processing in patient monitoring
  • Comparing the effects of database tuning and optimization approaches in improving database performance and efficiency in omnichannel retailing

Topics & Ideas: Human-Computer Interaction

  • An analysis of the impact of mobile technology on human-computer interaction prevalence in adolescent men
  • An exploration of how artificial intelligence is changing human-computer interaction patterns in children
  • An evaluation of the usability and accessibility of web-based systems for CRM in the fast fashion retail sector
  • Assessing the influence of virtual and augmented reality on consumer purchasing patterns
  • An examination of the use of gesture-based interfaces in architecture
  • Exploring the impact of ease of use in wearable technology on geriatric user
  • Evaluating the ramifications of gamification in the Metaverse
  • A systematic review of user experience (UX) design advances associated with Augmented Reality
  • A comparison of natural language processing algorithms automation of customer response Comparing end-user perceptions of natural language processing algorithms for automated customer response
  • Analysing the impact of voice-based interfaces on purchase practices in the fast food industry

Research Topic Kickstarter - Need Help Finding A Research Topic?

Topics & Ideas: Information Security

  • A bibliometric review of current trends in cryptography for secure communication
  • An analysis of secure multi-party computation protocols and their applications in cloud-based computing
  • An investigation of the security of blockchain technology in patient health record tracking
  • A comparative study of symmetric and asymmetric encryption algorithms for instant text messaging
  • A systematic review of secure data storage solutions used for cloud computing in the fintech industry
  • An analysis of intrusion detection and prevention systems used in the healthcare sector
  • Assessing security best practices for IoT devices in political offices
  • An investigation into the role social media played in shifting regulations related to privacy and the protection of personal data
  • A comparative study of digital signature schemes adoption in property transfers
  • An assessment of the security of secure wireless communication systems used in tertiary institutions

Topics & Ideas: Software Engineering

  • A study of agile software development methodologies and their impact on project success in pharmacology
  • Investigating the impacts of software refactoring techniques and tools in blockchain-based developments
  • A study of the impact of DevOps practices on software development and delivery in the healthcare sector
  • An analysis of software architecture patterns and their impact on the maintainability and scalability of cloud-based offerings
  • A study of the impact of artificial intelligence and machine learning on software engineering practices in the education sector
  • An investigation of software testing techniques and methodologies for subscription-based offerings
  • A review of software security practices and techniques for protecting against phishing attacks from social media
  • An analysis of the impact of cloud computing on the rate of software development and deployment in the manufacturing sector
  • Exploring the impact of software development outsourcing on project success in multinational contexts
  • An investigation into the effect of poor software documentation on app success in the retail sector

CompSci & IT Dissertations/Theses

While the ideas we’ve presented above are a decent starting point for finding a CompSci-related research topic, they are fairly generic and non-specific. So, it helps to look at actual dissertations and theses to see how this all comes together.

Below, we’ve included a selection of research projects from various CompSci-related degree programs to help refine your thinking. These are actual dissertations and theses, written as part of Master’s and PhD-level programs, so they can provide some useful insight as to what a research topic looks like in practice.

  • An array-based optimization framework for query processing and data analytics (Chen, 2021)
  • Dynamic Object Partitioning and replication for cooperative cache (Asad, 2021)
  • Embedding constructural documentation in unit tests (Nassif, 2019)
  • PLASA | Programming Language for Synchronous Agents (Kilaru, 2019)
  • Healthcare Data Authentication using Deep Neural Network (Sekar, 2020)
  • Virtual Reality System for Planetary Surface Visualization and Analysis (Quach, 2019)
  • Artificial neural networks to predict share prices on the Johannesburg stock exchange (Pyon, 2021)
  • Predicting household poverty with machine learning methods: the case of Malawi (Chinyama, 2022)
  • Investigating user experience and bias mitigation of the multi-modal retrieval of historical data (Singh, 2021)
  • Detection of HTTPS malware traffic without decryption (Nyathi, 2022)
  • Redefining privacy: case study of smart health applications (Al-Zyoud, 2019)
  • A state-based approach to context modeling and computing (Yue, 2019)
  • A Novel Cooperative Intrusion Detection System for Mobile Ad Hoc Networks (Solomon, 2019)
  • HRSB-Tree for Spatio-Temporal Aggregates over Moving Regions (Paduri, 2019)

Looking at these titles, you can probably pick up that the research topics here are quite specific and narrowly-focused , compared to the generic ones presented earlier. This is an important thing to keep in mind as you develop your own research topic. That is to say, to create a top-notch research topic, you must be precise and target a specific context with specific variables of interest . In other words, you need to identify a clear, well-justified research gap.

Fast-Track Your Research Topic

If you’re still feeling a bit unsure about how to find a research topic for your Computer Science dissertation or research project, check out our Topic Kickstarter service.

You Might Also Like:

Research topics and ideas about data science and big data analytics

Investigating the impacts of software refactoring techniques and tools in blockchain-based developments.

Steps on getting this project topic

Joseph

I want to work with this topic, am requesting materials to guide.

Yadessa Dugassa

Information Technology -MSc program

Andrew Itodo

It’s really interesting but how can I have access to the materials to guide me through my work?

Sorie A. Turay

That’s my problem also.

kumar

Investigating the impacts of software refactoring techniques and tools in blockchain-based developments is in my favour. May i get the proper material about that ?

BEATRICE OSAMEGBE

BLOCKCHAIN TECHNOLOGY

Nanbon Temasgen

I NEED TOPIC

Submit a Comment Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

  • Print Friendly

CS 261: Research Topics in Operating Systems (2021)

Some links to papers are links to the ACM’s site. You may need to use the Harvard VPN to get access to the papers via those links. Alternate links will be provided.

Meeting 1 (1/26): Overview

Operating system architectures, meeting 2 (1/28): multics and unix.

“Multics—The first seven years” , Corbató FJ, Saltzer JH, and Clingen CT (1972)

“Protection in an information processing utility” , Graham RM (1968)

“The evolution of the Unix time-sharing system” , Ritchie DM (1984)

Additional resources

The Multicians web site for additional information on Multics, including extensive stories and Multics source code.

Technical: The Multics input/output system , Feiertag RJ and Organick EI, for a description of Multics I/O to contrast with Unix I/O.

Unix and Multics , Tom Van Vleck.

… I remarked to Dennis that easily half the code I was writing in Multics was error recovery code. He said, "We left all that stuff out. If there's an error, we have this routine called panic() , and when it is called, the machine crashes, and you holler down the hall, 'Hey, reboot it.'"

The Louisiana State Trooper Story

The IBM 7094 and CTSS

This describes the history of the system that preceded Multics, CTSS (the Compatible Time Sharing System). It also contains one of my favorite stories about the early computing days: “IBM had been very generous to MIT in the fifties and sixties, donating or discounting its biggest scientific computers. When a new top of the line 36-bit scientific machine came out, MIT expected to get one. In the early sixties, the deal was that MIT got one 8-hour shift, all the other New England colleges and universities got a shift, and the third shift was available to IBM for its own use. One use IBM made of its share was yacht handicapping: the President of IBM raced big yachts on Long Island Sound, and these boats were assigned handicap points by a complicated formula. There was a special job deck kept at the MIT Computation Center, and if a request came in to run it, operators were to stop whatever was running on the machine and do the yacht handicapping job immediately.”

Using Ring 5 , Randy Saunders.

"All Multics User functions work in Ring 5." I have that EMail (from Dave Bergum) framed on my wall to this date. … All the documentation clearly states that system software has ring brackets of [1,5,5] so that it runs equally in both rings 4 and 5. However, the PL/I compiler creates segments with ring brackets of [4,4,4] by default. … I found each and every place CNO had fixed a program without resetting the ring brackets correctly. It started out 5 a day, and in 3 months it was down to one a week.”

Bell Systems Technical Journal 57(6) Part 2: Unix Time-sharing System (July–August 1978)

This volume contains some of the first broadly-accessible descriptions of Unix. Individual articles are available on archive.org . As of late January 2021, you can buy a physical copy on Amazon for $2,996. Interesting articles include Thompson on Unix implementation, Ritchie’s retrospective, and several articles on actual applications, especially document preparation.

Meeting 3 (2/2): Microkernels

“The nucleus of a multiprogramming system” , Brinch Hansen P (1970).

“Toward real microkernels” , Liedtke J (1996).

“Are virtual machine monitors microkernels done right?” , Hand S, Warfield A, Fraser K, Kotsovinos E, Magenheimer DJ (2005).

Supplemental reading

“Improving IPC by kernel design” , Liedtke J (1993). Article introducing the first microbenchmark-performant microkernel.

“Are virtual machine monitors microkernels done right?” , Heiser G, Uhlig V, LeVasseur J (2006).

“From L3 to seL4: What have we learnt in 20 years of L4 microkernels?” , Elphinstone K, Heiser G (2013).

Retained: Minimality as key design principle. Replaced: Synchronous IPC augmented with (seL4, NOVA, Fiasco.OC) or replaced by (OKL4) asynchronous notification. Replaced: Physical by virtual message registers. Abandoned: Long IPC. Replaced: Thread IDs by port-like IPC endpoints as message destinations. Abandoned: IPC timeouts in seL4, OKL4. Abandoned: Clans and chiefs. Retained: User-level drivers as a core feature. Abandoned: Hierarchical process management. Multiple approaches: Some L4 kernels retain the model of recursive address-space construc- tion, while seL4 and OKL4 originate mappings from frames. Added: User-level control over kernel memory in seL4, kernel memory quota in Fiasco.OC. Unresolved: Principled, policy-free control of CPU time. Unresolved: Handling of multicore processors in the age of verification. Replaced: Process kernel by event kernel in seL4, OKL4 and NOVA. Abandoned: Virtual TCB addressing. … Abandoned: C++ for seL4 and OKL4.

Meeting 4 (2/4): Exokernels

“Exterminate all operating systems abstractions” , Engler DE, Kaashoek MF (1995).

“Exokernel: an operating system architecture for application-level resource management” , Engler DE, Kaashoek MF, O’Toole J (1995).

“The nonkernel: a kernel designed for the cloud” , Ben-Yehuda M, Peleg O, Ben-Yehuda OA, Smolyar I, Tsafrir D (2013).

“Application performance and flexibility on exokernel systems” , Kaashoek MF, Engler DR, Ganger GR, Briceño HM, Hunt R, Mazières D, Pinckney T, Grimm R, Jannotti J, Mackenzie K (1997).

Particularly worth reading is section 4, Multiplexing Stable Storage, which contains one of the most overcomplicated designs for stable storage imaginable. It’s instructive: if your principles end up here, might there be something wrong with your principles?

“Fast and flexible application-level networking on exokernel systems” , Ganger GR, Engler DE, Kaashoek MF, Briceño HM, Hunt R, Pinckney T (2002).

Particularly worth reading is section 8, Discussion: “The construction and revision of the Xok/ExOS networking support came with several lessons and controversial design decisions.”

Meeting 5 (2/9): Security

“EROS: A fast capability system” , Shapiro JS, Smith JM, Farber DJ (1999).

“Labels and event processes in the Asbestos operating system” , Vandebogart S, Efstathopoulos P, Kohler E, Krohn M, Frey C, Ziegler D, Kaashoek MF, Morris R, Mazières D (2007).

This paper covers too much ground. On the first read, skip sections 4–6.

Meeting 6 (2/11): I/O

“Arrakis: The operating system is the control plane” (PDF) , Peter S, Li J, Zhang I, Ports DRK, Woos D, Krishnamurthy A, Anderson T, Roscoe T (2014)

“The IX Operating System: Combining Low Latency, High Throughput, and Efficiency in a Protected Dataplane” , Belay A, Prekas G, Primorac M, Klimovic A, Grossman S, Kozyrakis C, Bugnion E (2016) — read Sections 1–4 first (return to the rest if you have time)

“I'm Not Dead Yet!: The Role of the Operating System in a Kernel-Bypass Era” , Zhang I, Liu J, Austin A, Roberts ML, Badam A (2019)

  • “The multikernel: A new OS architecture for scalable multicore systems” , Baumann A, Barham P, Dagand PE, Harris T, Isaacs R, Peter S, Roscoe T, Schüpach A, Singhana A (2009); this describes the Barrelfish system on which Arrakis is based

Meeting 7 (2/16): Speculative designs

From least to most speculative:

“Unified high-performance I/O: One Stack to Rule Them All” (PDF) , Trivedi A, Stuedi P, Metzler B, Pletka R, Fitch BG, Gross TR (2013)

“The Case for Less Predictable Operating System Behavior” (PDF) , Sun R, Porter DE, Oliveira D, Bishop M (2015)

“Quantum operating systems” , Corrigan-Gibbs H, Wu DJ, Boneh D (2017)

“Pursue robust indefinite scalability” , Ackley DH, Cannon DC (2013)

Meeting 8 (2/18): Log-structured file system

“The Design and Implementation of a Log-Structured File System” , Rosenblum M, Ousterhout J (1992)

“Logging versus Clustering: A Performance Evaluation”

  • Read the abstract of the paper ; scan further if you’d like
  • Then poke around the linked critiques

Meeting 9 (2/23): Consistency

“Generalized file system dependencies” , Frost C, Mammarella M, Kohler E, de los Reyes A, Hovsepian S, Matsuoka A, Zhang L (2007)

“Application crash consistency and performance with CCFS” , Sankaranarayana Pillai T, Alagappan R, Lu L, Chidambaram V, Arpaci-Dusseau AC, Arpaci-Dusseau RH (2017)

Meeting 10 (2/25): Transactions and speculation

“Rethink the sync” , Nightingale EB, Veeraraghavzn K, Chen PM, Flinn J (2006)

“Operating system transactions” , Porter DE, Hofmann OS, Rossbach CJ, Benn E, Witchel E (2009)

Meeting 11 (3/2): Speculative designs

“Can We Store the Whole World's Data in DNA Storage?”

“A tale of two abstractions: The case for object space”

“File systems as processes”

“Preserving hidden data with an ever-changing disk”

More, if you’re hungry for it

  • “Breaking Apart the VFS for Managing File Systems”

Virtualization

Meeting 14 (3/11): virtual machines and containers.

“Xen and the Art of Virtualization” , Barham P, Dragovic B, Fraser K, Hand S, Harris T, Ho A, Neugebauer R, Pratt I, Warfield A (2003)

“Blending containers and virtual machines: A study of Firecracker and gVisor” , Anjali, Caraz-Harter T, Swift MM (2020)

Meeting 15 (3/18): Virtual memory and virtual devices

“Memory resource management in VMware ESX Server” , Waldspurger CA (2002)

“Opportunistic flooding to improve TCP transmit performance in virtualized clouds” , Gamage S, Kangarlou A, Kompella RR, Xu D (2011)

Meeting 16 (3/23): Speculative designs

“The Best of Both Worlds with On-Demand Virtualization” , Kooburat T, Swift M (2011)

“The NIC is the Hypervisor: Bare-Metal Guests in IaaS Clouds” , Mogul JC, Mudigonda J, Santos JR, Turner Y (2013)

“vPipe: One Pipe to Connect Them All!” , Gamage S, Kompella R, Xu D (2013)

“Scalable Cloud Security via Asynchronous Virtual Machine Introspection” , Rajasekaran S, Ni Z, Chawla HS, Shah N, Wood T (2016)

Distributed systems

Meeting 17 (3/25): distributed systems history.

“Grapevine: an exercise in distributed computing” , Birrell AD, Levin R, Schroeder MD, Needham RM (1982)

“Implementing remote procedure calls” , Birrell AD, Nelson BJ (1984)

Skim : “Time, clocks, and the ordering of events in a distributed system” , Lamport L (1978)

Meeting 18 (3/30): Paxos

“Paxos made simple” , Lamport L (2001)

“Paxos made live: an engineering perspective” , Chanra T, Griesemer R, Redston J (2007)

“In search of an understandable consensus algorithm” , Ongaro D, Ousterhout J (2014)

  • Adrian Colyer’s consensus series links to ten papers, especially:
  • “Raft Refloated: Do we have consensus?” , Howard H, Schwarzkopf M, Madhavapeddy A, Crowcroft J (2015)
  • A later update from overlapping authors: “Paxos vs. Raft: Have we reached consensus on distributed consensus?” , Howard H, Mortier R (2020)
  • “Understanding Paxos” , notes by Paul Krzyzanowski (2018); includes some failure examples
  • One-slide Paxos pseudocode , Robert Morris (2014)

Meeting 19 (4/1): Review of replication results

Meeting 20 (4/6): project discussion, meeting 21 (4/8): industrial consistency.

“Scaling Memcache at Facebook” , Nishtala R, Fugal H, Grimm S, Kwiatkowski M, Lee H, Li HC, McElroy R, Paleczny M, Peek D, Saab P, Stafford D, Tung T, Venkataramani V (2013)

“Millions of Tiny Databases” , Brooker M, Chen T, Ping F (2020)

Meeting 22 (4/13): Short papers and speculative designs

“Scalability! But at what COST?” , McSherry F, Isard M, Murray DG (2015)

“What bugs cause production cloud incidents?” , Liu H, Lu S, Musuvathi M, Nath S (2019)

“Escape Capsule: Explicit State Is Robust and Scalable” , Rajagopalan S, Williams D, Jamjoom H, Warfield A (2013)

“Music-defined networking” , Hogan M, Esposito F (2018)

  • Too networking-centric for us, but fun: “Delay is Not an Option: Low Latency Routing in Space” , Handley M (2018)
  • A useful taxonomy: “When Should The Network Be The Computer?” , Ports DRK, Nelson J (2019)

Meeting 23 (4/20): The M Group

“All File Systems Are Not Created Equal: On the Complexity of Crafting Crash-Consistent Applications” , Pillai TS, Chidambaram V, Alagappan R, Al-Kiswany S, Arpaci-Dusseau AC, Arpaci-Dusseau RH (2014)

“Crash Consistency Validation Made Easy” , Jiang Y, Chen H, Qin F, Xu C, Ma X, Lu J (2016)

Meeting 24 (4/22): NVM and Juice

“Persistent Memcached: Bringing Legacy Code to Byte-Addressable Persistent Memory” , Marathe VJ, Seltzer M, Byan S, Harris T

“NVMcached: An NVM-based Key-Value Cache” , Wu X, Ni F, Zhang L, Wang Y, Ren Y, Hack M, Shao Z, Jiang S (2016)

“Cloudburst: stateful functions-as-a-service” , Sreekanti V, Wu C, Lin XC, Schleier-Smith J, Gonzalez JE, Hellerstein JM, Tumanov A (2020)

  • Adrian Colyer’s take

Meeting 25 (4/27): Scheduling

  • “The Linux Scheduler: A Decade of Wasted Cores” , Lozi JP, Lepers B, Funston J, Gaud F, Quéma V, Fedorova A (2016)

Cyber risk and cybersecurity: a systematic review of data availability

  • Open access
  • Published: 17 February 2022
  • Volume 47 , pages 698–736, ( 2022 )

Cite this article

You have full access to this open access article

computer information systems research paper topics

  • Frank Cremer 1 ,
  • Barry Sheehan   ORCID: orcid.org/0000-0003-4592-7558 1 ,
  • Michael Fortmann 2 ,
  • Arash N. Kia 1 ,
  • Martin Mullins 1 ,
  • Finbarr Murphy 1 &
  • Stefan Materne 2  

69k Accesses

72 Citations

42 Altmetric

Explore all metrics

Cybercrime is estimated to have cost the global economy just under USD 1 trillion in 2020, indicating an increase of more than 50% since 2018. With the average cyber insurance claim rising from USD 145,000 in 2019 to USD 359,000 in 2020, there is a growing necessity for better cyber information sources, standardised databases, mandatory reporting and public awareness. This research analyses the extant academic and industry literature on cybersecurity and cyber risk management with a particular focus on data availability. From a preliminary search resulting in 5219 cyber peer-reviewed studies, the application of the systematic methodology resulted in 79 unique datasets. We posit that the lack of available data on cyber risk poses a serious problem for stakeholders seeking to tackle this issue. In particular, we identify a lacuna in open databases that undermine collective endeavours to better manage this set of risks. The resulting data evaluation and categorisation will support cybersecurity researchers and the insurance industry in their efforts to comprehend, metricise and manage cyber risks.

Similar content being viewed by others

computer information systems research paper topics

Systematic Review: Cybersecurity Risk Taxonomy

computer information systems research paper topics

A Survey of Cybersecurity Risk Management Frameworks

computer information systems research paper topics

Cybersecurity Risk Management Frameworks in the Oil and Gas Sector: A Systematic Literature Review

Avoid common mistakes on your manuscript.

Introduction

Globalisation, digitalisation and smart technologies have escalated the propensity and severity of cybercrime. Whilst it is an emerging field of research and industry, the importance of robust cybersecurity defence systems has been highlighted at the corporate, national and supranational levels. The impacts of inadequate cybersecurity are estimated to have cost the global economy USD 945 billion in 2020 (Maleks Smith et al. 2020 ). Cyber vulnerabilities pose significant corporate risks, including business interruption, breach of privacy and financial losses (Sheehan et al. 2019 ). Despite the increasing relevance for the international economy, the availability of data on cyber risks remains limited. The reasons for this are many. Firstly, it is an emerging and evolving risk; therefore, historical data sources are limited (Biener et al. 2015 ). It could also be due to the fact that, in general, institutions that have been hacked do not publish the incidents (Eling and Schnell 2016 ). The lack of data poses challenges for many areas, such as research, risk management and cybersecurity (Falco et al. 2019 ). The importance of this topic is demonstrated by the announcement of the European Council in April 2021 that a centre of excellence for cybersecurity will be established to pool investments in research, technology and industrial development. The goal of this centre is to increase the security of the internet and other critical network and information systems (European Council 2021 ).

This research takes a risk management perspective, focusing on cyber risk and considering the role of cybersecurity and cyber insurance in risk mitigation and risk transfer. The study reviews the existing literature and open data sources related to cybersecurity and cyber risk. This is the first systematic review of data availability in the general context of cyber risk and cybersecurity. By identifying and critically analysing the available datasets, this paper supports the research community by aggregating, summarising and categorising all available open datasets. In addition, further information on datasets is attached to provide deeper insights and support stakeholders engaged in cyber risk control and cybersecurity. Finally, this research paper highlights the need for open access to cyber-specific data, without price or permission barriers.

The identified open data can support cyber insurers in their efforts on sustainable product development. To date, traditional risk assessment methods have been untenable for insurance companies due to the absence of historical claims data (Sheehan et al. 2021 ). These high levels of uncertainty mean that cyber insurers are more inclined to overprice cyber risk cover (Kshetri 2018 ). Combining external data with insurance portfolio data therefore seems to be essential to improve the evaluation of the risk and thus lead to risk-adjusted pricing (Bessy-Roland et al. 2021 ). This argument is also supported by the fact that some re/insurers reported that they are working to improve their cyber pricing models (e.g. by creating or purchasing databases from external providers) (EIOPA 2018 ). Figure  1 provides an overview of pricing tools and factors considered in the estimation of cyber insurance based on the findings of EIOPA ( 2018 ) and the research of Romanosky et al. ( 2019 ). The term cyber risk refers to all cyber risks and their potential impact.

figure 1

An overview of the current cyber insurance informational and methodological landscape, adapted from EIOPA ( 2018 ) and Romanosky et al. ( 2019 )

Besides the advantage of risk-adjusted pricing, the availability of open datasets helps companies benchmark their internal cyber posture and cybersecurity measures. The research can also help to improve risk awareness and corporate behaviour. Many companies still underestimate their cyber risk (Leong and Chen 2020 ). For policymakers, this research offers starting points for a comprehensive recording of cyber risks. Although in many countries, companies are obliged to report data breaches to the respective supervisory authority, this information is usually not accessible to the research community. Furthermore, the economic impact of these breaches is usually unclear.

As well as the cyber risk management community, this research also supports cybersecurity stakeholders. Researchers are provided with an up-to-date, peer-reviewed literature of available datasets showing where these datasets have been used. For example, this includes datasets that have been used to evaluate the effectiveness of countermeasures in simulated cyberattacks or to test intrusion detection systems. This reduces a time-consuming search for suitable datasets and ensures a comprehensive review of those available. Through the dataset descriptions, researchers and industry stakeholders can compare and select the most suitable datasets for their purposes. In addition, it is possible to combine the datasets from one source in the context of cybersecurity or cyber risk. This supports efficient and timely progress in cyber risk research and is beneficial given the dynamic nature of cyber risks.

Cyber risks are defined as “operational risks to information and technology assets that have consequences affecting the confidentiality, availability, and/or integrity of information or information systems” (Cebula et al. 2014 ). Prominent cyber risk events include data breaches and cyberattacks (Agrafiotis et al. 2018 ). The increasing exposure and potential impact of cyber risk have been highlighted in recent industry reports (e.g. Allianz 2021 ; World Economic Forum 2020 ). Cyberattacks on critical infrastructures are ranked 5th in the World Economic Forum's Global Risk Report. Ransomware, malware and distributed denial-of-service (DDoS) are examples of the evolving modes of a cyberattack. One example is the ransomware attack on the Colonial Pipeline, which shut down the 5500 mile pipeline system that delivers 2.5 million barrels of fuel per day and critical liquid fuel infrastructure from oil refineries to states along the U.S. East Coast (Brower and McCormick 2021 ). These and other cyber incidents have led the U.S. to strengthen its cybersecurity and introduce, among other things, a public body to analyse major cyber incidents and make recommendations to prevent a recurrence (Murphey 2021a ). Another example of the scope of cyberattacks is the ransomware NotPetya in 2017. The damage amounted to USD 10 billion, as the ransomware exploited a vulnerability in the windows system, allowing it to spread independently worldwide in the network (GAO 2021 ). In the same year, the ransomware WannaCry was launched by cybercriminals. The cyberattack on Windows software took user data hostage in exchange for Bitcoin cryptocurrency (Smart 2018 ). The victims included the National Health Service in Great Britain. As a result, ambulances were redirected to other hospitals because of information technology (IT) systems failing, leaving people in need of urgent assistance waiting. It has been estimated that 19,000 cancelled treatment appointments resulted from losses of GBP 92 million (Field 2018 ). Throughout the COVID-19 pandemic, ransomware attacks increased significantly, as working from home arrangements increased vulnerability (Murphey 2021b ).

Besides cyberattacks, data breaches can also cause high costs. Under the General Data Protection Regulation (GDPR), companies are obliged to protect personal data and safeguard the data protection rights of all individuals in the EU area. The GDPR allows data protection authorities in each country to impose sanctions and fines on organisations they find in breach. “For data breaches, the maximum fine can be €20 million or 4% of global turnover, whichever is higher” (GDPR.EU 2021 ). Data breaches often involve a large amount of sensitive data that has been accessed, unauthorised, by external parties, and are therefore considered important for information security due to their far-reaching impact (Goode et al. 2017 ). A data breach is defined as a “security incident in which sensitive, protected, or confidential data are copied, transmitted, viewed, stolen, or used by an unauthorized individual” (Freeha et al. 2021 ). Depending on the amount of data, the extent of the damage caused by a data breach can be significant, with the average cost being USD 392 million Footnote 1 (IBM Security 2020 ).

This research paper reviews the existing literature and open data sources related to cybersecurity and cyber risk, focusing on the datasets used to improve academic understanding and advance the current state-of-the-art in cybersecurity. Furthermore, important information about the available datasets is presented (e.g. use cases), and a plea is made for open data and the standardisation of cyber risk data for academic comparability and replication. The remainder of the paper is structured as follows. The next section describes the related work regarding cybersecurity and cyber risks. The third section outlines the review method used in this work and the process. The fourth section details the results of the identified literature. Further discussion is presented in the penultimate section and the final section concludes.

Related work

Due to the significance of cyber risks, several literature reviews have been conducted in this field. Eling ( 2020 ) reviewed the existing academic literature on the topic of cyber risk and cyber insurance from an economic perspective. A total of 217 papers with the term ‘cyber risk’ were identified and classified in different categories. As a result, open research questions are identified, showing that research on cyber risks is still in its infancy because of their dynamic and emerging nature. Furthermore, the author highlights that particular focus should be placed on the exchange of information between public and private actors. An improved information flow could help to measure the risk more accurately and thus make cyber risks more insurable and help risk managers to determine the right level of cyber risk for their company. In the context of cyber insurance data, Romanosky et al. ( 2019 ) analysed the underwriting process for cyber insurance and revealed how cyber insurers understand and assess cyber risks. For this research, they examined 235 American cyber insurance policies that were publicly available and looked at three components (coverage, application questionnaires and pricing). The authors state in their findings that many of the insurers used very simple, flat-rate pricing (based on a single calculation of expected loss), while others used more parameters such as the asset value of the company (or company revenue) or standard insurance metrics (e.g. deductible, limits), and the industry in the calculation. This is in keeping with Eling ( 2020 ), who states that an increased amount of data could help to make cyber risk more accurately measured and thus more insurable. Similar research on cyber insurance and data was conducted by Nurse et al. ( 2020 ). The authors examined cyber insurance practitioners' perceptions and the challenges they face in collecting and using data. In addition, gaps were identified during the research where further data is needed. The authors concluded that cyber insurance is still in its infancy, and there are still several unanswered questions (for example, cyber valuation, risk calculation and recovery). They also pointed out that a better understanding of data collection and use in cyber insurance would be invaluable for future research and practice. Bessy-Roland et al. ( 2021 ) come to a similar conclusion. They proposed a multivariate Hawkes framework to model and predict the frequency of cyberattacks. They used a public dataset with characteristics of data breaches affecting the U.S. industry. In the conclusion, the authors make the argument that an insurer has a better knowledge of cyber losses, but that it is based on a small dataset and therefore combination with external data sources seems essential to improve the assessment of cyber risks.

Several systematic reviews have been published in the area of cybersecurity (Kruse et al. 2017 ; Lee et al. 2020 ; Loukas et al. 2013 ; Ulven and Wangen 2021 ). In these papers, the authors concentrated on a specific area or sector in the context of cybersecurity. This paper adds to this extant literature by focusing on data availability and its importance to risk management and insurance stakeholders. With a priority on healthcare and cybersecurity, Kruse et al. ( 2017 ) conducted a systematic literature review. The authors identified 472 articles with the keywords ‘cybersecurity and healthcare’ or ‘ransomware’ in the databases Cumulative Index of Nursing and Allied Health Literature, PubMed and Proquest. Articles were eligible for this review if they satisfied three criteria: (1) they were published between 2006 and 2016, (2) the full-text version of the article was available, and (3) the publication is a peer-reviewed or scholarly journal. The authors found that technological development and federal policies (in the U.S.) are the main factors exposing the health sector to cyber risks. Loukas et al. ( 2013 ) conducted a review with a focus on cyber risks and cybersecurity in emergency management. The authors provided an overview of cyber risks in communication, sensor, information management and vehicle technologies used in emergency management and showed areas for which there is still no solution in the literature. Similarly, Ulven and Wangen ( 2021 ) reviewed the literature on cybersecurity risks in higher education institutions. For the literature review, the authors used the keywords ‘cyber’, ‘information threats’ or ‘vulnerability’ in connection with the terms ‘higher education, ‘university’ or ‘academia’. A similar literature review with a focus on Internet of Things (IoT) cybersecurity was conducted by Lee et al. ( 2020 ). The review revealed that qualitative approaches focus on high-level frameworks, and quantitative approaches to cybersecurity risk management focus on risk assessment and quantification of cyberattacks and impacts. In addition, the findings presented a four-step IoT cyber risk management framework that identifies, quantifies and prioritises cyber risks.

Datasets are an essential part of cybersecurity research, underlined by the following works. Ilhan Firat et al. ( 2021 ) examined various cybersecurity datasets in detail. The study was motivated by the fact that with the proliferation of the internet and smart technologies, the mode of cyberattacks is also evolving. However, in order to prevent such attacks, they must first be detected; the dissemination and further development of cybersecurity datasets is therefore critical. In their work, the authors observed studies of datasets used in intrusion detection systems. Khraisat et al. ( 2019 ) also identified a need for new datasets in the context of cybersecurity. The researchers presented a taxonomy of current intrusion detection systems, a comprehensive review of notable recent work, and an overview of the datasets commonly used for assessment purposes. In their conclusion, the authors noted that new datasets are needed because most machine-learning techniques are trained and evaluated on the knowledge of old datasets. These datasets do not contain new and comprehensive information and are partly derived from datasets from 1999. The authors noted that the core of this issue is the availability of new public datasets as well as their quality. The availability of data, how it is used, created and shared was also investigated by Zheng et al. ( 2018 ). The researchers analysed 965 cybersecurity research papers published between 2012 and 2016. They created a taxonomy of the types of data that are created and shared and then analysed the data collected via datasets. The researchers concluded that while datasets are recognised as valuable for cybersecurity research, the proportion of publicly available datasets is limited.

The main contributions of this review and what differentiates it from previous studies can be summarised as follows. First, as far as we can tell, it is the first work to summarise all available datasets on cyber risk and cybersecurity in the context of a systematic review and present them to the scientific community and cyber insurance and cybersecurity stakeholders. Second, we investigated, analysed, and made available the datasets to support efficient and timely progress in cyber risk research. And third, we enable comparability of datasets so that the appropriate dataset can be selected depending on the research area.

Methodology

Process and eligibility criteria.

The structure of this systematic review is inspired by the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) framework (Page et al. 2021 ), and the search was conducted from 3 to 10 May 2021. Due to the continuous development of cyber risks and their countermeasures, only articles published in the last 10 years were considered. In addition, only articles published in peer-reviewed journals written in English were included. As a final criterion, only articles that make use of one or more cybersecurity or cyber risk datasets met the inclusion criteria. Specifically, these studies presented new or existing datasets, used them for methods, or used them to verify new results, as well as analysed them in an economic context and pointed out their effects. The criterion was fulfilled if it was clearly stated in the abstract that one or more datasets were used. A detailed explanation of this selection criterion can be found in the ‘Study selection’ section.

Information sources

In order to cover a complete spectrum of literature, various databases were queried to collect relevant literature on the topic of cybersecurity and cyber risks. Due to the spread of related articles across multiple databases, the literature search was limited to the following four databases for simplicity: IEEE Xplore, Scopus, SpringerLink and Web of Science. This is similar to other literature reviews addressing cyber risks or cybersecurity, including Sardi et al. ( 2021 ), Franke and Brynielsson ( 2014 ), Lagerström (2019), Eling and Schnell ( 2016 ) and Eling ( 2020 ). In this paper, all databases used in the aforementioned works were considered. However, only two studies also used all the databases listed. The IEEE Xplore database contains electrical engineering, computer science, and electronics work from over 200 journals and three million conference papers (IEEE 2021 ). Scopus includes 23,400 peer-reviewed journals from more than 5000 international publishers in the areas of science, engineering, medicine, social sciences and humanities (Scopus 2021 ). SpringerLink contains 3742 journals and indexes over 10 million scientific documents (SpringerLink 2021 ). Finally, Web of Science indexes over 9200 journals in different scientific disciplines (Science 2021 ).

A search string was created and applied to all databases. To make the search efficient and reproducible, the following search string with Boolean operator was used in all databases: cybersecurity OR cyber risk AND dataset OR database. To ensure uniformity of the search across all databases, some adjustments had to be made for the respective search engines. In Scopus, for example, the Advanced Search was used, and the field code ‘Title-ABS-KEY’ was integrated into the search string. For IEEE Xplore, the search was carried out with the Search String in the Command Search and ‘All Metadata’. In the Web of Science database, the Advanced Search was used. The special feature of this search was that it had to be carried out in individual steps. The first search was carried out with the terms cybersecurity OR cyber risk with the field tag Topic (T.S. =) and the second search with dataset OR database. Subsequently, these searches were combined, which then delivered the searched articles for review. For SpringerLink, the search string was used in the Advanced Search under the category ‘Find the resources with all of the words’. After conducting this search string, 5219 studies could be found. According to the eligibility criteria (period, language and only scientific journals), 1581 studies were identified in the databases:

Scopus: 135

Springer Link: 548

Web of Science: 534

An overview of the process is given in Fig.  2 . Combined with the results from the four databases, 854 articles without duplicates were identified.

figure 2

Literature search process and categorisation of the studies

Study selection

In the final step of the selection process, the articles were screened for relevance. Due to a large number of results, the abstracts were analysed in the first step of the process. The aim was to determine whether the article was relevant for the systematic review. An article fulfilled the criterion if it was recognisable in the abstract that it had made a contribution to datasets or databases with regard to cyber risks or cybersecurity. Specifically, the criterion was considered to be met if the abstract used datasets that address the causes or impacts of cyber risks, and measures in the area of cybersecurity. In this process, the number of articles was reduced to 288. The articles were then read in their entirety, and an expert panel of six people decided whether they should be used. This led to a final number of 255 articles. The years in which the articles were published and the exact number can be seen in Fig.  3 .

figure 3

Distribution of studies

Data collection process and synthesis of the results

For the data collection process, various data were extracted from the studies, including the names of the respective creators, the name of the dataset or database and the corresponding reference. It was also determined where the data came from. In the context of accessibility, it was determined whether access is free, controlled, available for purchase or not available. It was also determined when the datasets were created and the time period referenced. The application type and domain characteristics of the datasets were identified.

This section analyses the results of the systematic literature review. The previously identified studies are divided into three categories: datasets on the causes of cyber risks, datasets on the effects of cyber risks and datasets on cybersecurity. The classification is based on the intended use of the studies. This system of classification makes it easier for stakeholders to find the appropriate datasets. The categories are evaluated individually. Although complete information is available for a large proportion of datasets, this is not true for all of them. Accordingly, the abbreviation N/A has been inserted in the respective characters to indicate that this information could not be determined by the time of submission. The term ‘use cases in the literature’ in the following and supplementary tables refers to the application areas in which the corresponding datasets were used in the literature. The areas listed there refer to the topic area on which the researchers conducted their research. Since some datasets were used interdisciplinarily, the listed use cases in the literature are correspondingly longer. Before discussing each category in the next sections, Fig.  4 provides an overview of the number of datasets found and their year of creation. Figure  5 then shows the relationship between studies and datasets in the period under consideration. Figure  6 shows the distribution of studies, their use of datasets and their creation date. The number of datasets used is higher than the number of studies because the studies often used several datasets (Table 1 ).

figure 4

Distribution of dataset results

figure 5

Correlation between the studies and the datasets

figure 6

Distribution of studies and their use of datasets

Most of the datasets are generated in the U.S. (up to 58.2%). Canada and Australia rank next, with 11.3% and 5% of all the reviewed datasets, respectively.

Additionally, to create value for the datasets for the cyber insurance industry, an assessment of the applicability of each dataset has been provided for cyber insurers. This ‘Use Case Assessment’ includes the use of the data in the context of different analyses, calculation of cyber insurance premiums, and use of the information for the design of cyber insurance contracts or for additional customer services. To reasonably account for the transition of direct hyperlinks in the future, references were directed to the main websites for longevity (nearest resource point). In addition, the links to the main pages contain further information on the datasets and different versions related to the operating systems. The references were chosen in such a way that practitioners get the best overview of the respective datasets.

Case datasets

This section presents selected articles that use the datasets to analyse the causes of cyber risks. The datasets help identify emerging trends and allow pattern discovery in cyber risks. This information gives cybersecurity experts and cyber insurers the data to make better predictions and take appropriate action. For example, if certain vulnerabilities are not adequately protected, cyber insurers will demand a risk surcharge leading to an improvement in the risk-adjusted premium. Due to the capricious nature of cyber risks, existing data must be supplemented with new data sources (for example, new events, new methods or security vulnerabilities) to determine prevailing cyber exposure. The datasets of cyber risk causes could be combined with existing portfolio data from cyber insurers and integrated into existing pricing tools and factors to improve the valuation of cyber risks.

A portion of these datasets consists of several taxonomies and classifications of cyber risks. Aassal et al. ( 2020 ) propose a new taxonomy of phishing characteristics based on the interpretation and purpose of each characteristic. In comparison, Hindy et al. ( 2020 ) presented a taxonomy of network threats and the impact of current datasets on intrusion detection systems. A similar taxonomy was suggested by Kiwia et al. ( 2018 ). The authors presented a cyber kill chain-based taxonomy of banking Trojans features. The taxonomy built on a real-world dataset of 127 banking Trojans collected from December 2014 to January 2016 by a major U.K.-based financial organisation.

In the context of classification, Aamir et al. ( 2021 ) showed the benefits of machine learning for classifying port scans and DDoS attacks in a mixture of normal and attack traffic. Guo et al. ( 2020 ) presented a new method to improve malware classification based on entropy sequence features. The evaluation of this new method was conducted on different malware datasets.

To reconstruct attack scenarios and draw conclusions based on the evidence in the alert stream, Barzegar and Shajari ( 2018 ) use the DARPA2000 and MACCDC 2012 dataset for their research. Giudici and Raffinetti ( 2020 ) proposed a rank-based statistical model aimed at predicting the severity levels of cyber risk. The model used cyber risk data from the University of Milan. In contrast to the previous datasets, Skrjanc et al. ( 2018 ) used the older dataset KDD99 to monitor large-scale cyberattacks using a cauchy clustering method.

Amin et al. ( 2021 ) used a cyberattack dataset from the Canadian Institute for Cybersecurity to identify spatial clusters of countries with high rates of cyberattacks. In the context of cybercrime, Junger et al. ( 2020 ) examined crime scripts, key characteristics of the target company and the relationship between criminal effort and financial benefit. For their study, the authors analysed 300 cases of fraudulent activities against Dutch companies. With a similar focus on cybercrime, Mireles et al. ( 2019 ) proposed a metric framework to measure the effectiveness of the dynamic evolution of cyberattacks and defensive measures. To validate its usefulness, they used the DEFCON dataset.

Due to the rapidly changing nature of cyber risks, it is often impossible to obtain all information on them. Kim and Kim ( 2019 ) proposed an automated dataset generation system called CTIMiner that collects threat data from publicly available security reports and malware repositories. They released a dataset to the public containing about 640,000 records from 612 security reports published between January 2008 and 2019. A similar approach is proposed by Kim et al. ( 2020 ), using a named entity recognition system to extract core information from cyber threat reports automatically. They created a 498,000-tag dataset during their research (Ulven and Wangen 2021 ).

Within the framework of vulnerabilities and cybersecurity issues, Ulven and Wangen ( 2021 ) proposed an overview of mission-critical assets and everyday threat events, suggested a generic threat model, and summarised common cybersecurity vulnerabilities. With a focus on hospitality, Chen and Fiscus ( 2018 ) proposed several issues related to cybersecurity in this sector. They analysed 76 security incidents from the Privacy Rights Clearinghouse database. Supplementary Table 1 lists all findings that belong to the cyber causes dataset.

Impact datasets

This section outlines selected findings of the cyber impact dataset. For cyber insurers, these datasets can form an important basis for information, as they can be used to calculate cyber insurance premiums, evaluate specific cyber risks, formulate inclusions and exclusions in cyber wordings, and re-evaluate as well as supplement the data collected so far on cyber risks. For example, information on financial losses can help to better assess the loss potential of cyber risks. Furthermore, the datasets can provide insight into the frequency of occurrence of these cyber risks. The new datasets can be used to close any data gaps that were previously based on very approximate estimates or to find new results.

Eight studies addressed the costs of data breaches. For instance, Eling and Jung ( 2018 ) reviewed 3327 data breach events from 2005 to 2016 and identified an asymmetric dependence of monthly losses by breach type and industry. The authors used datasets from the Privacy Rights Clearinghouse for analysis. The Privacy Rights Clearinghouse datasets and the Breach level index database were also used by De Giovanni et al. ( 2020 ) to describe relationships between data breaches and bitcoin-related variables using the cointegration methodology. The data were obtained from the Department of Health and Human Services of healthcare facilities reporting data breaches and a national database of technical and organisational infrastructure information. Also in the context of data breaches, Algarni et al. ( 2021 ) developed a comprehensive, formal model that estimates the two components of security risks: breach cost and the likelihood of a data breach within 12 months. For their survey, the authors used two industrial reports from the Ponemon institute and VERIZON. To illustrate the scope of data breaches, Neto et al. ( 2021 ) identified 430 major data breach incidents among more than 10,000 incidents. The database created is available and covers the period 2018 to 2019.

With a direct focus on insurance, Biener et al. ( 2015 ) analysed 994 cyber loss cases from an operational risk database and investigated the insurability of cyber risks based on predefined criteria. For their study, they used data from the company SAS OpRisk Global Data. Similarly, Eling and Wirfs ( 2019 ) looked at a wide range of cyber risk events and actual cost data using the same database. They identified cyber losses and analysed them using methods from statistics and actuarial science. Using a similar reference, Farkas et al. ( 2021 ) proposed a method for analysing cyber claims based on regression trees to identify criteria for classifying and evaluating claims. Similar to Chen and Fiscus ( 2018 ), the dataset used was the Privacy Rights Clearinghouse database. Within the framework of reinsurance, Moro ( 2020 ) analysed cyber index-based information technology activity to see if index-parametric reinsurance coverage could suggest its cedant using data from a Symantec dataset.

Paté-Cornell et al. ( 2018 ) presented a general probabilistic risk analysis framework for cybersecurity in an organisation to be specified. The results are distributions of losses to cyberattacks, with and without considered countermeasures in support of risk management decisions based both on past data and anticipated incidents. The data used were from The Common Vulnerability and Exposures database and via confidential access to a database of cyberattacks on a large, U.S.-based organisation. A different conceptual framework for cyber risk classification and assessment was proposed by Sheehan et al. ( 2021 ). This framework showed the importance of proactive and reactive barriers in reducing companies’ exposure to cyber risk and quantifying the risk. Another approach to cyber risk assessment and mitigation was proposed by Mukhopadhyay et al. ( 2019 ). They estimated the probability of an attack using generalised linear models, predicted the security technology required to reduce the probability of cyberattacks, and used gamma and exponential distributions to best approximate the average loss data for each malicious attack. They also calculated the expected loss due to cyberattacks, calculated the net premium that would need to be charged by a cyber insurer, and suggested cyber insurance as a strategy to minimise losses. They used the CSI-FBI survey (1997–2010) to conduct their research.

In order to highlight the lack of data on cyber risks, Eling ( 2020 ) conducted a literature review in the areas of cyber risk and cyber insurance. Available information on the frequency, severity, and dependency structure of cyber risks was filtered out. In addition, open questions for future cyber risk research were set up. Another example of data collection on the impact of cyberattacks is provided by Sornette et al. ( 2013 ), who use a database of newspaper articles, press reports and other media to provide a predictive method to identify triggering events and potential accident scenarios and estimate their severity and frequency. A similar approach to data collection was used by Arcuri et al. ( 2020 ) to gather an original sample of global cyberattacks from newspaper reports sourced from the LexisNexis database. This collection is also used and applied to the fields of dynamic communication and cyber risk perception by Fang et al. ( 2021 ). To create a dataset of cyber incidents and disputes, Valeriano and Maness ( 2014 ) collected information on cyber interactions between rival states.

To assess trends and the scale of economic cybercrime, Levi ( 2017 ) examined datasets from different countries and their impact on crime policy. Pooser et al. ( 2018 ) investigated the trend in cyber risk identification from 2006 to 2015 and company characteristics related to cyber risk perception. The authors used a dataset of various reports from cyber insurers for their study. Walker-Roberts et al. ( 2020 ) investigated the spectrum of risk of a cybersecurity incident taking place in the cyber-physical-enabled world using the VERIS Community Database. The datasets of impacts identified are presented below. Due to overlap, some may also appear in the causes dataset (Supplementary Table 2).

Cybersecurity datasets

General intrusion detection.

General intrusion detection systems account for the largest share of countermeasure datasets. For companies or researchers focused on cybersecurity, the datasets can be used to test their own countermeasures or obtain information about potential vulnerabilities. For example, Al-Omari et al. ( 2021 ) proposed an intelligent intrusion detection model for predicting and detecting attacks in cyberspace, which was applied to dataset UNSW-NB 15. A similar approach was taken by Choras and Kozik ( 2015 ), who used machine learning to detect cyberattacks on web applications. To evaluate their method, they used the HTTP dataset CSIC 2010. For the identification of unknown attacks on web servers, Kamarudin et al. ( 2017 ) proposed an anomaly-based intrusion detection system using an ensemble classification approach. Ganeshan and Rodrigues ( 2020 ) showed an intrusion detection system approach, which clusters the database into several groups and detects the presence of intrusion in the clusters. In comparison, AlKadi et al. ( 2019 ) used a localisation-based model to discover abnormal patterns in network traffic. Hybrid models have been recommended by Bhattacharya et al. ( 2020 ) and Agrawal et al. ( 2019 ); the former is a machine-learning model based on principal component analysis for the classification of intrusion detection system datasets, while the latter is a hybrid ensemble intrusion detection system for anomaly detection using different datasets to detect patterns in network traffic that deviate from normal behaviour.

Agarwal et al. ( 2021 ) used three different machine learning algorithms in their research to find the most suitable for efficiently identifying patterns of suspicious network activity. The UNSW-NB15 dataset was used for this purpose. Kasongo and Sun ( 2020 ), Feed-Forward Deep Neural Network (FFDNN), Keshk et al. ( 2021 ), the privacy-preserving anomaly detection framework, and others also use the UNSW-NB 15 dataset as part of intrusion detection systems. The same dataset and others were used by Binbusayyis and Vaiyapuri ( 2019 ) to identify and compare key features for cyber intrusion detection. Atefinia and Ahmadi ( 2021 ) proposed a deep neural network model to reduce the false positive rate of an anomaly-based intrusion detection system. Fossaceca et al. ( 2015 ) focused in their research on the development of a framework that combined the outputs of multiple learners in order to improve the efficacy of network intrusion, and Gauthama Raman et al. ( 2020 ) presented a search algorithm based on Support Vector machine to improve the performance of the detection and false alarm rate to improve intrusion detection techniques. Ahmad and Alsemmeari ( 2020 ) targeted extreme learning machine techniques due to their good capabilities in classification problems and handling huge data. They used the NSL-KDD dataset as a benchmark.

With reference to prediction, Bakdash et al. ( 2018 ) used datasets from the U.S. Department of Defence to predict cyberattacks by malware. This dataset consists of weekly counts of cyber events over approximately seven years. Another prediction method was presented by Fan et al. ( 2018 ), which showed an improved integrated cybersecurity prediction method based on spatial-time analysis. Also, with reference to prediction, Ashtiani and Azgomi ( 2014 ) proposed a framework for the distributed simulation of cyberattacks based on high-level architecture. Kirubavathi and Anitha ( 2016 ) recommended an approach to detect botnets, irrespective of their structures, based on network traffic flow behaviour analysis and machine-learning techniques. Dwivedi et al. ( 2021 ) introduced a multi-parallel adaptive technique to utilise an adaption mechanism in the group of swarms for network intrusion detection. AlEroud and Karabatis ( 2018 ) presented an approach that used contextual information to automatically identify and query possible semantic links between different types of suspicious activities extracted from network flows.

Intrusion detection systems with a focus on IoT

In addition to general intrusion detection systems, a proportion of studies focused on IoT. Habib et al. ( 2020 ) presented an approach for converting traditional intrusion detection systems into smart intrusion detection systems for IoT networks. To enhance the process of diagnostic detection of possible vulnerabilities with an IoT system, Georgescu et al. ( 2019 ) introduced a method that uses a named entity recognition-based solution. With regard to IoT in the smart home sector, Heartfield et al. ( 2021 ) presented a detection system that is able to autonomously adjust the decision function of its underlying anomaly classification models to a smart home’s changing condition. Another intrusion detection system was suggested by Keserwani et al. ( 2021 ), which combined Grey Wolf Optimization and Particle Swam Optimization to identify various attacks for IoT networks. They used the KDD Cup 99, NSL-KDD and CICIDS-2017 to evaluate their model. Abu Al-Haija and Zein-Sabatto ( 2020 ) provide a comprehensive development of a new intelligent and autonomous deep-learning-based detection and classification system for cyberattacks in IoT communication networks that leverage the power of convolutional neural networks, abbreviated as IoT-IDCS-CNN (IoT-based Intrusion Detection and Classification System using Convolutional Neural Network). To evaluate the development, the authors used the NSL-KDD dataset. Biswas and Roy ( 2021 ) recommended a model that identifies malicious botnet traffic using novel deep-learning approaches like artificial neural networks gutted recurrent units and long- or short-term memory models. They tested their model with the Bot-IoT dataset.

With a more forensic background, Koroniotis et al. ( 2020 ) submitted a network forensic framework, which described the digital investigation phases for identifying and tracing attack behaviours in IoT networks. The suggested work was evaluated with the Bot-IoT and UINSW-NB15 datasets. With a focus on big data and IoT, Chhabra et al. ( 2020 ) presented a cyber forensic framework for big data analytics in an IoT environment using machine learning. Furthermore, the authors mentioned different publicly available datasets for machine-learning models.

A stronger focus on a mobile phones was exhibited by Alazab et al. ( 2020 ), which presented a classification model that combined permission requests and application programme interface calls. The model was tested with a malware dataset containing 27,891 Android apps. A similar approach was taken by Li et al. ( 2019a , b ), who proposed a reliable classifier for Android malware detection based on factorisation machine architecture and extraction of Android app features from manifest files and source code.

Literature reviews

In addition to the different methods and models for intrusion detection systems, various literature reviews on the methods and datasets were also found. Liu and Lang ( 2019 ) proposed a taxonomy of intrusion detection systems that uses data objects as the main dimension to classify and summarise machine learning and deep learning-based intrusion detection literature. They also presented four different benchmark datasets for machine-learning detection systems. Ahmed et al. ( 2016 ) presented an in-depth analysis of four major categories of anomaly detection techniques, which include classification, statistical, information theory and clustering. Hajj et al. ( 2021 ) gave a comprehensive overview of anomaly-based intrusion detection systems. Their article gives an overview of the requirements, methods, measurements and datasets that are used in an intrusion detection system.

Within the framework of machine learning, Chattopadhyay et al. ( 2018 ) conducted a comprehensive review and meta-analysis on the application of machine-learning techniques in intrusion detection systems. They also compared different machine learning techniques in different datasets and summarised the performance. Vidros et al. ( 2017 ) presented an overview of characteristics and methods in automatic detection of online recruitment fraud. They also published an available dataset of 17,880 annotated job ads, retrieved from the use of a real-life system. An empirical study of different unsupervised learning algorithms used in the detection of unknown attacks was presented by Meira et al. ( 2020 ).

New datasets

Kilincer et al. ( 2021 ) reviewed different intrusion detection system datasets in detail. They had a closer look at the UNS-NB15, ISCX-2012, NSL-KDD and CIDDS-001 datasets. Stojanovic et al. ( 2020 ) also provided a review on datasets and their creation for use in advanced persistent threat detection in the literature. Another review of datasets was provided by Sarker et al. ( 2020 ), who focused on cybersecurity data science as part of their research and provided an overview from a machine-learning perspective. Avila et al. ( 2021 ) conducted a systematic literature review on the use of security logs for data leak detection. They recommended a new classification of information leak, which uses the GDPR principles, identified the most widely publicly available dataset for threat detection, described the attack types in the datasets and the algorithms used for data leak detection. Tuncer et al. ( 2020 ) presented a bytecode-based detection method consisting of feature extraction using local neighbourhood binary patterns. They chose a byte-based malware dataset to investigate the performance of the proposed local neighbourhood binary pattern-based detection method. With a different focus, Mauro et al. ( 2020 ) gave an experimental overview of neural-based techniques relevant to intrusion detection. They assessed the value of neural networks using the Bot-IoT and UNSW-DB15 datasets.

Another category of results in the context of countermeasure datasets is those that were presented as new. Moreno et al. ( 2018 ) developed a database of 300 security-related accidents from European and American sources. The database contained cybersecurity-related events in the chemical and process industry. Damasevicius et al. ( 2020 ) proposed a new dataset (LITNET-2020) for network intrusion detection. The dataset is a new annotated network benchmark dataset obtained from the real-world academic network. It presents real-world examples of normal and under-attack network traffic. With a focus on IoT intrusion detection systems, Alsaedi et al. ( 2020 ) proposed a new benchmark IoT/IIot datasets for assessing intrusion detection system-enabled IoT systems. Also in the context of IoT, Vaccari et al. ( 2020 ) proposed a dataset focusing on message queue telemetry transport protocols, which can be used to train machine-learning models. To evaluate the performance of machine-learning classifiers, Mahfouz et al. ( 2020 ) created a dataset called Game Theory and Cybersecurity (GTCS). A dataset containing 22,000 malware and benign samples was constructed by Martin et al. ( 2019 ). The dataset can be used as a benchmark to test the algorithm for Android malware classification and clustering techniques. In addition, Laso et al. ( 2017 ) presented a dataset created to investigate how data and information quality estimates enable the detection of anomalies and malicious acts in cyber-physical systems. The dataset contained various cyberattacks and is publicly available.

In addition to the results described above, several other studies were found that fit into the category of countermeasures. Johnson et al. ( 2016 ) examined the time between vulnerability disclosures. Using another vulnerabilities database, Common Vulnerabilities and Exposures (CVE), Subroto and Apriyana ( 2019 ) presented an algorithm model that uses big data analysis of social media and statistical machine learning to predict cyber risks. A similar databank but with a different focus, Common Vulnerability Scoring System, was used by Chatterjee and Thekdi ( 2020 ) to present an iterative data-driven learning approach to vulnerability assessment and management for complex systems. Using the CICIDS2017 dataset to evaluate the performance, Malik et al. ( 2020 ) proposed a control plane-based orchestration for varied, sophisticated threats and attacks. The same dataset was used in another study by Lee et al. ( 2019 ), who developed an artificial security information event management system based on a combination of event profiling for data processing and different artificial network methods. To exploit the interdependence between multiple series, Fang et al. ( 2021 ) proposed a statistical framework. In order to validate the framework, the authors applied it to a dataset of enterprise-level security breaches from the Privacy Rights Clearinghouse and Identity Theft Center database. Another framework with a defensive aspect was recommended by Li et al. ( 2021 ) to increase the robustness of deep neural networks against adversarial malware evasion attacks. Sarabi et al. ( 2016 ) investigated whether and to what extent business details can help assess an organisation's risk of data breaches and the distribution of risk across different types of incidents to create policies for protection, detection and recovery from different forms of security incidents. They used data from the VERIS Community Database.

Datasets that have been classified into the cybersecurity category are detailed in Supplementary Table 3. Due to overlap, records from the previous tables may also be included.

This paper presented a systematic literature review of studies on cyber risk and cybersecurity that used datasets. Within this framework, 255 studies were fully reviewed and then classified into three different categories. Then, 79 datasets were consolidated from these studies. These datasets were subsequently analysed, and important information was selected through a process of filtering out. This information was recorded in a table and enhanced with further information as part of the literature analysis. This made it possible to create a comprehensive overview of the datasets. For example, each dataset contains a description of where the data came from and how the data has been used to date. This allows different datasets to be compared and the appropriate dataset for the use case to be selected. This research certainly has limitations, so our selection of datasets cannot necessarily be taken as a representation of all available datasets related to cyber risks and cybersecurity. For example, literature searches were conducted in four academic databases and only found datasets that were used in the literature. Many research projects also used old datasets that may no longer consider current developments. In addition, the data are often focused on only one observation and are limited in scope. For example, the datasets can only be applied to specific contexts and are also subject to further limitations (e.g. region, industry, operating system). In the context of the applicability of the datasets, it is unfortunately not possible to make a clear statement on the extent to which they can be integrated into academic or practical areas of application or how great this effort is. Finally, it remains to be pointed out that this is an overview of currently available datasets, which are subject to constant change.

Due to the lack of datasets on cyber risks in the academic literature, additional datasets on cyber risks were integrated as part of a further search. The search was conducted on the Google Dataset search portal. The search term used was ‘cyber risk datasets’. Over 100 results were found. However, due to the low significance and verifiability, only 20 selected datasets were included. These can be found in Table 2  in the “ Appendix ”.

The results of the literature review and datasets also showed that there continues to be a lack of available, open cyber datasets. This lack of data is reflected in cyber insurance, for example, as it is difficult to find a risk-based premium without a sufficient database (Nurse et al. 2020 ). The global cyber insurance market was estimated at USD 5.5 billion in 2020 (Dyson 2020 ). When compared to the USD 1 trillion global losses from cybercrime (Maleks Smith et al. 2020 ), it is clear that there exists a significant cyber risk awareness challenge for both the insurance industry and international commerce. Without comprehensive and qualitative data on cyber losses, it can be difficult to estimate potential losses from cyberattacks and price cyber insurance accordingly (GAO 2021 ). For instance, the average cyber insurance loss increased from USD 145,000 in 2019 to USD 359,000 in 2020 (FitchRatings 2021 ). Cyber insurance is an important risk management tool to mitigate the financial impact of cybercrime. This is particularly evident in the impact of different industries. In the Energy & Commodities financial markets, a ransomware attack on the Colonial Pipeline led to a substantial impact on the U.S. economy. As a result of the attack, about 45% of the U.S. East Coast was temporarily unable to obtain supplies of diesel, petrol and jet fuel. This caused the average price in the U.S. to rise 7 cents to USD 3.04 per gallon, the highest in seven years (Garber 2021 ). In addition, Colonial Pipeline confirmed that it paid a USD 4.4 million ransom to a hacker gang after the attack. Another ransomware attack occurred in the healthcare and government sector. The victim of this attack was the Irish Health Service Executive (HSE). A ransom payment of USD 20 million was demanded from the Irish government to restore services after the hack (Tidy 2021 ). In the car manufacturing sector, Miller and Valasek ( 2015 ) initiated a cyberattack that resulted in the recall of 1.4 million vehicles and cost manufacturers EUR 761 million. The risk that arises in the context of these events is the potential for the accumulation of cyber losses, which is why cyber insurers are not expanding their capacity. An example of this accumulation of cyber risks is the NotPetya malware attack, which originated in Russia, struck in Ukraine, and rapidly spread around the world, causing at least USD 10 billion in damage (GAO 2021 ). These events highlight the importance of proper cyber risk management.

This research provides cyber insurance stakeholders with an overview of cyber datasets. Cyber insurers can use the open datasets to improve their understanding and assessment of cyber risks. For example, the impact datasets can be used to better measure financial impacts and their frequencies. These data could be combined with existing portfolio data from cyber insurers and integrated with existing pricing tools and factors to better assess cyber risk valuation. Although most cyber insurers have sparse historical cyber policy and claims data, they remain too small at present for accurate prediction (Bessy-Roland et al. 2021 ). A combination of portfolio data and external datasets would support risk-adjusted pricing for cyber insurance, which would also benefit policyholders. In addition, cyber insurance stakeholders can use the datasets to identify patterns and make better predictions, which would benefit sustainable cyber insurance coverage. In terms of cyber risk cause datasets, cyber insurers can use the data to review their insurance products. For example, the data could provide information on which cyber risks have not been sufficiently considered in product design or where improvements are needed. A combination of cyber cause and cybersecurity datasets can help establish uniform definitions to provide greater transparency and clarity. Consistent terminology could lead to a more sustainable cyber market, where cyber insurers make informed decisions about the level of coverage and policyholders understand their coverage (The Geneva Association 2020).

In addition to the cyber insurance community, this research also supports cybersecurity stakeholders. The reviewed literature can be used to provide a contemporary, contextual and categorised summary of available datasets. This supports efficient and timely progress in cyber risk research and is beneficial given the dynamic nature of cyber risks. With the help of the described cybersecurity datasets and the identified information, a comparison of different datasets is possible. The datasets can be used to evaluate the effectiveness of countermeasures in simulated cyberattacks or to test intrusion detection systems.

In this paper, we conducted a systematic review of studies on cyber risk and cybersecurity databases. We found that most of the datasets are in the field of intrusion detection and machine learning and are used for technical cybersecurity aspects. The available datasets on cyber risks were relatively less represented. Due to the dynamic nature and lack of historical data, assessing and understanding cyber risk is a major challenge for cyber insurance stakeholders. To address this challenge, a greater density of cyber data is needed to support cyber insurers in risk management and researchers with cyber risk-related topics. With reference to ‘Open Science’ FAIR data (Jacobsen et al. 2020 ), mandatory reporting of cyber incidents could help improve cyber understanding, awareness and loss prevention among companies and insurers. Through greater availability of data, cyber risks can be better understood, enabling researchers to conduct more in-depth research into these risks. Companies could incorporate this new knowledge into their corporate culture to reduce cyber risks. For insurance companies, this would have the advantage that all insurers would have the same understanding of cyber risks, which would support sustainable risk-based pricing. In addition, common definitions of cyber risks could be derived from new data.

The cybersecurity databases summarised and categorised in this research could provide a different perspective on cyber risks that would enable the formulation of common definitions in cyber policies. The datasets can help companies addressing cybersecurity and cyber risk as part of risk management assess their internal cyber posture and cybersecurity measures. The paper can also help improve risk awareness and corporate behaviour, and provides the research community with a comprehensive overview of peer-reviewed datasets and other available datasets in the area of cyber risk and cybersecurity. This approach is intended to support the free availability of data for research. The complete tabulated review of the literature is included in the Supplementary Material.

This work provides directions for several paths of future work. First, there are currently few publicly available datasets for cyber risk and cybersecurity. The older datasets that are still widely used no longer reflect today's technical environment. Moreover, they can often only be used in one context, and the scope of the samples is very limited. It would be of great value if more datasets were publicly available that reflect current environmental conditions. This could help intrusion detection systems to consider current events and thus lead to a higher success rate. It could also compensate for the disadvantages of older datasets by collecting larger quantities of samples and making this contextualisation more widespread. Another area of research may be the integratability and adaptability of cybersecurity and cyber risk datasets. For example, it is often unclear to what extent datasets can be integrated or adapted to existing data. For cyber risks and cybersecurity, it would be helpful to know what requirements need to be met or what is needed to use the datasets appropriately. In addition, it would certainly be helpful to know whether datasets can be modified to be used for cyber risks or cybersecurity. Finally, the ability for stakeholders to identify machine-readable cybersecurity datasets would be useful because it would allow for even clearer delineations or comparisons between datasets. Due to the lack of publicly available datasets, concrete benchmarks often cannot be applied.

Average cost of a breach of more than 50 million records.

Aamir, M., S.S.H. Rizvi, M.A. Hashmani, M. Zubair, and J. Ahmad. 2021. Machine learning classification of port scanning and DDoS attacks: A comparative analysis. Mehran University Research Journal of Engineering and Technology 40 (1): 215–229. https://doi.org/10.22581/muet1982.2101.19 .

Article   Google Scholar  

Aamir, M., and S.M.A. Zaidi. 2019. DDoS attack detection with feature engineering and machine learning: The framework and performance evaluation. International Journal of Information Security 18 (6): 761–785. https://doi.org/10.1007/s10207-019-00434-1 .

Aassal, A. El, S. Baki, A. Das, and R.M. Verma. 2020. 2020. An in-depth benchmarking and evaluation of phishing detection research for security needs. IEEE Access 8: 22170–22192. https://doi.org/10.1109/ACCESS.2020.2969780 .

Abu Al-Haija, Q., and S. Zein-Sabatto. 2020. An efficient deep-learning-based detection and classification system for cyber-attacks in IoT communication networks. Electronics 9 (12): 26. https://doi.org/10.3390/electronics9122152 .

Adhikari, U., T.H. Morris, and S.Y. Pan. 2018. Applying Hoeffding adaptive trees for real-time cyber-power event and intrusion classification. IEEE Transactions on Smart Grid 9 (5): 4049–4060. https://doi.org/10.1109/tsg.2017.2647778 .

Agarwal, A., P. Sharma, M. Alshehri, A.A. Mohamed, and O. Alfarraj. 2021. Classification model for accuracy and intrusion detection using machine learning approach. PeerJ Computer Science . https://doi.org/10.7717/peerj-cs.437 .

Agrafiotis, I., J.R.C.. Nurse, M. Goldsmith, S. Creese, and D. Upton. 2018. A taxonomy of cyber-harms: Defining the impacts of cyber-attacks and understanding how they propagate. Journal of Cybersecurity 4: tyy006.

Agrawal, A., S. Mohammed, and J. Fiaidhi. 2019. Ensemble technique for intruder detection in network traffic. International Journal of Security and Its Applications 13 (3): 1–8. https://doi.org/10.33832/ijsia.2019.13.3.01 .

Ahmad, I., and R.A. Alsemmeari. 2020. Towards improving the intrusion detection through ELM (extreme learning machine). CMC Computers Materials & Continua 65 (2): 1097–1111. https://doi.org/10.32604/cmc.2020.011732 .

Ahmed, M., A.N. Mahmood, and J.K. Hu. 2016. A survey of network anomaly detection techniques. Journal of Network and Computer Applications 60: 19–31. https://doi.org/10.1016/j.jnca.2015.11.016 .

Al-Jarrah, O.Y., O. Alhussein, P.D. Yoo, S. Muhaidat, K. Taha, and K. Kim. 2016. Data randomization and cluster-based partitioning for Botnet intrusion detection. IEEE Transactions on Cybernetics 46 (8): 1796–1806. https://doi.org/10.1109/TCYB.2015.2490802 .

Al-Mhiqani, M.N., R. Ahmad, Z.Z. Abidin, W. Yassin, A. Hassan, K.H. Abdulkareem, N.S. Ali, and Z. Yunos. 2020. A review of insider threat detection: Classification, machine learning techniques, datasets, open challenges, and recommendations. Applied Sciences—Basel 10 (15): 41. https://doi.org/10.3390/app10155208 .

Al-Omari, M., M. Rawashdeh, F. Qutaishat, M. Alshira’H, and N. Ababneh. 2021. An intelligent tree-based intrusion detection model for cyber security. Journal of Network and Systems Management 29 (2): 18. https://doi.org/10.1007/s10922-021-09591-y .

Alabdallah, A., and M. Awad. 2018. Using weighted Support Vector Machine to address the imbalanced classes problem of Intrusion Detection System. KSII Transactions on Internet and Information Systems 12 (10): 5143–5158. https://doi.org/10.3837/tiis.2018.10.027 .

Alazab, M., M. Alazab, A. Shalaginov, A. Mesleh, and A. Awajan. 2020. Intelligent mobile malware detection using permission requests and API calls. Future Generation Computer Systems—the International Journal of eScience 107: 509–521. https://doi.org/10.1016/j.future.2020.02.002 .

Albahar, M.A., R.A. Al-Falluji, and M. Binsawad. 2020. An empirical comparison on malicious activity detection using different neural network-based models. IEEE Access 8: 61549–61564. https://doi.org/10.1109/ACCESS.2020.2984157 .

AlEroud, A.F., and G. Karabatis. 2018. Queryable semantics to detect cyber-attacks: A flow-based detection approach. IEEE Transactions on Systems, Man, and Cybernetics: Systems 48 (2): 207–223. https://doi.org/10.1109/TSMC.2016.2600405 .

Algarni, A.M., V. Thayananthan, and Y.K. Malaiya. 2021. Quantitative assessment of cybersecurity risks for mitigating data breaches in business systems. Applied Sciences (switzerland) . https://doi.org/10.3390/app11083678 .

Alhowaide, A., I. Alsmadi, and J. Tang. 2021. Towards the design of real-time autonomous IoT NIDS. Cluster Computing—the Journal of Networks Software Tools and Applications . https://doi.org/10.1007/s10586-021-03231-5 .

Ali, S., and Y. Li. 2019. Learning multilevel auto-encoders for DDoS attack detection in smart grid network. IEEE Access 7: 108647–108659. https://doi.org/10.1109/ACCESS.2019.2933304 .

AlKadi, O., N. Moustafa, B. Turnbull, and K.K.R. Choo. 2019. Mixture localization-based outliers models for securing data migration in cloud centers. IEEE Access 7: 114607–114618. https://doi.org/10.1109/ACCESS.2019.2935142 .

Allianz. 2021. Allianz Risk Barometer. https://www.agcs.allianz.com/content/dam/onemarketing/agcs/agcs/reports/Allianz-Risk-Barometer-2021.pdf . Accessed 15 May 2021.

Almiani, M., A. AbuGhazleh, A. Al-Rahayfeh, S. Atiewi, and Razaque, A. 2020. Deep recurrent neural network for IoT intrusion detection system. Simulation Modelling Practice and Theory 101: 102031. https://doi.org/10.1016/j.simpat.2019.102031

Alsaedi, A., N. Moustafa, Z. Tari, A. Mahmood, and A. Anwar. 2020. TON_IoT telemetry dataset: A new generation dataset of IoT and IIoT for data-driven intrusion detection systems. IEEE Access 8: 165130–165150. https://doi.org/10.1109/access.2020.3022862 .

Alsamiri, J., and K. Alsubhi. 2019. Internet of Things cyber attacks detection using machine learning. International Journal of Advanced Computer Science and Applications 10 (12): 627–634.

Alsharafat, W. 2013. Applying artificial neural network and eXtended classifier system for network intrusion detection. International Arab Journal of Information Technology 10 (3): 230–238.

Google Scholar  

Amin, R.W., H.E. Sevil, S. Kocak, G. Francia III., and P. Hoover. 2021. The spatial analysis of the malicious uniform resource locators (URLs): 2016 dataset case study. Information (switzerland) 12 (1): 1–18. https://doi.org/10.3390/info12010002 .

Arcuri, M.C., L.Z. Gai, F. Ielasi, and E. Ventisette. 2020. Cyber attacks on hospitality sector: Stock market reaction. Journal of Hospitality and Tourism Technology 11 (2): 277–290. https://doi.org/10.1108/jhtt-05-2019-0080 .

Arp, D., M. Spreitzenbarth, M. Hubner, H. Gascon, K. Rieck, and C.E.R.T. Siemens. 2014. Drebin: Effective and explainable detection of android malware in your pocket. In Ndss 14: 23–26.

Ashtiani, M., and M.A. Azgomi. 2014. A distributed simulation framework for modeling cyber attacks and the evaluation of security measures. Simulation 90 (9): 1071–1102. https://doi.org/10.1177/0037549714540221 .

Atefinia, R., and M. Ahmadi. 2021. Network intrusion detection using multi-architectural modular deep neural network. Journal of Supercomputing 77 (4): 3571–3593. https://doi.org/10.1007/s11227-020-03410-y .

Avila, R., R. Khoury, R. Khoury, and F. Petrillo. 2021. Use of security logs for data leak detection: A systematic literature review. Security and Communication Networks 2021: 29. https://doi.org/10.1155/2021/6615899 .

Azeez, N.A., T.J. Ayemobola, S. Misra, R. Maskeliunas, and R. Damasevicius. 2019. Network Intrusion Detection with a Hashing Based Apriori Algorithm Using Hadoop MapReduce. Computers 8 (4): 15. https://doi.org/10.3390/computers8040086 .

Bakdash, J.Z., S. Hutchinson, E.G. Zaroukian, L.R. Marusich, S. Thirumuruganathan, C. Sample, B. Hoffman, and G. Das. 2018. Malware in the future forecasting of analyst detection of cyber events. Journal of Cybersecurity . https://doi.org/10.1093/cybsec/tyy007 .

Barletta, V.S., D. Caivano, A. Nannavecchia, and M. Scalera. 2020. Intrusion detection for in-vehicle communication networks: An unsupervised Kohonen SOM approach. Future Internet . https://doi.org/10.3390/FI12070119 .

Barzegar, M., and M. Shajari. 2018. Attack scenario reconstruction using intrusion semantics. Expert Systems with Applications 108: 119–133. https://doi.org/10.1016/j.eswa.2018.04.030 .

Bessy-Roland, Y., A. Boumezoued, and C. Hillairet. 2021. Multivariate Hawkes process for cyber insurance. Annals of Actuarial Science 15 (1): 14–39.

Bhardwaj, A., V. Mangat, and R. Vig. 2020. Hyperband tuned deep neural network with well posed stacked sparse AutoEncoder for detection of DDoS attacks in cloud. IEEE Access 8: 181916–181929. https://doi.org/10.1109/ACCESS.2020.3028690 .

Bhati, B.S., C.S. Rai, B. Balamurugan, and F. Al-Turjman. 2020. An intrusion detection scheme based on the ensemble of discriminant classifiers. Computers & Electrical Engineering 86: 9. https://doi.org/10.1016/j.compeleceng.2020.106742 .

Bhattacharya, S., S.S.R. Krishnan, P.K.R. Maddikunta, R. Kaluri, S. Singh, T.R. Gadekallu, M. Alazab, and U. Tariq. 2020. A novel PCA-firefly based XGBoost classification model for intrusion detection in networks using GPU. Electronics 9 (2): 16. https://doi.org/10.3390/electronics9020219 .

Bibi, I., A. Akhunzada, J. Malik, J. Iqbal, A. Musaddiq, and S. Kim. 2020. A dynamic DL-driven architecture to combat sophisticated android malware. IEEE Access 8: 129600–129612. https://doi.org/10.1109/ACCESS.2020.3009819 .

Biener, C., M. Eling, and J.H. Wirfs. 2015. Insurability of cyber risk: An empirical analysis. The   Geneva Papers on Risk and Insurance—Issues and Practice 40 (1): 131–158. https://doi.org/10.1057/gpp.2014.19 .

Binbusayyis, A., and T. Vaiyapuri. 2019. Identifying and benchmarking key features for cyber intrusion detection: An ensemble approach. IEEE Access 7: 106495–106513. https://doi.org/10.1109/ACCESS.2019.2929487 .

Biswas, R., and S. Roy. 2021. Botnet traffic identification using neural networks. Multimedia Tools and Applications . https://doi.org/10.1007/s11042-021-10765-8 .

Bouyeddou, B., F. Harrou, B. Kadri, and Y. Sun. 2021. Detecting network cyber-attacks using an integrated statistical approach. Cluster Computing—the Journal of Networks Software Tools and Applications 24 (2): 1435–1453. https://doi.org/10.1007/s10586-020-03203-1 .

Bozkir, A.S., and M. Aydos. 2020. LogoSENSE: A companion HOG based logo detection scheme for phishing web page and E-mail brand recognition. Computers & Security 95: 18. https://doi.org/10.1016/j.cose.2020.101855 .

Brower, D., and M. McCormick. 2021. Colonial pipeline resumes operations following ransomware attack. Financial Times .

Cai, H., F. Zhang, and A. Levi. 2019. An unsupervised method for detecting shilling attacks in recommender systems by mining item relationship and identifying target items. The Computer Journal 62 (4): 579–597. https://doi.org/10.1093/comjnl/bxy124 .

Cebula, J.J., M.E. Popeck, and L.R. Young. 2014. A Taxonomy of Operational Cyber Security Risks Version 2 .

Chadza, T., K.G. Kyriakopoulos, and S. Lambotharan. 2020. Learning to learn sequential network attacks using hidden Markov models. IEEE Access 8: 134480–134497. https://doi.org/10.1109/ACCESS.2020.3011293 .

Chatterjee, S., and S. Thekdi. 2020. An iterative learning and inference approach to managing dynamic cyber vulnerabilities of complex systems. Reliability Engineering and System Safety . https://doi.org/10.1016/j.ress.2019.106664 .

Chattopadhyay, M., R. Sen, and S. Gupta. 2018. A comprehensive review and meta-analysis on applications of machine learning techniques in intrusion detection. Australasian Journal of Information Systems 22: 27.

Chen, H.S., and J. Fiscus. 2018. The inhospitable vulnerability: A need for cybersecurity risk assessment in the hospitality industry. Journal of Hospitality and Tourism Technology 9 (2): 223–234. https://doi.org/10.1108/JHTT-07-2017-0044 .

Chhabra, G.S., V.P. Singh, and M. Singh. 2020. Cyber forensics framework for big data analytics in IoT environment using machine learning. Multimedia Tools and Applications 79 (23–24): 15881–15900. https://doi.org/10.1007/s11042-018-6338-1 .

Chiba, Z., N. Abghour, K. Moussaid, A. Elomri, and M. Rida. 2019. Intelligent approach to build a Deep Neural Network based IDS for cloud environment using combination of machine learning algorithms. Computers and Security 86: 291–317. https://doi.org/10.1016/j.cose.2019.06.013 .

Choras, M., and R. Kozik. 2015. Machine learning techniques applied to detect cyber attacks on web applications. Logic Journal of the IGPL 23 (1): 45–56. https://doi.org/10.1093/jigpal/jzu038 .

Chowdhury, S., M. Khanzadeh, R. Akula, F. Zhang, S. Zhang, H. Medal, M. Marufuzzaman, and L. Bian. 2017. Botnet detection using graph-based feature clustering. Journal of Big Data 4 (1): 14. https://doi.org/10.1186/s40537-017-0074-7 .

Cost Of A Cyber Incident: Systematic Review And Cross-Validation, Cybersecurity & Infrastructure Agency , 1, https://www.cisa.gov/sites/default/files/publications/CISA-OCE_Cost_of_Cyber_Incidents_Study-FINAL_508.pdf (2020).

D’Hooge, L., T. Wauters, B. Volckaert, and F. De Turck. 2019. Classification hardness for supervised learners on 20 years of intrusion detection data. IEEE Access 7: 167455–167469. https://doi.org/10.1109/access.2019.2953451 .

Damasevicius, R., A. Venckauskas, S. Grigaliunas, J. Toldinas, N. Morkevicius, T. Aleliunas, and P. Smuikys. 2020. LITNET-2020: An annotated real-world network flow dataset for network intrusion detection. Electronics 9 (5): 23. https://doi.org/10.3390/electronics9050800 .

De Giovanni, A.L.D., and M. Pirra. 2020. On the determinants of data breaches: A cointegration analysis. Decisions in Economics and Finance . https://doi.org/10.1007/s10203-020-00301-y .

Deng, L., D. Li, X. Yao, and H. Wang. 2019. Retracted Article: Mobile network intrusion detection for IoT system based on transfer learning algorithm. Cluster Computing 22 (4): 9889–9904. https://doi.org/10.1007/s10586-018-1847-2 .

Donkal, G., and G.K. Verma. 2018. A multimodal fusion based framework to reinforce IDS for securing Big Data environment using Spark. Journal of Information Security and Applications 43: 1–11. https://doi.org/10.1016/j.jisa.2018.10.001 .

Dunn, C., N. Moustafa, and B. Turnbull. 2020. Robustness evaluations of sustainable machine learning models against data Poisoning attacks in the Internet of Things. Sustainability 12 (16): 17. https://doi.org/10.3390/su12166434 .

Dwivedi, S., M. Vardhan, and S. Tripathi. 2021. Multi-parallel adaptive grasshopper optimization technique for detecting anonymous attacks in wireless networks. Wireless Personal Communications . https://doi.org/10.1007/s11277-021-08368-5 .

Dyson, B. 2020. COVID-19 crisis could be ‘watershed’ for cyber insurance, says Swiss Re exec. https://www.spglobal.com/marketintelligence/en/news-insights/latest-news-headlines/covid-19-crisis-could-be-watershed-for-cyber-insurance-says-swiss-re-exec-59197154 . Accessed 7 May 2020.

EIOPA. 2018. Understanding cyber insurance—a structured dialogue with insurance companies. https://www.eiopa.europa.eu/sites/default/files/publications/reports/eiopa_understanding_cyber_insurance.pdf . Accessed 28 May 2018

Elijah, A.V., A. Abdullah, N.Z. JhanJhi, M. Supramaniam, and O.B. Abdullateef. 2019. Ensemble and deep-learning methods for two-class and multi-attack anomaly intrusion detection: An empirical study. International Journal of Advanced Computer Science and Applications 10 (9): 520–528.

Eling, M., and K. Jung. 2018. Copula approaches for modeling cross-sectional dependence of data breach losses. Insurance Mathematics & Economics 82: 167–180. https://doi.org/10.1016/j.insmatheco.2018.07.003 .

Eling, M., and W. Schnell. 2016. What do we know about cyber risk and cyber risk insurance? Journal of Risk Finance 17 (5): 474–491. https://doi.org/10.1108/jrf-09-2016-0122 .

Eling, M., and J. Wirfs. 2019. What are the actual costs of cyber risk events? European Journal of Operational Research 272 (3): 1109–1119. https://doi.org/10.1016/j.ejor.2018.07.021 .

Eling, M. 2020. Cyber risk research in business and actuarial science. European Actuarial Journal 10 (2): 303–333.

Elmasry, W., A. Akbulut, and A.H. Zaim. 2019. Empirical study on multiclass classification-based network intrusion detection. Computational Intelligence 35 (4): 919–954. https://doi.org/10.1111/coin.12220 .

Elsaid, S.A., and N.S. Albatati. 2020. An optimized collaborative intrusion detection system for wireless sensor networks. Soft Computing 24 (16): 12553–12567. https://doi.org/10.1007/s00500-020-04695-0 .

Estepa, R., J.E. Díaz-Verdejo, A. Estepa, and G. Madinabeitia. 2020. How much training data is enough? A case study for HTTP anomaly-based intrusion detection. IEEE Access 8: 44410–44425. https://doi.org/10.1109/ACCESS.2020.2977591 .

European Council. 2021. Cybersecurity: how the EU tackles cyber threats. https://www.consilium.europa.eu/en/policies/cybersecurity/ . Accessed 10 May 2021

Falco, G. et al. 2019. Cyber risk research impeded by disciplinary barriers. Science (American Association for the Advancement of Science) 366 (6469): 1066–1069.

Fan, Z.J., Z.P. Tan, C.X. Tan, and X. Li. 2018. An improved integrated prediction method of cyber security situation based on spatial-time analysis. Journal of Internet Technology 19 (6): 1789–1800. https://doi.org/10.3966/160792642018111906015 .

Fang, Z.J., M.C. Xu, S.H. Xu, and T.Z. Hu. 2021. A framework for predicting data breach risk: Leveraging dependence to cope with sparsity. IEEE Transactions on Information Forensics and Security 16: 2186–2201. https://doi.org/10.1109/tifs.2021.3051804 .

Farkas, S., O. Lopez, and M. Thomas. 2021. Cyber claim analysis using Generalized Pareto regression trees with applications to insurance. Insurance: Mathematics and Economics 98: 92–105. https://doi.org/10.1016/j.insmatheco.2021.02.009 .

Farsi, H., A. Fanian, and Z. Taghiyarrenani. 2019. A novel online state-based anomaly detection system for process control networks. International Journal of Critical Infrastructure Protection 27: 11. https://doi.org/10.1016/j.ijcip.2019.100323 .

Ferrag, M.A., L. Maglaras, S. Moschoyiannis, and H. Janicke. 2020. Deep learning for cyber security intrusion detection: Approaches, datasets, and comparative study. Journal of Information Security and Applications 50: 19. https://doi.org/10.1016/j.jisa.2019.102419 .

Field, M. 2018. WannaCry cyber attack cost the NHS £92m as 19,000 appointments cancelled. https://www.telegraph.co.uk/technology/2018/10/11/wannacry-cyber-attack-cost-nhs-92m-19000-appointments-cancelled/ . Accessed 9 May 2018.

FitchRatings. 2021. U.S. Cyber Insurance Market Update (Spike in Claims Leads to Decline in 2020 Underwriting Performance). https://www.fitchratings.com/research/insurance/us-cyber-insurance-market-update-spike-in-claims-leads-to-decline-in-2020-underwriting-performance-26-05-2021 .

Fossaceca, J.M., T.A. Mazzuchi, and S. Sarkani. 2015. MARK-ELM: Application of a novel Multiple Kernel Learning framework for improving the robustness of network intrusion detection. Expert Systems with Applications 42 (8): 4062–4080. https://doi.org/10.1016/j.eswa.2014.12.040 .

Franke, U., and J. Brynielsson. 2014. Cyber situational awareness–a systematic review of the literature. Computers & security 46: 18–31.

Freeha, K., K.J. Hwan, M. Lars, and M. Robin. 2021. Data breach management: An integrated risk model. Information & Management 58 (1): 103392. https://doi.org/10.1016/j.im.2020.103392 .

Ganeshan, R., and P. Rodrigues. 2020. Crow-AFL: Crow based adaptive fractional lion optimization approach for the intrusion detection. Wireless Personal Communications 111 (4): 2065–2089. https://doi.org/10.1007/s11277-019-06972-0 .

GAO. 2021. CYBER INSURANCE—Insurers and policyholders face challenges in an evolving market. https://www.gao.gov/assets/gao-21-477.pdf . Accessed 16 May 2021.

Garber, J. 2021. Colonial Pipeline fiasco foreshadows impact of Biden energy policy. https://www.foxbusiness.com/markets/colonial-pipeline-fiasco-foreshadows-impact-of-biden-energy-policy . Accessed 4 May 2021.

Gauthama Raman, M.R., N. Somu, S. Jagarapu, T. Manghnani, T. Selvam, K. Krithivasan, and V.S. Shankar Sriram. 2020. An efficient intrusion detection technique based on support vector machine and improved binary gravitational search algorithm. Artificial Intelligence Review 53 (5): 3255–3286. https://doi.org/10.1007/s10462-019-09762-z .

Gavel, S., A.S. Raghuvanshi, and S. Tiwari. 2021. Distributed intrusion detection scheme using dual-axis dimensionality reduction for Internet of things (IoT). Journal of Supercomputing . https://doi.org/10.1007/s11227-021-03697-5 .

GDPR.EU. 2021. FAQ. https://gdpr.eu/faq/ . Accessed 10 May 2021.

Georgescu, T.M., B. Iancu, and M. Zurini. 2019. Named-entity-recognition-based automated system for diagnosing cybersecurity situations in IoT networks. Sensors (switzerland) . https://doi.org/10.3390/s19153380 .

Giudici, P., and E. Raffinetti. 2020. Cyber risk ordering with rank-based statistical models. AStA Advances in Statistical Analysis . https://doi.org/10.1007/s10182-020-00387-0 .

Goh, J., S. Adepu, K.N. Junejo, and A. Mathur. 2016. A dataset to support research in the design of secure water treatment systems. In CRITIS.

Gong, X.Y., J.L. Lu, Y.F. Zhou, H. Qiu, and R. He. 2021. Model uncertainty based annotation error fixing for web attack detection. Journal of Signal Processing Systems for Signal Image and Video Technology 93 (2–3): 187–199. https://doi.org/10.1007/s11265-019-01494-1 .

Goode, S., H. Hoehle, V. Venkatesh, and S.A. Brown. 2017. USER compensation as a data breach recovery action: An investigation of the sony playstation network breach. MIS Quarterly 41 (3): 703–727.

Guo, H., S. Huang, C. Huang, Z. Pan, M. Zhang, and F. Shi. 2020. File entropy signal analysis combined with wavelet decomposition for malware classification. IEEE Access 8: 158961–158971. https://doi.org/10.1109/ACCESS.2020.3020330 .

Habib, M., I. Aljarah, and H. Faris. 2020. A Modified multi-objective particle swarm optimizer-based Lévy flight: An approach toward intrusion detection in Internet of Things. Arabian Journal for Science and Engineering 45 (8): 6081–6108. https://doi.org/10.1007/s13369-020-04476-9 .

Hajj, S., R. El Sibai, J.B. Abdo, J. Demerjian, A. Makhoul, and C. Guyeux. 2021. Anomaly-based intrusion detection systems: The requirements, methods, measurements, and datasets. Transactions on Emerging Telecommunications Technologies 32 (4): 36. https://doi.org/10.1002/ett.4240 .

Heartfield, R., G. Loukas, A. Bezemskij, and E. Panaousis. 2021. Self-configurable cyber-physical intrusion detection for smart homes using reinforcement learning. IEEE Transactions on Information Forensics and Security 16: 1720–1735. https://doi.org/10.1109/tifs.2020.3042049 .

Hemo, B., T. Gafni, K. Cohen, and Q. Zhao. 2020. Searching for anomalies over composite hypotheses. IEEE Transactions on Signal Processing 68: 1181–1196. https://doi.org/10.1109/TSP.2020.2971438

Hindy, H., D. Brosset, E. Bayne, A.K. Seeam, C. Tachtatzis, R. Atkinson, and X. Bellekens. 2020. A taxonomy of network threats and the effect of current datasets on intrusion detection systems. IEEE Access 8: 104650–104675. https://doi.org/10.1109/ACCESS.2020.3000179 .

Hong, W., D. Huang, C. Chen, and J. Lee. 2020. Towards accurate and efficient classification of power system contingencies and cyber-attacks using recurrent neural networks. IEEE Access 8: 123297–123309. https://doi.org/10.1109/ACCESS.2020.3007609 .

Husák, M., M. Zádník, V. Bartos, and P. Sokol. 2020. Dataset of intrusion detection alerts from a sharing platform. Data in Brief 33: 106530.

IBM Security. 2020. Cost of a Data breach Report. https://www.capita.com/sites/g/files/nginej291/files/2020-08/Ponemon-Global-Cost-of-Data-Breach-Study-2020.pdf . Accessed 19 May 2021.

IEEE. 2021. IEEE Quick Facts. https://www.ieee.org/about/at-a-glance.html . Accessed 11 May 2021.

Kilincer, I.F., F. Ertam, and S. Abdulkadir. 2021. Machine learning methods for cyber security intrusion detection: Datasets and comparative study. Computer Networks 188: 107840. https://doi.org/10.1016/j.comnet.2021.107840 .

Jaber, A.N., and S. Ul Rehman. 2020. FCM-SVM based intrusion detection system for cloud computing environment. Cluster Computing—the Journal of Networks Software Tools and Applications 23 (4): 3221–3231. https://doi.org/10.1007/s10586-020-03082-6 .

Jacobs, J., S. Romanosky, B. Edwards, M. Roytman, and I. Adjerid. 2019. Exploit prediction scoring system (epss). arXiv:1908.04856

Jacobsen, A. et al. 2020. FAIR principles: Interpretations and implementation considerations. Data Intelligence 2 (1–2): 10–29. https://doi.org/10.1162/dint_r_00024 .

Jahromi, A.N., S. Hashemi, A. Dehghantanha, R.M. Parizi, and K.K.R. Choo. 2020. An enhanced stacked LSTM method with no random initialization for malware threat hunting in safety and time-critical systems. IEEE Transactions on Emerging Topics in Computational Intelligence 4 (5): 630–640. https://doi.org/10.1109/TETCI.2019.2910243 .

Jang, S., S. Li, and Y. Sung. 2020. FastText-based local feature visualization algorithm for merged image-based malware classification framework for cyber security and cyber defense. Mathematics 8 (3): 13. https://doi.org/10.3390/math8030460 .

Javeed, D., T.H. Gao, and M.T. Khan. 2021. SDN-enabled hybrid DL-driven framework for the detection of emerging cyber threats in IoT. Electronics 10 (8): 16. https://doi.org/10.3390/electronics10080918 .

Johnson, P., D. Gorton, R. Lagerstrom, and M. Ekstedt. 2016. Time between vulnerability disclosures: A measure of software product vulnerability. Computers & Security 62: 278–295. https://doi.org/10.1016/j.cose.2016.08.004 .

Johnson, P., R. Lagerström, M. Ekstedt, and U. Franke. 2018. Can the common vulnerability scoring system be trusted? A Bayesian analysis. IEEE Transactions on Dependable and Secure Computing 15 (6): 1002–1015. https://doi.org/10.1109/TDSC.2016.2644614 .

Junger, M., V. Wang, and M. Schlömer. 2020. Fraud against businesses both online and offline: Crime scripts, business characteristics, efforts, and benefits. Crime Science 9 (1): 13. https://doi.org/10.1186/s40163-020-00119-4 .

Kalutarage, H.K., H.N. Nguyen, and S.A. Shaikh. 2017. Towards a threat assessment framework for apps collusion. Telecommunication Systems 66 (3): 417–430. https://doi.org/10.1007/s11235-017-0296-1 .

Kamarudin, M.H., C. Maple, T. Watson, and N.S. Safa. 2017. A LogitBoost-based algorithm for detecting known and unknown web attacks. IEEE Access 5: 26190–26200. https://doi.org/10.1109/ACCESS.2017.2766844 .

Kasongo, S.M., and Y.X. Sun. 2020. A deep learning method with wrapper based feature extraction for wireless intrusion detection system. Computers & Security 92: 15. https://doi.org/10.1016/j.cose.2020.101752 .

Keserwani, P.K., M.C. Govil, E.S. Pilli, and P. Govil. 2021. A smart anomaly-based intrusion detection system for the Internet of Things (IoT) network using GWO–PSO–RF model. Journal of Reliable Intelligent Environments 7 (1): 3–21. https://doi.org/10.1007/s40860-020-00126-x .

Keshk, M., E. Sitnikova, N. Moustafa, J. Hu, and I. Khalil. 2021. An integrated framework for privacy-preserving based anomaly detection for cyber-physical systems. IEEE Transactions on Sustainable Computing 6 (1): 66–79. https://doi.org/10.1109/TSUSC.2019.2906657 .

Khan, I.A., D.C. Pi, A.K. Bhatia, N. Khan, W. Haider, and A. Wahab. 2020. Generating realistic IoT-based IDS dataset centred on fuzzy qualitative modelling for cyber-physical systems. Electronics Letters 56 (9): 441–443. https://doi.org/10.1049/el.2019.4158 .

Khraisat, A., I. Gondal, P. Vamplew, J. Kamruzzaman, and A. Alazab. 2020. Hybrid intrusion detection system based on the stacking ensemble of C5 decision tree classifier and one class support vector machine. Electronics 9 (1): 18. https://doi.org/10.3390/electronics9010173 .

Khraisat, A., I. Gondal, P. Vamplew, and J. Kamruzzaman. 2019. Survey of intrusion detection systems: Techniques, datasets and challenges. Cybersecurity 2 (1): 20. https://doi.org/10.1186/s42400-019-0038-7 .

Kilincer, I.F., F. Ertam, and A. Sengur. 2021. Machine learning methods for cyber security intrusion detection: Datasets and comparative study. Computer Networks 188: 16. https://doi.org/10.1016/j.comnet.2021.107840 .

Kim, D., and H.K. Kim. 2019. Automated dataset generation system for collaborative research of cyber threat analysis. Security and Communication Networks 2019: 10. https://doi.org/10.1155/2019/6268476 .

Kim, G., C. Lee, J. Jo, and H. Lim. 2020. Automatic extraction of named entities of cyber threats using a deep Bi-LSTM-CRF network. International Journal of Machine Learning and Cybernetics 11 (10): 2341–2355. https://doi.org/10.1007/s13042-020-01122-6 .

Kirubavathi, G., and R. Anitha. 2016. Botnet detection via mining of traffic flow characteristics. Computers & Electrical Engineering 50: 91–101. https://doi.org/10.1016/j.compeleceng.2016.01.012 .

Kiwia, D., A. Dehghantanha, K.K.R. Choo, and J. Slaughter. 2018. A cyber kill chain based taxonomy of banking Trojans for evolutionary computational intelligence. Journal of Computational Science 27: 394–409. https://doi.org/10.1016/j.jocs.2017.10.020 .

Koroniotis, N., N. Moustafa, and E. Sitnikova. 2020. A new network forensic framework based on deep learning for Internet of Things networks: A particle deep framework. Future Generation Computer Systems 110: 91–106. https://doi.org/10.1016/j.future.2020.03.042 .

Kruse, C.S., B. Frederick, T. Jacobson, and D. Kyle Monticone. 2017. Cybersecurity in healthcare: A systematic review of modern threats and trends. Technology and Health Care 25 (1): 1–10.

Kshetri, N. 2018. The economics of cyber-insurance. IT Professional 20 (6): 9–14. https://doi.org/10.1109/MITP.2018.2874210 .

Kumar, R., P. Kumar, R. Tripathi, G.P. Gupta, T.R. Gadekallu, and G. Srivastava. 2021. SP2F: A secured privacy-preserving framework for smart agricultural Unmanned Aerial Vehicles. Computer Networks . https://doi.org/10.1016/j.comnet.2021.107819 .

Kumar, R., and R. Tripathi. 2021. DBTP2SF: A deep blockchain-based trustworthy privacy-preserving secured framework in industrial internet of things systems. Transactions on Emerging Telecommunications Technologies 32 (4): 27. https://doi.org/10.1002/ett.4222 .

Laso, P.M., D. Brosset, and J. Puentes. 2017. Dataset of anomalies and malicious acts in a cyber-physical subsystem. Data in Brief 14: 186–191. https://doi.org/10.1016/j.dib.2017.07.038 .

Lee, J., J. Kim, I. Kim, and K. Han. 2019. Cyber threat detection based on artificial neural networks using event profiles. IEEE Access 7: 165607–165626. https://doi.org/10.1109/ACCESS.2019.2953095 .

Lee, S.J., P.D. Yoo, A.T. Asyhari, Y. Jhi, L. Chermak, C.Y. Yeun, and K. Taha. 2020. IMPACT: Impersonation attack detection via edge computing using deep Autoencoder and feature abstraction. IEEE Access 8: 65520–65529. https://doi.org/10.1109/ACCESS.2020.2985089 .

Leong, Y.-Y., and Y.-C. Chen. 2020. Cyber risk cost and management in IoT devices-linked health insurance. The Geneva Papers on Risk and Insurance—Issues and Practice 45 (4): 737–759. https://doi.org/10.1057/s41288-020-00169-4 .

Levi, M. 2017. Assessing the trends, scale and nature of economic cybercrimes: overview and Issues: In Cybercrimes, cybercriminals and their policing, in crime, law and social change. Crime, Law and Social Change 67 (1): 3–20. https://doi.org/10.1007/s10611-016-9645-3 .

Li, C., K. Mills, D. Niu, R. Zhu, H. Zhang, and H. Kinawi. 2019a. Android malware detection based on factorization machine. IEEE Access 7: 184008–184019. https://doi.org/10.1109/ACCESS.2019.2958927 .

Li, D.Q., and Q.M. Li. 2020. Adversarial deep ensemble: evasion attacks and defenses for malware detection. IEEE Transactions on Information Forensics and Security 15: 3886–3900. https://doi.org/10.1109/tifs.2020.3003571 .

Li, D.Q., Q.M. Li, Y.F. Ye, and S.H. Xu. 2021. A framework for enhancing deep neural networks against adversarial malware. IEEE Transactions on Network Science and Engineering 8 (1): 736–750. https://doi.org/10.1109/tnse.2021.3051354 .

Li, R.H., C. Zhang, C. Feng, X. Zhang, and C.J. Tang. 2019b. Locating vulnerability in binaries using deep neural networks. IEEE Access 7: 134660–134676. https://doi.org/10.1109/access.2019.2942043 .

Li, X., M. Xu, P. Vijayakumar, N. Kumar, and X. Liu. 2020. Detection of low-frequency and multi-stage attacks in industrial Internet of Things. IEEE Transactions on Vehicular Technology 69 (8): 8820–8831. https://doi.org/10.1109/TVT.2020.2995133 .

Liu, H.Y., and B. Lang. 2019. Machine learning and deep learning methods for intrusion detection systems: A survey. Applied Sciences—Basel 9 (20): 28. https://doi.org/10.3390/app9204396 .

Lopez-Martin, M., B. Carro, and A. Sanchez-Esguevillas. 2020. Application of deep reinforcement learning to intrusion detection for supervised problems. Expert Systems with Applications . https://doi.org/10.1016/j.eswa.2019.112963 .

Loukas, G., D. Gan, and Tuan Vuong. 2013. A review of cyber threats and defence approaches in emergency management. Future Internet 5: 205–236.

Luo, C.C., S. Su, Y.B. Sun, Q.J. Tan, M. Han, and Z.H. Tian. 2020. A convolution-based system for malicious URLs detection. CMC—Computers Materials Continua 62 (1): 399–411.

Mahbooba, B., M. Timilsina, R. Sahal, and M. Serrano. 2021. Explainable artificial intelligence (XAI) to enhance trust management in intrusion detection systems using decision tree model. Complexity 2021: 11. https://doi.org/10.1155/2021/6634811 .

Mahdavifar, S., and A.A. Ghorbani. 2020. DeNNeS: Deep embedded neural network expert system for detecting cyber attacks. Neural Computing & Applications 32 (18): 14753–14780. https://doi.org/10.1007/s00521-020-04830-w .

Mahfouz, A., A. Abuhussein, D. Venugopal, and S. Shiva. 2020. Ensemble classifiers for network intrusion detection using a novel network attack dataset. Future Internet 12 (11): 1–19. https://doi.org/10.3390/fi12110180 .

Maleks Smith, Z., E. Lostri, and J.A. Lewis. 2020. The hidden costs of cybercrime. https://www.mcafee.com/enterprise/en-us/assets/reports/rp-hidden-costs-of-cybercrime.pdf . Accessed 16 May 2021.

Malik, J., A. Akhunzada, I. Bibi, M. Imran, A. Musaddiq, and S.W. Kim. 2020. Hybrid deep learning: An efficient reconnaissance and surveillance detection mechanism in SDN. IEEE Access 8: 134695–134706. https://doi.org/10.1109/ACCESS.2020.3009849 .

Manimurugan, S. 2020. IoT-Fog-Cloud model for anomaly detection using improved Naive Bayes and principal component analysis. Journal of Ambient Intelligence and Humanized Computing . https://doi.org/10.1007/s12652-020-02723-3 .

Martin, A., R. Lara-Cabrera, and D. Camacho. 2019. Android malware detection through hybrid features fusion and ensemble classifiers: The AndroPyTool framework and the OmniDroid dataset. Information Fusion 52: 128–142. https://doi.org/10.1016/j.inffus.2018.12.006 .

Mauro, M.D., G. Galatro, and A. Liotta. 2020. Experimental review of neural-based approaches for network intrusion management. IEEE Transactions on Network and Service Management 17 (4): 2480–2495. https://doi.org/10.1109/TNSM.2020.3024225 .

McLeod, A., and D. Dolezel. 2018. Cyber-analytics: Modeling factors associated with healthcare data breaches. Decision Support Systems 108: 57–68. https://doi.org/10.1016/j.dss.2018.02.007 .

Meira, J., R. Andrade, I. Praca, J. Carneiro, V. Bolon-Canedo, A. Alonso-Betanzos, and G. Marreiros. 2020. Performance evaluation of unsupervised techniques in cyber-attack anomaly detection. Journal of Ambient Intelligence and Humanized Computing 11 (11): 4477–4489. https://doi.org/10.1007/s12652-019-01417-9 .

Miao, Y., J. Ma, X. Liu, J. Weng, H. Li, and H. Li. 2019. Lightweight fine-grained search over encrypted data in Fog computing. IEEE Transactions on Services Computing 12 (5): 772–785. https://doi.org/10.1109/TSC.2018.2823309 .

Miller, C., and C. Valasek. 2015. Remote exploitation of an unaltered passenger vehicle. Black Hat USA 2015 (S 91).

Mireles, J.D., E. Ficke, J.H. Cho, P. Hurley, and S.H. Xu. 2019. Metrics towards measuring cyber agility. IEEE Transactions on Information Forensics and Security 14 (12): 3217–3232. https://doi.org/10.1109/tifs.2019.2912551 .

Mishra, N., and S. Pandya. 2021. Internet of Things applications, security challenges, attacks, intrusion detection, and future visions: A systematic review. IEEE Access . https://doi.org/10.1109/ACCESS.2021.3073408 .

Monshizadeh, M., V. Khatri, B.G. Atli, R. Kantola, and Z. Yan. 2019. Performance evaluation of a combined anomaly detection platform. IEEE Access 7: 100964–100978. https://doi.org/10.1109/ACCESS.2019.2930832 .

Moreno, V.C., G. Reniers, E. Salzano, and V. Cozzani. 2018. Analysis of physical and cyber security-related events in the chemical and process industry. Process Safety and Environmental Protection 116: 621–631. https://doi.org/10.1016/j.psep.2018.03.026 .

Moro, E.D. 2020. Towards an economic cyber loss index for parametric cover based on IT security indicator: A preliminary analysis. Risks . https://doi.org/10.3390/risks8020045 .

Moustafa, N., E. Adi, B. Turnbull, and J. Hu. 2018. A new threat intelligence scheme for safeguarding industry 4.0 systems. IEEE Access 6: 32910–32924. https://doi.org/10.1109/ACCESS.2018.2844794 .

Moustakidis, S., and P. Karlsson. 2020. A novel feature extraction methodology using Siamese convolutional neural networks for intrusion detection. Cybersecurity . https://doi.org/10.1186/s42400-020-00056-4 .

Mukhopadhyay, A., S. Chatterjee, K.K. Bagchi, P.J. Kirs, and G.K. Shukla. 2019. Cyber Risk Assessment and Mitigation (CRAM) framework using Logit and Probit models for cyber insurance. Information Systems Frontiers 21 (5): 997–1018. https://doi.org/10.1007/s10796-017-9808-5 .

Murphey, H. 2021a. Biden signs executive order to strengthen US cyber security. https://www.ft.com/content/4d808359-b504-4014-85f6-68e7a2851bf1?accessToken=zwAAAXl0_ifgkc9NgINZtQRAFNOF9mjnooUb8Q.MEYCIQDw46SFWsMn1iyuz3kvgAmn6mxc0rIVfw10Lg1ovJSfJwIhAK2X2URzfSqHwIS7ddRCvSt2nGC2DcdoiDTG49-4TeEt&sharetype=gift?token=fbcd6323-1ecf-4fc3-b136-b5b0dd6a8756 . Accessed 7 May 2021.

Murphey, H. 2021b. Millions of connected devices have security flaws, study shows. https://www.ft.com/content/0bf92003-926d-4dee-87d7-b01f7c3e9621?accessToken=zwAAAXnA7f2Ikc8L-SADkm1N7tOH17AffD6WIQ.MEQCIDjBuROvhmYV0Mx3iB0cEV7m5oND1uaCICxJu0mzxM0PAiBam98q9zfHiTB6hKGr1gGl0Azt85yazdpX9K5sI8se3Q&sharetype=gift?token=2538218d-77d9-4dd3-9649-3cb556a34e51 . Accessed 6 May 2021.

Murugesan, V., M. Shalinie, and M.H. Yang. 2018. Design and analysis of hybrid single packet IP traceback scheme. IET Networks 7 (3): 141–151. https://doi.org/10.1049/iet-net.2017.0115 .

Mwitondi, K.S., and S.A. Zargari. 2018. An iterative multiple sampling method for intrusion detection. Information Security Journal 27 (4): 230–239. https://doi.org/10.1080/19393555.2018.1539790 .

Neto, N.N., S. Madnick, A.M.G. De Paula, and N.M. Borges. 2021. Developing a global data breach database and the challenges encountered. ACM Journal of Data and Information Quality 13 (1): 33. https://doi.org/10.1145/3439873 .

Nurse, J.R.C., L. Axon, A. Erola, I. Agrafiotis, M. Goldsmith, and S. Creese. 2020. The data that drives cyber insurance: A study into the underwriting and claims processes. In 2020 International conference on cyber situational awareness, data analytics and assessment (CyberSA), 15–19 June 2020.

Oliveira, N., I. Praca, E. Maia, and O. Sousa. 2021. Intelligent cyber attack detection and classification for network-based intrusion detection systems. Applied Sciences—Basel 11 (4): 21. https://doi.org/10.3390/app11041674 .

Page, M.J. et al. 2021. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. Systematic Reviews 10 (1): 89. https://doi.org/10.1186/s13643-021-01626-4 .

Pajouh, H.H., R. Javidan, R. Khayami, A. Dehghantanha, and K.R. Choo. 2019. A two-layer dimension reduction and two-tier classification model for anomaly-based intrusion detection in IoT backbone networks. IEEE Transactions on Emerging Topics in Computing 7 (2): 314–323. https://doi.org/10.1109/TETC.2016.2633228 .

Parra, G.D., P. Rad, K.K.R. Choo, and N. Beebe. 2020. Detecting Internet of Things attacks using distributed deep learning. Journal of Network and Computer Applications 163: 13. https://doi.org/10.1016/j.jnca.2020.102662 .

Paté-Cornell, M.E., M. Kuypers, M. Smith, and P. Keller. 2018. Cyber risk management for critical infrastructure: A risk analysis model and three case studies. Risk Analysis 38 (2): 226–241. https://doi.org/10.1111/risa.12844 .

Pooser, D.M., M.J. Browne, and O. Arkhangelska. 2018. Growth in the perception of cyber risk: evidence from U.S. P&C Insurers. The Geneva Papers on Risk and Insurance—Issues and Practice 43 (2): 208–223. https://doi.org/10.1057/s41288-017-0077-9 .

Pu, G., L. Wang, J. Shen, and F. Dong. 2021. A hybrid unsupervised clustering-based anomaly detection method. Tsinghua Science and Technology 26 (2): 146–153. https://doi.org/10.26599/TST.2019.9010051 .

Qiu, J., W. Luo, L. Pan, Y. Tai, J. Zhang, and Y. Xiang. 2019. Predicting the impact of android malicious samples via machine learning. IEEE Access 7: 66304–66316. https://doi.org/10.1109/ACCESS.2019.2914311 .

Qu, X., L. Yang, K. Guo, M. Sun, L. Ma, T. Feng, S. Ren, K. Li, and X. Ma. 2020. Direct batch growth hierarchical self-organizing mapping based on statistics for efficient network intrusion detection. IEEE Access 8: 42251–42260. https://doi.org/10.1109/ACCESS.2020.2976810 .

Rahman, Md.S., S. Halder, Md. Ashraf Uddin, and U.K. Acharjee. 2021. An efficient hybrid system for anomaly detection in social networks. Cybersecurity 4 (1): 10. https://doi.org/10.1186/s42400-021-00074-w .

Ramaiah, M., V. Chandrasekaran, V. Ravi, and N. Kumar. 2021. An intrusion detection system using optimized deep neural network architecture. Transactions on Emerging Telecommunications Technologies 32 (4): 17. https://doi.org/10.1002/ett.4221 .

Raman, M.R.G., K. Kannan, S.K. Pal, and V.S.S. Sriram. 2016. Rough set-hypergraph-based feature selection approach for intrusion detection systems. Defence Science Journal 66 (6): 612–617. https://doi.org/10.14429/dsj.66.10802 .

Rathore, S., J.H. Park. 2018. Semi-supervised learning based distributed attack detection framework for IoT. Applied Soft Computing 72: 79–89. https://doi.org/10.1016/j.asoc.2018.05.049 .

Romanosky, S., L. Ablon, A. Kuehn, and T. Jones. 2019. Content analysis of cyber insurance policies: How do carriers price cyber risk? Journal of Cybersecurity (oxford) 5 (1): tyz002.

Sarabi, A., P. Naghizadeh, Y. Liu, and M. Liu. 2016. Risky business: Fine-grained data breach prediction using business profiles. Journal of Cybersecurity 2 (1): 15–28. https://doi.org/10.1093/cybsec/tyw004 .

Sardi, Alberto, Alessandro Rizzi, Enrico Sorano, and Anna Guerrieri. 2021. Cyber risk in health facilities: A systematic literature review. Sustainability 12 (17): 7002.

Sarker, Iqbal H., A.S.M. Kayes, Shahriar Badsha, Hamed Alqahtani, Paul Watters, and Alex Ng. 2020. Cybersecurity data science: An overview from machine learning perspective. Journal of Big Data 7 (1): 41. https://doi.org/10.1186/s40537-020-00318-5 .

Scopus. 2021. Factsheet. https://www.elsevier.com/__data/assets/pdf_file/0017/114533/Scopus_GlobalResearch_Factsheet2019_FINAL_WEB.pdf . Accessed 11 May 2021.

Sentuna, A., A. Alsadoon, P.W.C. Prasad, M. Saadeh, and O.H. Alsadoon. 2021. A novel Enhanced Naïve Bayes Posterior Probability (ENBPP) using machine learning: Cyber threat analysis. Neural Processing Letters 53 (1): 177–209. https://doi.org/10.1007/s11063-020-10381-x .

Shaukat, K., S.H. Luo, V. Varadharajan, I.A. Hameed, S. Chen, D.X. Liu, and J.M. Li. 2020. Performance comparison and current challenges of using machine learning techniques in cybersecurity. Energies 13 (10): 27. https://doi.org/10.3390/en13102509 .

Sheehan, B., F. Murphy, M. Mullins, and C. Ryan. 2019. Connected and autonomous vehicles: A cyber-risk classification framework. Transportation Research Part a: Policy and Practice 124: 523–536. https://doi.org/10.1016/j.tra.2018.06.033 .

Sheehan, B., F. Murphy, A.N. Kia, and R. Kiely. 2021. A quantitative bow-tie cyber risk classification and assessment framework. Journal of Risk Research 24 (12): 1619–1638.

Shlomo, A., M. Kalech, and R. Moskovitch. 2021. Temporal pattern-based malicious activity detection in SCADA systems. Computers & Security 102: 17. https://doi.org/10.1016/j.cose.2020.102153 .

Singh, K.J., and T. De. 2020. Efficient classification of DDoS attacks using an ensemble feature selection algorithm. Journal of Intelligent Systems 29 (1): 71–83. https://doi.org/10.1515/jisys-2017-0472 .

Skrjanc, I., S. Ozawa, T. Ban, and D. Dovzan. 2018. Large-scale cyber attacks monitoring using Evolving Cauchy Possibilistic Clustering. Applied Soft Computing 62: 592–601. https://doi.org/10.1016/j.asoc.2017.11.008 .

Smart, W. 2018. Lessons learned review of the WannaCry Ransomware Cyber Attack. https://www.england.nhs.uk/wp-content/uploads/2018/02/lessons-learned-review-wannacry-ransomware-cyber-attack-cio-review.pdf . Accessed 7 May 2021.

Sornette, D., T. Maillart, and W. Kröger. 2013. Exploring the limits of safety analysis in complex technological systems. International Journal of Disaster Risk Reduction 6: 59–66. https://doi.org/10.1016/j.ijdrr.2013.04.002 .

Sovacool, B.K. 2008. The costs of failure: A preliminary assessment of major energy accidents, 1907–2007. Energy Policy 36 (5): 1802–1820. https://doi.org/10.1016/j.enpol.2008.01.040 .

SpringerLink. 2021. Journal Search. https://rd.springer.com/search?facet-content-type=%22Journal%22 . Accessed 11 May 2021.

Stojanovic, B., K. Hofer-Schmitz, and U. Kleb. 2020. APT datasets and attack modeling for automated detection methods: A review. Computers & Security 92: 19. https://doi.org/10.1016/j.cose.2020.101734 .

Subroto, A., and A. Apriyana. 2019. Cyber risk prediction through social media big data analytics and statistical machine learning. Journal of Big Data . https://doi.org/10.1186/s40537-019-0216-1 .

Tan, Z., A. Jamdagni, X. He, P. Nanda, R.P. Liu, and J. Hu. 2015. Detection of denial-of-service attacks based on computer vision techniques. IEEE Transactions on Computers 64 (9): 2519–2533. https://doi.org/10.1109/TC.2014.2375218 .

Tidy, J. 2021. Irish cyber-attack: Hackers bail out Irish health service for free. https://www.bbc.com/news/world-europe-57197688 . Accessed 6 May 2021.

Tuncer, T., F. Ertam, and S. Dogan. 2020. Automated malware recognition method based on local neighborhood binary pattern. Multimedia Tools and Applications 79 (37–38): 27815–27832. https://doi.org/10.1007/s11042-020-09376-6 .

Uhm, Y., and W. Pak. 2021. Service-aware two-level partitioning for machine learning-based network intrusion detection with high performance and high scalability. IEEE Access 9: 6608–6622. https://doi.org/10.1109/ACCESS.2020.3048900 .

Ulven, J.B., and G. Wangen. 2021. A systematic review of cybersecurity risks in higher education. Future Internet 13 (2): 1–40. https://doi.org/10.3390/fi13020039 .

Vaccari, I., G. Chiola, M. Aiello, M. Mongelli, and E. Cambiaso. 2020. MQTTset, a new dataset for machine learning techniques on MQTT. Sensors 20 (22): 17. https://doi.org/10.3390/s20226578 .

Valeriano, B., and R.C. Maness. 2014. The dynamics of cyber conflict between rival antagonists, 2001–11. Journal of Peace Research 51 (3): 347–360. https://doi.org/10.1177/0022343313518940 .

Varghese, J.E., and B. Muniyal. 2021. An Efficient IDS framework for DDoS attacks in SDN environment. IEEE Access 9: 69680–69699. https://doi.org/10.1109/ACCESS.2021.3078065 .

Varsha, M. V., P. Vinod, K.A. Dhanya. 2017 Identification of malicious android app using manifest and opcode features. Journal of Computer Virology and Hacking Techniques 13 (2): 125–138. https://doi.org/10.1007/s11416-016-0277-z

Velliangiri, S., and H.M. Pandey. 2020. Fuzzy-Taylor-elephant herd optimization inspired Deep Belief Network for DDoS attack detection and comparison with state-of-the-arts algorithms. Future Generation Computer Systems—the International Journal of Escience 110: 80–90. https://doi.org/10.1016/j.future.2020.03.049 .

Verma, A., and V. Ranga. 2020. Machine learning based intrusion detection systems for IoT applications. Wireless Personal Communications 111 (4): 2287–2310. https://doi.org/10.1007/s11277-019-06986-8 .

Vidros, S., C. Kolias, G. Kambourakis, and L. Akoglu. 2017. Automatic detection of online recruitment frauds: Characteristics, methods, and a public dataset. Future Internet 9 (1): 19. https://doi.org/10.3390/fi9010006 .

Vinayakumar, R., M. Alazab, K.P. Soman, P. Poornachandran, A. Al-Nemrat, and S. Venkatraman. 2019. Deep learning approach for intelligent intrusion detection system. IEEE Access 7: 41525–41550. https://doi.org/10.1109/access.2019.2895334 .

Walker-Roberts, S., M. Hammoudeh, O. Aldabbas, M. Aydin, and A. Dehghantanha. 2020. Threats on the horizon: Understanding security threats in the era of cyber-physical systems. Journal of Supercomputing 76 (4): 2643–2664. https://doi.org/10.1007/s11227-019-03028-9 .

Web of Science. 2021. Web of Science: Science Citation Index Expanded. https://clarivate.com/webofsciencegroup/solutions/webofscience-scie/ . Accessed 11 May 2021.

World Economic Forum. 2020. WEF Global Risk Report. http://www3.weforum.org/docs/WEF_Global_Risk_Report_2020.pdf . Accessed 13 May 2020.

Xin, Y., L. Kong, Z. Liu, Y. Chen, Y. Li, H. Zhu, M. Gao, H. Hou, and C. Wang. 2018. Machine learning and deep learning methods for cybersecurity. IEEE Access 6: 35365–35381. https://doi.org/10.1109/ACCESS.2018.2836950 .

Xu, C., J. Zhang, K. Chang, and C. Long. 2013. Uncovering collusive spammers in Chinese review websites. In Proceedings of the 22nd ACM international conference on Information & Knowledge Management.

Yang, J., T. Li, G. Liang, W. He, and Y. Zhao. 2019. A Simple recurrent unit model based intrusion detection system with DCGAN. IEEE Access 7: 83286–83296. https://doi.org/10.1109/ACCESS.2019.2922692 .

Yuan, B.G., J.F. Wang, D. Liu, W. Guo, P. Wu, and X.H. Bao. 2020. Byte-level malware classification based on Markov images and deep learning. Computers & Security 92: 12. https://doi.org/10.1016/j.cose.2020.101740 .

Zhang, S., X.M. Ou, and D. Caragea. 2015. Predicting cyber risks through national vulnerability database. Information Security Journal 24 (4–6): 194–206. https://doi.org/10.1080/19393555.2015.1111961 .

Zhang, Y., P. Li, and X. Wang. 2019. Intrusion detection for IoT based on improved genetic algorithm and deep belief network. IEEE Access 7: 31711–31722.

Zheng, Muwei, Hannah Robbins, Zimo Chai, Prakash Thapa, and Tyler Moore. 2018. Cybersecurity research datasets: taxonomy and empirical analysis. In 11th {USENIX} workshop on cyber security experimentation and test ({CSET} 18).

Zhou, X., W. Liang, S. Shimizu, J. Ma, and Q. Jin. 2021. Siamese neural network based few-shot learning for anomaly detection in industrial cyber-physical systems. IEEE Transactions on Industrial Informatics 17 (8): 5790–5798. https://doi.org/10.1109/TII.2020.3047675 .

Zhou, Y.Y., G. Cheng, S.Q. Jiang, and M. Dai. 2020. Building an efficient intrusion detection system based on feature selection and ensemble classifier. Computer Networks 174: 17. https://doi.org/10.1016/j.comnet.2020.107247 .

Download references

Open Access funding provided by the IReL Consortium.

Author information

Authors and affiliations.

University of Limerick, Limerick, Ireland

Frank Cremer, Barry Sheehan, Arash N. Kia, Martin Mullins & Finbarr Murphy

TH Köln University of Applied Sciences, Cologne, Germany

Michael Fortmann & Stefan Materne

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Barry Sheehan .

Ethics declarations

Conflict of interest.

On behalf of all authors, the corresponding author states that there is no conflict of interest.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file1 (PDF 334 kb)

Supplementary file1 (docx 418 kb), rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cremer, F., Sheehan, B., Fortmann, M. et al. Cyber risk and cybersecurity: a systematic review of data availability. Geneva Pap Risk Insur Issues Pract 47 , 698–736 (2022). https://doi.org/10.1057/s41288-022-00266-6

Download citation

Received : 15 June 2021

Accepted : 20 January 2022

Published : 17 February 2022

Issue Date : July 2022

DOI : https://doi.org/10.1057/s41288-022-00266-6

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Cyber insurance
  • Systematic review
  • Cybersecurity
  • Find a journal
  • Publish with us
  • Track your research

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals

Information technology articles from across Nature Portfolio

Information technology is the design and implementation of computer networks for data processing and communication. This includes designing the hardware for processing information and connecting separate components, and developing software that can efficiently and faultlessly analyse and distribute this data.

Latest Research and Reviews

computer information systems research paper topics

Towards improving aspect-oriented software reusability estimation

  • Aws A. Magableh
  • Hana’a Bani Ata
  • Adnan Rawashdeh

computer information systems research paper topics

Basketball technique action recognition using 3D convolutional neural networks

  • Jingfei Wang
  • Carlos Cordente Martínez

computer information systems research paper topics

Integrated photonic neuromorphic computing: opportunities and challenges

Neuromorphic photonics is an emerging computing platform that addresses the growing computational demands of modern society. We review advances in integrated neuromorphic photonics and discuss challenges associated with electro-optical conversions, implementations of nonlinearity, amplification and processing in the time domain.

  • Nikolaos Farmakidis
  • Harish Bhaskaran

computer information systems research paper topics

Design and application of virtual simulation teaching platform for intelligent manufacturing

  • Pengfei Zheng
  • Junkai Yang

computer information systems research paper topics

A novel model for relation prediction in knowledge graphs exploiting semantic and structural feature integration

  • Jianliang Yang

computer information systems research paper topics

A data decomposition-based hierarchical classification method for multi-label classification of contractual obligations for the purpose of their governance

  • Amrita Singh
  • Preethu Rose Anish
  • Smita Ghaisas

Advertisement

News and Comment

computer information systems research paper topics

Behavioral health and generative AI: a perspective on future of therapies and patient care

There have been considerable advancements in artificial intelligence (AI), specifically with generative AI (GAI) models. GAI is a class of algorithms designed to create new data, such as text, images, and audio, that resembles the data on which they have been trained. These models have been recently investigated in medicine, yet the opportunity and utility of GAI in behavioral health are relatively underexplored. In this commentary, we explore the potential uses of GAI in the field of behavioral health, specifically focusing on image generation. We propose the application of GAI for creating personalized and contextually relevant therapeutic interventions and emphasize the need to integrate human feedback into the AI-assisted therapeutics and decision-making process. We report the use of GAI with a case study of behavioral therapy on emotional recognition and management with a three-step process. We illustrate image generation-specific GAI to recognize, express, and manage emotions, featuring personalized content and interactive experiences. Furthermore, we highlighted limitations, challenges, and considerations, including the elements of human emotions, the need for human-AI collaboration, transparency and accountability, potential bias, security, privacy and ethical issues, and operational considerations. Our commentary serves as a guide for practitioners and developers to envision the future of behavioral therapies and consider the benefits and limitations of GAI in improving behavioral health practices and patient outcomes.

  • Emre Sezgin

computer information systems research paper topics

Rate-splitting multiple-access-enabled V2X communications

An article in IEEE Transactions on Wireless Communications proposes solutions for interference management in vehicle-to-everything communication systems by leveraging a one-layer rate-splitting multiple-access scheme.

computer information systems research paper topics

The dream of electronic newspapers becomes a reality — in 1974

Efforts to develop an electronic newspaper providing information at the touch of a button took a step forward 50 years ago, and airborne bacteria in the London Underground come under scrutiny, in the weekly dip into Nature ’s archive.

computer information systems research paper topics

Autonomous interference-avoiding machine-to-machine communications

An article in IEEE Journal on Selected Areas in Communications proposes algorithmic solutions to dynamically optimize MIMO waveforms to minimize or eliminate interference in autonomous machine-to-machine communications.

Combining quantum and AI for the next superpower

Quantum computing can benefit from the advancements made in artificial intelligence (AI) holistically across the tech stack — AI may even unlock completely new ways of using quantum computers. Simultaneously, AI can benefit from quantum computing leveraging the expected future compute and memory power.

  • Martina Gschwendtner
  • Henning Soller
  • Sheila Zingg

computer information systems research paper topics

How scientists are making the most of Reddit

As X wanes, researchers are turning to Reddit for insights and data, and to better connect with the public.

  • Hannah Docter-Loeb

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

computer information systems research paper topics

ct-logo

Top 400 Information Technology Research Topics – Full Guide!

The field of IT is progressive and ever-changing due to the rapid development of hardware, software, and networking technologies. The demand for innovative research in IT has also continued to rise as businesses and organizations embrace digital systems and data-driven solutions. 

Understanding the salient areas of study in IT will help professionals keep up with changes that arise and enable organizations to leverage emerging technologies effectively. 

Cybersecurity, artificial intelligence, cloud computing , and big data analytics have emerged through IT research. These fundamental factors shape the modern technology landscape, giving rise to immense possibilities for boosting productivity, raising efficiency, and improving competitiveness across sectors. 

However, companies wanting to navigate the complexities of today’s digital age and exploit new technological advances must examine some of the latest IT research topics.

Understanding Information Technology Research

Table of Contents

In the world of technology, research is a compass that helps us navigate its convoluted evolutions. For instance, Information Technology (IT) research has been conducted in computer science, software engineering, data analytics, and cybersecurity.

IT research involves systematic inquiry to advance knowledge, problem-solving, and innovation. This includes conducting rigorous experiments and analyzing results to unveil new theories or approaches that improve technologies or bring breakthroughs.

Therefore, interdisciplinarity is at the core of IT research, with collaboration cutting across various disciplines. Whether using AI to reinforce cyber security or big data analytics in healthcare, collaboration leads to solutions to complex problems.

This is because IT research is changing rapidly due to technological advances. Thus, researchers need to be up-to-date to make meaningful contributions.

Ethics are involved so that technology can be responsibly deployed. The researchers grapple with privacy, security, bias, and equity issues to ensure technology benefits society.

As a result of this publication and conferences, which enable dissemination of findings, leading to further innovations, collaboration has supported progress, hence speeding it up.

Understanding IT research is vital for leveraging technology to address societal challenges and foster positive change.

Recommended Readings: “ Top 109+ Media Bias Research Topics | Full Guide! “.

Picking the Right Topic to Research: The Key to Finding New Things 

In the always-changing world of information technology, choosing the proper topic to research is like starting a smart path. It’s a big decision that sets where your hard work will go and how much your findings could mean.

Fitting with Industry Moves and Issues

Finding a research topic that fits current industry moves and big issues is important. By staying informed on the latest happenings and problems in the technology field, you can ensure your research stays useful and helps solve real-world troubles.

Growing Fresh Ideas and Practical Uses

Choosing a research topic that generates fresh ideas and practical applications is crucial. Your findings should not just add to school talks but also lead to real solutions that can be used in real situations, pushing technology forward and making work smoother.

Sparking Mind Curiosity and Excitement

Selecting a research topic that sparks your curiosity and excitement is essential. When you dive into an area that truly fascinates you, the research journey becomes more engaging, and your drive to uncover big insights is stronger.

Finding Gaps and Unexplored Areas

Finding gaps in existing knowledge or unexplored areas in the technology landscape can lead to big discoveries. Entering uncharted spaces can uncover fresh insights and meaningfully advance the field.

Considering Potential Wide Effect and Growth

Considering your research topic’s potential wide effect and growth is crucial. Will your findings have far-reaching effects across industries? Can your solutions grow and shift to address changing challenges? Evaluating these things can help you prioritize research areas with the greatest potential for big impact.

By carefully choosing the right research topic, you can open the door to discoveries, push technology forward, and contribute to the constant evolution of the technology information landscape.

Top 400 Information Technology Research Topics

The list of the top 400 information technology research topics is organized into different categories. Let’s examine it. 

Artificial Intelligence (AI) and Machine Learning (ML)

  • Easy AI: Explaining and Using
  • Group Learning: Getting Better Together
  • AI in Health: Diagnosing and Helping
  • Robots Learning on Their Own
  • Being Fair with Computers
  • Talking to Computers in Normal Language
  • AI Fighting Bad Guys on the Internet
  • AI Driving Cars: How Safe Is It?
  • Sharing What We’ve Learned with Other Machines
  • AI in Schools: Computers Learning About You

Cybersecurity and Encryption

  • Trusting Computers: How to Stay Safe
  • Keeping Secrets Safe with Fancy Math
  • Secret Codes Computers Use: Safe or Not?
  • Spy Games: Watching Out for Bad Stuff
  • Keeping Secrets, Even from Friends
  • Your Body as Your Password: Is It Safe?
  • Fighting Against Computer Ransomers
  • Keeping Your Secrets Secret, Even When Sharing
  • Making Sure Your Smart Stuff Isn’t Spying on You
  • Insuring Against Computer Bad Luck

Data Science and Big Data

  • Sharing Secrets: How to Be Safe
  • Watching the World in Real-Time
  • Big Data: Big Computers Handling Big Jobs
  • Making Data Pretty to Look At
  • Cleaning Up Messy Data
  • Predicting the Future with Numbers
  • Finding Patterns in Connected Dots
  • Keeping Your Secrets Safe in Big Data
  • Sharing Our Secrets Without Telling Anyone
  • Helping the Planet with Numbers

Cloud Computing

  • Computers Without a Home: Where Do They Live?
  • Keeping Computers Close to Home
  • Moving Our Stuff to New Homes
  • Juggling Many Clouds at Once
  • Making Computers That Live in the Cloud
  • Keeping Clouds Safe from Bad Guys
  • Keeping Clouds Safe from Sneaky Spies
  • Making Sure Clouds Do What They’re Supposed To
  • Computers Need Energy Too!
  • Making the Internet of Things Even Smarter

Internet of Things (IoT)

  • Smart Stuff Everywhere: How Does It Work?
  • Watching Out for Bad Stuff in Smart Things
  • Smart Stuff: Is It Safe?
  • Taking Care of Smart Toys
  • Making Smart Things That Don’t Need Batteries
  • Making Smart Factories Even Smarter
  • Smart Cities: Making Cities Better Places to Live
  • Your Clothes Can Be Smart, Too!
  • Helping Farmers with Smart Farming
  • Keeping Secrets Safe in Smart Stuff

Human-Computer Interaction (HCI)

  • Magic Glasses: How Do They Work?
  • Making Computers Easy to Use
  • Making Computers for Everyone
  • Talking to Computers with Your Hands
  • Making Sure Computers Are Nice to People
  • Talking to Computers with Your Voice
  • Playing with Computers, You Can Touch
  • Trusting Computers to Drive for Us
  • Computers That Understand Different People
  • Making Computers That Read Our Minds

Software Engineering

  • Making Computers Work Together Smoothly
  • Building Computers from Tiny Pieces
  • Playing Games to Make Computers Better
  • Making Sure Computers Work Right
  • Making Old Computers New Again
  • Making Computers Like to Exercise
  • Making Computers Easier to Understand
  • Building Computers with Blueprints
  • Making Sure Computers Don’t Get Sick
  • Sharing Computer Secrets with Everyone

Mobile Computing

  • Keeping Phones Safe from Bad Guys
  • Making Apps for Every Kind of Phone
  • Keeping Phones Safe in the Cloud
  • Finding Your Way with Your Phone
  • Paying with Your Phone: Safe or Not?
  • Checking Your Health with Your Phone
  • Seeing the World Through Your Phone
  • Wearing Your Phone on Your Wrist
  • Learning on the Go with Your Phone
  • Making Phones Even Smarter with Clouds

Networking and Communications

  • Making Sure Computers Can Talk to Each Other
  • Making Computers Work Together Without Wires
  • Making the Internet Faster for Everyone
  • Getting More Internet Addresses for More Computers
  • Cutting the Internet into Pieces
  • Making the Internet Even More Invisible
  • Talking to Computers with Light
  • Making Sure Tiny Computers Talk to Each Other
  • Sending Messages Even When It’s Hard
  • Making the Radio Smarter for Computers

Bioinformatics and Computational Biology

  • Reading Your DNA with Computers
  • Making Medicine Just for You
  • Meeting the Microscopic World with Computers
  • Building Computer Models of Living Things
  • Finding New Medicine with Computers
  • Building Computer Models of Tiny Machines
  • Making Family Trees for Living Things
  • Counting Germs with Computers
  • Making Big Lists of Living Things
  • Making Computers Think Like Brains

Quantum Computing

  • Making Computers Better at Some Math Problems
  • Keeping Computers Safe from Small Mistakes
  • Making Computers Even Harder to Spy On
  • Making Computers Learn Faster with Quantum Tricks
  • Making Fake Worlds for Computers to Explore
  • Building Computers from Super-Cold Stuff
  • Making Computers Cold to Think Better
  • Making Computers Think Like Chemists
  • Making the Internet Even Safer with Computers
  • Showing Off What Computers Can Do Best

Green Computing

  • Saving Energy with Computers
  • Using Wind and Sun to Power Computers
  • Making Phones Last Longer Without Plugging In
  • Making Computers Kinder to the Planet
  • Recycling Old Computers to Save the Earth
  • Computers That Care About Their Trash
  • Saving Energy in Big Rooms Full of Computers
  • Making Computers Save Energy and Work Faster
  • Counting the Trash from Computers
  • Making Computers Kinder to the Planet’s Air

Information Systems

  • Making Computers Work Together in Big Companies
  • Making Computers Remember Their Friends
  • Making Computers Share What They Know
  • Making Computers Smart About Money
  • Making Computers Send Presents to Their Friends
  • Helping Computers Make Big Decisions
  • Making Government Computers Talk to Each Other
  • Making Computers Count Likes and Shares
  • Assisting computers to Find What You Asked For
  • Assisting companies to Keep Their Friends Happy

Semantic Web and Linked Data

  • Making Computers Understand Each Other Better
  • Making Computers Talk About Themselves
  • Making the Internet More Friendly for Computers
  • Helping Computers Find What They Need
  • Making Computers Smarter by Talking to Each Other
  • Making Computers Friends with Different Languages
  • Making Computers Understand Different Ideas
  • Making Computers Think Like Us
  • Making Computers Smarter About Old Stuff
  • Making Computers Share Their Secrets Safely

Social Computing and Online Communities

  • Making Friends on the Internet
  • Getting Good Suggestions from the Internet
  • Making Computers Work Together to Solve Problems
  • Learning from Your Friends on the Internet
  • Stopping Fake News on the Internet
  • Knowing How People Feel on the Internet
  • Helping Each Other on the Internet During Emergencies
  • Making Sure Computers Are Nice to Everyone
  • Keeping Secrets on the Internet
  • Making the Internet a Better Place for Everyone

Game Development and Virtual Worlds

  • Making Games That Play Fair
  • Letting Computers Make Their Fun
  • Making Fake Worlds for Fun
  • Learning with Games
  • Making the Rules for Fun
  • Watching How People Play Together
  • Seeing Things That Aren’t There
  • Letting Lots of People Play Together
  • Making the Engines for Fun
  • Playing Games to Learn

E-Learning and Educational Technology

  • Making Learning Easy for Everyone
  • Taking Classes on the Internet
  • Learning from Your Computer’s Teacher
  • Learning from What Computers Know
  • Learning Anywhere with Your Computer
  • Making Learning Fun with Games
  • Learning Without a Real Lab
  • Learning with Free Stuff on the Internet
  • Mixing School with Your Computer
  • Making School More Fun with Your Computer

Digital Forensics and Incident Response

  • Solving Computer Mysteries
  • Looking for Clues in Computers
  • Finding Bad Guys on the Internet
  • Looking for Clues on Phones and Tablets
  • Hiding Clues on Computers
  • Helping When Computers Get Sick
  • Solving Mysteries While the Computer Is On
  • Finding Clues on Your Smart Watch
  • Finding Tools for Finding Clues
  • Following the Rules When Solving Mysteries

Wearable Technology and Smart Devices

  • Keeping Healthy with Smart Watches
  • Making Clothes That Talk to Computers
  • Listening to the Earth with Your Shirt
  • Wearing Glasses That Show Cool Stuff
  • Making Your Home Smarter with Your Phone
  • Using Your Body to Unlock Your Phone
  • Helping People Move with Special Shoes
  • Assisting people to See with Special Glasses
  • Making Your Clothes Do More Than Keep You Warm
  • Keeping Secrets Safe on Your Smart Stuff

Robotics and Automation

  • Making Friends with Robots
  • Letting Robots Do the Hard Work
  • Robots That Work Together Like Ants
  • Learning Tricks from People
  • Robots That Feel Like Jelly
  • Helping Doctors and Nurses with Robots
  • Robots That Help Farmers Grow Food
  • Making Cars Without People
  • Teaching Robots to Recognize Things
  • Robots That Learn from Animals

Health Informatics

  • Computers That Help Doctors Keep Track of Patients
  • Sharing Secrets About Your Health with Other Computers
  • Seeing the Doctor on Your Computer
  • Keeping Track of Your Health with Your Phone
  • Making Medicine Better with Computers
  • Keeping Your Health Secrets Safe with Computers
  • Learning About Health with Computers
  • Keeping Health Secrets Safe on the Internet
  • Watching Out for Germs with Computers
  • Making Sure the Doctor’s Computer Plays Nice

Geographic Information Systems (GIS)

  • Watching the World Change with Computers
  • Making Maps on the Internet
  • Seeing the World from Very Far Away
  • Finding Hidden Patterns with Computers
  • Making Cities Better with Computers
  • Keeping Track of the Earth with Computers
  • Keeping Track of Wild Animals with Computers
  • Making Maps with Everyone’s Help
  • Seeing the World in 3D
  • Finding Things on the Map with Your Phone

Knowledge Management

  • Helping Computers Remember Things
  • Making Computers Talk About What They Know
  • Finding Secrets in Big Piles of Data
  • Helping Companies Remember What They Know
  • Sharing Secrets with Computers at Work
  • Making Computers Learn from Each Other
  • Making Computers Talk About Their Friends
  • Making Companies Remember Their Secrets
  • Keeping Track of What Companies Know

Computational Linguistics and Natural Language Processing (NLP)

  • Finding Out How People Feel on the Internet
  • Finding Names and Places in Stories
  • Making Computers Talk to Each Other
  • Making Computers Answer Questions
  • Making Summaries for Busy People
  • Making Computers Understand Stories
  • Making Computers Understand Pictures and Sounds
  • Making Computers Learn New Words
  • Making Computers Remember What They Read
  • Making Sure Computers Aren’t Mean to Anyone

Information Retrieval and Search Engines

  • Finding Stuff on the Internet
  • Getting Suggestions from the Internet
  • Finding Stuff at Work
  • Helping Computers Find Stuff Faster
  • Making Computers Understand What You Want
  • Finding Stuff on Your Phone
  • Finding Stuff When You’re Moving
  • Finding Stuff Near Where You Are
  • Making Sure Computers Look Everywhere for What You Want

Computer Vision

  • Finding Stuff in Pictures
  • Cutting Up Pictures
  • Watching Videos for Fun
  • Learning from Lots of Pictures
  • Making Pictures with Computers
  • Finding Stuff That Looks Like Other Stuff
  • Finding Secrets in Medical Pictures
  • Finding Out If Pictures Are Real
  • Looking at People’s Faces to Know Them

Quantum Information Science

  • Making Computers Learn Faster with Tricks

Social Robotics

  • Robots That Help People Who Have Trouble Talking
  • Robots That Teach People New Things
  • Making Robots Work with People
  • Helping Kids Learn with Robots
  • Making Sure Robots Aren’t Mean to Anyone
  • Making Robots Understand How People Feel
  • Making Friends with Robots from Different Places
  • Making Sure Robots Respect Different Cultures
  • Helping Robots Learn How to Be Nice

Cloud Robotics

  • Making Robots Work Together from Far Away
  • Making Robots Share Their Toys
  • Making Robots Do Hard Jobs in Different Places
  • Making Robots Save Energy
  • Making Robots Play Together Nicely
  • Making Robots Practice Being Together
  • Making Sure Robots Play Fair
  • Making Robots Follow the Rules

Cyber-Physical Systems (CPS)

  • Making Robots Work Together with Other Things
  • Keeping Robots Safe from Small Mistakes
  • Keeping Factories Safe from Bad Guys
  • Making Sure Robots Respect Different People
  • Making Sure Robots Work Well with People
  • Keeping Robots Safe from Bad Guys
  • Making Sure Robots Follow the Rules

Biomedical Imaging

  • Taking Pictures of Inside You with Computers
  • Seeing Inside You with Computers
  • Cutting Up Pictures of Inside You
  • Finding Problems Inside You with Computers
  • Cutting Up Pictures and Putting Them Together
  • Counting Inside You with Pictures
  • Making Pictures to Help Doctors
  • Making Lists from Pictures Inside You
  • Making Sure Pictures of You Are Safe

Remote Sensing

  • Watching Earth from Far Away with Computers
  • Making Pictures of Earth Change
  • Taking Pictures from Very High Up
  • Watching Crops Grow with Computers
  • Watching Cities Grow with Computers
  • Watching Earth Change with Computers
  • Watching Earth from Far Away During Emergencies
  • Making Computers Work Together to See Earth
  • Putting Pictures of Earth Together
  • Making Sure Pictures of Earth Are Safe

Cloud Gaming

  • Playing Games from Far Away
  • Making Games Work Faster from Far Away
  • Keeping Games Safe from Bad Guys
  • Making Sure Everyone Can Play Together
  • Making Games Faster from Far Away
  • Watching People Play Games from Far Away
  • Making Sure Games Look Good from Far Away
  • Watching Games Get More Popular

Augmented Reality (AR)

  • Making Glasses That Show Cool Stuff
  • Making Cool Stuff for Glasses to Show
  • Watching Glasses Follow You
  • Watching Phones Show Cool Stuff
  • Making Cool Stuff to Show with Phones
  • Making Places Even Better with Phones
  • Making Factories Even Better with Glasses
  • Making Places Even Better with Glasses
  • Making Sure Glasses Don’t Scare Anyone

Virtual Reality (VR)

  • Making Glasses That Show Different Worlds
  • Making Glasses That Follow Your Hands
  • Making Therapy Fun with Glasses
  • Making Learning Fun with Glasses
  • Making Glasses That Make Jobs Safer
  • Making Glasses That Show Your Friends
  • Making Sure Glasses Are Friendly
  • Making Glasses That Make Buildings Better
  • Making Sure Glasses Aren’t Scary

Digital Twins

  • Making Computers That Copy the Real World
  • Making People Better with Computers
  • Making Flying Safer with Computers
  • Making Cars Safer with Computers
  • Making Energy Better with Computers
  • Making Buildings Better with Computers
  • Making Cities Safer with Computers
  • Making Sure Computers Copy the Real World Safely
  • Making Computers Follow the Rules

Edge Computing

  • Making Computers Work Faster Near You
  • Keeping Computers Safe Near You
  • Making Computers Work with Far-Away Computers
  • Making Computers Work Fast with You
  • Making Computers Work Together Near You
  • Making Phones Work Faster Near You
  • Making Computers Work Near You
  • Making Computers Work in Busy Places

Explainable AI (XAI)

  • Making Computers Explain What They Do
  • Making Medicine Safer with Computers
  • Making Money Safer with Computers
  • Making Computers Safe to Drive Cars
  • Making Computers Fair to Everyone
  • Making Computers Explain What They Think
  • Making Computers Easy to Understand

Blockchain and Distributed Ledger Technology (DLT)

  • Making Secret Codes Computers Use
  • Making Contracts Computers Can Understand
  • Making Computers Share Secrets Safely
  • Making Money Safe with Computers
  • Making Computers Work Together Nicely
  • Making Computers Keep Secrets Safe
  • Making Computers Work Together Fairly
  • Making Stuff Move Safely with Computers

Quantum Communication

  • Making Computers Talk to Each Other Safely
  • Making Computers Talk to Each Other from Far Away
  • Making Computers Talk to Each Other in Secret
  • Making Money Move Safely with Computers

This list covers a broad spectrum of topics within Information Technology, ranging from foundational concepts to cutting-edge research areas. Feel free to choose any topic that aligns with your interests and expertise for further exploration and study!

Emerging Trends in Information Technology Research

In the rapidly changing world of Computer Studies, keeping up with the latest trends is indispensable. Technology keeps changing, and so does research in computer studies. From awesome things like clever robots to how we can safeguard our online information, computer studies research is always discovering new ways to improve our lives. Therefore, let us delve into some of the most exciting new trends shaping computer studies’ future.

  • Smart Computers:

Right now, smart computers are a hot item. They can learn from experience, recognize patterns, and even understand language like humans do. This helps in many areas, such as healthcare or finance. So researchers are working on making smart computers smarter yet so that they can make decisions alone and be fair to everyone.

  • Fast Computing:

As more devices connect to the Internet, we need ways to process information quickly. Fast computing helps bring processing power closer to where the information comes from, making things quicker and more efficient. Thus, researchers have been figuring out how to improve fast computing, especially for analyzing real-time data.

  • Keeping Things Safe:

With all the cool tech around, keeping our information safe from bad guys is important. We must develop methods to safeguard our data and networks from cyber attackers. In addition, they have also been considering how to ensure the privacy of our personal information so that only authorized individuals can access it.

  • Fancy Computers:

The next big thing in computing is quantum computers. They can do calculations at a high speed that ordinary ones cannot. Researchers are working hard to achieve quantum computing because it could be useful in cracking codes and creating new drugs.

  • New Ways of Doing Things Together:

Blockchain is an exciting technology that allows us to collaborate without a central authority. Its use in cryptocurrencies is quite popular but it has other applications too. Blockchain can be applied for purposes such as helping us discover where products come from, proving who we are on the internet, and making contracts that cannot be changed later on.

  • Virtual Reality Adventures:

Entering a completely different world is what Virtual Reality (VR) and Augmented Reality (AR) do. The feeling of being in reality is what these two technologies create, which is not real. These researchers are working hard on making VRs and ARs better so that they can be used for learning, training, and amusement in more innovative ways.

In summary, computer studies research keeps changing with new trends such as smart computers, rapid computing, cybersecurity issues, high-end computers, collaboration platforms and immersive games or virtual reality escapades. 

By exploring these trends and developing new ideas, researchers ensure that technology keeps improving and making our lives easier and more exciting.

How can I brainstorm research topics in information technology?

Start by identifying your areas of interest and exploring recent advancements in the field. Consider consulting with mentors or peers for suggestions and feedback.

What are some ethical considerations in AI research?

Ethical considerations in AI research include fairness, transparency, accountability, and privacy. Researchers should ensure their algorithms and models do not perpetuate bias or harm individuals.

How can I stay updated on emerging trends in IT research?

Follow reputable journals, conferences, and online forums dedicated to information technology. Engage with the academic community through discussions and networking events.

Similar Articles

How To Do Homework Fast

How To Do Homework Fast – 11 Tips To Do Homework Fast

Homework is one of the most important parts that have to be done by students. It has been around for…

Write assignment introduction

How to Write an Assignment Introduction – 6 Best Tips

In essence, the writing tasks in academic tenure students are an integral part of any curriculum. Whether in high school,…

Leave a Comment Cancel Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed .

Banner

Computer Information Systems: Articles & Journals

  • Articles & Journals
  • Books & Ebooks
  • Standards & Patents
  • Citing & Writing Help
  • Career Resources
  • Using AI in Research This link opens in a new window

Find Articles in Research Databases

  • Computing databases
  • Business databases
  • Psychology databases
  • OneSearch  
  • Google Scholar
  • What is peer review?

How-to instructions

  • ACM Proceedings
  • ASIS&T Previous five years available free

Selected full text articles from journals, magazines, and trade publications, including industry and country reports, MarketLine company profiles, SWOT analyses, and more

How-to instructions for screen readers

Selected full text articles from business-related magazines and journals

Funded by TexShare

Primarily full text case studies, statistical data, news articles, journal articles, and topical reference materials organized by company, industry, and country; includes SWOT reports

  • Plunkett Research This link opens in a new window Analyzes industry trends, including finances, markets, technologies, deregulation, research/development and globalization; also includes industry statistics, company profiles, glossaries, and more

Full text (sometimes delayed for six months or more) for approximately 400 journals in psychology and related fields

Advanced Search

Sample search:    "critical thinking" AND ("higher education" OR college*)    (race OR racial) AND (discriminat* OR prejudic*) Select limiters (articles from scholarly publications, etc.) on results screen

While it doesn't offer some of the sophisticated options available in research databases, Google Scholar can be helpful. See instructions for customizing Google Scholar to provide Full Text @ UHCL links in results.

Google Scholar Search

A peer-reviewed (or refereed ) journal:

  • uses  experts from the same subject field or profession as the author to evaluate a manuscript prior to acceptance for publication
  • has articles that report on research studies or provide scholarly analysis of topics
  • may include book reviews, editorials, or other brief items that are not considered scholarly articles
  • Anatomy of a Scholarly Article Explains key elements from the first & last page of a typical scholarly or academic article. North Carolina State Univ. Libraries

Peer Review in 3 Minutes

(3:15) Explains the academic publishing process for research articles and scholarly journals, including the quality control process of peer review. North Carolina State Univ. Libraries

White Papers and Industry Publications

  • AISeL (Association for Information Systems Electronic Library) Repository for the ICIS, AMCIS, ECIS, PACIS, BLED, ACIS and MWAIS conference proceedings. Current AIS members can access the full text of all articles. Others can see indexing and selected tables of contents.
  • Bitpipe White papers, research guides, webcasts and case studies on various IT topics such as web services, networking, telecommunication, software development, storage, wireless and more.
  • The Information Technology Professional's Resource Center Provides links to networking related information available on the net, forums for IT professionals to interact, and full-text information and data on a variety of technologies.
  • TechRepublic: Research Library Includes webcasts, white papers, file downloads and more with information on various aspects of technology. Also has blogs and forums.

Get Full Text for an Article

If a database lacks immediate full text, click Find It @ UHCL  in results to check for full text from another source. Follow Available Online links in OneSearch to a resource with full text for UH Clear Lake users as shown below.

Shows Find It at UHCL leading to full text in EBSCOhost PsycARTICLES

If full text is not found, submit an article request .

  • Request a book, article, or other item ILLiad Interlibrary Loan logon

PDF

Finding Journal Title Abbreviations

  • Web of Science Journal Title Abbreviations list shows the abbreviations used for journal titles as cited works
  • All that JAS (Journal Abbreviations Sources)
  • << Previous: Home
  • Next: Books & Ebooks >>
  • Last Updated: May 20, 2024 2:21 PM
  • URL: https://uhcl.libguides.com/CINF
Bayou Building 2402, 2700 Bay Area Blvd, Houston, TX 77058-1002

computer information systems research paper topics

Explore your training options in 10 minutes Get Started

  • Graduate Stories
  • Partner Spotlights
  • Bootcamp Prep
  • Bootcamp Admissions
  • University Bootcamps
  • Coding Tools
  • Software Engineering
  • Web Development
  • Data Science
  • Tech Guides
  • Tech Resources
  • Career Advice
  • Online Learning
  • Internships
  • Apprenticeships
  • Tech Salaries
  • Associate Degree
  • Bachelor's Degree
  • Master's Degree
  • University Admissions
  • Best Schools
  • Certifications
  • Bootcamp Financing
  • Higher Ed Financing
  • Scholarships
  • Financial Aid
  • Best Coding Bootcamps
  • Best Online Bootcamps
  • Best Web Design Bootcamps
  • Best Data Science Bootcamps
  • Best Technology Sales Bootcamps
  • Best Data Analytics Bootcamps
  • Best Cybersecurity Bootcamps
  • Best Digital Marketing Bootcamps
  • Los Angeles
  • San Francisco
  • Browse All Locations
  • Digital Marketing
  • Machine Learning
  • See All Subjects
  • Bootcamps 101
  • Full-Stack Development
  • Career Changes
  • View all Career Discussions
  • Mobile App Development
  • Cybersecurity
  • Product Management
  • UX/UI Design
  • What is a Coding Bootcamp?
  • Are Coding Bootcamps Worth It?
  • How to Choose a Coding Bootcamp
  • Best Online Coding Bootcamps and Courses
  • Best Free Bootcamps and Coding Training
  • Coding Bootcamp vs. Community College
  • Coding Bootcamp vs. Self-Learning
  • Bootcamps vs. Certifications: Compared
  • What Is a Coding Bootcamp Job Guarantee?
  • How to Pay for Coding Bootcamp
  • Ultimate Guide to Coding Bootcamp Loans
  • Best Coding Bootcamp Scholarships and Grants
  • Education Stipends for Coding Bootcamps
  • Get Your Coding Bootcamp Sponsored by Your Employer
  • GI Bill and Coding Bootcamps
  • Tech Intevriews
  • Our Enterprise Solution
  • Connect With Us
  • Publication
  • Reskill America
  • Partner With Us

Career Karma

  • Resource Center
  • Bachelor’s Degree
  • Master’s Degree

The Top 10 Most Interesting Computer Science Research Topics

Computer science touches nearly every area of our lives. With new advancements in technology, the computer science field is constantly evolving, giving rise to new computer science research topics. These topics attempt to answer various computer science research questions and how they affect the tech industry and the larger world.

Computer science research topics can be divided into several categories, such as artificial intelligence, big data and data science, human-computer interaction, security and privacy, and software engineering. If you are a student or researcher looking for computer research paper topics. In that case, this article provides some suggestions on examples of computer science research topics and questions.

Find your bootcamp match

What makes a strong computer science research topic.

A strong computer science topic is clear, well-defined, and easy to understand. It should also reflect the research’s purpose, scope, or aim. In addition, a strong computer science research topic is devoid of abbreviations that are not generally known, though, it can include industry terms that are currently and generally accepted.

Tips for Choosing a Computer Science Research Topic

  • Brainstorm . Brainstorming helps you develop a few different ideas and find the best topic for you. Some core questions you should ask are, What are some open questions in computer science? What do you want to learn more about? What are some current trends in computer science?
  • Choose a sub-field . There are many subfields and career paths in computer science . Before choosing a research topic, ensure that you point out which aspect of computer science the research will focus on. That could be theoretical computer science, contemporary computing culture, or even distributed computing research topics.
  • Aim to answer a question . When you’re choosing a research topic in computer science, you should always have a question in mind that you’d like to answer. That helps you narrow down your research aim to meet specified clear goals.
  • Do a comprehensive literature review . When starting a research project, it is essential to have a clear idea of the topic you plan to study. That involves doing a comprehensive literature review to better understand what has been learned about your topic in the past.
  • Keep the topic simple and clear. The topic should reflect the scope and aim of the research it addresses. It should also be concise and free of ambiguous words. Hence, some researchers recommended that the topic be limited to five to 15 substantive words. It can take the form of a question or a declarative statement.

What’s the Difference Between a Research Topic and a Research Question?

A research topic is the subject matter that a researcher chooses to investigate. You may also refer to it as the title of a research paper. It summarizes the scope of the research and captures the researcher’s approach to the research question. Hence, it may be broad or more specific. For example, a broad topic may read, Data Protection and Blockchain, while a more specific variant can read, Potential Strategies to Privacy Issues on the Blockchain.

On the other hand, a research question is the fundamental starting point for any research project. It typically reflects various real-world problems and, sometimes, theoretical computer science challenges. As such, it must be clear, concise, and answerable.

How to Create Strong Computer Science Research Questions

To create substantial computer science research questions, one must first understand the topic at hand. Furthermore, the research question should generate new knowledge and contribute to the advancement of the field. It could be something that has not been answered before or is only partially answered. It is also essential to consider the feasibility of answering the question.

Top 10 Computer Science Research Paper Topics

1. battery life and energy storage for 5g equipment.

The 5G network is an upcoming cellular network with much higher data rates and capacity than the current 4G network. According to research published in the European Scientific Institute Journal, one of the main concerns with the 5G network is the high energy consumption of the 5G-enabled devices . Hence, this research on this topic can highlight the challenges and proffer unique solutions to make more energy-efficient designs.

2. The Influence of Extraction Methods on Big Data Mining

Data mining has drawn the scientific community’s attention, especially with the explosive rise of big data. Many research results prove that the extraction methods used have a significant effect on the outcome of the data mining process. However, a topic like this analyzes algorithms. It suggests strategies and efficient algorithms that may help understand the challenge or lead the way to find a solution.

3. Integration of 5G with Analytics and Artificial Intelligence

According to the International Finance Corporation, 5G and AI technologies are defining emerging markets and our world. Through different technologies, this research aims to find novel ways to integrate these powerful tools to produce excellent results. Subjects like this often spark great discoveries that pioneer new levels of research and innovation. A breakthrough can influence advanced educational technology, virtual reality, metaverse, and medical imaging.

4. Leveraging Asynchronous FPGAs for Crypto Acceleration

To support the growing cryptocurrency industry, there is a need to create new ways to accelerate transaction processing. This project aims to use asynchronous Field-Programmable Gate Arrays (FPGAs) to accelerate cryptocurrency transaction processing. It explores how various distributed computing technologies can influence mining cryptocurrencies faster with FPGAs and generally enjoy faster transactions.

5. Cyber Security Future Technologies

Cyber security is a trending topic among businesses and individuals, especially as many work teams are going remote. Research like this can stretch the length and breadth of the cyber security and cloud security industries and project innovations depending on the researcher’s preferences. Another angle is to analyze existing or emerging solutions and present discoveries that can aid future research.

6. Exploring the Boundaries Between Art, Media, and Information Technology

The field of computers and media is a vast and complex one that intersects in many ways. They create images or animations using design technology like algorithmic mechanism design, design thinking, design theory, digital fabrication systems, and electronic design automation. This paper aims to define how both fields exist independently and symbiotically.

7. Evolution of Future Wireless Networks Using Cognitive Radio Networks

This research project aims to study how cognitive radio technology can drive evolution in future wireless networks. It will analyze the performance of cognitive radio-based wireless networks in different scenarios and measure its impact on spectral efficiency and network capacity. The research project will involve the development of a simulation model for studying the performance of cognitive radios in different scenarios.

8. The Role of Quantum Computing and Machine Learning in Advancing Medical Predictive Systems

In a paper titled Exploring Quantum Computing Use Cases for Healthcare , experts at IBM highlighted precision medicine and diagnostics to benefit from quantum computing. Using biomedical imaging, machine learning, computational biology, and data-intensive computing systems, researchers can create more accurate disease progression prediction, disease severity classification systems, and 3D Image reconstruction systems vital for treating chronic diseases.

9. Implementing Privacy and Security in Wireless Networks

Wireless networks are prone to attacks, and that has been a big concern for both individual users and organizations. According to the Cyber Security and Infrastructure Security Agency CISA, cyber security specialists are working to find reliable methods of securing wireless networks . This research aims to develop a secure and privacy-preserving communication framework for wireless communication and social networks.

10. Exploring the Challenges and Potentials of Biometric Systems Using Computational Techniques

Much discussion surrounds biometric systems and the potential for misuse and privacy concerns. When exploring how biometric systems can be effectively used, issues such as verification time and cost, hygiene, data bias, and cultural acceptance must be weighed. The paper may take a critical study into the various challenges using computational tools and predict possible solutions.

Other Examples of Computer Science Research Topics & Questions

Computer research topics.

  • The confluence of theoretical computer science, deep learning, computational algorithms, and performance computing
  • Exploring human-computer interactions and the importance of usability in operating systems
  • Predicting the limits of networking and distributed systems
  • Controlling data mining on public systems through third-party applications
  • The impact of green computing on the environment and computational science

Computer Research Questions

  • Why are there so many programming languages?
  • Is there a better way to enhance human-computer interactions in computer-aided learning?
  • How safe is cloud computing, and what are some ways to enhance security?
  • Can computers effectively assist in the sequencing of human genes?
  • How valuable is SCRUM methodology in Agile software development?

Choosing the Right Computer Science Research Topic

Computer science research is a vast field, and it can be challenging to choose the right topic. There are a few things to keep in mind when making this decision. Choose a topic that you are interested in. This will make it easier to stay motivated and produce high-quality research for your computer science degree .

Select a topic that is relevant to your field of study. This will help you to develop specialized knowledge in the area. Choose a topic that has potential for future research. This will ensure that your research is relevant and up-to-date. Typically, coding bootcamps provide a framework that streamlines students’ projects to a specific field, doing their search for a creative solution more effortless.

Computer Science Research Topics FAQ

To start a computer science research project, you should look at what other content is out there. Complete a literature review to know the available findings surrounding your idea. Design your research and ensure that you have the necessary skills and resources to complete the project.

The first step to conducting computer science research is to conceptualize the idea and review existing knowledge about that subject. You will design your research and collect data through surveys or experiments. Analyze your data and build a prototype or graphical model. You will also write a report and present it to a recognized body for review and publication.

You can find computer science research jobs on the job boards of many universities. Many universities have job boards on their websites that list open positions in research and academia. Also, many Slack and GitHub channels for computer scientists provide regular updates on available projects.

There are several hot topics and questions in AI that you can build your research on. Below are some AI research questions you may consider for your research paper.

  • Will it be possible to build artificial emotional intelligence?
  • Will robots replace humans in all difficult cumbersome jobs as part of the progress of civilization?
  • Can artificial intelligence systems self-improve with knowledge from the Internet?

About us: Career Karma is a platform designed to help job seekers find, research, and connect with job training programs to advance their careers. Learn about the CK publication .

What's Next?

icon_10

Get matched with top bootcamps

Ask a question to our community, take our careers quiz.

Saheed Aremu Olanrewaju

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Apply to top tech training programs in one click

Princeton University

  • Advisers & Contacts
  • Bachelor of Arts & Bachelor of Science in Engineering
  • Prerequisites
  • Declaring Computer Science for AB Students
  • Declaring Computer Science for BSE Students
  • Class of '25, '26 & '27 - Departmental Requirements
  • Class of 2024 - Departmental Requirements
  • COS126 Information
  • Important Steps and Deadlines
  • Independent Work Seminars
  • Guidelines and Useful Information

Undergraduate Research Topics

  • AB Junior Research Workshops
  • Undergraduate Program FAQ
  • How to Enroll
  • Requirements
  • Certificate Program FAQ
  • Interdepartmental Committee
  • Minor Program
  • Funding for Student Group Activities
  • Mailing Lists and Policies
  • Study Abroad
  • Jobs & Careers
  • Admissions Requirements
  • Breadth Requirements
  • Pre-FPO Checklist
  • FPO Checklist
  • M.S.E. Track
  • M.Eng. Track
  • Departmental Internship Policy (for Master's students)
  • General Examination
  • Fellowship Opportunities
  • Travel Reimbursement Policy
  • Communication Skills
  • Course Schedule
  • Course Catalog
  • Research Areas
  • Interdisciplinary Programs
  • Technical Reports
  • Computing Facilities
  • Researchers
  • Technical Staff
  • Administrative Staff
  • Graduate Students
  • Undergraduate Students
  • Graduate Alumni
  • Climate and Inclusion Committee
  • Resources for Undergraduate & Graduate Students
  • Outreach Initiatives
  • Resources for Faculty & Staff
  • Spotlight Stories
  • Job Openings
  • Undergraduate Program
  • Independent Work & Theses

Suggested Undergraduate Research Topics

computer information systems research paper topics

How to Contact Faculty for IW/Thesis Advising

Send the professor an e-mail. When you write a professor, be clear that you want a meeting regarding a senior thesis or one-on-one IW project, and briefly describe the topic or idea that you want to work on. Check the faculty listing for email addresses.

*Updated April 9, 2024

Table Legend:     X = Available      |      N/A = Not Available
X X X
X X X
X N/A N/A
X X X
N/A N/A N/A
X X X
X X X
X X X
X X X
X X X
X X X
X X X
X X X
X X X
X X X
X X X
X X X
X X X
X X X
X X X
X X X
X N/A N/A
X X X
X X X
X X X
X X X
X X X
X X X
X X X
X N/A N/A
X X X
X X X
X X X
X X X
X X X
X X X
X X X
X X X
X X X
X X X
N/A X N/A
X X X
X X X
X X X
X X X
N/A N/A N/A
X X X
N/A N/A N/A
X X X
X X X
X X X
N/A X N/A
X X X
X X X
X X X
X X X
X X X
X X X
X X X
X X X
X X X
X X X
X X N/A
X X X
X X X
X X X
X X X

Parastoo Abtahi, Room 419

Available for single-semester IW and senior thesis advising, 2024-2025

  • Research Areas: Human-Computer Interaction (HCI), Augmented Reality (AR), and Spatial Computing
  • Input techniques for on-the-go interaction (e.g., eye-gaze, microgestures, voice) with a focus on uncertainty, disambiguation, and privacy.
  • Minimal and timely multisensory output (e.g., spatial audio, haptics) that enables users to attend to their physical environment and the people around them, instead of a 2D screen.
  • Interaction with intelligent systems (e.g., IoT, robots) situated in physical spaces with a focus on updating users’ mental model despite the complexity and dynamicity of these systems.

Ryan Adams, Room 411

Research areas:

  • Machine learning driven design
  • Generative models for structured discrete objects
  • Approximate inference in probabilistic models
  • Accelerating solutions to partial differential equations
  • Innovative uses of automatic differentiation
  • Modeling and optimizing 3d printing and CNC machining

Andrew Appel, Room 209

Available for Fall 2024 IW advising, only

  • Research Areas: Formal methods, programming languages, compilers, computer security.
  • Software verification (for which taking COS 326 / COS 510 is helpful preparation)
  • Game theory of poker or other games (for which COS 217 / 226 are helpful)
  • Computer game-playing programs (for which COS 217 / 226)
  •  Risk-limiting audits of elections (for which ORF 245 or other knowledge of probability is useful)

Sanjeev Arora, Room 407

  • Theoretical machine learning, deep learning and its analysis, natural language processing. My advisees would typically have taken a course in algorithms (COS423 or COS 521 or equivalent) and a course in machine learning.
  • Show that finding approximate solutions to NP-complete problems is also NP-complete (i.e., come up with NP-completeness reductions a la COS 487). 
  • Experimental Algorithms: Implementing and Evaluating Algorithms using existing software packages. 
  • Studying/designing provable algorithms for machine learning and implementions using packages like scipy and MATLAB, including applications in Natural language processing and deep learning.
  • Any topic in theoretical computer science.

David August, Room 221

Not available for IW or thesis advising, 2024-2025

  • Research Areas: Computer Architecture, Compilers, Parallelism
  • Containment-based approaches to security:  We have designed and tested a simple hardware+software containment mechanism that stops incorrect communication resulting from faults, bugs, or exploits from leaving the system.   Let's explore ways to use containment to solve real problems.  Expect to work with corporate security and technology decision-makers.
  • Parallelism: Studies show much more parallelism than is currently realized in compilers and architectures.  Let's find ways to realize this parallelism.
  • Any other interesting topic in computer architecture or compilers. 

Mark Braverman, 194 Nassau St., Room 231

  • Research Areas: computational complexity, algorithms, applied probability, computability over the real numbers, game theory and mechanism design, information theory.
  • Topics in computational and communication complexity.
  • Applications of information theory in complexity theory.
  • Algorithms for problems under real-life assumptions.
  • Game theory, network effects
  • Mechanism design (could be on a problem proposed by the student)

Sebastian Caldas, 221 Nassau Street, Room 105

  • Research Areas: collaborative learning, machine learning for healthcare. Typically, I will work with students that have taken COS324.
  • Methods for collaborative and continual learning.
  • Machine learning for healthcare applications.

Bernard Chazelle, 194 Nassau St., Room 301

  • Research Areas: Natural Algorithms, Computational Geometry, Sublinear Algorithms. 
  • Natural algorithms (flocking, swarming, social networks, etc).
  • Sublinear algorithms
  • Self-improving algorithms
  • Markov data structures

Danqi Chen, Room 412

  • My advisees would be expected to have taken a course in machine learning and ideally have taken COS484 or an NLP graduate seminar.
  • Representation learning for text and knowledge bases
  • Pre-training and transfer learning
  • Question answering and reading comprehension
  • Information extraction
  • Text summarization
  • Any other interesting topics related to natural language understanding/generation

Marcel Dall'Agnol, Corwin 034

  • Research Areas: Theoretical computer science. (Specifically, quantum computation, sublinear algorithms, complexity theory, interactive proofs and cryptography)
  • Research Areas: Machine learning

Jia Deng, Room 423

  •  Research Areas: Computer Vision, Machine Learning.
  • Object recognition and action recognition
  • Deep Learning, autoML, meta-learning
  • Geometric reasoning, logical reasoning

Adji Bousso Dieng, Room 406

  • Research areas: Vertaix is a research lab at Princeton University led by Professor Adji Bousso Dieng. We work at the intersection of artificial intelligence (AI) and the natural sciences. The models and algorithms we develop are motivated by problems in those domains and contribute to advancing methodological research in AI. We leverage tools in statistical machine learning and deep learning in developing methods for learning with the data, of various modalities, arising from the natural sciences.

Robert Dondero, Corwin Hall, Room 038

  • Research Areas:  Software engineering; software engineering education.
  • Develop or evaluate tools to facilitate student learning in undergraduate computer science courses at Princeton, and beyond.
  • In particular, can code critiquing tools help students learn about software quality?

Zeev Dvir, 194 Nassau St., Room 250

  • Research Areas: computational complexity, pseudo-randomness, coding theory and discrete mathematics.
  • Independent Research: I have various research problems related to Pseudorandomness, Coding theory, Complexity and Discrete mathematics - all of which require strong mathematical background. A project could also be based on writing a survey paper describing results from a few theory papers revolving around some particular subject.

Benjamin Eysenbach, Room 416

  • Research areas: reinforcement learning, machine learning. My advisees would typically have taken COS324.
  • Using RL algorithms to applications in science and engineering.
  • Emergent behavior of RL algorithms on high-fidelity robotic simulators.
  • Studying how architectures and representations can facilitate generalization.

Christiane Fellbaum, 1-S-14 Green

  • Research Areas: theoretical and computational linguistics, word sense disambiguation, lexical resource construction, English and multilingual WordNet(s), ontology
  • Anything having to do with natural language--come and see me with/for ideas suitable to your background and interests. Some topics students have worked on in the past:
  • Developing parsers, part-of-speech taggers, morphological analyzers for underrepresented languages (you don't have to know the language to develop such tools!)
  • Quantitative approaches to theoretical linguistics questions
  • Extensions and interfaces for WordNet (English and WN in other languages),
  • Applications of WordNet(s), including:
  • Foreign language tutoring systems,
  • Spelling correction software,
  • Word-finding/suggestion software for ordinary users and people with memory problems,
  • Machine Translation 
  • Sentiment and Opinion detection
  • Automatic reasoning and inferencing
  • Collaboration with professors in the social sciences and humanities ("Digital Humanities")

Adam Finkelstein, Room 424 

  • Research Areas: computer graphics, audio.

Robert S. Fish, Corwin Hall, Room 037

  • Networking and telecommunications
  • Learning, perception, and intelligence, artificial and otherwise;
  • Human-computer interaction and computer-supported cooperative work
  • Online education, especially in Computer Science Education
  • Topics in research and development innovation methodologies including standards, open-source, and entrepreneurship
  • Distributed autonomous organizations and related blockchain technologies

Michael Freedman, Room 308 

  • Research Areas: Distributed systems, security, networking
  • Projects related to streaming data analysis, datacenter systems and networks, untrusted cloud storage and applications. Please see my group website at http://sns.cs.princeton.edu/ for current research projects.

Ruth Fong, Room 032

  • Research Areas: computer vision, machine learning, deep learning, interpretability, explainable AI, fairness and bias in AI
  • Develop a technique for understanding AI models
  • Design a AI model that is interpretable by design
  • Build a paradigm for detecting and/or correcting failure points in an AI model
  • Analyze an existing AI model and/or dataset to better understand its failure points
  • Build a computer vision system for another domain (e.g., medical imaging, satellite data, etc.)
  • Develop a software package for explainable AI
  • Adapt explainable AI research to a consumer-facing problem

Note: I am happy to advise any project if there's a sufficient overlap in interest and/or expertise; please reach out via email to chat about project ideas.

Tom Griffiths, Room 405

Available for Fall 2024 single-semester IW advising, only

Research areas: computational cognitive science, computational social science, machine learning and artificial intelligence

Note: I am open to projects that apply ideas from computer science to understanding aspects of human cognition in a wide range of areas, from decision-making to cultural evolution and everything in between. For example, we have current projects analyzing chess game data and magic tricks, both of which give us clues about how human minds work. Students who have expertise or access to data related to games, magic, strategic sports like fencing, or other quantifiable domains of human behavior feel free to get in touch.

Aarti Gupta, Room 220

  • Research Areas: Formal methods, program analysis, logic decision procedures
  • Finding bugs in open source software using automatic verification tools
  • Software verification (program analysis, model checking, test generation)
  • Decision procedures for logical reasoning (SAT solvers, SMT solvers)

Elad Hazan, Room 409  

  • Research interests: machine learning methods and algorithms, efficient methods for mathematical optimization, regret minimization in games, reinforcement learning, control theory and practice
  • Machine learning, efficient methods for mathematical optimization, statistical and computational learning theory, regret minimization in games.
  • Implementation and algorithm engineering for control, reinforcement learning and robotics
  • Implementation and algorithm engineering for time series prediction

Felix Heide, Room 410

  • Research Areas: Computational Imaging, Computer Vision, Machine Learning (focus on Optimization and Approximate Inference).
  • Optical Neural Networks
  • Hardware-in-the-loop Holography
  • Zero-shot and Simulation-only Learning
  • Object recognition in extreme conditions
  • 3D Scene Representations for View Generation and Inverse Problems
  • Long-range Imaging in Scattering Media
  • Hardware-in-the-loop Illumination and Sensor Optimization
  • Inverse Lidar Design
  • Phase Retrieval Algorithms
  • Proximal Algorithms for Learning and Inference
  • Domain-Specific Language for Optics Design

Peter Henderson , 302 Sherrerd Hall

  • Research Areas: Machine learning, law, and policy

Kyle Jamieson, Room 306

  • Research areas: Wireless and mobile networking; indoor radar and indoor localization; Internet of Things
  • See other topics on my independent work  ideas page  (campus IP and CS dept. login req'd)

Alan Kaplan, 221 Nassau Street, Room 105

Research Areas:

  • Random apps of kindness - mobile application/technology frameworks used to help individuals or communities; topic areas include, but are not limited to: first response, accessibility, environment, sustainability, social activism, civic computing, tele-health, remote learning, crowdsourcing, etc.
  • Tools automating programming language interoperability - Java/C++, React Native/Java, etc.
  • Software visualization tools for education
  • Connected consumer devices, applications and protocols

Brian Kernighan, Room 311

  • Research Areas: application-specific languages, document preparation, user interfaces, software tools, programming methodology
  • Application-oriented languages, scripting languages.
  • Tools; user interfaces
  • Digital humanities

Zachary Kincaid, Room 219

  • Research areas: programming languages, program analysis, program verification, automated reasoning
  • Independent Research Topics:
  • Develop a practical algorithm for an intractable problem (e.g., by developing practical search heuristics, or by reducing to, or by identifying a tractable sub-problem, ...).
  • Design a domain-specific programming language, or prototype a new feature for an existing language.
  • Any interesting project related to programming languages or logic.

Gillat Kol, Room 316

  • Research area: theory

Aleksandra Korolova, 309 Sherrerd Hall

  • Research areas: Societal impacts of algorithms and AI; privacy; fair and privacy-preserving machine learning; algorithm auditing.

Advisees typically have taken one or more of COS 226, COS 324, COS 423, COS 424 or COS 445.

Pravesh Kothari, Room 320

  • Research areas: Theory

Amit Levy, Room 307

  • Research Areas: Operating Systems, Distributed Systems, Embedded Systems, Internet of Things
  • Distributed hardware testing infrastructure
  • Second factor security tokens
  • Low-power wireless network protocol implementation
  • USB device driver implementation

Kai Li, Room 321

  • Research Areas: Distributed systems; storage systems; content-based search and data analysis of large datasets.
  • Fast communication mechanisms for heterogeneous clusters.
  • Approximate nearest-neighbor search for high dimensional data.
  • Data analysis and prediction of in-patient medical data.
  • Optimized implementation of classification algorithms on manycore processors.

Xiaoyan Li, 221 Nassau Street, Room 104

  • Research areas: Information retrieval, novelty detection, question answering, AI, machine learning and data analysis.
  • Explore new statistical retrieval models for document retrieval and question answering.
  • Apply AI in various fields.
  • Apply supervised or unsupervised learning in health, education, finance, and social networks, etc.
  • Any interesting project related to AI, machine learning, and data analysis.

Lydia Liu, Room 414

  • Research Areas: algorithmic decision making, machine learning and society
  • Theoretical foundations for algorithmic decision making (e.g. mathematical modeling of data-driven decision processes, societal level dynamics)
  • Societal impacts of algorithms and AI through a socio-technical lens (e.g. normative implications of worst case ML metrics, prediction and model arbitrariness)
  • Machine learning for social impact domains, especially education (e.g. responsible development and use of LLMs for education equity and access)
  • Evaluation of human-AI decision making using statistical methods (e.g. causal inference of long term impact)

Wyatt Lloyd, Room 323

  • Research areas: Distributed Systems
  • Caching algorithms and implementations
  • Storage systems
  • Distributed transaction algorithms and implementations

Alex Lombardi , Room 312

  • Research Areas: Theory

Margaret Martonosi, Room 208

  • Quantum Computing research, particularly related to architecture and compiler issues for QC.
  • Computer architectures specialized for modern workloads (e.g., graph analytics, machine learning algorithms, mobile applications
  • Investigating security and privacy vulnerabilities in computer systems, particularly IoT devices.
  • Other topics in computer architecture or mobile / IoT systems also possible.

Jonathan Mayer, Sherrerd Hall, Room 307 

Available for Spring 2025 single-semester IW, only

  • Research areas: Technology law and policy, with emphasis on national security, criminal procedure, consumer privacy, network management, and online speech.
  • Assessing the effects of government policies, both in the public and private sectors.
  • Collecting new data that relates to government decision making, including surveying current business practices and studying user behavior.
  • Developing new tools to improve government processes and offer policy alternatives.

Mae Milano, Room 307

  • Local-first / peer-to-peer systems
  • Wide-ares storage systems
  • Consistency and protocol design
  • Type-safe concurrency
  • Language design
  • Gradual typing
  • Domain-specific languages
  • Languages for distributed systems

Andrés Monroy-Hernández, Room 405

  • Research Areas: Human-Computer Interaction, Social Computing, Public-Interest Technology, Augmented Reality, Urban Computing
  • Research interests:developing public-interest socio-technical systems.  We are currently creating alternatives to gig work platforms that are more equitable for all stakeholders. For instance, we are investigating the socio-technical affordances necessary to support a co-op food delivery network owned and managed by workers and restaurants. We are exploring novel system designs that support self-governance, decentralized/federated models, community-centered data ownership, and portable reputation systems.  We have opportunities for students interested in human-centered computing, UI/UX design, full-stack software development, and qualitative/quantitative user research.
  • Beyond our core projects, we are open to working on research projects that explore the use of emerging technologies, such as AR, wearables, NFTs, and DAOs, for creative and out-of-the-box applications.

Christopher Moretti, Corwin Hall, Room 036

  • Research areas: Distributed systems, high-throughput computing, computer science/engineering education
  • Expansion, improvement, and evaluation of open-source distributed computing software.
  • Applications of distributed computing for "big science" (e.g. biometrics, data mining, bioinformatics)
  • Software and best practices for computer science education and study, especially Princeton's 126/217/226 sequence or MOOCs development
  • Sports analytics and/or crowd-sourced computing

Radhika Nagpal, F316 Engineering Quadrangle

  • Research areas: control, robotics and dynamical systems

Karthik Narasimhan, Room 422

  • Research areas: Natural Language Processing, Reinforcement Learning
  • Autonomous agents for text-based games ( https://www.microsoft.com/en-us/research/project/textworld/ )
  • Transfer learning/generalization in NLP
  • Techniques for generating natural language
  • Model-based reinforcement learning

Arvind Narayanan, 308 Sherrerd Hall 

Research Areas: fair machine learning (and AI ethics more broadly), the social impact of algorithmic systems, tech policy

Pedro Paredes, Corwin Hall, Room 041

My primary research work is in Theoretical Computer Science.

 * Research Interest: Spectral Graph theory, Pseudorandomness, Complexity theory, Coding Theory, Quantum Information Theory, Combinatorics.

The IW projects I am interested in advising can be divided into three categories:

 1. Theoretical research

I am open to advise work on research projects in any topic in one of my research areas of interest. A project could also be based on writing a survey given results from a few papers. Students should have a solid background in math (e.g., elementary combinatorics, graph theory, discrete probability, basic algebra/calculus) and theoretical computer science (226 and 240 material, like big-O/Omega/Theta, basic complexity theory, basic fundamental algorithms). Mathematical maturity is a must.

A (non exhaustive) list of topics of projects I'm interested in:   * Explicit constructions of better vertex expanders and/or unique neighbor expanders.   * Construction deterministic or random high dimensional expanders.   * Pseudorandom generators for different problems.   * Topics around the quantum PCP conjecture.   * Topics around quantum error correcting codes and locally testable codes, including constructions, encoding and decoding algorithms.

 2. Theory informed practical implementations of algorithms   Very often the great advances in theoretical research are either not tested in practice or not even feasible to be implemented in practice. Thus, I am interested in any project that consists in trying to make theoretical ideas applicable in practice. This includes coming up with new algorithms that trade some theoretical guarantees for feasible implementation yet trying to retain the soul of the original idea; implementing new algorithms in a suitable programming language; and empirically testing practical implementations and comparing them with benchmarks / theoretical expectations. A project in this area doesn't have to be in my main areas of research, any theoretical result could be suitable for such a project.

Some examples of areas of interest:   * Streaming algorithms.   * Numeric linear algebra.   * Property testing.   * Parallel / Distributed algorithms.   * Online algorithms.    3. Machine learning with a theoretical foundation

I am interested in projects in machine learning that have some mathematical/theoretical, even if most of the project is applied. This includes topics like mathematical optimization, statistical learning, fairness and privacy.

One particular area I have been recently interested in is in the area of rating systems (e.g., Chess elo) and applications of this to experts problems.

Final Note: I am also willing to advise any project with any mathematical/theoretical component, even if it's not the main one; please reach out via email to chat about project ideas.

Iasonas Petras, Corwin Hall, Room 033

  • Research Areas: Information Based Complexity, Numerical Analysis, Quantum Computation.
  • Prerequisites: Reasonable mathematical maturity. In case of a project related to Quantum Computation a certain familiarity with quantum mechanics is required (related courses: ELE 396/PHY 208).
  • Possible research topics include:

1.   Quantum algorithms and circuits:

  • i. Design or simulation quantum circuits implementing quantum algorithms.
  • ii. Design of quantum algorithms solving/approximating continuous problems (such as Eigenvalue problems for Partial Differential Equations).

2.   Information Based Complexity:

  • i. Necessary and sufficient conditions for tractability of Linear and Linear Tensor Product Problems in various settings (for example worst case or average case). 
  • ii. Necessary and sufficient conditions for tractability of Linear and Linear Tensor Product Problems under new tractability and error criteria.
  • iii. Necessary and sufficient conditions for tractability of Weighted problems.
  • iv. Necessary and sufficient conditions for tractability of Weighted Problems under new tractability and error criteria.

3. Topics in Scientific Computation:

  • i. Randomness, Pseudorandomness, MC and QMC methods and their applications (Finance, etc)

Yuri Pritykin, 245 Carl Icahn Lab

  • Research interests: Computational biology; Cancer immunology; Regulation of gene expression; Functional genomics; Single-cell technologies.
  • Potential research projects: Development, implementation, assessment and/or application of algorithms for analysis, integration, interpretation and visualization of multi-dimensional data in molecular biology, particularly single-cell and spatial genomics data.

Benjamin Raphael, Room 309  

  • Research interests: Computational biology and bioinformatics; Cancer genomics; Algorithms and machine learning approaches for analysis of large-scale datasets
  • Implementation and application of algorithms to infer evolutionary processes in cancer
  • Identifying correlations between combinations of genomic mutations in human and cancer genomes
  • Design and implementation of algorithms for genome sequencing from new DNA sequencing technologies
  • Graph clustering and network anomaly detection, particularly using diffusion processes and methods from spectral graph theory

Vikram Ramaswamy, 035 Corwin Hall

  • Research areas: Interpretability of AI systems, Fairness in AI systems, Computer vision.
  • Constructing a new method to explain a model / create an interpretable by design model
  • Analyzing a current model / dataset to understand bias within the model/dataset
  • Proposing new fairness evaluations
  • Proposing new methods to train to improve fairness
  • Developing synthetic datasets for fairness / interpretability benchmarks
  • Understanding robustness of models

Ran Raz, Room 240

  • Research Area: Computational Complexity
  • Independent Research Topics: Computational Complexity, Information Theory, Quantum Computation, Theoretical Computer Science

Szymon Rusinkiewicz, Room 406

  • Research Areas: computer graphics; computer vision; 3D scanning; 3D printing; robotics; documentation and visualization of cultural heritage artifacts
  • Research ways of incorporating rotation invariance into computer visiontasks such as feature matching and classification
  • Investigate approaches to robust 3D scan matching
  • Model and compensate for imperfections in 3D printing
  • Given a collection of small mobile robots, apply control policies learned in simulation to the real robots.

Olga Russakovsky, Room 408

  • Research Areas: computer vision, machine learning, deep learning, crowdsourcing, fairness&bias in AI
  • Design a semantic segmentation deep learning model that can operate in a zero-shot setting (i.e., recognize and segment objects not seen during training)
  • Develop a deep learning classifier that is impervious to protected attributes (such as gender or race) that may be erroneously correlated with target classes
  • Build a computer vision system for the novel task of inferring what object (or part of an object) a human is referring to when pointing to a single pixel in the image. This includes both collecting an appropriate dataset using crowdsourcing on Amazon Mechanical Turk, creating a new deep learning formulation for this task, and running extensive analysis of both the data and the model

Sebastian Seung, Princeton Neuroscience Institute, Room 153

  • Research Areas: computational neuroscience, connectomics, "deep learning" neural networks, social computing, crowdsourcing, citizen science
  • Gamification of neuroscience (EyeWire  2.0)
  • Semantic segmentation and object detection in brain images from microscopy
  • Computational analysis of brain structure and function
  • Neural network theories of brain function

Jaswinder Pal Singh, Room 324

  • Research Areas: Boundary of technology and business/applications; building and scaling technology companies with special focus at that boundary; parallel computing systems and applications: parallel and distributed applications and their implications for software and architectural design; system software and programming environments for multiprocessors.
  • Develop a startup company idea, and build a plan/prototype for it.
  • Explore tradeoffs at the boundary of technology/product and business/applications in a chosen area.
  • Study and develop methods to infer insights from data in different application areas, from science to search to finance to others. 
  • Design and implement a parallel application. Possible areas include graphics, compression, biology, among many others. Analyze performance bottlenecks using existing tools, and compare programming models/languages.
  • Design and implement a scalable distributed algorithm.

Mona Singh, Room 420

  • Research Areas: computational molecular biology, as well as its interface with machine learning and algorithms.
  • Whole and cross-genome methods for predicting protein function and protein-protein interactions.
  • Analysis and prediction of biological networks.
  • Computational methods for inferring specific aspects of protein structure from protein sequence data.
  • Any other interesting project in computational molecular biology.

Robert Tarjan, 194 Nassau St., Room 308

  • Research Areas: Data structures; graph algorithms; combinatorial optimization; computational complexity; computational geometry; parallel algorithms.
  • Implement one or more data structures or combinatorial algorithms to provide insight into their empirical behavior.
  • Design and/or analyze various data structures and combinatorial algorithms.

Olga Troyanskaya, Room 320

  • Research Areas: Bioinformatics; analysis of large-scale biological data sets (genomics, gene expression, proteomics, biological networks); algorithms for integration of data from multiple data sources; visualization of biological data; machine learning methods in bioinformatics.
  • Implement and evaluate one or more gene expression analysis algorithm.
  • Develop algorithms for assessment of performance of genomic analysis methods.
  • Develop, implement, and evaluate visualization tools for heterogeneous biological data.

David Walker, Room 211

  • Research Areas: Programming languages, type systems, compilers, domain-specific languages, software-defined networking and security
  • Independent Research Topics:  Any other interesting project that involves humanitarian hacking, functional programming, domain-specific programming languages, type systems, compilers, software-defined networking, fault tolerance, language-based security, theorem proving, logic or logical frameworks.

Shengyi Wang, Postdoctoral Research Associate, Room 216

Available for Fall 2024 single-semester IW, only

  • Independent Research topics: Explore Escher-style tilings using (introductory) group theory and automata theory to produce beautiful pictures.

Kevin Wayne, Corwin Hall, Room 040

  • Research Areas: design, analysis, and implementation of algorithms; data structures; combinatorial optimization; graphs and networks.
  • Design and implement computer visualizations of algorithms or data structures.
  • Develop pedagogical tools or programming assignments for the computer science curriculum at Princeton and beyond.
  • Develop assessment infrastructure and assessments for MOOCs.

Matt Weinberg, 194 Nassau St., Room 222

  • Research Areas: algorithms, algorithmic game theory, mechanism design, game theoretical problems in {Bitcoin, networking, healthcare}.
  • Theoretical questions related to COS 445 topics such as matching theory, voting theory, auction design, etc. 
  • Theoretical questions related to incentives in applications like Bitcoin, the Internet, health care, etc. In a little bit more detail: protocols for these systems are often designed assuming that users will follow them. But often, users will actually be strictly happier to deviate from the intended protocol. How should we reason about user behavior in these protocols? How should we design protocols in these settings?

Huacheng Yu, Room 310

  • data structures
  • streaming algorithms
  • design and analyze data structures / streaming algorithms
  • prove impossibility results (lower bounds)
  • implement and evaluate data structures / streaming algorithms

Ellen Zhong, Room 314

Opportunities outside the department.

We encourage students to look in to doing interdisciplinary computer science research and to work with professors in departments other than computer science.  However, every CS independent work project must have a strong computer science element (even if it has other scientific or artistic elements as well.)  To do a project with an adviser outside of computer science you must have permission of the department.  This can be accomplished by having a second co-adviser within the computer science department or by contacting the independent work supervisor about the project and having he or she sign the independent work proposal form.

Here is a list of professors outside the computer science department who are eager to work with computer science undergraduates.

Maria Apostolaki, Engineering Quadrangle, C330

  • Research areas: Computing & Networking, Data & Information Science, Security & Privacy

Branko Glisic, Engineering Quadrangle, Room E330

  • Documentation of historic structures
  • Cyber physical systems for structural health monitoring
  • Developing virtual and augmented reality applications for documenting structures
  • Applying machine learning techniques to generate 3D models from 2D plans of buildings
  •  Contact : Rebecca Napolitano, rkn2 (@princeton.edu)

Mihir Kshirsagar, Sherrerd Hall, Room 315

Center for Information Technology Policy.

  • Consumer protection
  • Content regulation
  • Competition law
  • Economic development
  • Surveillance and discrimination

Sharad Malik, Engineering Quadrangle, Room B224

Select a Senior Thesis Adviser for the 2020-21 Academic Year.

  • Design of reliable hardware systems
  • Verifying complex software and hardware systems

Prateek Mittal, Engineering Quadrangle, Room B236

  • Internet security and privacy 
  • Social Networks
  • Privacy technologies, anonymous communication
  • Network Science
  • Internet security and privacy: The insecurity of Internet protocols and services threatens the safety of our critical network infrastructure and billions of end users. How can we defend end users as well as our critical network infrastructure from attacks?
  • Trustworthy social systems: Online social networks (OSNs) such as Facebook, Google+, and Twitter have revolutionized the way our society communicates. How can we leverage social connections between users to design the next generation of communication systems?
  • Privacy Technologies: Privacy on the Internet is eroding rapidly, with businesses and governments mining sensitive user information. How can we protect the privacy of our online communications? The Tor project (https://www.torproject.org/) is a potential application of interest.

Ken Norman,  Psychology Dept, PNI 137

  • Research Areas: Memory, the brain and computation 
  • Lab:  Princeton Computational Memory Lab

Potential research topics

  • Methods for decoding cognitive state information from neuroimaging data (fMRI and EEG) 
  • Neural network simulations of learning and memory

Caroline Savage

Office of Sustainability, Phone:(609)258-7513, Email: cs35 (@princeton.edu)

The  Campus as Lab  program supports students using the Princeton campus as a living laboratory to solve sustainability challenges. The Office of Sustainability has created a list of campus as lab research questions, filterable by discipline and topic, on its  website .

An example from Computer Science could include using  TigerEnergy , a platform which provides real-time data on campus energy generation and consumption, to study one of the many energy systems or buildings on campus. Three CS students used TigerEnergy to create a  live energy heatmap of campus .

Other potential projects include:

  • Apply game theory to sustainability challenges
  • Develop a tool to help visualize interactions between complex campus systems, e.g. energy and water use, transportation and storm water runoff, purchasing and waste, etc.
  • How can we learn (in aggregate) about individuals’ waste, energy, transportation, and other behaviors without impinging on privacy?

Janet Vertesi, Sociology Dept, Wallace Hall, Room 122

  • Research areas: Sociology of technology; Human-computer interaction; Ubiquitous computing.
  • Possible projects: At the intersection of computer science and social science, my students have built mixed reality games, produced artistic and interactive installations, and studied mixed human-robot teams, among other projects.

David Wentzlaff, Engineering Quadrangle, Room 228

Computing, Operating Systems, Sustainable Computing.

  • Instrument Princeton's Green (HPCRC) data center
  • Investigate power utilization on an processor core implemented in an FPGA
  • Dismantle and document all of the components in modern electronics. Invent new ways to build computers that can be recycled easier.
  • Other topics in parallel computer architecture or operating systems

Facebook

  • How it works

researchprospect post subheader

Useful Links

How much will your dissertation cost?

Have an expert academic write your dissertation paper!

Dissertation Services

Dissertation Services

Get unlimited topic ideas and a dissertation plan for just £45.00

Order topics and plan

Order topics and plan

Get 1 free topic in your area of study with aim and justification

Yes I want the free topic

Yes I want the free topic

Information Technology Dissertation Topics

Published by Owen Ingram at December 29th, 2022 , Revised On March 23, 2023

Information technology stands out as one of the latest discoveries of the twenty-first century. According to researchers, technology is currently undergoing an era of transformation. Yet, despite all the hype, many students struggle to figure out a topic for their degree in Information Technology.

Nonetheless, we are right here to direct our students and show them a ray of hope. A comprehensive list of advanced dissertation topics in the field of information systems is provided below for students to pick a topic that suits their interests and research.

Related Academic Resource: Business Information Technology Topics , Technology Dissertation Topics , Green Technology Dissertation Topics

List of IT Dissertation Topics Having Potential for Research

  • A literature analysis on the information quality management framework
  • A comprehensive investigation of the information system hierarchy
  • Big data and business intelligence are essential for sustainable development in organisations: Discuss a UK-based perspective
  • Correlation between Information systems management and risk management infrastructure to achieve business risk resilience
  • Impact of the Coronavirus on the management of X country’s information systems
  • The function of structured versus unstructured data in managing information systems
  • A review of the literature on business intelligence management and information systems
  • Pre- and post-COVID analysis of the impact of information systems on organisational performance
  • Implementing IT governance and managing information systems
  • A descriptive overview of IS strategic planning and management services
  • A review of the literature on international information system security
  • Information systems management historical analysis focusing on the last three decades
  • The part that planning, alignment, and leadership play in information systems management
  • A systematic review of the post-COVID era for information systems management research
  • Difficulties and possible challenges in the International Management of Information systems
  • A thorough analysis of information policy and global information systems management
  • How to handle data management in the era of 5G technologies?
  • Human-computer interaction’s effect on innovations
  • How does machine learning introduce students to more modern career opportunities?
  • Consider the use of molecular information systems in biotechnology
  • How has information technology aided in the processing of natural language?
  • What are the most recent advancements in software engineering and programming languages?
  • An examination of new potential in the robotics industry.
  • What factors should I take into account while buying a bandwidth monitor?
  • How do we develop an efficient clinic management system for intensive care?
  • Reasons why e-waste management solutions should be used worldwide ASAP
  • Motives for why cyberbullying persists in modern communication technologies
  • Interpersonal communication has changed as a result of the development of information technology
  • The effect of 3D printing on medical practice
  • How well do colleges and universities produce qualified computer scientists using robots in infectious disease units?
  • How ethical hacking has become more harmful
  • Why having specialised financial systems is important
  • What is the best security precaution: A fingerprint or a serial number?
  • How to strengthen patent protection for technical advances?
  • An overview of the many software security measures.

Dissertation Writing Services

Orders completed by our expert writers are

  • Formally drafted in an academic style
  • Free Amendments and 100% Plagiarism Free – or your money back!
  • 100% Confidential and Timely Delivery!
  • Free anti-plagiarism report
  • Appreciated by thousands of clients. Check client reviews

Dissertation Writing Services

Do you have a dissertation topic in the field of information technology? If not, our competent dissertation writers are at your disposal. The importance of technology research cannot be overstated. Several students are required to complete their information technology dissertations.

Our well-qualified dissertation writer offers research topics in the field of information technology to these students. Such assistance includes writing a dissertation and finding significant and relevant dissertation topics . Place your order now to enjoy our services !

You May Also Like

Waste disposal is an important part of our everyday lives that often goes unnoticed. Proper waste disposal ensures that our environment and public health remain safe and healthy.

Need interesting and manageable Sexual Harassment of Women dissertation topics? Here are the trending Sexual Harassment of Women dissertation titles so you can choose the most suitable one.

Counselling psychology is one of the various subfields of psychology. It addresses a variety of situational issues that affect people from different social groups. In order to receive a psychology degree, students must present a dissertation.

USEFUL LINKS

LEARNING RESOURCES

researchprospect-reviews-trust-site

COMPANY DETAILS

Research-Prospect-Writing-Service

  • How It Works

Computer Technology Research Paper Topics

Academic Writing Service

This list of computer technology research paper topics  provides the list of 33 potential topics for research papers and an overview article on the history of computer technology.

1. Analog Computers

Paralleling the split between analog and digital computers, in the 1950s the term analog computer was a posteriori projected onto pre-existing classes of mechanical, electrical, and electromechanical computing artifacts, subsuming them under the same category. The concept of analog, like the technical demarcation between analog and digital computer, was absent from the vocabulary of those classifying artifacts for the 1914 Edinburgh Exhibition, the first world’s fair emphasizing computing technology, and this leaves us with an invaluable index of the impressive number of classes of computing artifacts amassed during the few centuries of capitalist modernity. True, from the debate between ‘‘smooth’’ and ‘‘lumpy’’ artificial lines of computing (1910s) to the differentiation between ‘‘continuous’’ and ‘‘cyclic’’ computers (1940s), the subsequent analog–digital split became possible by the multitudinous accumulation of attempts to decontextualize the computer from its socio-historical use alternately to define the ideal computer technically. The fact is, however, that influential classifications of computing technology from the previous decades never provided an encompassing demarcation compared to the analog– digital distinction used since the 1950s. Historians of the digital computer find that the experience of working with software was much closer to art than science, a process that was resistant to mass production; historians of the analog computer find this to have been typical of working with the analog computer throughout all its aspects. The historiography of the progress of digital computing invites us to turn to the software crisis, which perhaps not accidentally, surfaced when the crisis caused by the analog ended. Noticeably, it was not until the process of computing with a digital electronic computer became sufficiently visual by the addition of a special interface—to substitute for the loss of visualization that was previously provided by the analog computer—that the analog computer finally disappeared.

Academic Writing, Editing, Proofreading, And Problem Solving Services

Get 10% off with 24start discount code, 2. artificial intelligence.

Artificial intelligence (AI) is the field of software engineering that builds computer systems and occasionally robots to perform tasks that require intelligence. The term ‘‘artificial intelligence’’ was coined by John McCarthy in 1958, then a graduate student at Princeton, at a summer workshop held at Dartmouth in 1956. This two-month workshop marks the official birth of AI, which brought together young researchers who would nurture the field as it grew over the next several decades: Marvin Minsky, Claude Shannon, Arthur Samuel, Ray Solomonoff, Oliver Selfridge, Allen Newell, and Herbert Simon. It would be difficult to argue that the technologies derived from AI research had a profound effect on our way of life by the beginning of the 21st century. However, AI technologies have been successfully applied in many industrial settings, medicine and health care, and video games. Programming techniques developed in AI research were incorporated into more widespread programming practices, such as high-level programming languages and time-sharing operating systems. While AI did not succeed in constructing a computer which displays the general mental capabilities of a typical human, such as the HAL computer in Arthur C. Clarke and Stanley Kubrick’s film 2001: A Space Odyssey, it has produced programs that perform some apparently intelligent tasks, often at a much greater level of skill and reliability than humans. More than this, AI has provided a powerful and defining image of what computer technology might someday be capable of achieving.

3. Computer and Video Games

Interactive computer and video games were first developed in laboratories as the late-night amusements of computer programmers or independent projects of television engineers. Their formats include computer software; networked, multiplayer games on time-shared systems or servers; arcade consoles; home consoles connected to television sets; and handheld game machines. The first experimental projects grew out of early work in computer graphics, artificial intelligence, television technology, hardware and software interface development, computer-aided education, and microelectronics. Important examples were Willy Higinbotham’s oscilloscope-based ‘‘Tennis for Two’’ at the Brookhaven National Laboratory (1958); ‘‘Spacewar!,’’ by Steve Russell, Alan Kotok, J. Martin Graetz and others at the Massachusetts Institute of Technology (1962); Ralph Baer’s television-based tennis game for Sanders Associates (1966); several networked games from the PLATO (Programmed Logic for Automatic Teaching Operations) Project at the University of Illinois during the early 1970s; and ‘‘Adventure,’’ by Will Crowther of Bolt, Beranek & Newman (1972), extended by Don Woods at Stanford University’s Artificial Intelligence Laboratory (1976). The main lines of development during the 1970s and early 1980s were home video consoles, coin-operated arcade games, and computer software.

4. Computer Displays

The display is an essential part of any general-purpose computer. Its function is to act as an output device to communicate data to humans using the highest bandwidth input system that humans possess—the eyes. Much of the development of computer displays has been about trying to get closer to the limits of human visual perception in terms of color and spatial resolution. Mainframe and minicomputers used ‘‘terminals’’ to display the output. These were fed data from the host computer and processed the data to create screen images using a graphics processor. The display was typically integrated with a keyboard system and some communication hardware as a terminal or video display unit (VDU) following the basic model used for teletypes. Personal computers (PCs) in the late 1970s and early 1980s changed this model by integrating the graphics controller into the computer chassis itself. Early PC displays typically displayed only monochrome text and communicated in character codes such as ASCII. Line-scanning frequencies were typically from 15 to 20 kilohertz—similar to television. CRT displays rapidly developed after the introduction of video graphics array (VGA) technology (640 by 480 pixels in16 colors) in the mid-1980s and scan frequencies rose to 60 kilohertz or more for mainstream displays; 100 kilohertz or more for high-end displays. These displays were capable of displaying formats up to 2048 by 1536 pixels with high color depths. Because the human eye is very quick to respond to visual stimulation, developments in display technology have tended to track the development of semiconductor technology that allows the rapid manipulation of the stored image.

5. Computer Memory for Personal Computers

During the second half of the twentieth century, the two primary methods used for the long-term storage of digital information were magnetic and optical recording. These methods were selected primarily on the basis of cost. Compared to core or transistorized random-access memory (RAM), storage costs for magnetic and optical media were several orders of magnitude cheaper per bit of information and were not volatile; that is, the information did not vanish when electrical power was turned off. However, access to information stored on magnetic and optical recorders was much slower compared to RAM memory. As a result, computer designers used a mix of both types of memory to accomplish computational tasks. Designers of magnetic and optical storage systems have sought meanwhile to increase the speed of access to stored information to increase the overall performance of computer systems, since most digital information is stored magnetically or optically for reasons of cost.

6. Computer Modeling

Computer simulation models have transformed the natural, engineering, and social sciences, becoming crucial tools for disciplines as diverse as ecology, epidemiology, economics, urban planning, aerospace engineering, meteorology, and military operations. Computer models help researchers study systems of extreme complexity, predict the behavior of natural phenomena, and examine the effects of human interventions in natural processes. Engineers use models to design everything from jets and nuclear-waste repositories to diapers and golf clubs. Models enable astrophysicists to simulate supernovas, biochemists to replicate protein folding, geologists to predict volcanic eruptions, and physiologists to identify populations at risk of lead poisoning. Clearly, computer models provide a powerful means of solving problems, both theoretical and applied.

7. Computer Networks

Computers and computer networks have changed the way we do almost everything—the way we teach, learn, do research, access or share information, communicate with each other, and even the way we entertain ourselves. A computer network, in simple terms, consists of two or more computing devices (often called nodes) interconnected by means of some medium capable of transmitting data that allows the computers to communicate with each other in order to provide a variety of services to users.

8. Computer Science

Computer science occupies a unique position among the scientific and technical disciplines. It revolves around a specific artifact—the electronic digital computer—that touches upon a broad and diverse set of fields in its design, operation, and application. As a result, computer science represents a synthesis and extension of many different areas of mathematics, science, engineering, and business.

9. Computer-Aided Control Technology

The story of computer-aided control technology is inextricably entwined with the modern history of automation. Automation in the first half of the twentieth century involved (often analog) processes for continuous automatic measurement and control of hardware by hydraulic, mechanical, or electromechanical means. These processes facilitated the development and refinement of battlefield fire-control systems, feedback amplifiers for use in telephony, electrical grid simulators, numerically controlled milling machines, and dozens of other innovations.

10. Computer-Aided Design and Manufacture

Computer-aided design and manufacture, known by the acronym CAD/CAM, is a process for manufacturing mechanical components, wherein computers are used to link the information needed in and produced by the design process to the information needed to control the machine tools that produce the parts. However, CAD/CAM actually constitutes two separate technologies that developed along similar, but unrelated, lines until they were combined in the 1970s.

11. Computer-User Interface

A computer interface is the point of contact between a person and an electronic computer. Today’s interfaces include a keyboard, mouse, and display screen. Computer user interfaces developed through three distinct stages, which can be identified as batch processing, interactive computing, and the graphical user interface (GUI). Today’s graphical interfaces support additional multimedia features, such as streaming audio and video. In GUI design, every new software feature introduces more icons into the process of computer– user interaction. Presently, the large vocabulary of icons used in GUI design is difficult for users to remember, which creates a complexity problem. As GUIs become more complex, interface designers are adding voice recognition and intelligent agent technologies to make computer user interfaces even easier to operate.

12. Early Computer Memory

Mechanisms to store information were present in early mechanical calculating machines, going back to Charles Babbage’s analytical engine proposed in the 1830s. It introduced the concept of the ‘‘store’’ and, if ever built, would have held 1000 numbers of up to 50 decimal digits. However, the move toward base-2 or binary computing in the 1930s brought about a new paradigm in technology—the digital computer, whose most elementary component was an on–off switch. Information on a digital system is represented using a combination of on and off signals, stored as binary digits (shortened to bits): zeros and ones. Text characters, symbols, or numerical values can all be coded as bits, so that information stored in digital memory is just zeros and ones, regardless of the storage medium. The history of computer memory is closely linked to the history of computers but a distinction should be made between primary (or main) and secondary memory. Computers only need operate on one segment of data at a time, and with memory being a scarce resource, the rest of the data set could be stored in less expensive and more abundant secondary memory.

13. Early Digital Computers

Digital computers were a marked departure from the electrical and mechanical calculating and computing machines in wide use from the early twentieth century. The innovation was of information being represented using only two states (on or off), which came to be known as ‘‘digital.’’ Binary (base 2) arithmetic and logic provided the tools for these machines to perform useful functions. George Boole’s binary system of algebra allowed any mathematical equation to be represented by simply true or false logic statements. By using only two states, engineering was also greatly simplified, and universality and accuracy increased. Further developments from the early purpose-built machines, to ones that were programmable accompanied by many key technological developments, resulted in the well-known success and proliferation of the digital computer.

14. Electronic Control Technology

The advancement of electrical engineering in the twentieth century made a fundamental change in control technology. New electronic devices including vacuum tubes (valves) and transistors were used to replace electromechanical elements in conventional controllers and to develop new types of controllers. In these practices, engineers discovered basic principles of control theory that could be further applied to design electronic control systems.

15. Encryption and Code Breaking

The word cryptography comes from the Greek words for ‘‘hidden’’ (kryptos) and ‘‘to write’’ (graphein)—literally, the science of ‘‘hidden writing.’’ In the twentieth century, cryptography became fundamental to information technology (IT) security generally. Before the invention of the digital computer at mid-century, national governments across the world relied on mechanical and electromechanical cryptanalytic devices to protect their own national secrets and communications, as well as to expose enemy secrets. Code breaking played an important role in both World Wars I and II, and the successful exploits of Polish and British cryptographers and signals intelligence experts in breaking the code of the German Enigma ciphering machine (which had a range of possible transformations between a message and its code of approximately 150 trillion (or 150 million million million) are well documented.

16. Error Checking and Correction

In telecommunications, whether transmission of data or voice signals is over copper, fiber-optic, or wireless links, information coded in the signal transmitted must be decoded by the receiver from a background of noise. Signal errors can be introduced, for example from physical defects in the transmission medium (semiconductor crystal defects, dust or scratches on magnetic memory, bubbles in optical fibers), from electromagnetic interference (natural or manmade) or cosmic rays, or from cross-talk (unwanted coupling) between channels. In digital signal transmission, data is transmitted as ‘‘bits’’ (ones or zeros, corresponding to on or off in electronic circuits). Random bit errors occur singly and in no relation to each other. Burst error is a large, sustained error or loss of data, perhaps caused by transmission problems in the connecting cables, or sudden noise. Analog to digital conversion can also introduce sampling errors.

17. Global Positioning System (GPS)

The NAVSTAR (NAVigation System Timing And Ranging) Global Positioning System (GPS) provides an unlimited number of military and civilian users worldwide with continuous, highly accurate data on their position in four dimensions— latitude, longitude, altitude, and time— through all weather conditions. It includes space, control, and user segments (Figure 6). A constellation of 24 satellites in 10,900 nautical miles, nearly circular orbits—six orbital planes, equally spaced 60 degrees apart, inclined approximately 55 degrees relative to the equator, and each with four equidistant satellites—transmits microwave signals in two different L-band frequencies. From any point on earth, between five and eight satellites are ‘‘visible’’ to the user. Synchronized, extremely precise atomic clocks—rubidium and cesium— aboard the satellites render the constellation semiautonomous by alleviating the need to continuously control the satellites from the ground. The control segment consists of a master facility at Schriever Air Force Base, Colorado, and a global network of automated stations. It passively tracks the entire constellation and, via an S-band uplink, periodically sends updated orbital and clock data to each satellite to ensure that navigation signals received by users remain accurate. Finally, GPS users—on land, at sea, in the air or space—rely on commercially produced receivers to convert satellite signals into position, time, and velocity estimates.

18. Gyrocompass and Inertial Guidance

Before the twentieth century, navigation at sea employed two complementary methods, astronomical and dead reckoning. The former involved direct measurements of celestial phenomena to ascertain position, while the latter required continuous monitoring of a ship’s course, speed, and distance run. New navigational technology was required not only for iron ships in which traditional compasses required correction, but for aircraft and submarines in which magnetic compasses cannot be used. Owing to their rapid motion, aircraft presented challenges for near instantaneous navigation data collection and reduction. Electronics furnished the exploitation of radio and the adaptation of a gyroscope to direction finding through the invention of the nonmagnetic gyrocompass.

Although the Cold War arms race after World War II led to the development of inertial navigation, German manufacture of the V-2 rocket under the direction of Wernher von Braun during the war involved a proto-inertial system, a two-gimballed gyro with an integrator to determine speed. Inertial guidance combines a gyrocompass with accelerometers installed along orthogonal axes, devices that record all accelerations of the vehicle in which inertial guidance has been installed. With this system, if the initial position of the vehicle is known, then the vehicle’s position at any moment is known because integrators record all directions and accelerations and calculate speeds and distance run. Inertial guidance devices can subtract accelerations due to gravity or other motions of the vehicle. Because inertial guidance does not depend on an outside reference, it is the ultimate dead reckoning system, ideal for the nuclear submarines for which they were invented and for ballistic missiles. Their self-contained nature makes them resistant to electronic countermeasures. Inertial systems were first installed in commercial aircraft during the 1960s. The expense of manufacturing inertial guidance mechanisms (and their necessary management by computer) has limited their application largely to military and some commercial purposes. Inertial systems accumulate errors, so their use at sea (except for submarines) has been as an adjunct to other navigational methods, unlike aircraft applications. Only the development of the global positioning system (GPS) at the end of the century promised to render all previous navigational technologies obsolete. Nevertheless, a range of technologies, some dating to the beginning of the century, remain in use in a variety of commercial and leisure applications.

19. Hybrid Computers

Following the emergence of the analog–digital demarcation in the late 1940s—and the ensuing battle between a speedy analog versus the accurate digital—the term ‘‘hybrid computer’’ surfaced in the early 1960s. The assumptions held by the adherents of the digital computer—regarding the dynamic mechanization of computational labor to accompany the equally dynamic increase in computational work—was becoming a universal ideology. From this perspective, the digital computer justly appeared to be technically superior. In introducing the digital computer to social realities, however, extensive interaction with the experienced analog computer adherents proved indispensable, especially given that the digital proponents’ expectation of progress by employing the available and inexpensive hardware was stymied by the lack of inexpensive software. From this perspective—as historiographically unwanted it may be by those who agree with the essentialist conception of the analog–digital demarcation—the history of the hybrid computer suggests that the computer as we now know it was brought about by linking the analog and the digital, not by separating them. Placing the ideal analog and the ideal digital at the two poles, all computing techniques that combined some features of both fell beneath ‘‘hybrid computation’’; the designators ‘‘balanced’’ or ‘‘true’’ were preserved for those built with appreciable amounts of both. True hybrids fell into the middle spectrum that included: pure analog computers, analog computers using digital-type numerical analysis techniques, analog computers programmed with the aid of digital computers, analog computers using digital control and logic, analog computers using digital subunits, analog computers using digital computers as peripheral equipment, balanced hybrid computer systems, digital computers using analog subroutines, digital computers with analog arithmetic elements, digital computers designed to permit analog-type programming, digital computers with analog-oriented compilers and interpreters, and pure digital computers.

20. Information Theory

Information theory, also known originally as the mathematical theory of communication, was first explicitly formulated during the mid-twentieth century. Almost immediately it became a foundation; first, for the more systematic design and utilization of numerous telecommunication and information technologies; and second, for resolving a paradox in thermodynamics. Finally, information theory has contributed to new interpretations of a wide range of biological and cultural phenomena, from organic physiology and genetics to cognitive behavior, human language, economics, and political decision making. Reflecting the symbiosis between theory and practice typical of twentieth century technology, technical issues in early telegraphy and telephony gave rise to a proto-information theory developed by Harry Nyquist at Bell Labs in 1924 and Ralph Hartley, also at Bell Labs, in 1928. This theory in turn contributed to advances in telecommunications, which stimulated the development of information theory per se by Claude Shannon and Warren Weaver, in their book The Mathematical Theory of Communication published in 1949. As articulated by Claude Shannon, a Bell Labs researcher, the technical concept of information is defined by the probability of a specific message or signal being picked out from a number of possibilities and transmitted from A to B. Information in this sense is mathematically quantifiable. The amount of information, I, conveyed by signal, S, is inversely related to its probability, P. That is, the more improbable a message, the more information it contains. To facilitate the mathematical analysis of messages, the measure is conveniently defined as I ¼ log2 1/P(S), and is named a binary digit or ‘‘bit’’ for short. Thus in the simplest case of a two-state signal (1 or 0, corresponding to on or off in electronic circuits), with equal probability for each state, the transmission of either state as the code for a message would convey one bit of information. The theory of information opened up by this conceptual analysis has become the basis for constructing and analyzing digital computational devices and a whole range of information technologies (i.e., technologies including telecommunications and data processing), from telephones to computer networks.

21. Internet

The Internet is a global computer network of networks whose origins are found in U.S. military efforts. In response to Sputnik and the emerging space race, the Advanced Research Projects Agency (ARPA) was formed in 1958 as an agency of the Pentagon. The researchers at ARPA were given a generous mandate to develop innovative technologies such as communications.

In 1962, psychologist J.C.R. Licklider from the Massachusetts Institute of Technology’s Lincoln Laboratory joined ARPA to take charge of the Information Processing Techniques Office (IPTO). In 1963 Licklider wrote a memo proposing an interactive network allowing people to communicate via computer. This project did not materialize. In 1966, Bob Taylor, then head of the IPTO, noted that he needed three different computer terminals to connect to three different machines in different locations around the nation. Taylor also recognized that universities working with IPTO needed more computing resources. Instead of the government buying machines for each university, why not share machines? Taylor revitalized Licklider’s idea, securing $1 million in funding, and hired 29-yearold Larry Roberts to direct the creation of ARPAnet.

In 1974, Robert Kahn and Vincent Cerf proposed the first internet-working protocol, a way for datagrams (packets) to be communicated between disparate networks, and they called it an ‘‘internet.’’ Their efforts created transmission control protocol/internet protocol (TCP/IP). In 1982, TCP/IP replaced NCP on ARPAnet. Other networks adopted TCP/IP and it became the dominant standard for all networking by the late 1990s.

In 1981 the U.S. National Science Foundation (NSF) created Computer Science Network (CSNET) to provide universities that did not have access to ARPAnet with their own network. In 1986, the NSF sponsored the NSFNET ‘‘backbone’’ to connect five supercomputing centers. The backbone also connected ARPAnet and CSNET together, and the idea of a network of networks became firmly entrenched. The open technical architecture of the Internet allowed numerous innovations to be grafted easily onto the whole. When ARPAnet was dismantled in 1990, the Internet was thriving at universities and technology- oriented companies. The NSF backbone was dismantled in 1995 when the NSF realized that commercial entities could keep the Internet running and growing on their own, without government subsidy. Commercial network providers worked through the Commercial Internet Exchange to manage network traffic.

22. Mainframe Computers

The term ‘‘computer’’ currently refers to a general-purpose, digital, electronic, stored-program calculating machine. The term ‘‘mainframe’’ refers to a large, expensive, multiuser computer, able to handle a wide range of applications. The term was derived from the main frame or cabinet in which the central processing unit (CPU) and main memory of a computer were kept separate from those cabinets that held peripheral devices used for input and output.

Computers are generally classified as supercomputers, mainframes, minicomputers, or microcomputers. This classification is based on factors such as processing capability, cost, and applications, with supercomputers the fastest and most expensive. All computers were called mainframes until the 1960s, including the first supercomputer, the naval ordnance research calculator (NORC), offered by International Business Machines (IBM) in 1954. In 1960, Digital Equipment Corporation (DEC) shipped the PDP-1, a computer that was much smaller and cheaper than a mainframe.

Mainframes once each filled a large room, cost millions of dollars, and needed a full maintenance staff, partly in order to repair the damage caused by the heat generated by their vacuum tubes. These machines were characterized by proprietary operating systems and connections through dumb terminals that had no local processing capabilities. As personal computers developed and began to approach mainframes in speed and processing power, however, mainframes have evolved to support a client/server relationship, and to interconnect with open standard-based systems. They have become particularly useful for systems that require reliability, security, and centralized control. Their ability to process large amounts of data quickly make them particularly valuable for storage area networks (SANs). Mainframes today contain multiple CPUs, providing additional speed through multiprocessing operations. They support many hundreds of simultaneously executing programs, as well as numerous input and output processors for multiplexing devices, such as video display terminals and disk drives. Many legacy systems, large applications that have been developed, tested, and used over time, are still running on mainframes.

23. Mineral Prospecting

Twentieth century mineral prospecting draws upon the accumulated knowledge of previous exploration and mining activities, advancing technology, expanding knowledge of geologic processes and deposit models, and mining and processing capabilities to determine where and how to look for minerals of interest. Geologic models have been developed for a wide variety of deposit types; the prospector compares geologic characteristics of potential exploration areas with those of deposit models to determine which areas have similar characteristics and are suitable prospecting locations. Mineral prospecting programs are often team efforts, integrating general and site-specific knowledge of geochemistry, geology, geophysics, and remote sensing to ‘‘discover’’ hidden mineral deposits and ‘‘measure’’ their economic potential with increasing accuracy and reduced environmental disturbance. Once a likely target zone has been identified, multiple exploration tools are used in a coordinated program to characterize the deposit and its economic potential.

24. Packet Switching

Historically the first communications networks were telegraphic—the electrical telegraph replacing the mechanical semaphore stations in the mid-nineteenth century. Telegraph networks were largely eclipsed by the advent of the voice (telephone) network, which first appeared in the late nineteenth century, and provided the immediacy of voice conversation. The Public Switched Telephone Network allows a subscriber to dial a connection to another subscriber, with the connection being a series of telephone lines connected together through switches at the telephone exchanges along the route. This technique is known as circuit switching, as a circuit is set up between the subscribers, and is held until the call is cleared.

One of the disadvantages of circuit switching is the fact that the capacity of the link is often significantly underused due to silences in the conversation, but the spare capacity cannot be shared with other traffic. Another disadvantage is the time it takes to establish the connection before the conversation can begin. One could liken this to sending a railway engine from London to Edinburgh to set the points before returning to pick up the carriages. What is required is a compromise between the immediacy of conversation on an established circuit-switched connection, with the ad hoc delivery of a store-and-forward message system. This is what packet switching is designed to provide.

25. Personal Computers

A personal computer, or PC, is designed for personal use. Its central processing unit (CPU) runs single-user systems and application software, processes input from the user, sending output to a variety of peripheral devices. Programs and data are stored in memory and attached storage devices. Personal computers are generally single-user desktop machines, but the term has been applied to any computer that ‘‘stands alone’’ for a single user, including portable computers.

The technology that enabled the construction of personal computers was the microprocessor, a programmable integrated circuit (or ‘‘chip’’) that acts as the CPU. Intel introduced the first microprocessor in 1971, the 4-bit 4004, which it called a ‘‘microprogrammable computer on a chip.’’ The 4004 was originally developed as a general-purpose chip for a programmable calculator, but Intel introduced it as part of Intel’s Microcomputer System 4-bit, or MCS-4, which also included read-only memory (ROM) and random-access memory (RAM) memory chips and a shift register chip. In August 1972, Intel followed with the 8-bit 8008, then the more powerful 8080 in June 1974. Following Intel’s lead, computers based on the 8080 were usually called microcomputers.

The success of the minicomputer during the 1960s prepared computer engineers and users for ‘‘single person, single CPU’’ computers. Digital Equipment Corporation’s (DEC) widely used PDP-10, for example, was smaller, cheaper, and more accessible than large mainframe computers. Timeshared computers operating under operating systems such as TOPS-10 on the PDP-10— co-developed by the Massachusetts Institute of Technology (MIT) and DEC in 1972—created the illusion of individual control of computing power by providing rapid access to personal programs and files. By the early 1970s, the accessibility of minicomputers, advances in microelectronics, and component miniaturization created expectations of affordable personal computers.

26. Printers

Printers generally can be categorized as either impact or nonimpact. Like typewriters, impact printers generate output by striking the page with a solid substance. Impact printers include daisy wheel and dot matrix printers. The daisy wheel printer, which was introduced in 1972 by Diablo Systems, operates by spinning the daisy wheel to the correct character whereupon a hammer strikes it, forcing the character through an inked ribbon and onto the paper. Dot matrix printers operate by using a series of small pins to strike a matrix or grid ribbon coated with ink. The strike of the pin forces the ink to transfer to the paper at the point of impact. Unlike daisy wheel printers, dot matrix printers can generate italic and other character types through producing different pin patterns. Nonimpact printers generate images by spraying or fusing ink to paper or other output media. This category includes inkjet printers, laser printers, and thermal printers. Whether they are inkjet or laser, impact or nonimpact, all modern printers incorporate features of dot matrix technology in their design: they operate by generating dots onto paper or other physical media.

27. Processors for Computers

A processor is the part of the computer system that manipulates the data. The first computer processors of the late 1940s and early 1950s performed three main functions and had three main components. They worked in a cycle to gather, decode, and execute instructions. They were made up of the arithmetic and logic unit, the control unit, and some extra storage components or registers. Today, most processors contain these components and perform these same functions, but since the 1960s they have developed different forms, capabilities, and organization. As with computers in general, increasing speed and decreasing size has marked their development.

28. Radionavigation

Astronomical and dead-reckoning techniques furnished the methods of navigating ships until the twentieth century, when exploitation of radio waves, coupled with electronics, met the needs of aircraft with their fast speeds, but also transformed all navigational techniques. The application of radio to dead reckoning has allowed vessels to determine their positions in all-weather by direction finding (known as radio direction finding, or RDF) or by hyperbolic systems. Another use of radio, radar (radio direction and rangefinding), enables vessels to determine their distance to, or their bearing from, objects of known position. Radionavigation complements traditional navigational methods by employing three frames of reference. First, radio enables a vessel to navigate by lines of bearing to shore transmitters (the most common use of radio). This is directly analogous to the use of lighthouses for bearings. Second, shore stations may take radio bearings of craft and relay to them computed positions. Third, radio beacons provide aircraft or ships with signals that function as true compasses.

29. Software Application Programs

At the beginning of the computer age around the late 1940s, inventors of the intelligent machine were not thinking about applications software, or any software other than that needed to run the bare machine to do mathematical calculating. It was only when Maurice Wilkes’ young protégé David Williams crafted a tidy set of initial orders for the EDSAC, an early programmable digital computer, that users could string together standard subroutines to a program and have the execution jump between them. This was the beginning of software as we know it—something that runs on a machine other than an operating system to make it do anything desired. ‘‘Applications’’ are software other than system programs that run the actual hardware. Manufacturers always had this software, and as the 1950s progressed they would ‘‘bundle’’ applications with hardware to make expensive computers more attractive. Some programming departments were even placed in the marketing departments.

30. Software Engineering

Software engineering aims to develop the programs that allow digital computers to do useful work in a systematic, disciplined manner that produces high-quality software on time and on budget. As computers have spread throughout industrialized societies, software has become a multibillion dollar industry. Both the users and developers of software depend a great deal on the effectiveness of the development process.

Software is a concept that didn’t even pertain to the first electronic digital computers. They were ‘‘programmed’’ through switches and patch cables that physically altered the electrical pathways of the machine. It was not until the Manchester Mark I, the first operational stored-program electronic digital computer, was developed in 1948 at the University of Manchester in England that configuring the machine to solve a specific problem became a matter of software rather than hardware. Subsequently, instructions were stored in memory along with data.

31. Supercomputers

Supercomputers are high-performance computing devices that are generally used for numerical calculation, for the study of physical systems either through numerical simulation or the processing of scientific data. Initially, they were large, expensive, mainframe computers, which were usually owned by government research labs. By the end of the twentieth century, they were more often networks of inexpensive small computers. The common element of all of these machines was their ability to perform high-speed floating-point arithmetic— binary arithmetic that approximates decimal numbers with a fixed number of bits—the basis of numerical computation.

With the advent of inexpensive supercomputers, these machines moved beyond the large government labs and into smaller research and engineering facilities. Some were used for the study of social science. A few were employed by business concerns, such as stock brokerages or graphic designers.

32. Systems Programs

The operating systems used in all computers today are a result of the development and organization of early systems programs designed to control and regulate the operations of computer hardware. The early computing machines such as the ENIAC of 1945 were ‘‘programmed’’ manually with connecting cables and setting switches for each new calculation. With the advent of the stored program computer of the late 1940s (the Manchester Mark I, EDVAC, EDSAC (electronic delay storage automatic calculator), the first system programs such as assemblers and compilers were developed and installed. These programs performed oft repeated and basic operations for computer use including converting programs into machine code, storing and retrieving files, managing computer resources and peripherals, and aiding in the compilation of new programs. With the advent of programming languages, and the dissemination of more computers in research centers, universities, and businesses during the late 1950s and 1960s, a large group of users began developing programs, improving usability, and organizing system programs into operating systems.

The 1970s and 1980s saw a turn away from some of the complications of system software, an interweaving of features from different operating systems, and the development of systems programs for the personal computer. In the early 1970s, two programmers from Bell Laboratories, Ken Thompson and Dennis Ritchie, developed a smaller, simpler operating system called UNIX. Unlike past system software, UNIX was portable and could be run on different computer systems. Due in part to low licensing fees and simplicity of design, UNIX increased in popularity throughout the 1970s. At the Xerox Palo Alto Research Center, research during the 1970s led to the development of system software for the Apple Macintosh computer that included a GUI (graphical user interface). This type of system software filtered the user’s interaction with the computer through the use of graphics or icons representing computer processes. In 1985, a year after the release of the Apple Macintosh computer, a GUI was overlaid on Microsoft’s then dominant operating system, MS-DOS, to produce Microsoft Windows. The Microsoft Windows series of operating systems became and remains the dominant operating system on personal computers.

33. World Wide Web

The World Wide Web (Web) is a ‘‘finite but unbounded’’ collection of media-rich digital resources that are connected through high-speed digital networks. It relies upon an Internet protocol suite that supports cross-platform transmission and makes available a wide variety of media types (i.e., multimedia). The cross-platform delivery environment represents an important departure from more traditional network communications protocols such as e-mail, telnet, and file transfer protocols (FTP) because it is content-centric. It is also to be distinguished from earlier document acquisition systems such as Gopher, which was designed in 1991, originally as a mainframe program but quickly implemented over networks, and wide area information systems (WAIS), also released in 1991. WAIS accommodated a narrower range of media formats and failed to include hyperlinks within their navigation protocols. Following the success of Gopher on the Internet, the Web quickly extended and enriched the metaphor of integrated browsing and navigation. This made it possible to navigate and peruse a wide variety of media types effortlessly on the Web, which in turn led to the Web’s hegemony as an Internet protocol.

History of Computer Technology

Computer Technology

The modern computer—the (electronic) digital computer in which the stored program concept is realized and hence self-modifying programs are possible—was only invented in the 1940s. Nevertheless, the history of computing (interpreted as the usage of modern computers) is only understandable against the background of the many forms of information processing as well as mechanical computing devices that solved mathematical problems in the first half of the twentieth century. The part these several predecessors played in the invention and early history of the computer may be interpreted from two different perspectives: on the one hand it can be argued that these machines prepared the way for the modern digital computer, on the other hand it can be argued that the computer, which was invented as a mathematical instrument, was reconstructed to be a data-processing machine, a control mechanism, and a communication tool.

The invention and early history of the digital computer has its roots in two different kinds of developments: first, information processing in business and government bureaucracies; and second, the use and the search for mathematical instruments and methods that could solve mathematical problems arising in the sciences and in engineering.

Origins in Mechanical Office Equipment

The development of information processing in business and government bureaucracies had its origins in the late nineteenth century, which was not just an era of industrialization and mass production but also a time of continuous growth in administrative work. The economic precondition for this development was the creation of a global economy, which caused growth in production of goods and trade. This brought with it an immense increase in correspondence, as well as monitoring and accounting activities—corporate bureaucracies began to collect and process data in increasing quantities. Almost at the same time, government organizations became more and more interested in collating data on population and demographic changes (e.g., expanding tax revenues, social security, and wide-ranging planning and monitoring functions) and analyzing this data statistically.

Bureaucracies in the U.S. and in Europe reacted in a different way to these changes. While in Europe for the most part neither office machines nor telephones entered offices until 1900, in the U.S. in the last quarter of the nineteenth century the information-handling techniques in bureaucracies were radically changed because of the introduction of mechanical devices for writing, copying, and counting data. The rise of big business in the U.S. had caused a growing demand for management control tools, which was fulfilled by a new ideology of systematic management together with the products of the rising office machines industry. Because of a later start in industrialization, the government and businesses in the U.S. were not forced to reorganize their bureaucracies when they introduced office machines. This, together with an ideological preference for modern office equipment, was the cause of a market for office machines and of a far-reaching mechanization of office work in the U.S. In the 1880s typewriters and cash registers became very widespread, followed by adding machines and book-keeping machines in the 1890s. From 1880 onward, the makers of office machines in the U.S. underwent a period of enormous growth, and in 1920 the office machine industry annually generated about $200 million in revenue. In Europe, by comparison, mechanization of office work emerged about two decades later than in the U.S.—both Germany and Britain adopted the American system of office organization and extensive use of office machines for the most part no earlier than the 1920s.

During the same period the rise of a new office machine technology began. Punched card systems, initially invented by Herman Hollerith to analyze the U.S. census in 1890, were introduced. By 1911 Hollerith’s company had only about 100 customers, but after it had been merged in the same year with two other companies to become the Computing- Tabulating-Recording Company (CTR), it began a tremendous ascent to become the world leader in the office machine industry. CTR’s general manager, Thomas J. Watson, understood the extraordinary potential of these punched-card accounting devices, which enabled their users to process enormous amounts of data largely automatically, in a rapid way and at an adequate level of cost and effort. Due to Watson’s insights and his extraordinary management abilities, the company (which had since been renamed to International Business Machines (IBM)) became the fourth largest office machine supplier in the world by 1928—topped only by Remington Rand, National Cash Register (NCR), and the Burroughs Adding Machine Company.

Origin of Calculating Devices and Analog Instruments

Compared with the fundamental changes in the world of corporate and government bureaucracies caused by office machinery during the late nineteenth and early twentieth century, calculating machines and instruments seemed to have only a minor influence in the world of science and engineering. Scientists and engineers had always been confronted with mathematical problems and had over the centuries developed techniques such as mathematical tables. However, many new mathematical instruments emerged in the nineteenth century and increasingly began to change the world of science and engineering. Apart from the slide rule, which came into popular use in Europe from the early nineteenth century onwards (and became the symbol of the engineer for decades), calculating machines and instruments were only produced on a large scale in the middle of the nineteenth century.

In the 1850s the production of calculating machines as well as that of planimeters (used to measure the area of closed curves, a typical problem in land surveying) started on different scales. Worldwide, less than 2,000 calculating machines were produced before 1880, but more than 10,000 planimeters were produced by the early 1880s. Also, various types of specialized mathematical analog instruments were produced on a very small scale in the late nineteenth century; among them were integraphs for the graphical solution of special types of differential equations, harmonic analyzers for the determination of Fourier coefficients of a periodic function, and tide predictors that could calculate the time and height of the ebb and flood tides.

Nonetheless, in 1900 only geodesists and astronomers (as well as part of the engineering community) made extensive use of mathematical instruments. In addition, the establishment of applied mathematics as a new discipline took place at German universities on a small scale and the use of apparatus and machines as well as graphical and numerical methods began to flourish during this time. After World War I, the development of engineering sciences and of technical physics gave a tremendous boost to applied mathematics in Germany and Britain. In general, scientists and engineers became more aware of the capabilities of calculating machines and a change of the calculating culture—from the use of tables to the use of calculating machines—took place.

One particular problem that was increasingly encountered by mechanical and electrical engineers in the 1920s was the solution of several types of differential equations, which were not solvable by analytic solutions. As one important result of this development, a new type of analog instrument— the so called ‘‘differential analyzer’’—was invented in 1931 by the engineer Vannevar Bush at the Massachusetts Institute of Technology (MIT). In contrast to its predecessors—several types of integraphs—this machine (which was later called an analog computer) could be used not only to solve a special class of differential equation, but a more general class of differential equations associated with engineering problems. Before the digital computer was invented in the 1940s there was an intensive use of analog instruments (similar to Bush’s differential analyzer) and a number of machines were constructed in the U.S. and in Europe after the model of Bush’s machine before and during World War II. Analog instruments also became increasingly important in several fields such as the firing control of artillery on warships or the control of rockets. It is worth mentioning here that only for a limited class of scientific and engineering problems was it possible to construct an analog computer— weather forecasting and the problem of shock waves produced by an atomic bomb, for example, required the solution of partial differential equations, for which a digital computer was needed.

The Invention of the Computer

The invention of the electronic digital stored-program computer is directly connected with the development of numerical calculation tools for the solution of mathematical problems in the sciences and in engineering. The ideas that led to the invention of the computer were developed simultaneously by scientists and engineers in Germany, Britain, and the U.S. in the 1930s and 1940s. The first freely programmable program-controlled automatic calculator was developed by the civil engineering student Konrad Zuse in Germany. Zuse started development work on program-controlled computing machines in the 1930s, when he had to deal with extensive calculations in static, and in 1941 his Z3, which was based on electromechanical relay technology, became operational.

Several similar developments in the U.S. were in progress at the same time. In 1937 Howard Aiken, a physics student at Harvard University, approached IBM to build a program-controlled calculator— later called the ‘‘Harvard Mark I.’’ On the basis of a concept Aiken had developed because of his experiences with the numerical solution of partial differential equations, the machine was built and became operational in 1944. At almost the same time a series of important relay computers was built at the Bell Laboratories in New York following a suggestion by George R. Stibitz. All these developments in the U.S. were spurred by the outbreak of World War II. The first large-scale programmable electronic computer called the Colossus was built in complete secrecy in 1943 to 1944 at Bletchley Park in Britain in order to help break the German Enigma machine ciphers.

However, it was neither these relay calculators nor the Colossus that were decisive for the development of the universal computer, but the ENIAC (electronic numerical integrator and computer), which was developed at the Moore School of Engineering at the University of Pennsylvania. Extensive ballistic calculations were carried out there for the U.S. Army during World War II with the aid of the Bush ‘‘differential analyzer’’ and more than a hundred women (‘‘computors’’) working on mechanical desk calculators. Observing that capacity was barely sufficient to compute the artillery firing tables, the physicist John W. Mauchly and the electronic engineer John Presper Eckert started developing the ENIAC, a digital version of the differential analyzer, in 1943 with funding from the U.S. Army.

In 1944 the mathematician John von Neumann turned his attention to the ENIAC because of his mathematical work on the Manhattan Project (on the implosion of the hydrogen bomb). While the ENIAC was being built, Neumann and the ENIAC team drew up plans for a successor to the ENIAC in order to improve the shortcomings of the ENIAC concept, such as the very small memory and the time-consuming reprogramming (actually rewiring) required to change the setup for a new calculation. In these meetings the idea of a stored-program, universal machine evolved. Memory was to be used to store the program in addition to data. This would enable the machine to execute conditional branches and change the flow of the program. The concept of a computer in the modern sense of the word was born and in 1945 von Neumann wrote the important ‘‘First Draft of a Report on the EDVAC,’’ which described the stored-program, universal computer. The logical structure that was presented in this draft report is now referred to as the ‘‘von Neumann architecture.’’ This EDVAC report was originally intended for internal use but once made freely available it became the ‘‘bible’’ for computer pioneers throughout the world in the 1940s and 1950s. The first computer featuring the von Neumann architecture operated at Cambridge University in the U.K.; in June 1949 the EDSAC (electronic delay storage automatic computer) computer built by Maurice Wilkes—designed according to the EDVAC principles—became operational.

The Computer as a Scientific Instrument

As soon as the computer was invented, a growing demand for computers by scientists and engineers evolved, and numerous American and European universities started their own computer projects in the 1940s and 1950s. After the technical difficulties of building an electronic computer were solved, scientists grasped the opportunity to use the new scientific instrument for their research. For example, at the University of Gottingen in Germany, the early computers were used for the initial value problems of partial differential equations associated with hydrodynamic problems from atomic physics and aerodynamics. Another striking example was the application of von Neumann’s computer at the Institute for Advanced Study (IAS) in Princeton to numerical weather forecasts in 1950. As a result, numerical weather forecasts could be made on a regular basis from the mid-1950s onwards.

Mathematical methods have always been of a certain importance for science and engineering sciences, but only the use of the electronic digital computer (as an enabling technology) made it possible to broaden the application of mathematical methods to such a degree that research in science, medicine, and engineering without computer- based mathematical methods has become virtually inconceivable at the end of the twentieth century. A number of additional computer-based techniques, such as scientific visualization, medical imaging, computerized tomography, pattern recognition, image processing, and statistical applications, have become of the utmost significance for science, medicine, engineering, and social sciences. In addition, the computer changed the way engineers construct technical artifacts fundamentally because of the use of computer-based methods such as computer-aided design (CAD), computer-aided manufacture (CAM), computer-aided engineering, control applications, and finite-element methods. However, the most striking example seems to be the development of scientific computing and computer modeling, which became accepted as a third mode of scientific research that complements experimentation and theoretical analysis. Scientific computing and computer modeling are based on supercomputers as the enabling technology, which became important tools for modern science routinely used to simulate physical and chemical phenomena. These high-speed computers became equated with the machines developed by Seymour Cray, who built the fastest computers in the world for many years. The supercomputers he launched such as the legendary CRAY I from 1976 were the basis for computer modeling of real world systems, and helped, for example, the defense industry in the U.S. to build weapons systems and the oil industry to create geological models that show potential oil deposits.

Growth of Digital Computers in Business and Information Processing

When the digital computer was invented as a mathematical instrument in the 1940s, it could not have been foreseen that this new artifact would ever be of a certain importance in the business world. About 50 firms entered the computer business worldwide in the late 1940s and the early 1950s, and the computer was reconstructed to be a type of electronic data-processing machine that took the place of punched-card technology as well as other office machine technology. It is interesting to consider that there were mainly three types of companies building computers in the 1950s and 1960s: newly created computer firms (such as the company founded by the ENIAC inventors Eckert and Mauchly), electronics and control equipments firms (such as RCA and General Electric), and office appliance companies (such as Burroughs and NCR). Despite the fact that the first digital computers were put on the market by a German and a British company, U.S. firms dominated the world market from the 1950s onward, as these firms had the biggest market as well as financial support from the government.

Generally speaking, the Cold War exerted an enormous influence on the development of computer technology. Until the early 1960s the U.S. military and the defense industry were the central drivers of the digital computer expansion, serving as the main market for computer technology and shaping and speeding up the formation of the rising computer industry. Because of the U.S. military’s role as the ‘‘tester’’ for prototype hard- and software, it had a direct and lasting influence on technological developments; in addition, it has to be noted that the spread of computer technology was partly hindered by military secrecy. Even after the emergence of a large civilian computer market in the 1960s, the U.S. military maintained its influence by investing a great deal in computer in hard- and software and in computer research projects.

From the middle of the 1950s onwards the world computer market was dominated by IBM, which accounted for more than 70 percent of the computer industry revenues until the mid-1970s. The reasons for IBM’s overwhelming success were diverse, but the company had a unique combination of technical and organizational capabilities at its disposal that prepared it perfectly for the mainframe computer market. In addition, IBM benefited from enormous government contracts, which helped to develop excellence in computer technology and design. However, the greatest advantage of IBM was by no doubt its marketing organization and its reputation as a service-oriented firm, which was used to working closely with customers to adapt machinery to address specific problems, and this key difference between IBM and its competitors persisted right into the computer age.

During the late 1950s and early 1960s, the computer market—consisting of IBM and seven other companies called the ‘‘seven dwarves’’—was dominated by IBM, with its 650 and 1401 computers. By 1960 the market for computers was still small. Only about 7,000 computers had been delivered by the computer industry, and at this time even IBM was primarily a punched-card machine supplier, which was still the major source of its income. Only in 1960 did a boom in demand for computers start, and by 1970 the number of computers installed worldwide had increased to more than 100,000. The computer industry was on the track to become one of the world’s major industries, and was totally dominated by IBM.

The outstanding computer system of this period was IBM’s System/360. It was announced in 1964 as a compatible family of the same computer architecture, and employed interchangeable peripheral devices in order to solve IBM’s problems with a hotchpotch of incompatible product lines (which had evoked large problems in the development and maintenance of a great deal of different hardware and software products). Despite the fact that neither the technology used nor the systems programming were of a high-tech technology at the time, the System/360 established a new standard for mainframe computers for decades. Various computer firms in the U.S., Europe, Japan and even Russia, concentrated on copying components, peripherals for System/360 or tried to build System/360-compatible computers.

The growth of the computer market during the 1960s was accompanied by market shakeouts: two of the ‘‘seven dwarves’’ left the computer business after the first computer recession in the early 1970s, and afterwards the computer market was controlled by IBM and BUNCH (Burroughs, UNIVAC, NCR, Control Data, and Honeywell). At the same time, an internationalization of the computer market took place—U.S. companies controlled the world market for computers— which caused considerable fears over loss of national independence in European and Japanese national governments, and these subsequently stirred up national computing programs. While the European attempts to create national champions as well as the more general attempt to create a European-wide market for mainframe computers failed in the end, Japan’s attempt to found a national computer industry has been successful: Until today Japan is the only nation able to compete with the U.S. in a wide array of high-tech computer-related products.

Real-Time and Time-Sharing

Until the 1960s almost all computers in government and business were running batch-processing applications (i.e., the computers were only used in the same way as the punched-card accounting machines they had replaced). In the early 1950s, however, the computer industry introduced a new mode of computing named ‘‘real-time’’ in the business sector for the first time, which was originally developed for military purposes in MIT’s Whirlwind project. This project was initially started in World War II with the aim of designing an aircraft simulator by analog methods, and later became a part of a research and development program for the gigantic, computerized anti-aircraft defense system SAGE (semi-automatic ground environment) built up by IBM in the 1950s.

The demand for this new mode of computing was created by cultural and structural changes in economy. The increasing number of financial transactions in banks and insurance companies as well as increasing airline traveling activities made necessary new computer-based information systems that led finally to new forms of business evolution through information technology.

The case of the first computerized airline reservation system SABRE, developed for American Airlines by IBM in the 1950s and finally implemented in the early 1960s, serves to thoroughly illustrate these structural and structural changes in economy. Until the early 1950s, airline reservations had been made manually without any problems, but by 1953 this system was in crisis because increased air traffic and growing flight plan complexity had made reservation costs insupportable. SABRE became a complete success, demonstrating the potential of centralized real-time computing systems connected via a network. The system enabled flight agents throughout the U.S., who were equipped with desktop terminals, to gain a direct, real-time access to the central reservation system based on central IBM mainframe computers, while the airline was able to assign appropriate resources in response. Therefore, an effective combination of advantages was offered by SABRE—a better utilization of resources and a much higher customer convenience.

Very soon this new mode of computing spread around the business and government world and became commonplace throughout the service and distribution sectors of the economy; for example, bank tellers and insurance account representatives increasingly worked at terminals. On the one hand structural information problems led managers to go this way, and on the other hand the increasing use of computers as information handling machines in government and business had brought about the idea of computer-based accessible data retrieval. In the end, more and more IBM customers wanted to link dozens of operators directly to central computers by using terminal keyboards and display screens.

In the late 1950s and early 1960s—at the same time that IBM and American Airlines had begun the development of the SABRE airline reservation system—a group of brilliant computer scientists had a new idea for computer usage named ‘‘time sharing.’’ Instead of dedicating a multi-terminal system solely to a single application, they had the computer utility vision of organizing a mainframe computer so that several users could interact with it simultaneously. This vision was to change the nature of computing profoundly, because computing was no longer provided to naive users by programmers and systems analysts, and by the late 1960s time-sharing computers became widespread in the U.S.

Particularly important for this development had been the work of J.C.R. Licklider of the Advanced Research Project Agency (ARPA) of the U.S. Department of Defense. In 1960 Licklider had published a now-classic paper ‘‘Man–Computer Symbiosis’’ proposing the use of computers to augment human intellect and creating the vision of interactive computing. Licklider was very successful in translating his idea of a network allowing people on different computers to communicate into action, and convinced ARPA to start an enormous research program in 1962. Its budget surpassed that of all other sources of U.S. public research funding for computers combined. The ARPA research programs resulted in a series of fundamental moves forward in computer technology in areas such as computer graphics, artificial intelligence, and operating systems. For example, even the most influential current operating system, the general-purpose time-sharing system Unix, developed in the early 1970s at the Bell Laboratories, was a spin-off of an ambitious operating system project, Multics, funded by ARPA. The designers of Unix successfully attempted to keep away from complexity by using a clear, minimalist design approach to software design, and created a multitasking, multiuser operating system, which became the standard operating system in the 1980s.

Electronic Component Revolution

While the nature of business computing was changed by the new paradigms such as real time and time sharing, advances in solid-state components increasingly became a driving force for fundamental changes in the computer industry, and led to a dynamic interplay between new computer designs and new programming techniques that resulted in a remarkable series of technical developments. The technical progress of the mainframe computer had always run parallel to conversions in the electronics components. During the period from 1945 to 1965, two fundamental transformations in the electronics industry took place that were marked by the invention of the transistor in 1947 and the integrated circuit in 1957 to 1958. While the first generation of computers—lasting until about 1960—was characterized by vacuum tubes (valves) for switching elements, the second generation used the much smaller and more reliable transistors, which could be produced at a lower price. A new phase was inaugurated when an entire integrated circuit on a chip of silicon was produced in 1961, and when the first integrated circuits were produced for the military in 1962. A remarkable pace of progress in semiconductor innovations, known as the ‘‘revolution in miniature,’’ began to speed up the computer industry. The third generation of computers characterized by the use of integrated circuits began with the announcement of the IBM System/360 in 1964 (although this computer system did not use true integrated circuits). The most important effect of the introduction of integrated circuits was not to strengthen the leading mainframe computer systems, but to destroy Grosch’s Law, which stated that computing power increases as the square of its costs. In fact, the cost of computer power dramatically reduced during the next ten years.

This became clear with the introduction of the first computer to use integrated circuits on a full scale in 1965: the Digital Equipment Corporation (DEC) offered its PDP-8 computer for just $18,000, creating a new class of computers called minicomputers—small in size and low in cost—as well as opening up the market to new customers. Minicomputers were mainly used in areas other than general-purpose computing such as industrial applications and interactive graphics systems. The PDP-8 became the first widely successful minicomputer with over 50,000 items sold, demonstrating that there was a market for smaller computers. This success of DEC (by 1970 it had become the world’s third largest computer manufacturer) was supported by dramatic advances in solid-state technology. During the 1960s the number of transistors on a chip doubled every two years, and as a result minicomputers became continuously more powerful and more inexpensive at an inconceivable speed.

Personal Computing

The most striking aspect of the consequences of the exponential increase of the number of transistors on a chip during the 1960s—as stated by ‘‘Moore’s Law’’: the number of transistors on a chip doubled every two years—was not the lowering of the costs of mainframe computer and minicomputer processing and storage, but the introduction of the first consumer products based on chip technology such as hand-held calculators and digital watches in about 1970. More specifically, the market acts in these industries were changed overnight by the shift from mechanical to chip technology, which led to an enormous deterioration in prices as well as a dramatic industry shakeout. These episodes only marked the beginning of wide-ranging changes in economy and society during the last quarter of the twentieth century leading to a new situation where chips played an essential role in almost every part of business and modern life.

The case of the invention of the personal computer serves to illustrate that it was not sufficient to develop the microprocessor as the enabling technology in order to create a new invention, but how much new technologies can be socially constructed by cultural factors and commercial interests. When the microprocessor, a single-chip integrated circuit implementation of a CPU, was launched by the semiconductor company Intel in 1971, there was no hindrance to producing a reasonably priced microcomputer, but it took six years until the consumer product PC emerged. None of the traditional mainframe and minicomputer companies were involved in creating the early personal computer. Instead, a group of computer hobbyists as well as the ‘‘computer liberation’’ movement in the U.S. became the driving force behind the invention of the PC. These two groups were desperately keen on a low-priced type of minicomputer for use at home for leisure activities such as computer games; or rather they had the counterculture vision of an unreservedly available and personal access to an inexpensive computer utility provided with rich information. When in 1975 the Altair 8800, an Intel 8080 microprocessor-based computer, was offered as an electronic hobbyist kit for less than $400, these two groups began to realize their vision of a ‘‘personal computer.’’ Very soon dozens of computer clubs and computer magazines were founded around the U.S., and these computer enthusiasts created the personal computer by combining the Altair with keyboards, disk drives, and monitors as well as by developing standard software for it. Consequently, in only two years, a more or less useless hobbyist kit had been changed into a computer that could easily be transformed in a consumer product.

The computer hobbyist period ended in 1977, when the first standard machines for an emerging consumer product mass market were sold. These included products such as the Commodore Pet and the Apple II, which included its own monitor, disk drive, and keyboard, and was provided with several basic software packages. Over next three years, spreadsheet, word processing, and database software were developed, and an immense market for games software evolved. As a result, personal computers became more and more a consumer product for ordinary people, and Apple’s revenues shot to more than $500 million in 1982. By 1980, the personal computer had transformed into a business machine, and IBM decided to develop its own personal computer, which was introduced as the IBM PC in 1981. It became an overwhelming success and set a new industry standard.

Apple tried to compete by launching their new Macintosh computer in 1984 provided with a revolutionary graphical user interface (GUI), which set a new standard for a user-friendly human–computer interaction. It was based on technology created by computer scientists at the Xerox Palo Alto Research Center in California, who had picked up on ideas about human– computer interaction developed at the Stanford Research Institute and at the University of Utah. Despite the fact that the Macintosh’s GUI was far superior to the MS-DOS operating system of the IBM-compatible PCs, Apple failed to win the business market and remained a niche player with a market share of about 10 percent. The PC main branch was determined by the companies IBM had chosen as its original suppliers in 1981 for the design of the microprocessor (Intel) and the operating system (Microsoft). While IBM failed to seize power in the operating system software market for PCs in a software war with Microsoft, Microsoft achieved dominance not only of the key market for PC operating systems, but also the key market of office applications during the first half of the 1990s.

In the early 1990s computing again underwent further fundamental changes with the appearance of the Internet, and for the most computer users, networking became an integral part of what it means to have a computer. Furthermore, the rise of the Internet indicated the impending arrival of a new ‘‘information infrastructure’’ as well as of a ‘‘digital convergence,’’ as the coupling of computers and communications networks was often called.

In addition, the 1990s were a period of an information technology boom, which was mainly based on the Internet hype. For many years previously, it seemed to a great deal of managers and journalists that the Internet would become not just an indispensable business tool, but also a miracle cure for economic growth and prosperity. In addition, computer scientists and sociologists started a discussion predicting the beginning of a new ‘‘information age’’ based on the Internet as a ‘‘technological revolution’’ and reshaping the ‘‘material basis’’ of industrial societies.

The Internet was the outcome of an unusual collaboration of a military–industrial–academic complex that promoted the development of this extraordinary innovation. It grew out of a military network called the ARPAnet, a project established and funded by ARPA in the 1960s. The ARPAnet was initially devoted to support of data communications for defense research projects and was only used by a small number of researchers in the 1970s. Its further development was primarily promoted by unintentional forms of network usage. The users of the ARPAnet became very much attracted by the opportunity for communicating through electronic mail, which rapidly surpassed all other forms of network activities. Another unplanned spin-off of the ARPAnet was the Usenet (Unix User Network), which started in 1979 as a link between two universities and enabled its users to subscribe to newsgroups. Electronic mail became a driving force for the creation of a large number of new proprietary networks funded by the existing computer services industry or by organizations such as the NSF (NSFnet). Because networks users’ desire for email to be able to cross network boundaries, an ARPA project on ‘‘internetworking’’ became the origin for the ‘‘Internet’’—a network of networks linked by several layers of protocols such as TCP/IP (transmission control protocol/internet protocol), which quickly developed into the actual standard.

Only after the government funding had solved many of the most essential technical issues and had shaped a number of the most characteristic features of the Internet, did private sector entrepreneurs start Internet-related ventures and quickly developed user-oriented enhancements. Nevertheless, the Internet did not make a promising start and it took more than ten years before significant numbers of networks were connected. In 1980, the Internet had less than two hundred hosts, and during the next four years the number of hosts went up only to 1000. Only when the Internet reached the educational and business community of PC users in the late 1980s, did it start to become an important economic and social phenomenon. The number of hosts began an explosive growth in the late 1980s—by 1988 there were over 50,000 hosts. An important and unforeseen side effect of this development became the creation of the Internet into a new electronic publishing medium. The electronic publishing development that excited most interest in the Internet was the World Wide Web, originally developed at the CERN High Energy Physics Laboratory in Geneva in 1989. Soon there were millions of documents on the Internet, and private PC users became excited by the joys of surfing the Internet. A number of firms such as AOL soon provided low-cost network access and a range of consumer-oriented information services. The Internet boom was also helped by the Clinton–Gore presidential election campaign on the ‘‘information superhighway’’ and by the amazing news reporting on the national information infrastructure in the early 1990s. Nevertheless, for many observers it was astounding how fast the number of hosts on the Internet increased during the next few years—from more than 1 million in 1992 to 72 million in 1999.

The overwhelming success of the PC and of the Internet tends to hide the fact that its arrival marked only a branching in computer history and not a sequence. (Take, for example, the case of mainframe computers, which still continue to run, being of great importance to government facilities and the private sector (such as banks and insurance companies), or the case of supercomputers, being of the utmost significance for modern science and engineering.) Furthermore, it should be noted that only a small part of the computer applications performed today is easily observable—98 percent of programmable CPUs are used in embedded systems such as automobiles, medical devices, washing machines and mobile telephones.

Browse other Technology Research Paper Topics .

ORDER HIGH QUALITY CUSTOM PAPER

computer information systems research paper topics

Dissertation Help UK : Online Dissertation Help

39 Information Systems Dissertation Topics Ideas

February 20, 2022

Dr Jana Martiskova

Click here to place an order for topic brief service to get instant approval from your professor.

Table of Contents

As the name depicts, information systems dissertation topics revolve around the information technology sphere of organizations and industries. Information systems research topics include both primary as well as secondary levels of research studies and their complexities differ in accordance with the academic and degree levels at hand.

Other Related Post

  • Computer science dissertation topics
  • Internet dissertation topics .
  • Networking dissertation topics
  • IT dissertation topics
  • Computer science research topics

Best Information Systems Dissertation Topics Ideas for College Students

Given below is an extensive and enriched list of information systems thesis topics for our clients so that they go through the list and find something as per their interest and priority:

  • A historical analysis of information systems management: focus on the past three decades.
  • The role played by leadership, alignment, and planning in the domain of information systems management.
  • Research in information systems management: focus on post-COVID time period.
  • International information systems management: potential challenges and risks involved.
  • Information policy and international information systems management: a systematic analysis.
  • Information systems management and global operations: a review of the literature.
  • Importance of case studies and integrated projects in teaching information systems management.
  • A comparative analysis of practitioners and academicians in the field of information systems management.
  • How information technology supports businesses: the role played by information systems management.
  • Information systems management practices: a descriptive analysis.
  • Information systems management and the public sector: focus on the key issues.
  • Utilization of consumer internet data: ethics in information systems management.
  • Software development: groupware and problem-solving in a correlational analysis.
  • Research in the field of information systems management: focus on new innovations and ideas.
  • Cognition digital twins for personalized information systems of smart cities: Proof of concept
  • Information management systems: comparing private and public organizations in country X.
  • Machine learning-based diagnosis of diseases using the unfolded EEG spectra: toward an intelligent software sensor.
  • Relationship between information systems management and risk management systems: a comparative analysis.
  • Judging the IT department performance in an organization through information systems management.
  • Information systems management graduate school curriculums: a descriptive study.
  • Relationship between organizational learning and information systems management: a systematic analysis.
  • Quality management in the domain of information systems: a descriptive analysis.
  • Management of big data in developing countries of the world: a review of the literature.
  • Strategic information systems management: focus on the role of a balanced scorecard.
  • Delivery of information system: formation of a hypothetical framework.
  • Information quality management framework: a review of the literature.
  • Information systems hierarchy: a systematic analysis.
  • Importance of big data and business intelligence for the sustainable development in organizations: a UK-based approach.
  • Correlation between information systems management and risk management infrastructure to attain business risk resilience.
  • Effects of COVID-19 pandemic on the information systems management of X country.
  • Role of structured versus unstructured data in the domain of information systems management.
  • Business intelligence and information systems management: a review of the literature.
  • Effects of information systems on organizational performance: pre and post COVID analysis.
  • The Determinants of management information systems effectiveness in small-and medium-sized enterprises.
  • IT governance implementation and information systems management.
  • IS strategic planning and management services: a descriptive review.
  • Information system security at international levels: a review of the literature.
  • Developing a hypothetical model for measuring quality in information systems management.
  • The effects of information systems compatibility on firm performance following mergers and acquisitions
  • Implications of Knowledge Organization Systems for Health Information Exchange and Communication during the COVID-19 Pandemic

Above is the best list of  Information Systems Dissertation Topics, If you are still looking for some unique information systems dissertation topics fill out the form below and get the topic mini proposal on your requirements.

Paid Topic Mini Proposal (500 Words)

You will get the topics first and then the mini proposal which includes:

  • An explanation why we choose this topic.
  • 2-3 research questions.
  • Key literature resources identification.
  • Suitable methodology including raw sample size and data collection method
  • View a Sample of Service

Note: After submitting your order please must check your email [inbox/spam] folders for order confirmation and login details. If the email goes in spam please mark not as spam to avoid any communication gap between us.

Get An Expert Dissertation Writing Help To Achieve Good Grades

By placing an order with us, you can get;

  • Writer consultation before payment to ensure your work is in safe hands.
  • Free topic if you don't have one
  • Draft submissions to check the quality of the work as per supervisor's feedback
  • Free revisions
  • Complete privacy
  • Plagiarism Free work
  • Guaranteed 2:1 (With help of your supervisor's feedback)
  • 2 Instalments plan
  • Special discounts

Leave a Comment Cancel reply

Save my name, email, and website in this browser for the next time I comment.

WhatsApp and Get 35% off promo code now!

78 MIS Topics for Presentations and Essays

🏆 best management information systems project ideas, 🎓 interesting topics related to information systems, ✅ simple & easy mis assignment topics.

  • Samsung Company’s Management Information System The scope of Management Information System is defined as, “The combination of human and computer based resources that results in the collection, storage, retrieval, communication and use of data for the purpose of efficient management […]
  • The Role of Management Information System (MIS) in Business The diagram below shows the relationship between the departments and underpins how the manual system which is used to conduct the primary and secondary activities within the departments is related to the performance of each […]
  • Management Information System Implementation in the Bank This conforms to the first principle of change in which a person is adjusted via a change in the system that they work in.
  • Management Information Systems: Making Strategic Decisions The company will create a model of the relationship between all the pieces of information in the group. In this regard, the organization employs MISs in order to complete and integrate a series of elements […]
  • Management Information Systems: Effective Decision-Making and Security Through taking into account the different organizational levels within an organization management information systems are classified into four main types, namely, operational level systems, knowledge level systems, management level systems and strategic level systems. Management […]
  • ABC Company Management Information System Increasing the presence of the firm’s products to specific segments of clients provides the customers with seamless shopping experience in the business’s physical and online stores.
  • Management Information Systems and Enterprise Resource Planning In addition to heavy investment in the staff who left, their departure led to delay in the areas they were in charge of as well as repeating some of the steps already done during the […]
  • Management Information Systems Analysis and Design The progress of this project will be based on a simple definition of a management information system which would be: a computer based system that provides flexible and speedy access to accurate data.
  • Management Information System: Cisco Systems Prior to the implementation of the ERP system, the company’s systems were on the brink of failure. The management of the company understood the need for the company to shift to a new ERP system.
  • Management Information Systems and E-Government In the developing countries, it has been of much surprise to notice that, the failures of e-government project, is a problem that is real and much practical.
  • Management Information Systems: Mitsubishi Motors A management information system is considered as one of the most effective and successful systems that are able to provide the necessary information in order to promote the development and management of any organization in […]
  • Management Information Systems Benefits in Business This has helped this firm to achieve competitive advantage in the market because it is always aware of the needs of its customers. To manage this threat, ABC has discounted and differentiated its products in […]
  • Management Information Systems in Organizational Performance The information system has enabled the organisation to solve problems like inappropriate use of time, increased expenditure, and customer dissatisfaction. Management information system is an important tool that can be used to shift the cost […]
  • Management Information Systems: Socio-Technical Aspect Software: This component stands for programs that are used to operate the MIS, manage data, search and cipher through logs, and other related activities.
  • Health Management Information Systems: Impact on the Technology Implementation Since the beginning of the information systems implementation, the vast majority of spheres have adopted some cutting-edge technologies to increase the effectiveness of their working process.
  • Management Information Systems (MIS) The advances in the evolution of devices and the achievement of a new stage of development critically impacts MIS and creates the basis for the emergence of multiple changes towards the achievement of better outcomes […]
  • Chalhoub Group: Management Information Systems This presentation will focus on one organization in UAE, highlighting how its improved IS/IT systems have helped it register massive profits.
  • Healthcare Management Information Systems: Working Principles For instance, the ministry of health uses the network to disseminate health information to people in all regions and also globally.
  • Healthcare Management Information Systems: An Evaluation In this perspective, the Chief Information Officer survey therefore becomes important for the Health Management Information System industry because it assist health institutions to project current and future informational and technological needs, not mentioning the […]
  • Accounting and Management Information Systems This article is a discussion of the results obtained by Mangiuc in an empirical study that involved both local and foreign companies in Romania.
  • Management Information Systems: Primis Online System at McGraw Hill This paper focuses on the analysis, design and system development elements applied by the Primis team in deployment of the online system at McGraw Hill.
  • Imperial Tobacco. Management Information System – Competitive Forces This means that the management at Imperial Tobacco needs to develop products that can compete with the new products for them to maintain their position in the market.
  • Management Information Systems: Ethics and Career Path The second one is the group of skills necessary to vivificate information, and the last one is meant to reason in a proper.
  • Management Information System and Outsourcing According to these critics, there is a need for some of the currently outsourced services to be performed in the home country.
  • “Management Information Systems” by James O’Brien and George M. Marakas This is a network or sub-network with a high speed that interconnects different types of data storage devices that have associated data servers on behalf of a larger network of users. Through this, data can […]
  • Management Information Systems and Its Impacts As thus, it is the obligation of the employees so see to it that they acquire the necessary knowledge and skills; otherwise, they will be washed out of the company system.
  • Management Information Systems: Efficiency and Collaboration In addition, it is important to stress out that Microsoft Access allows a more flexible retrieval of data even when the volume of data gets high.
  • Fly Dubai Company’s Management Information Systems Data from the company’s website and its associated pilot training website outline the main sources of primary information. Identity refers to the ease that websites explain the nature, history, and values of a company.
  • Relevant Decision Making: Management Information Systems in Organizations In this respect, managers are likely to make wrong decisions, especially, if they are unaware of the inaccuracy of the information provided by the system.
  • Management Information System and Strategic Performance According to his assumption, the higher the demographic diversity in top management team, the greater the contribution of accounting system to strategic performance.
  • Management Information Systems in Corporate Institutions With the invention of personal computers and other information technology tools, the companies had to develop a proper information technology system that would handle the work of the organization and reduce the errors that were […]
  • Types of Management Information Systems in Business Generally, a TPS is used to process the data that is required to update the records about the operations of a business.
  • Management Information Systems: LinkedIn Corporation It highlights how information technology has been used in management, the general operations of the organization as well as how the use of information systems has helped the organization to attain a competitive edge.
  • Management Information Systems and Business Decision-Making The article explains to its audience the importance of promoting and adapting the use of information systems to ensure that managers get the latest information in time.
  • Management Information System in Business The main importance of information system to any modern organization is to store its data and that of its associates and customers in a secure manner.
  • Management Information Systems Major: Courses and Careers Knowledge on Management information systems is vital to institutions on a management height, where it is employed to preserve and build up new techniques for organizing vast amounts of information and helping managers in the […]
  • Management Information System: Operational Efficiency and Decision-Making The customers as well are in a position to be aware of the status of their deliveries by logging in to the company’s website which is updated by the servers throughout.
  • Bespoke Management Information Systems Using Microsoft Access
  • Management Information System in Starbucks: IBM TPS System
  • Logistics Management Information Systems: Functions, Components, Examples
  • The Management Information Systems of Toyota: New Methods and Accomplish Business Goals
  • Management Information Systems for Shipping and Delivery Company
  • Management Information Systems of the Small and Medium Enterprises
  • Management Information Systems in Marketing: Kotler’s Model
  • Barriers to Successful Development of Strategic Management Information System
  • Management Accounting Information System: Auditing and Financial Reporting Modules
  • Warehouse Management Information System: Optimizing the Use of Available Space or Coordinating Tasks
  • Management Information Systems in Hospitals: Accounting for the Control of Doctors
  • Management Information Systems Through User Interface
  • Project Management Information System: Using More Efficiently, Without Getting Overwhelmed With Data
  • Information Management Systems in the Supply Chain
  • How Management Information Systems Affect Working Ethics
  • Human Resource Management System: The Best Tools in 2022
  • Management Information System for Real Estate and Property Management
  • Management Information System: Advantages and Disadvantages
  • The Technology of Information Management System
  • Management Information Systems for Computer-Aided Design
  • Management Information Systems: Enterprise Applications
  • The History of Management Information Systems: Five Eras
  • Management Information System: Development Process With System Development Life Cycle
  • Credit Management Information Systems: A Forward-Looking Approach
  • Common Problems in Management Information Systems
  • Management Information Systems at Rosenbluth Travel: Competitive Advantage in a Rapidly Growing Global Service Company
  • Why Can Management Information Systems Effectiveness Decreases
  • Management Information Systems: The Difference Between Advanced MIS and MI Dashboard
  • Developing Decision Support Capabilities Through the Use of Management Information Systems
  • Using National Education Management Information Systems to Make Local Service Improvements: The Case of Pakistan
  • How Might a Management Information System Be Used in a School
  • The External Organizational Environment and Its Impact on Management Information Systems
  • Management Information Systems: Managing the Digital Firm by Kenneth Laudon, Jane Laudon
  • The Disadvantage of Management Information System: Fraudulent Activities
  • Management Information Systems: Impact on Dairy Farm Profitability
  • Which Country Is Best in Management Information System
  • Management Information Systems Program for Poughkeepsie Children’s Home
  • Relationship Between Management Information Systems and Corporate Performance
  • Management Information Systems: Air Canada Takes off With Maintenix
  • Farm Management Information Systems Planning and Development in the Netherlands
  • Cyber Security Topics
  • Encryption Essay Titles
  • Hacking Essay Topics
  • Information Management Paper Topics
  • Quality Control Research Topics
  • Security Management Essay Ideas
  • Virtualization Essay Titles
  • Software Engineering Topics
  • Chicago (A-D)
  • Chicago (N-B)

IvyPanda. (2023, November 30). 78 MIS Topics for Presentations and Essays. https://ivypanda.com/essays/topic/management-information-systems-essay-topics/

"78 MIS Topics for Presentations and Essays." IvyPanda , 30 Nov. 2023, ivypanda.com/essays/topic/management-information-systems-essay-topics/.

IvyPanda . (2023) '78 MIS Topics for Presentations and Essays'. 30 November.

IvyPanda . 2023. "78 MIS Topics for Presentations and Essays." November 30, 2023. https://ivypanda.com/essays/topic/management-information-systems-essay-topics/.

1. IvyPanda . "78 MIS Topics for Presentations and Essays." November 30, 2023. https://ivypanda.com/essays/topic/management-information-systems-essay-topics/.

Bibliography

IvyPanda . "78 MIS Topics for Presentations and Essays." November 30, 2023. https://ivypanda.com/essays/topic/management-information-systems-essay-topics/.

Illustration showing how cybersecurity helps protect critical systems and sensitive information from cyberattacks

Published: 27 October 2023

Cybersecurity refers to any technology, measure or practice for preventing cyberattacks or mitigating their impact. 

Cybersecurity aims to protect individuals’ and organizations’ systems, applications, computing devices, sensitive data and financial assets against computer viruses, sophisticated and costly ransomware attacks, and more.

Cyberattacks have the power to disrupt, damage or destroy businesses, and the cost to victims keeps rising. For example, according to IBM's Cost of a Data Breach 2023 report, 

  • The average cost of a data breach in 2023 was USD 4.45 million, up 15% over the last three years;
  • The average cost of a ransomware-related data breach in 2023 was even higher, at USD 5.13 million. This number does not include the cost of the ransom payment, which averaged an extra USD 1,542,333, up 89% from the previous year. 

By one estimate, cybercrime might cost the world economy  USD 10.5 trillion per year by 2025  (link resides outside ibm.com). 1

The expanding information technology (IT) trends of the past few years include:

  • a rise in cloud computing adoption,
  • network complexity,
  • remote work and work from home,
  • bring your own device (BYOD) programs,
  • and connected devices and sensors in everything from doorbells to cars to assembly lines.

All these trends create tremendous business advantages and human progress, but also provide exponentially more opportunities for cybercriminals to attack.

Not surprisingly, a recent study found that the global cybersecurity worker gap—the gap between existing cybersecurity workers and cybersecurity jobs that need to be filled—was 3.4 million workers worldwide. 2 Resource-strained security teams are focusing on developing comprehensive cybersecurity strategies that use advanced analytics, artificial intelligence and automation to fight cyberthreats more effectively and minimize the impact of cyberattacks.

The X-Force Threat Intelligence Index offers new insights into top threats to help you prepare and respond faster to cyberattacks, extortion and more.

Register for the Cost of a Data Breach report

A strong cybersecurity strategy protects all relevant IT infrastructure layers or domains against cyberthreats and cybercrime.

Critical infrastructure security protects the computer systems, applications, networks, data and digital assets that a society depends on for national security, economic health and public safety. In the United States, the National Institute of Standards and Technology (NIST) developed a cybersecurity framework to help IT providers in this area. The US Department of Homeland Security’ Cybersecurity and Infrastructure Security Agency (CISA) provides extra guidance.

Network security prevents unauthorized access to network resources, and detects and stops cyberattacks and network security breaches in progress. At the same time, network security helps ensure that authorized users have secure and timely access to the network resources they need.

Endpoints—servers, desktops, laptops, mobile devices—remain the primary entry point for cyberattacks. Endpoint security protects these devices and their users against attacks, and also protects the network against adversaries who use endpoints to launch attacks.

Application security protects applications running on-premises and in the cloud, preventing unauthorized access to and use of applications and related data. It also prevents flaws or vulnerabilities in application design that hackers can use to infiltrate the network. Modern application development methods—such as  DevOps and DevSecOps —build security and security testing into the development process.

Cloud security secures an organization’s cloud-based services and assets—applications, data, storage, development tools, virtual servers and cloud infrastructure. Generally speaking, cloud security operates on the shared responsibility model where the cloud provider is responsible for securing the services that they deliver and the infrastructure that is used to deliver them. The customer is responsible for protecting their data, code and other assets they store or run in the cloud. The details vary depending on the cloud services used.

Information security (InfoSec) pertains to protection of all an organization's important information—digital files and data, paper documents, physical media, even human speech—against unauthorized access, disclosure, use or alteration. Data security, the protection of digital information, is a subset of information security and the focus of most cybersecurity-related InfoSec measures.

Mobile security encompasses various disciplines and technologies specific to smartphones and mobile devices, including mobile application management (MAM) and enterprise mobility management (EMM). More recently, mobile security is available as part of unified endpoint management (UEM) solutions that enable configuration and security management for multiple endpoints—mobile devices, desktops, laptops, and more—from a single console.

Malware—short for "malicious software"—is any software code or computer program that is written intentionally to harm a computer system or its users. Almost every modern  cyberattack  involves some type of malware.

Hackers and cybercriminals create and use malware to gain unauthorized access to computer systems and sensitive data, hijack computer systems and operate them remotely, disrupt or damage computer systems, or hold data or systems hostage for large sums of money (see Ransomware).

Ransomware is a type of  malware  that encrypts a victim’s data or device and threatens to keep it encrypted—or worse—unless the victim pays a ransom to the attacker. According to the  IBM Security X-Force Threat Intelligence Index 2023 , ransomware attacks represented 17 percent of all  cyberattacks  in 2022.

“Or worse” is what distinguishes today's ransomware from its predecessors. The earliest ransomware attacks demanded a single ransom in exchange for the encryption key. Today, most ransomware attacks are double extortion attacks, demanding a second ransom to prevent sharing or publication of the victims data. Some are triple extortion attacks that threaten to launch a distributed denial of service attack if ransoms aren’t paid.

Phishing attacks are email, text or voice messages that trick users into downloading malware, sharing sensitive information or sending funds to the wrong people. Most users are familiar with bulk phishing scams—mass-mailed fraudulent messages that appear to be from a large and trusted brand, asking recipients to reset their passwords or reenter credit card information. But more sophisticated phishing scams, such as spear phishing and business email compromise (BEC) , target specific individuals or groups to steal especially valuable data or large sums of money.

Phishing is just one type of social engineering —a class of ‘human hacking’ tactics and attacks that use psychological manipulation to tempt or pressure people into taking unwise actions.

Insider threats are threats that originate with authorized users—employees, contractors, business partners—who intentionally or accidentally misuse their legitimate access, or have their accounts hijacked by cybercriminals. Insider threats can be harder to detect than external threats because they have the earmarks of authorized activity, and are invisible to antivirus software, firewalls and other security solutions that block external attacks.

One of the more persistent cybersecurity myths is that all cybercrime comes from external threats. In fact, according to a recent study, 44% of insider threats are caused by malicious actors, and the average cost per incident for malicious insider incidents in 2022 was USD 648,062. 3 Another study found that while the average external threat compromises about 200 million records, incidents involving an inside threat actor resulted in exposure of one billion records or more. 4

A DDoS attack attempts to crash a server, website or network by overloading it with traffic, usually from a botnet—a network of multiple distributed systems that a cybercriminal hijacks by using malware and remote-controlled operations.

The global volume of DDoS attacks spiked during the COVID-19 pandemic. Increasingly, attackers are combining DDoS attacks with ransomware attacks, or simply threatening to launch DDoS attacks unless the target pays a ransom.

Despite an ever-increasing volume of cybersecurity incidents worldwide and ever-increasing volumes of learnings that are gleaned from them, some dangerous misconceptions persist.

Strong passwords alone are adequate protection . Strong passwords make a difference. For example, a 12-character password takes 62 trillion times longer to crack than a 6-character password. But because cybercriminals can steal passwords (or pay disgruntled employees or other insiders to steal them), they can’t be an organization’s or individual’s only security measure.  

The major cybersecurity risks are well known . In fact, the risk surface is constantly expanding. Thousands of new vulnerabilities are reported in old and new applications and devices every year. Opportunities for human error—specifically by negligent employees or contractors who unintentionally cause a data breach—keep increasing.  

All cyberattack vectors are contained . Cybercriminals are finding new attack vectors all the time—including Linux systems, operational technology (OT), Internet of Things (IoT) devices and cloud environments.  

‘My industry is safe.’ Every industry has its share of cybersecurity risks, with cyber adversaries exploiting the necessities of communication networks within almost every government and private-sector organization. For example, ransomware attacks are targeting more sectors than ever, including local governments, non-profits and healthcare providers. Threats on supply chains, ".gov" websites, and critical infrastructure have also increased.  

Cybercriminals don’t attack small businesses . Yes, they do. For example, in 2021, 82 percent of ransomware attacks targeted companies with fewer than 1,000 employees; 37 percent of companies attacked with ransomware had fewer than 100 employees. 5

The following best practices and technologies can help your organization implement strong cybersecurity that reduces your vulnerability to cyberattacks and protects your critical information systems without intruding on the user or customer experience.

Security awareness training helps users understand how seemingly harmless actions—from using the same simple password for multiple log-ins, to oversharing on social media—increases their own or their organization’s risk of attack. Security awareness training combined with thought-out data security  policies can help employees protect sensitive personal and organizational data. It can also help them recognize and avoid phishing and malware attacks.

Identity and access management (IAM) defines the roles and access privileges for each user, and the conditions under which they are granted or denied their privileges. IAM technologies include  multi-factor authentication , which requires at least one credential in addition to a username and password, and adaptive authentication, which requires more credentials depending on context. 

Attack surface management (ASM) is the continuous discovery, analysis, remediation and monitoring of the cybersecurity vulnerabilities and potential attack vectors that make up an organization’s attack surface . Unlike other cyberdefense disciplines, ASM is conducted entirely from a hacker’s perspective, rather than the perspective of the defender. It identifies targets and assesses risks based on the opportunities they present to a malicious attacker.

Organizations rely on analytics- and AI-driven technologies to identify and respond to potential or actual attacks in progress because it's impossible to stop all cyberattacks. These technologies can include (but are not limited to) security information and event management (SIEM) , security orchestration, automation and response (SOAR) , and endpoint detection and response (EDR) . Typically, these technologies are used as part of a formal incident response plan.

Disaster recovery capabilities often play a key role in maintaining business continuity in the event of a cyberattack. For example, the ability to fail over to a backup that is hosted in a remote location can enable a business to resume operations quickly following a ransomware attack (and sometimes without paying a ransom).

Outsmart attacks with a connected, modernized security suite. The QRadar portfolio is embedded with enterprise-grade AI and offers integrated products for endpoint security, log management, SIEM and SOAR—all with a common user interface, shared insights and connected workflows.

Proactive threat hunting, continuous monitoring and a deep investigation of threats are just a few of the priorities facing an already busy IT department. Having a trusted incident response team on standby can reduce your response time, minimize the impact of a cyberattack, and help you recover faster.

AI-driven unified endpoint management (UEM) protects your devices, apps, content and data. This protection means you can rapidly scale your remote workforce and bring-your-own-device (BYOD) initiatives while building a zero trust security strategy. 

Implemented on premises or in a hybrid cloud, IBM data security solutions help you investigate and remediate cyberthreats, enforce real-time controls and manage regulatory compliance.

Proactively protect your organization’s primary and secondary storage systems against ransomware, human error, natural disasters, sabotage, hardware failures and other data loss risks.

Be better prepared for breaches by understanding their causes and the factors that increase or reduce costs. Learn from the experiences of more than 550 organizations that were hit by a data breach.

SIEM (security information and event management) is software that helps organizations recognize and address potential security threats and vulnerabilities before they can disrupt business operations.

Know the threat to beat the threat—get actionable insights that help you understand how threat actors are waging attacks, and how to proactively protect your organization.

Understand your cybersecurity landscape and prioritize initiatives together with senior IBM security architects and consultants in a no-cost, virtual or in-person, 3-hour design thinking session.

Threat management is a process used by cybersecurity professionals to prevent cyberattacks, detect cyber threats and respond to security incidents.

Find insights for rethinking your ransomware defenses and building your ability to remediate an evolving ransomware situation more rapidly.

Cybersecurity threats are becoming more advanced, more persistent and are demanding more effort by security analysts to sift through countless alerts and incidents. IBM Security QRadar SIEM helps you remediate threats faster while maintaining your bottom line. QRadar SIEM prioritizes high-fidelity alerts to help you catch threats that others miss.

1  Cybercrime threatens business growth. Take these steps to mitigate your risk.  (link resides outside ibm.com)

2  Bridging the 3.4 million workforce gap in cybersecurity (link resides outside ibm.com)

3  2022 Ponemon Cost of Insider Threats Global Report  (link resides outside ibm.com)

4  Verizon 2023 Data Breach Investigations Report  (link resides outside ibm.com)

5   82% of Ransomware Attacks Target Small Businesses, Report Reveals  (link resides outside ibm.com)

U.S. flag

An official website of the United States government

Here’s how you know

Official websites use .gov A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS A lock ( Lock A locked padlock ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

https://www.nist.gov/cybersecurity

black background. light blue shield in the middle. circle of hashmarks and triangles around the shield.

Cybersecurity

NIST develops cybersecurity standards, guidelines, best practices, and other resources to meet the needs of U.S. industry, federal agencies and the broader public. Our activities range from producing specific information that organizations can put into practice immediately to longer-term research that anticipates advances in technologies and future challenges.

Some NIST cybersecurity assignments are defined by federal statutes, executive orders and policies. For example, the Office of Management and Budget (OMB) mandates that all federal agencies implement NIST’s cybersecurity standards and guidance for non-national security systems. Our cybersecurity activities also are driven by the needs of U.S. industry and the broader public. We engage vigorously with stakeholders to set priorities and ensure that our resources address the key issues that they face. 

NIST also advances understanding and improves the management of privacy risks, some of which relate directly to cybersecurity.

Priority areas to which NIST contributes – and plans to focus more on – include cryptography, education and workforce, emerging technologies, risk management , identity and access management, measurements, privacy, trustworthy networks and trustworthy platforms.

Additional details can be found in these brief and more detailed fact sheets.

Featured Content

Cybersecurity topics.

  • Cryptography
  • Cybersecurity education and workforce development
  • Cybersecurity measurement
  • Identity & access management
  • Privacy engineering
  • Risk Management
  • Securing emerging technologies
  • Trustworthy networks
  • Trustworthy platforms

The Research

Projects & programs.

device a and device b

Exposure Notification – protecting workplaces and vulnerable communities during a pandemic

Trustworthy Network of Things

Trustworthy Networks of Things

Cryptographic module validation program (cmvp).

web-gctg

Cyber-Physical Systems/Internet of Things for Smart Cities

Additional resources links.

NIST Cybersecurity Framework wheel grahpic has external sections labeled Identify, Protect, Detect, Respond and Recover; internal circle is labeled Govern.

NIST Drafts Major Update to Its Widely Used Cybersecurity Framework

NIST Publishes Automated Vehicles Workshop Report (NIST IR 8527)

NIST Publishes Automated Vehicles Workshop Report

Eight images show the same person, four wearing glasses and four without, and all with different face expressions. Label says: Database Facial Expressions.

NIST Reports First Results From Age Estimation Software Evaluation

Collage of photos shows factory worker, medical equipment, person using microscope.

NIST Awards Over $1.2 Million to Small Businesses to Advance Cybersecurity, Biopharmaceuticals, Semiconductors and More

Protecting Your Small Business: Phishing

Cybersecurity Insights Blog

Check your wallet how mobile driver’s licenses are changing online transactions, latest nice framework update offers improvements for the cybersecurity workforce, protecting model updates in privacy-preserving federated learning: part two, take a tour nist cybersecurity framework 2.0: small business quick start guide, nice webinar: empowering refugee communities in cybersecurity roles, 2024 iris experts group (ieg) meeting, nist workshop on the requirements for an accordion cipher mode 2024.

Additive Construction-The Path to Standardization II

Additive Construction – The Path to Standardization II

Stay in touch.

Sign up for our newsletter to stay up to date with the latest research, trends, and news for Cybersecurity.

computer information systems research paper topics

  • Trustworthy AI

Our trust in technology relies on understanding how it works. It’s important to understand why AI makes the decisions it does. We’re developing tools to make AI more explainable, fair, robust, private, and transparent.

Tiny benchmarks for large language models

  • Foundation Models

IBM’s Granite model is one of the most transparent LLMs in the world

computer information systems research paper topics

  • AI Transparency
  • Open Source

What is red teaming for generative AI?

  • Adversarial Robustness and Privacy
  • Fairness, Accountability, Transparency
  • Natural Language Processing

An AI model trained on data that looks real but won’t leak personal information

computer information systems research paper topics

  • Data and AI Security

The latest AI safety method is a throwback to our maritime past

computer information systems research paper topics

  • Explainable AI
  • Generative AI

What is AI alignment?

  • Automated AI
  • See more of our work on Trustworthy AI
  • AI Testing We’re designing tools to help ensure that AI systems are trustworthy, reliable and can optimize business processes.
  • Adversarial Robustness and Privacy We’re making tools to protect AI and certify its robustness, and helping AI systems adhere to privacy requirements.
  • Explainable AI We’re creating tools to help AI systems explain why they made the decisions they did.
  • Fairness, Accountability, Transparency We’re developing technologies to increase the end-to-end transparency and fairness of AI systems.
  • Trustworthy Generation We’re developing theoretical and algorithmic frameworks for generative AI to accelerate future scientific discoveries.
  • Uncertainty Quantification We’re developing ways for AI to communicate when it's unsure of a decision across the AI application development lifecycle.

computer information systems research paper topics

Science for Social Good

IBM Science for Social Good partners IBM Research scientists and engineers with academic fellows, subject matter experts from NGOs, public sector agencies, and social enterprises to tackle emerging societal challenges using science and technology.

Publications

  • Cynthia Dwork
  • Kristjan Greenewald
  • Frank Libsch
  • Steve Bedell
  • Assala Benmalek
  • Celia Cintas
  • Leshem Choshen
  • LREC-COLING 2024
  • Brian W. Bauer
  • JMIR Mental Health

computer information systems research paper topics

Building trustworthy AI with Watson

Our research is regularly integrated into Watson solutions to make IBM’s AI for business more transparent, explainable, robust, private, and fair.

  • Skip to main content
  • Skip to search
  • Skip to footer

Products and Services

Now available: ccna v1.1 exam topics.

Validate your knowledge and skills in network fundamentals and access, IP connectivity, IP services, security fundamentals, and more. Take your IT career in any direction by earning a Cisco Certified Network Associate (CCNA) certification.

CCNA certification

Validate your knowledge and skills in network fundamentals and access, IP connectivity, IP services, security fundamentals, and more. Take your IT career in any direction by earning a Cisco Certified Network Associate (CCNA) certification.

Your career in networking begins with CCNA

Take your IT career in any direction by earning a CCNA. CCNA validates a broad range of fundamentals for all IT careers - from networking technologies, to security, to software development - proving you have the skills businesses need to meet market demands.

Networking fundamentals

Showcase your knowledge of networking equipment and configuration. Be able to troubleshoot connectivity issues and effectively manage networks.

IP Services

Demonstrate your ability to configure routing for different IP versions and describe the purpose of redundancy protocols. Be able to interpret the components of a routing table.

Security fundamentals

Understand threats and ways to prevent them. Identify key elements of a security program, like user awareness and training. Demonstrate practical skills like setting up secure access to devices and networks.

Understand how automation affects network management, and compare traditional networks with controller-based networking. Leverage APIs, and understand configuration management tools.

Your career in networking begins with CCNA

CCNA Certification

How it works, no formal prerequisites.

CCNA is an asset to IT professionals of all experience levels, but learners often benefit from one or more years of experience implementing and administering Cisco solutions.

Example learner profiles

  • Individuals looking to move into the IT field
  • IT professionals looking to stand out in the job market
  • IT professionals looking to enrich their current roles with additional networking skills

To earn the CCNA certification, you’ll need to pass a single required exam.

Getting started

To earn this certification, you’ll need to pass a single required exam.

A variety of resources are available to help you study - from guided learning to self-study and a community forum.

computer information systems research paper topics

Unlock your career potential

Because CCNA covers so many IT fundamentals, it’s a great way to stand out no matter where your career takes you.

Potential roles

Network engineer.

Apply a range of technologies to connect, secure, and automate complex networks.

Network administrator

Install, maintain, monitor, and troubleshoot networks and keep them secure.

Help desk administrator

Diagnose and troubleshoot technical issues for clients and employees.

Alumni testimonials

Ccna moved elvin up the career ladder.

CCNA moved Elvin up the career ladder

"Passing that CCNA exam triggered a chain of events I could never have predicted. First, I was a student, then a teacher, then a Cisco instructor, and I eventually became a Cisco VIP."

Elvin Arias Soto, CloudOps engineer

CCNA, CCDP, CCDA, CCNP, CCIE

Certifications give Kevin instant credibility at work

Certifications give Kevin instant credibility at work

"People always want to know who they're talking to. They want to know if you’re qualified. Certifications give you instant credibility."

Kevin Brown, CyberOps analyst

CCNA, CyberOps Associate

Ben made a career change with a Cisco certification

Ben made a career change with a Cisco certification

"I chose to pursue Cisco certifications because I knew it would put me in the best position to start a career in networking."

Ben Harting, Configuration engineer

Maintain your certification

Your certification is valid for three years. You can renew with Continuing Education credits or retake exams before they expire.

CCNA essentials webinar series

Learn what to expect from the CCNA exam, and chart your path to certification success.

CCNA certification guide

Get familiar with Cisco’s learning environment, find study resources, and discover helpful hints for earning your CCNA.

CCNA Prep Program

Packed with 50+ hours of resources, webinars, and practice quizzes, CCNA Prep On Demand is your ultimate study buddy.

Enhance your learning journey

Stay up to date.

Get the latest news about Cisco certifications, plus tools and insights to help you get where you want to go.

CCNA community

Not sure where to begin? Head to the Cisco CCNA community to get advice and connect with experts.

IMAGES

  1. (PDF) Essay on the understanding of computer & systems sciences

    computer information systems research paper topics

  2. Introduction to Computer Information Systems

    computer information systems research paper topics

  3. Research Paper Topics in Computer Science & Engineering

    computer information systems research paper topics

  4. Chapter 6

    computer information systems research paper topics

  5. Phd Computer Science Research Proposal

    computer information systems research paper topics

  6. 🔥 Latest topics for paper presentation in information technology. 100

    computer information systems research paper topics

VIDEO

  1. Computing Disciplines: Information Systems

  2. Operating Systems Research Paper Presentation

  3. Lecture 1

  4. Introduction to Computer Science: Lecture no 9

  5. Research Paper Presentation #research #paper #conference #ieee

  6. IGCSE ICT- Types and components of computer systems -CLASSIFIED questions Part-3

COMMENTS

  1. Journal of Computer Information Systems

    The Journal of Computer Information Systems (JCIS) aims to publish manuscripts that explore information systems and technology research and thus develop computer information systems globally. ... Quickly and easily track the impact your paper makes with the help of Authored Works. Publication office: Taylor & Francis, Inc., 530 Walnut Street ...

  2. 1000 Computer Science Thesis Topics and Ideas

    This section offers a well-organized and extensive list of 1000 computer science thesis topics, designed to illuminate diverse pathways for academic inquiry and innovation. Whether your interest lies in the emerging trends of artificial intelligence or the practical applications of web development, this assortment spans 25 critical areas of ...

  3. Artificial intelligence in information systems research: A systematic

    Artificial intelligence in information systems research: A systematic literature review and research agenda ... implications, and a research agenda for the future. The paper ends with a conclusion and directions for future research. 2. Background and related work. ... AI is the general concept for computer systems able to perform tasks that ...

  4. Information Systems Research

    Information Systems Research is a peer-reviewed journal that seeks to publish the best research in the information systems discipline. INFORMS.org; ... Call for Papers ISR has issued a call for papers for a special issue on Analytical Creativity. ScholarOne will be open to submissions beginning on January 2, 2024.

  5. 500+ Computer Science Research Topics

    Computer Science Research Topics are as follows: Using machine learning to detect and prevent cyber attacks. Developing algorithms for optimized resource allocation in cloud computing. Investigating the use of blockchain technology for secure and decentralized data storage. Developing intelligent chatbots for customer service.

  6. Computer Science Research Topics (+ Free Webinar)

    If you've landed on this post, chances are you're looking for a computer science-related research topic, but aren't sure where to start. Here, we'll explore a variety of CompSci & IT-related research ideas and topic thought-starters, including algorithms, AI, networking, database systems, UX, information security and software ...

  7. Information Systems Research

    Information Systems Research (ISR) is a leading peer-reviewed, international journal focusing on theory, research, and intellectual development for information systems in organizations, institutions, the economy, and society. It is dedicated to furthering knowledge in the application of information technologies to human organizations and their management and, more broadly, to improving ...

  8. CS 261: Research Topics in Operating Systems (2021)

    Unresolved: Principled, policy-free control of CPU time. Unresolved: Handling of multicore processors in the age of verification. Replaced: Process kernel by event kernel in seL4, OKL4 and NOVA. Abandoned: Virtual TCB addressing. …. Abandoned: C++ for seL4 and OKL4.

  9. Cyber risk and cybersecurity: a systematic review of data ...

    Cybercrime is estimated to have cost the global economy just under USD 1 trillion in 2020, indicating an increase of more than 50% since 2018. With the average cyber insurance claim rising from USD 145,000 in 2019 to USD 359,000 in 2020, there is a growing necessity for better cyber information sources, standardised databases, mandatory reporting and public awareness. This research analyses ...

  10. Information technology

    Information technology articles from across Nature Portfolio. Information technology is the design and implementation of computer networks for data processing and communication. This includes ...

  11. Top 400 Information Technology Research Topics

    The list of the top 400 information technology research topics is organized into different categories. Let's examine it. Artificial Intelligence (AI) and Machine Learning (ML) Easy AI: Explaining and Using. Group Learning: Getting Better Together. AI in Health: Diagnosing and Helping. Robots Learning on Their Own.

  12. Articles & Journals

    Useful resources for research in Computer Information Systems. A peer-reviewed (or refereed) journal:. uses experts from the same subject field or profession as the author to evaluate a manuscript prior to acceptance for publication; has articles that report on research studies or provide scholarly analysis of topics; may include book reviews, editorials, or other brief items that are not ...

  13. Computer Science Research Topics

    Computer science research topics can be divided into several categories, such as artificial intelligence, big data and data science, human-computer interaction, security and privacy, and software engineering. If you are a student or researcher looking for computer research paper topics. In that case, this article provides some suggestions on ...

  14. 130 Top-Notch Information Technology Research Topics

    130 Information Technology Research Topics And Quick Writing Prompts. The field of information technology is one of the most recent developments of the 21st century. Scholars argue that we are living in a technological age. Despite this buzz, however, many students still find it challenging to compose an information technology research topic.

  15. Undergraduate Research Topics

    Research areas: Interpretability of AI systems, Fairness in AI systems, Computer vision. Independen Work Topics: Constructing a new method to explain a model / create an interpretable by design model. Analyzing a current model / dataset to understand bias within the model/dataset.

  16. Information systems security research agenda: Exploring the gap between

    Topic modeling of Information Systems Security research between 1990 and 2020. • Delphi study of CISOs to rank order important Information Systems Security concerns. • Explores the gap between what practitioners consider to be important and what researchers are currently studying. • Develop a research agenda in Information Systems Security.

  17. 412 Computer Topics for Essays & Research Topics about ...

    In the past, people considered computers to be a reserve for scientist, engineers, the army and the government. Media is a field that has demonstrated the quality and value of computers. Ethics in Computer Technology: Cybercrimes. The first one is the category of crimes that are executed using a computer as a weapon.

  18. Information Technology: News, Articles, Research, & Case Studies

    Information Technology. New research on information technology from Harvard Business School faculty on issues including the HealthCare.gov fiasco, online privacy concerns, and the civic benefits of technologies that utilize citizen-created data. Page 1 of 60 Results →. 23 Apr 2024.

  19. Management Information Systems Research: A Topic Modeling Based

    Management information systems (MIS) have an interdisciplinary structure. Naturally, it develops and changes with the influence of other fields. This study tries to analyze MIS through academic studies on this topic. In this context, the analysis included 25304 articles published in the Scopus database from 2016 to 2021.

  20. Information Technology Dissertation Topics

    List of IT Dissertation Topics Having Potential for Research. A literature analysis on the information quality management framework. A comprehensive investigation of the information system hierarchy. Big data and business intelligence are essential for sustainable development in organisations: Discuss a UK-based perspective.

  21. Computer Technology Research Paper Topics

    This list of computer technology research paper topics provides the list of 33 potential topics for research papers and an overview article on the history of computer technology.. 1. Analog Computers. Paralleling the split between analog and digital computers, in the 1950s the term analog computer was a posteriori projected onto pre-existing classes of mechanical, electrical, and ...

  22. Expert Systems with Applications

    Expert Systems With Applications is a refereed international journal whose focus is on exchanging information relating to expert and intelligent systems applied in industry, government, and universities worldwide. The thrust of the journal is to publish original papers dealing with the design, development, testing, implementation, and/or management of expert and intelligent systems, and also ...

  23. 39 Best Information Systems Dissertation Topics Ideas

    Computer science research topics; Best Information Systems Dissertation Topics Ideas for College Students. Given below is an extensive and enriched list of information systems thesis topics for our clients so that they go through the list and find something as per their interest and priority: A historical analysis of information systems ...

  24. 78 Management Information System Topics for Presentation and Essays

    Samsung Company's Management Information System. The scope of Management Information System is defined as, "The combination of human and computer based resources that results in the collection, storage, retrieval, communication and use of data for the purpose of efficient management […] Management Information Systems Types: Functions and ...

  25. What Is Artificial Intelligence? Definition, Uses, and Types

    Artificial intelligence (AI) is the theory and development of computer systems capable of performing tasks that historically required human intelligence, such as recognizing speech, making decisions, and identifying patterns. AI is an umbrella term that encompasses a wide variety of technologies, including machine learning, deep learning, and ...

  26. What is Cybersecurity?

    Cybersecurity refers to any technology, measure or practice for preventing cyberattacks or mitigating their impact. Cybersecurity aims to protect individuals' and organizations' systems, applications, computing devices, sensitive data and financial assets against computer viruses, sophisticated and costly ransomware attacks, and more.

  27. Cybersecurity

    NIST develops cybersecurity standards, guidelines, best practices, and other resources to meet the needs of U.S. industry, federal agencies and the broader public. Our activities range from producing specific information that organizations can put into practice immediately to longer-term research that anticipates advances in technologies and ...

  28. Trustworthy AI

    We need to be able to look inside AI systems, to understand the rationale behind the algorithmic outcome, and even ask it questions as to how it came to its decision. At IBM Research, we're working on a range of approaches to ensure that AI systems built in the future are fair, robust, explainable, account, and align with the values of the ...

  29. CVIU

    The central focus of this journal is the computer analysis of pictorial information. Computer Vision and Image Understanding publishes papers covering all aspects of image analysis from the low-level, iconic processes of early vision to the high-level, symbolic processes of recognition and interpretation. A wide range of topics in the image ...

  30. CCNA

    Your career in networking begins with CCNA. Take your IT career in any direction by earning a CCNA. CCNA validates a broad range of fundamentals for all IT careers - from networking technologies, to security, to software development - proving you have the skills businesses need to meet market demands.