Browsing by Author "Navimipour, Nima Jafari"
Now showing 1 - 20 of 69
- Results Per Page
- Sort Options
Review Citation Count: 24Adventures in data analysis: a systematic review of Deep Learning techniques for pattern recognition in cyber-physical-social systems(Springer, 2023) Amiri, Zahra; Heidari, Arash; Navimipour, Nima Jafari; Unal, Mehmet; Mousavi, AliMachine Learning (ML) and Deep Learning (DL) have achieved high success in many textual, auditory, medical imaging, and visual recognition patterns. Concerning the importance of ML/DL in recognizing patterns due to its high accuracy, many researchers argued for many solutions for improving pattern recognition performance using ML/DL methods. Due to the importance of the required intelligent pattern recognition of machines needed in image processing and the outstanding role of big data in generating state-of-the-art modern and classical approaches to pattern recognition, we conducted a thorough Systematic Literature Review (SLR) about DL approaches for big data pattern recognition. Therefore, we have discussed different research issues and possible paths in which the abovementioned techniques might help materialize the pattern recognition notion. Similarly, we have classified 60 of the most cutting-edge articles put forward pattern recognition issues into ten categories based on the DL/ML method used: Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), Generative Adversarial Network (GAN), Autoencoder (AE), Ensemble Learning (EL), Reinforcement Learning (RL), Random Forest (RF), Multilayer Perception (MLP), Long-Short Term Memory (LSTM), and hybrid methods. SLR method has been used to investigate each one in terms of influential properties such as the main idea, advantages, disadvantages, strategies, simulation environment, datasets, and security issues. The results indicate most of the articles were published in 2021. Moreover, some important parameters such as accuracy, adaptability, fault tolerance, security, scalability, and flexibility were involved in these investigations.Article Citation Count: 30The applications of machine learning techniques in medical data processing based on distributed computing and the Internet of Things(Elsevier Ireland Ltd, 2023) Aminizadeh, Sarina; Heidari, Arash; Toumaj, Shiva; Darbandi, Mehdi; Navimipour, Nima Jafari; Rezaei, Mahsa; Talebi, SamiraMedical data processing has grown into a prominent topic in the latest decades with the primary goal of maintaining patient data via new information technologies, including the Internet of Things (IoT) and sensor technologies, which generate patient indexes in hospital data networks. Innovations like distributed computing, Machine Learning (ML), blockchain, chatbots, wearables, and pattern recognition can adequately enable the collection and processing of medical data for decision-making in the healthcare era. Particularly, to assist experts in the disease diagnostic process, distributed computing is beneficial by digesting huge volumes of data swiftly and producing personalized smart suggestions. On the other side, the current globe is confronting an outbreak of COVID-19, so an early diagnosis technique is crucial to lowering the fatality rate. ML systems are beneficial in aiding radiologists in examining the incredible amount of medical images. Nevertheless, they demand a huge quantity of training data that must be unified for processing. Hence, developing Deep Learning (DL) confronts multiple issues, such as conventional data collection, quality assurance, knowledge exchange, privacy preservation, administrative laws, and ethical considerations. In this research, we intend to convey an inclusive analysis of the most recent studies in distributed computing platform applications based on five categorized platforms, including cloud computing, edge, fog, IoT, and hybrid platforms. So, we evaluated 27 articles regarding the usage of the proposed framework, deployed methods, and applications, noting the advantages, drawbacks, and the applied dataset and screening the security mechanism and the presence of the Transfer Learning (TL) method. As a result, it was proved that most recent research (about 43%) used the IoT platform as the environment for the proposed architecture, and most of the studies (about 46%) were done in 2021. In addition, the most popular utilized DL algorithm was the Convolutional Neural Network (CNN), with a percentage of 19.4%. Hence, despite how technology changes, delivering appropriate therapy for patients is the primary aim of healthcare-associated departments. Therefore, further studies are recommended to develop more functional architectures based on DL and distributed environments and better evaluate the present healthcare data analysis models.Review Citation Count: 80Applications of ML/DL in the management of smart cities and societies based on new trends in information technologies: A systematic literature review(Elsevier, 2022) Heidari, Arash; Navimipour, Nima Jafari; Unal, MehmetThe goal of managing smart cities and societies is to maximize the efficient use of finite resources while enhancing the quality of life. To establish a sustainable urban existence, smart cities use some new technologies such as the Internet of Things (IoT), Internet of Drones (IoD), and Internet of Vehicles (IoV). The created data by these technologies are submitted to analytics to obtain new information for increasing the smart societies and cities' efficiency and effectiveness. Also, smart traffic management, smart power, and energy management, city surveillance, smart buildings, and patient healthcare monitoring are the most common applications in smart cities. However, the Artificial intelligence (AI), Machine Learning (ML), and Deep Learning (DL) approach all hold a lot of promise for managing automated activities in smart cities. Therefore, we discuss different research issues and possible research paths in which the aforementioned techniques might help materialize the smart city notion. The goal of this research is to offer a better understanding of (1) the fundamentals of smart city and society management, (2) the most recent developments and breakthroughs in this field, (3) the benefits and drawbacks of existing methods, and (4) areas that require further investigation and consideration. IoT, cloud computing, edge computing, fog computing, IoD, IoV, and hybrid models are the seven key emerging de-velopments in information technology that, in this paper, are considered to categorize the state-of-the-art techniques. The results indicate that the Conventional Neural Network (CNN) and Long Short-Term Memory (LSTM) are the most commonly used ML method in the publications. According to research, the majority of papers are about smart cities' power and energy management. Furthermore, most papers have concentrated on improving only one parameter, where the accuracy parameter obtains the most attention. In addition, Python is the most frequently used language, which was used in 69.8% of the papers.Article Citation Count: 0The applications of nature-inspired algorithms in Internet of Things-based healthcare service: A systematic literature review(Wiley, 2024) Amiri, Zahra; Heidari, Arash; Zavvar, Mohammad; Navimipour, Nima Jafari; Esmaeilpour, MansourNature-inspired algorithms revolve around the intersection of nature-inspired algorithms and the IoT within the healthcare domain. This domain addresses the emerging trends and potential synergies between nature-inspired computational approaches and IoT technologies for advancing healthcare services. Our research aims to fill gaps in addressing algorithmic integration challenges, real-world implementation issues, and the efficacy of nature-inspired algorithms in IoT-based healthcare. We provide insights into the practical aspects and limitations of such applications through a systematic literature review. Specifically, we address the need for a comprehensive understanding of the applications of nature-inspired algorithms in IoT-based healthcare, identifying gaps such as the lack of standardized evaluation metrics and studies on integration challenges and security considerations. By bridging these gaps, our paper offers insights and directions for future research in this domain, exploring the diverse landscape of nature-inspired algorithms in healthcare. Our chosen methodology is a Systematic Literature Review (SLR) to investigate related papers rigorously. Categorizing these algorithms into groups such as genetic algorithms, particle swarm optimization, cuckoo algorithms, ant colony optimization, other approaches, and hybrid methods, we employ meticulous classification based on critical criteria. MATLAB emerges as the predominant programming language, constituting 37.9% of cases, showcasing a prevalent choice among researchers. Our evaluation emphasizes adaptability as the paramount parameter, accounting for 18.4% of considerations. By shedding light on attributes, limitations, and potential directions for future research and development, this review aims to contribute to a comprehensive understanding of nature-inspired algorithms in the dynamic landscape of IoT-based healthcare services. Providing a complete overview of the current issues associated with nature-inspired algorithms in IoT-based healthcare services. Providing a thorough overview of present methodologies for IoT-based healthcare services in research studies; Evaluating each region that tailored nature-inspired algorithms with many perspectives such as advantages, restrictions, datasets, security involvement, and simulation stings; Outlining the critical aspects that motivate the cited approaches to enhance future research; Illustrating descriptions of certain IoT-based healthcare services used in various studies. imageReview Citation Count: 1Blockchain Systems in Embedded Internet of Things: Systematic Literature Review, Challenges Analysis, and Future Direction Suggestions(Mdpi, 2022) Darbandi, Mehdi; Al-Khafaji, Hamza Mohammed Ridha; Nasab, Seyed Hamid Hosseini; AlHamad, Ahmad Qasim Mohammad; Ergashevich, Beknazarov Zafarjon; Navimipour, Nima JafariInternet of Things (IoT) environments can extensively use embedded devices. Without the participation of consumers; tiny IoT devices will function and interact with one another, but their operations must be reliable and secure from various threats. The introduction of cutting-edge data analytics methods for linked IoT devices, including blockchain, may lower costs and boost the use of cloud platforms. In a peer-to-peer network such as blockchain, no one has to be trusted because each peer is in charge of their task, and there is no central server. Because blockchain is tamper-proof, it is connected to IoT to increase security. However, the technology is still developing and faces many challenges, such as power consumption and execution time. This article discusses blockchain technology and embedded devices in distant areas where IoT devices may encounter network shortages and possible cyber threats. This study aims to examine existing research while also outlining prospective areas for future work to use blockchains in smart settings. Finally, the efficiency of the blockchain is evaluated through performance parameters, such as latency, throughput, storage, and bandwidth. The obtained results showed that blockchain technology provides security and privacy for the IoT.Review Botnets Unveiled: a Comprehensive Survey on Evolving Threats and Defense Strategies(Wiley, 2024) Asadi, Mehdi; Jamali, Mohammad Ali Jabraeil; Heidari, Arash; Navimipour, Nima JafariBotnets have emerged as a significant internet security threat, comprising networks of compromised computers under the control of command and control (C&C) servers. These malevolent entities enable a range of malicious activities, from denial of service (DoS) attacks to spam distribution and phishing. Each bot operates as a malicious binary code on vulnerable hosts, granting remote control to attackers who can harness the combined processing power of these compromised hosts for synchronized, highly destructive attacks while maintaining anonymity. This survey explores botnets and their evolution, covering aspects such as their life cycles, C&C models, botnet communication protocols, detection methods, the unique environments botnets operate in, and strategies to evade detection tools. It analyzes research challenges and future directions related to botnets, with a particular focus on evasion and detection techniques, including methods like encryption and the use of covert channels for detection and the reinforcement of botnets. By reviewing existing research, the survey provides a comprehensive overview of botnets, from their origins to their evolving tactics, and evaluates how botnets evade detection and how to counteract their activities. Its primary goal is to inform the research community about the changing landscape of botnets and the challenges in combating these threats, offering guidance on addressing security concerns effectively through the highlighting of evasion and detection methods. The survey concludes by presenting future research directions, including using encryption and covert channels for detection and strategies to strengthen botnets. This aims to guide researchers in developing more robust security measures to combat botnets effectively. Exploring botnets: evolution, tactics, countermeasures. This survey dives into botnets, covering life cycles, communication, and evasion tactics. It highlights challenges and future strategies for combating cyber threats. imageArticle Citation Count: 3A cloud database route scheduling method using a hybrid optimization algorithm(Wiley, 2023) Baghi, Zahra Shokri; Navimipour, Nima JafariCloud computing has appeared as a technology allowing a company to employ computing resources such as applications, software, and hardware to calculate over the Internet. Scholars have paid great attention to cloud computing because of its cutting-edge availability, cost decrement, and boundless applications. A cloud database is a data storage site on the web where the optimal path is spotted to access the needed database. So, placing the ideal path to a database is crucial. The cloud database defined the scheduling problem to choose the perfect route. Cloud database path scheduling is a multifaceted procedure consisting of congestion control, routing list, and network flow distribution. It has a postponement in searching for the needed source route from the cloud database. Offering numerous infinite resources with the growing database workload is an NP-Hard optimization problem where the query request needs optimal schedules to respond to the required services. So, we have used a hybrid cuckoo search (CS) and genetic algorithm (GA), motivated by a social bird's phenomenon, to solve this problem. Integrating genetic operators has dramatically enhanced the balance between the capability of searching and utilization.Review Citation Count: 7Cloud healthcare services: A comprehensive and systematic literature review(Wiley, 2022) Rahimi, Morteza; Navimipour, Nima Jafari; Hosseinzadeh, Mehdi; Moattar, Mohammad Hossein; Darwesh, AsoOver the last decade, the landscape of cloud computing has been significantly changed. It has been known as a paradigm in which a shared pool of computing resources is accessible for users. The rapid growth of the healthcare environment provides better medical services to reduce costs and increase competition among healthcare providers. Despite its crucial role in the cloud, no thorough study exists in this domain. This article presents a systematic study for healthcare services in the cloud environment. A well-organized overview of all the databases has been explored. By clustering the research goals of the found papers, we have derived four main research groups. We have further evaluated the papers concerning the background of the paper, QoS parameters, application area, or methods used for applying and formulating the main ideas presented in the works. This survey emphasizes the challenges, needs, benefits of using cloud computing in healthcare systems and provides a comprehensive and detailed study on cloud healthcare services, strengths, and weaknesses of the existing methods. Highlighting cloud health services can be the major focus of research for developing the urban healthcare system.Article Citation Count: 1A cloud service composition method using a fuzzy-based particle swarm optimization algorithm(Springer, 2023) Nazif, Habibeh; Nassr, Mohammad; Al-Khafaji, Hamza Mohammed Ridha; Navimipour, Nima Jafari; Unal, MehmetIn today's dynamic business landscape, organizations heavily rely on cloud computing to leverage the power of virtualization and resource sharing. Service composition plays a vital role in cloud computing, combining multiple cloud services to fulfill complex user requests. Service composition in cloud computing presents several challenges. These include service heterogeneity, dynamic service availability, QoS (Quality of Service) constraints, and scalability issues. Traditional approaches often struggle to handle these challenges efficiently, leading to suboptimal resource utilization and poor service performance. This work presents a fuzzy-based strategy for composing cloud services to overcome these obstacles. The fact that service composition is NP-hard has prompted the use of a range of metaheuristic algorithms in numerous papers. Therefore, Particle Swarm Optimization (PSO) has been applied in this paper to solve the problem. Implementing a fuzzy-based PSO for service composition requires defining the fuzzy membership functions and rules based on the specific service domain. Once the fuzzy logic components are established, they can be integrated into the PSO algorithm. The simulation results have shown the high efficiency of the proposed method in decreasing the latency, cost, and response time.Review Citation Count: 10A comprehensive and systematic literature review on the big data management techniques in the internet of things(Springer, 2023) NaghibnAff, Arezou; Navimipour, Nima Jafari; Hosseinzadeh, Mehdi; Sharifi, ArashThe Internet of Things (IoT) is a communication paradigm and a collection of heterogeneous interconnected devices. It produces large-scale distributed, and diverse data called big data. Big Data Management (BDM) in IoT is used for knowledge discovery and intelligent decision-making and is one of the most significant research challenges today. There are several mechanisms and technologies for BDM in IoT. This paper aims to study the important mechanisms in this area systematically. This paper studies articles published between 2016 and August 2022. Initially, 751 articles were identified, but a paper selection process reduced the number of articles to 110 significant studies. Four categories to study BDM mechanisms in IoT include BDM processes, BDM architectures/frameworks, quality attributes, and big data analytics types. Also, this paper represents a detailed comparison of the mechanisms in each category. Finally, the development challenges and open issues of BDM in IoT are discussed. As a result, predictive analysis and classification methods are used in many articles. On the other hand, some quality attributes such as confidentiality, accessibility, and sustainability are less considered. Also, none of the articles use key-value databases for data storage. This study can help researchers develop more effective BDM in IoT methods in a complex environment.Article Citation Count: 0Comprehensive survey of artificial intelligence techniques and strategies for climate change mitigation(Pergamon-elsevier Science Ltd, 2024) Amiri, Zahra; Heidari, Arash; Navimipour, Nima JafariWith the galloping progress of the changing climates all around the world, Machine Learning (ML) approaches have been prevalently studied in many types of research in this area. ML is a robust tool for acquiring perspectives from data. In this paper, we elaborate on climate change mitigation issues and ML approaches leveraged to solve these issues and aid in the improvement and function of sustainable energy systems. ML has been employed in multiple applications and many scopes of climate subjects such as ecosystems, agriculture, buildings and cities, industry, and transportation. So, a Systematic Literature Review (SLR) is applied to explore and evaluate findings from related research. In this paper, we propose a novel taxonomy of Deep Learning (DL) method applications for climate change mitigation, a comprehensive analysis that has not been conducted before. We evaluated these methods based on critical parameters such as accuracy, scalability, and interpretability and quantitatively compared their results. This analysis provides new insights into the effectiveness and reliability of DL methods in addressing climate change challenges. We classified climate change ML methods into six key customizable groups: ecosystems, industry, buildings and cities, transportation, agriculture, and hybrid applications. Afterward, state-of-the-art research on ML mechanisms and applications for climate change mitigation issues has been highlighted. In addition, many problems and issues related to ML implementation for climate change have been mapped, which are predicted to stimulate more researchers to manage the future disastrous effects of climate change. Based on the findings, most of the papers utilized Python as the most common simulation environment 38.5 % of the time. In addition, most of the methods were analyzed and evaluated in terms of some parameters, namely accuracy, latency, adaptability, and scalability, respectively. Lastly, classification is the most frequent ML task within climate change mitigation, accounting for 40 % of the total. Furthermore, Convolutional Neural Networks (CNNs) are the most widely utilized approach for a variety of applications.Article Citation Count: 0A cost- and energy-efficient SRAM design based on a new 5 i-p majority gate in QCA nanotechnology(Elsevier, 2024) Kassa, Sankit; Ahmadpour, Seyed-Sajad; Lamba, Vijay; Misra, Neeraj Kumar; Navimipour, Nima Jafari; Kotecha, KetanQuantum-dot Cellular Automata (QCA) is a revolutionary paradigm in the Nano-scale VLSI market with the potential to replace the traditional Complementary Metal Oxide Semiconductor system. To demonstrate its usefulness, this article provides a QCA-based innovation structure comprising a 5-input (i-p) Majority Gate, which is one of the basic gates in QCA, and a Static Random Access Memory (SRAM) cell with set and reset functionalities. The suggested design, with nominal clock zones, provides a reliable, compact, efficient, and durable configuration that helps achieve the optimal size and latency while decreasing power consumption. Based on the suggested 5 i-p majority gate, the realized SRAM architecture improves energy dissipation by 33.95 %, cell count by 31.34 %, and area by 33.33 % when compared to the most recent design designs. Both the time and the cost have been decreased by 30 % and 53.95 %, respectively.Review Citation Count: 0A deep analysis of nature-inspired and meta-heuristic algorithms for designing intrusion detection systems in cloud/edge and IoT: state-of-the-art techniques, challenges, and future directions(Springer, 2024) Hu, Wengui; Cao, Qingsong; Darbandi, Mehdi; Navimipour, Nima JafariThe number of cloud-, edge-, and Internet of Things (IoT)-based applications that produce sensitive and personal data has rapidly increased in recent years. The IoT is a new model that integrates physical objects and the Internet and has become one of the principal technological evolutions of computing. Cloud computing is a paradigm for centralized computing that gathers resources in one place and makes them available to consumers via the Internet. Despite the vast array of resources that cloud computing offers, real-time mobile applications might not find it acceptable because it is typically located far from users. However, in applications where low latency and high dependability are required, edge computing-which disperses resources to the network edge-is becoming more and more popular. Though it has less processing power than traditional cloud computing, edge computing offers resources in a decentralized way that can react to customers' needs more quickly. There has been a sharp increase in attackers stealing data from these applications since the data is so sensitive. Thus, a powerful Intrusion Detection System (IDS) that can identify intruders is required. IDS are essential for the cybersecurity of the IoT, cloud, and edge architectures. Investigators have mostly embraced the use of deep learning algorithms as a means of protecting the IoT environment. However, these techniques have some issues with computational complexity, long processing times, and poor precision. Feature selection approaches can be utilized to overcome these problems. Optimization methods, including bio-inspired algorithms, are applied as feature selection approaches to enhance the classification accuracy of IDS systems. Based on the cited sources, it appears that no study has looked into these difficulties in depth. This research thoroughly analyzes the current literature on intrusion detection and using nature-inspired algorithms to safeguard IoT and cloud/edge settings. This article examines pertinent analyses and surveys on the aforementioned subjects, dangers, and outlooks. It also examines many frequently used algorithms in the development of IDSs used in IoT security. The findings demonstrate their efficiency in addressing IoT and cloud/edge ecosystem security issues. Moreover, it has been shown that the methods put out in the literature might improve IDS security and dependability in terms of precision and execution speed.Article Citation Count: 1The deep learning applications in IoT-based bio- and medical informatics: a systematic literature review(Springer London Ltd, 2024) Amiri, Zahra; Heidari, Arash; Navimipour, Nima Jafari; Esmaeilpour, Mansour; Yazdani, YaldaNowadays, machine learning (ML) has attained a high level of achievement in many contexts. Considering the significance of ML in medical and bioinformatics owing to its accuracy, many investigators discussed multiple solutions for developing the function of medical and bioinformatics challenges using deep learning (DL) techniques. The importance of DL in Internet of Things (IoT)-based bio- and medical informatics lies in its ability to analyze and interpret large amounts of complex and diverse data in real time, providing insights that can improve healthcare outcomes and increase efficiency in the healthcare industry. Several applications of DL in IoT-based bio- and medical informatics include diagnosis, treatment recommendation, clinical decision support, image analysis, wearable monitoring, and drug discovery. The review aims to comprehensively evaluate and synthesize the existing body of the literature on applying deep learning in the intersection of the IoT with bio- and medical informatics. In this paper, we categorized the most cutting-edge DL solutions for medical and bioinformatics issues into five categories based on the DL technique utilized: convolutional neural network, recurrent neural network, generative adversarial network, multilayer perception, and hybrid methods. A systematic literature review was applied to study each one in terms of effective properties, like the main idea, benefits, drawbacks, methods, simulation environment, and datasets. After that, cutting-edge research on DL approaches and applications for bioinformatics concerns was emphasized. In addition, several challenges that contributed to DL implementation for medical and bioinformatics have been addressed, which are predicted to motivate more studies to develop medical and bioinformatics research progressively. According to the findings, most articles are evaluated using features like accuracy, sensitivity, specificity, F-score, latency, adaptability, and scalability.Article Citation Count: 27Deep Q-Learning Technique for Offloading Offline/Online Computation in Blockchain-Enabled Green IoT-Edge Scenarios(Mdpi, 2022) Heidari, Arash; Jamali, Mohammad Ali Jabraeil; Navimipour, Nima Jafari; Akbarpour, ShahinThe number of Internet of Things (IoT)-related innovations has recently increased exponentially, with numerous IoT objects being invented one after the other. Where and how many resources can be transferred to carry out tasks or applications is known as computation offloading. Transferring resource-intensive computational tasks to a different external device in the network, such as a cloud, fog, or edge platform, is the strategy used in the IoT environment. Besides, offloading is one of the key technological enablers of the IoT, as it helps overcome the resource limitations of individual objects. One of the major shortcomings of previous research is the lack of an integrated offloading framework that can operate in an offline/online environment while preserving security. This paper offers a new deep Q-learning approach to address the IoT-edge offloading enabled blockchain problem using the Markov Decision Process (MDP). There is a substantial gap in the secure online/offline offloading systems in terms of security, and no work has been published in this arena thus far. This system can be used online and offline while maintaining privacy and security. The proposed method employs the Post Decision State (PDS) mechanism in online mode. Additionally, we integrate edge/cloud platforms into IoT blockchain-enabled networks to encourage the computational potential of IoT devices. This system can enable safe and secure cloud/edge/IoT offloading by employing blockchain. In this system, the master controller, offloading decision, block size, and processing nodes may be dynamically chosen and changed to reduce device energy consumption and cost. TensorFlow and Cooja's simulation results demonstrated that the method could dramatically boost system efficiency relative to existing schemes. The findings showed that the method beats four benchmarks in terms of cost by 6.6%, computational overhead by 7.1%, energy use by 7.9%, task failure rate by 6.2%, and latency by 5.5% on average.Review Citation Count: 5Deepfake detection using deep learning methods: A systematic and comprehensive review(Wiley Periodicals, inc, 2024) Dağ, Hasan; Navimipour, Nima Jafari; Dag, Hasan; Unal, MehmetDeep Learning (DL) has been effectively utilized in various complicated challenges in healthcare, industry, and academia for various purposes, including thyroid diagnosis, lung nodule recognition, computer vision, large data analytics, and human-level control. Nevertheless, developments in digital technology have been used to produce software that poses a threat to democracy, national security, and confidentiality. Deepfake is one of those DL-powered apps that has lately surfaced. So, deepfake systems can create fake images primarily by replacement of scenes or images, movies, and sounds that humans cannot tell apart from real ones. Various technologies have brought the capacity to change a synthetic speech, image, or video to our fingers. Furthermore, video and image frauds are now so convincing that it is hard to distinguish between false and authentic content with the naked eye. It might result in various issues and ranging from deceiving public opinion to using doctored evidence in a court. For such considerations, it is critical to have technologies that can assist us in discerning reality. This study gives a complete assessment of the literature on deepfake detection strategies using DL-based algorithms. We categorize deepfake detection methods in this work based on their applications, which include video detection, image detection, audio detection, and hybrid multimedia detection. The objective of this paper is to give the reader a better knowledge of (1) how deepfakes are generated and identified, (2) the latest developments and breakthroughs in this realm, (3) weaknesses of existing security methods, and (4) areas requiring more investigation and consideration. The results suggest that the Conventional Neural Networks (CNN) methodology is the most often employed DL method in publications. According to research, the majority of the articles are on the subject of video deepfake detection. The majority of the articles focused on enhancing only one parameter, with the accuracy parameter receiving the most attention. This article is categorized under:Technologies > Machine LearningAlgorithmic Development > MultimediaApplication Areas > Science and TechnologyArticle Citation Count: 0Design and analysis of a fault tolerance nano-scale code converter based on quantum-dots(Elsevier, 2024) Xie, Changgui; Zhao, Xin; Navimipour, Nima JafariQuantum-dot cellular automata (QCA), QCA ), a nano-scale computer framework, is developing as a potential alternative to current transistor-based technologies. However, it is susceptible to a variety of fabrication-related errors and process variances because it is a novel technology. As a result, QCA-based circuits pose reliability-related problems since they are prone to faults. To address the dependability challenges, it is becoming increasingly necessary to create fault-tolerance QCA-based circuits. On the other hand, the applications of code converters in digital systems are essential for rapid signal processing. Using fault-tolerance XOR and multiplexer, this research suggests a nano-based binary-to-gray and gray-to-binary code converter circuit in a single layer to increase efficiency and reduce complexity. The fault-tolerance performance of the suggested circuits against cell omission, misalignment, displacement, and extra cell deposition faults has significantly improved. Concerning the generalized design metrics of QCA circuits, the fault-tolerance designs have been contrasted with the existing structures. The proposed fault-tolerance circuits' energy dissipation findings have been calculated using the precise QCADesigner-E power estimator tool. Using the QCADesigner-E program, the proposed circuits' functionality has been confirmed. The results implied the high efficiency and applicability of the proposed designs.Article Citation Count: 11An efficient and energy-aware design of a novel nano-scale reversible adder using a quantum-based platform(Elsevier, 2022) Ahmadpour, Seyed-Sajad; Navimipour, Nima Jafari; Mosleh, Mohammad; Bahar, Ali Newaz; Das, Jadav Chandra; De, Debashis; Yalcin, SenayQuantum-dot cellular automata (QCA) is a domain coupling nano-technology that has drawn significant attention for less power consumption, area, and design overhead. It is able to achieve a high speed over the CMOS technology. Recently, the tendency to design reversible circuits has been expanding because of the reduction in energy dissipation. Hence, the QCA is a crucial candidate for reversible circuits in nano-technology. On the other hand, the addition operator is also considered one of the primary operations in digital and analog circuits due to its wide applications in digital signal processing and computer arithmetic operations. Accordingly, full-adders have become popular and extensively solve mathematical problems more efficiently and faster. They are one of the essential fundamental circuits in most digital processing circuits. Therefore, this article first suggests a novel reversible block called the RF-adder block. Then, an effective reversible adder design is proposed using the recommended reversible RF-adder block. The QCAPro and QCADesigner 2.0.3 tools were employed to assess the effectiveness of the suggested reversible full-adder. The outcomes of energy dissipation for the proposed circuit compared to the best previous structure at three different tunneling energy levels indicate a reduction in the power consumption by 45.55%, 38.82%, and 34.62%, respectively. (C) 2022 Elsevier B.V. All rights reserved.Article Citation Count: 11An Energy-Aware IoT Routing Approach Based on a Swarm Optimization Algorithm and a Clustering Technique(Springer, 2022) Sadrishojaei, Mahyar; Navimipour, Nima Jafari; Reshadi, Midia; Hosseinzadeh, MehdiThe Internet of Things (IoT) comprises many nodes dispersed around a particular target region, and it has lately been applied in a variety of sectors such as smart cities, farming, climatology, smart metering, waste treatment, and others. Even though the IoT has tremendous potential, some difficulties must be addressed. When building the clustering and routing protocol for huge-scale IoT networks, uniform energy usage and optimization are two significant concerns. Clustering and routing are well-known NP-hard optimization challenges applied to the IoT. The ease with which chicken can be implemented has garnered much interest compared to other population-based metaheuristic algorithms in solving optimization problems in the IoT. Aiming to reduce and improve node energy consumption in the IoT network by choosing the most suitable cluster head, the current effort seeks to extend the life of a network by selecting the most appropriate cluster head. A new cost function for homogenous dispersion of cluster heads was proposed in this research, and a good balance among exploration and exploitation search skills to create a node clustering protocol based on chicken search. This procedure is a big step forward from previous state-of-the-art protocols. The number of packets received, the total power consumption, the number of active nodes, and the latency of the suggested integrated clustered routing protocol are all used to evaluate the protocol's overall performance. The proposed strategy has been demonstrated to improve power consumption by at least 16 percent.Article Citation Count: 0An Energy-Aware Load Balancing Method for IoT-Based Smart Recycling Machines Using an Artificial Chemical Reaction Optimization Algorithm(Mdpi, 2023) Milan, Sara Tabaghchi; Darbandi, Mehdi; Navimipour, Nima Jafari; Yalcin, SenayRecycling is very important for a sustainable and clean environment. Developed and developing countries are both facing the problem of waste management and recycling issues. On the other hand, the Internet of Things (IoT) is a famous and applicable infrastructure used to provide connection between physical devices. It is an important technology that has been researched and implemented in recent years that promises to positively influence several industries, including recycling and trash management. The impact of the IoT on recycling and waste management is examined using standard operating practices in recycling. Recycling facilities, for instance, can use IoT to manage and keep an eye on the recycling situation in various places while allocating the logistics for transportation and distribution processes to minimize recycling costs and lead times. So, companies can use historical patterns to track usage trends in their service regions, assess their accessibility to gather resources, and arrange their activities accordingly. Additionally, energy is a significant aspect of the IoT since several devices will be linked to the internet, and the devices, sensors, nodes, and objects are all energy-restricted. Because the devices are constrained by their nature, the load-balancing protocol is crucial in an IoT ecosystem. Due to the importance of this issue, this study presents an energy-aware load-balancing method for IoT-based smart recycling machines using an artificial chemical reaction optimization algorithm. The experimental results indicated that the proposed solution could achieve excellent performance. According to the obtained results, the imbalance degree (5.44%), energy consumption (11.38%), and delay time (9.05%) were reduced using the proposed method.