Network Coding - We propose and investigate a novel scheme of delay attack-resistant network based on optical code division multiple access (OCDMA). The bit error rate (BER) is analyzed theoretically, and the closed expression of BER is obtained. The system has a corresponding optimal threshold to minimize the BER in different cases. At the same transmission power, the BER will decrease when the code weight increases. Furthermore, the maximum number of users is different with different code weight and transmission power. Optisystem simulation results show that OCDMA system has delay attack-resistant performance, which can effectively improve the physical layer security of optical network.
Authored by Mandong Liu, Peng Ouyang, Jianhua Ji, Ming Xu
Network Coding - This paper proposes a hybrid encryption scheme for multi-relay (MR) physical-layer network coding (PNC). Based on the three-relay (3R) bidirectional communication model, first, we discuss the throughput performance of the PNC compared with the traditional scheme (TS) and network coding (NC) system. Through the analysis of transmission efficiency, the superior throughput of the PNC system is demonstrated. Then, to further improve the security of the communication system, we give a scheme of advanced encryption standard (AES) and RivestShamir-Adleman (RSA) hybrid encryption, namely AR hybrid encryption. Finally, we embed the AR hybrid encryption into the multi-relay PNC communication system. At relay nodes of the ARPNC system, we focus on solving the problem of signal mapping. In the meantime, to reduce the performance loss caused by the increase of relay nodes, we exploit Low-Density Parity-Check (LDPC) code to enhance the decoding accuracy. The experimental results and security analysis show that the proposed scheme can boost the system throughput and transmission dependability and stronger the security of the communication system.
Authored by Yanru Yang, Meng Tang, Haihua Li, Guofeng Zeng, Jianhua Chen, Yongtao Yu
Network Coding - This paper introduces a method to improve the transmission model of BigNum network coding. The main contents include the research status of network coding, the principle of BigNum network coding, the security problems existing in the existing technology, the new coding matrix proposed for the problem, the beneficial effect of the new matrix and comparison. In this paper, to improve the security of BigNum network coding, we propose two new coding matrix forms: random number matrix and Fibonacci generation matrix. We also give a proof of the invertibility of Fibonacci generate matrix.
Authored by Zengqiang Tang, Yuyang Zhang, Wenxuan Qiao, Ping Dong
Network Coding - Network coding is getting wider and wider applications. Among which, many studies aim to leverage network coding to improve network security. However, a clear security classification and hierarchy is still missing so far. By classifying and articulating existing schemes, this paper proposed a security hierarchy of network coding system for the community. Four security grades: basic security, weak security, perfect security, and strong security, are tiered with different security strength. The tenet and implementation of them are expounded. The hierarchy helps delineate, classify, and differentiate secure network coding.
Authored by Na Qin, Yantao Liu
Network Accountability - Important for cloud services the cloud computing share throw multiple clients , and it is more important to allocate resources for cloud service provider , cloud computing is an infrastructure that provides on demand network services , in relation , the most important feature of the cloud services is that user’s data are hosted in remote . While taking benefit of this new emerging technology, users’ fear of losing command of their own data, is becoming a noteworthy hurdle to the extensive implementation of cloud services. Cloud service provider module is to process data owner request for storing data files and application and provides cloud users log details to data owner for audit purpose, to address this problem framework based on information accountability to keep track and trial of the authentic handling of the users’ data in the cloud. The system proposed that the Data can be fully tracked by the owner and follow up the service agreements by depending on many items which access, usage control and management.
Authored by Mostafa Mohammed, Zeyad Salih, Nicolae Tapus, Raed Hasan
Nearest Neighbor Search - One of the most significant and widely used IT breakthroughs nowadays is cloud computing. Today, the majority of enterprises use private or public cloud computing services for their computing infrastructure. Cyber-attackers regularly target Cloud resources by inserting malicious code or obfuscated malware onto the server. These malware programmes that are obfuscated are so clever that they often manage to evade the detection technology that is in place. Unfortunately, they are discovered long after they have done significant harm to the server. Machine Learning (ML) techniques have shown to be effective at finding malware in a wide range of fields. To address feature selection (FS) challenges, this study uses the wrapperbased Binary Bat Algorithm (BBA), Cuckoo Search Algorithm (CSA), Mayfly Algorithm (MA), and Particle Swarm Optimization (PSO), and then k-Nearest Neighbor (kNN), Random Forest (RF), and Support Vector Machine (SVM) are used to classify the benign and malicious records to measure the performance in terms of various metrics. CIC-MalMem-2022, the most recent malware memory dataset, is used to evaluate and test the proposed approach and it is found that the proposed system is an acceptable solution to detect malware.
Authored by Mohd. Ghazi, N. Raghava
Nearest Neighbor Search - With the rise and development of cloud computing, more and more companies try to outsource computing and storage to cloud in order to save storage and computing cost. Due to the rich information contained in images, the explosion of images is booming the image outsourcing. However, images may contain a lot of sensitive information and cloud servers are always not trusted. Directly outsourcing may lead to data breaches and incur privacy and security concerns. This has partly led to renewed interest in privacy-preserving encrypted image retrieval. However, there are still many challenges, such as low search accuracy and inefficiency due to the hundreds of high dimensional features extracted from a single image and the large scale of images. To address these challenges, in this paper, we propose an efficient, scalable and privacy-preserving image retrieval scheme via ball tree. First, the pre-trained Convolutional Neural Network (CNN) model is employed to extract image feature vectors to improve search accuracy. Next, an encrypted ball tree is constructed by using Learning With Errors(LWE)based secure k-Nearest Neighbor (kNN) algorithm. Finally, we conduct comprehensive experiments on real-world datasets and give a brief security analysis. The results show that our scheme is practical in terms of security, accuracy, and efficiency.
Authored by Xianxian Li, Jie Lei, Zhenkui Shi, Feng Yu
Nearest Neighbor Search - The organization formed by the connection established between computers, typically by cable, for the purpose of communicating and transmitting data is known as a network. A computer network is a collection of interconnected computers that allow for the sharing of resources including data, programs, and files. When people think of computer networks, they think of the Internet. In this paper, we proposed the usage of a new technique for the categorization of computer network traffic that is based on deep sparse autoencoders and k-nearest-neighbor (KNN) that has been optimized with Grid Search. The autoencoders took the input data and extracted high-level characteristics from it, then connected those features to the KNN. The KNN was used to divide the characteristics into three distinct kinds of assaults (normal and abnormal). In comparison to other investigations, the proposed approach demonstrated an accuracy of 98.23\% in its results.
Authored by Sarmad Al-Jawashee, Mesüt Çevik
Nearest Neighbor Search - Web component fingerprint library is the basis to solve the problem of Web component identification. A complete and accurate Web component fingerprint library can effectively improve the Web component identification capability. At present, the expansion mode of Web component fingerprint database is still mainly based on expert experience for manual mining, which is difficult to expand and update. Therefore, there is an urgent need for a method to efficiently extend the Web component fingerprint library. To solve this problem, an intelligent method for mining Web components and fingerprints is proposed. This method uses the idea of manual mining new components for reference, and uses the search result characteristics of Web components in search engines to intelligently mine new Web components. At the same time, the fingerprint of Web components can be obtained automatically through data mining on the websites where new components are applied. The experimental results show that 22 new components and 102 component fingerprints have been found in a short time by using intelligent mining methods, which can efficiently mine Web components and fingerprints. Compared with the current mainstream manual mining methods, the efficiency of this method is greatly improved, which proves the feasibility of this method.
Authored by Kaiming Yang, Tianyang Zhou, Guoren Zhong, Junhu Zhu, Ziqiao Zhou
Nearest Neighbor Search - The data of large-scale distributed demand-side iot devices are gradually migrated to the cloud. This cloud deployment mode makes it convenient for IoT devices to participate in the interaction between supply and demand, and at the same time exposes various vulnerabilities of IoT devices to the Internet, which can be easily accessed and manipulated by hackers to launch large-scale DDoS attacks. As an easy-to-understand supervised learning classification algorithm, KNN can obtain more accurate classification results without too many adjustment parameters, and has achieved many research achievements in the field of DDoS detection. However, in the face of high-dimensional data, this method has high operation cost, high cost and not practical. Aiming at this disadvantage, this chapter explores the potential of classical KNN algorithm in data storage structure, Knearest neighbor search and hyperparameter optimization, and proposes an improved KNN algorithm for DDoS attack detection of demand-side IoT devices.
Authored by Kun Shi, Songsong Chen, Dezhi Li, Ke Tian, Meiling Feng
Nearest Neighbor Search - Network security is one of the main challenges faced by network administrators and owners, especially with the increasing numbers and types of attacks. This rapid increase results in a need to develop different protection techniques and methods. Network Intrusion Detection Systems (NIDS) are a method to detect and analyze network traffic to identify attacks and notify network administrators. Recently, machine learning (ML) techniques have been extensively applied in developing detection systems. Due to the high complexity of data exchanged over the networks, applying ML techniques will negatively impact system performance as many features need to be analyzed. To select the most relevant features subset from the input data, a feature selection technique is used, which results in enhancing the overall performance of the NIDS. In this paper, we propose a wrapper approach as a feature selection based on a Chaotic Crow Search Algorithm (CCSA) for anomaly network intrusion detection systems. Experiments were conducted on the LITNET2020 dataset. To the best of our knowledge, our proposed method can be considered the first selection algorithm applied on this dataset based on swarm intelligence optimization to find a special subset of features for binary and multiclass classifications that optimizes the performance for all classes at the same time.The model was evaluated using several ML classifiers namely, Knearest neighbors (KNN), Decision Tree (DT), Random Forest (RF), Support Vector Machine (SVM), Multi-layer perceptron (MLP), and Long Short-Term Memory (LSTM). The results proved that the proposed algorithm is more efficient in improving the performance of NIDS in terms of accuracy, detection rate, precision, F-score, specificity, and false alarm rate, outperforming state-of-the-art feature selection techniques recently proposed in the literature.
Authored by Hussein Al-Zoubi, Samah Altaamneh
Nearest Neighbor Search - Nearest neighbor search is a fundamental buildingblock for a wide range of applications. A privacy-preserving protocol for nearest neighbor search involves a set of clients who send queries to a remote database. Each client retrieves the nearest neighbor(s) to its query in the database without revealing any information about the query. To ensure database privacy, clients must learn as little as possible beyond the query answer, even if behaving maliciously by deviating from protocol.
Authored by Sacha Servan-Schreiber, Simon Langowski, Srinivas Devadas
Nearest Neighbor Search - Security CCTV cameras are important for public safety. These cameras record continuously 24/7 and produce a large amount of video data. If the videos are not reviewed immediately after an incident, it can be difficult and timeconsuming to find a specific person out of many hours of recording. In this work we present a system that can search for people that fit a textual description in a video. It utilizes a imagetext multimodal deep learning model to calculate the similarity between an image of a person against a text description and find the top matches. Normally this would require calculating the textimage similarity scores between one text description and every person in the video, which is O(n) in the number of people in the video and therefore impractical for real-time search. We propose a solution to this by pre-calculating embeddings of person images and applying approximate nearest neighbor vector search. At inference time, only one forward pass through the deep learning model is needed, the computational cost is therefore the time to embed a text description O(1), plus the time to perform an approximate nearest neighbor search O(log(n)). This makes realtime interactive search possible.
Authored by Sumeth Yuenyong
Natural Language Processing - Dissemination of fake news is a matter of major concern that can result in national and social damage with devastating impacts. The misleading information on the internet is dubious and seems to be arduous for identification. Machine learning models are becoming an irreplaceable component in the detection of fake news spreading on the social media. LSTM is a memory based machine learning model for the detection of false news. LSTM has a promising approach and eradicates the issue of vanishing gradient in RNNs. The integration of natural language processing and LSTM model is considered to be effective in the false news identification.
Authored by Abina Azees, Geevarghese Titus
Natural Language Processing - Rule-based Web vulnerability detection is the most common method, usually based on the analysis of the website code and the feedback on detection of the target. In the process, large amount of contaminated data and network pressure will be generated, the false positive rate is high. This study implements a detection platform on the basis of the crawler and NLP. We use the crawler obtain the HTTP request on the target system firstly, classify the dataset according to whether there is parameter and whether the samples get to interact with a database. then we convert text word vector, carries on the dimensionality of serialized, through train dataset by NLP algorithm, finally obtain a model that can accurately predict Web vulnerabilities. Experimental results show that this method can detect Web vulnerabilities efficiently, greatly reduce invalid attack test parameters, and reduce network pressure.
Authored by Xin Ge, Min-Nan Yue
Natural Language Processing - Application code analysis and static rules are the most common methods for Web vulnerability detection, but this process will generate a large amount of contaminated data and network pressure, the false positive rate is high. This study implements a detection system on the basis of the crawler and NLP. The crawler visits page in imitation of a human, we collect the HTTP request and response as dataset, classify the dataset according to parameter characteristic and whether the samples get to interact with a database, then we convert text word vector, reduce the dimension and serialized them, through train dataset by NLP algorithm, finally we obtain a model that can accurately predict Web vulnerabilities. Experimental results show that this method can detect Web vulnerabilities efficiently, greatly reduce invalid attack test parameters, and reduce network pressure.
Authored by Xin Ge, Minnan Yue
Natural Language Processing - Story Ending Generation (SEG) is a challenging task in natural language generation. Recently, methods based on Pretrained Language Models (PLM) have achieved great prosperity, which can produce fluent and coherent story endings. However, the pre-training objective of PLM-based methods is unable to model the consistency between story context and ending. The goal of this paper is to adopt contrastive learning to generate endings more consistent with story context, while there are two main challenges in contrastive learning of SEG. First is the negative sampling of wrong endings inconsistent with story contexts. The second challenge is the adaptation of contrastive learning for SEG. To address these two issues, we propose a novel Contrastive Learning framework for Story Ending Generation (CLSEG)†, which has two steps: multi-aspect sampling and story-specific contrastive learning. Particularly, for the first issue, we utilize novel multi-aspect sampling mechanisms to obtain wrong endings considering the consistency of order, causality, and sentiment. To solve the second issue, we well-design a story-specific contrastive training strategy that is adapted for SEG. Experiments show that CLSEG outperforms baselines and can produce story endings with stronger consistency and rationality.
Authored by Yuqiang Xie, Yue Hu, Luxi Xing, Yunpeng Li, Wei Peng, Ping Guo
Natural Language Processing - The new capital city (IKN) of the Republic of Indonesia has been ratified and inaugurated by President Joko Widodo since January 2022. Unfortunately, there are still many Indonesian citizens who do not understand all the information regarding the determination of the new capital city. Even though the Indonesian Government has created an official website regarding the new capital city (www.ikn.go.id) the information is still not optimal because web page visitors are still unable to interact actively with the required information. Therefore, the development of the Chatting Robot (Chatbot) application is deemed necessary to become an interactive component in obtaining information needed by users related to new capital city. In this study, a chatbot application was developed by applying Natural Language Processing (NLP) using the Term Frequency-Inverse Document Frequency (TFIDF) method for term weighting and the Cosine-Similarity algorithm to calculate the similarity of the questions asked by the user. The research successfully designed and developed a chatbot application using the Cosine-Similarity algorithm. The testing phase of the chatbot model uses several scenarios related to the points of NLP implementation. The test results show that all scenarios of questions asked can be responded well by the chatbot.
Authored by Harry Achsan, Deni Kurniawan, Diki Purnama, Quintin Barcah, Yuri Astoria
Natural Language Processing - In today’s digital era, online attacks are increasing in number and are becoming severe day by day, especially those related to web applications. The data accessible over the web persuades the attackers to dispatch new kinds of attacks. Serious exploration on web security has shown that the most hazardous attack that affects web security is the Structured Query Language Injection(SQLI). This attack addresses a genuine threat to web application security and a few examination works have been directed to defend against this attack by detecting it when it happens. Traditional methods like input validation and filtering, use of parameterized queries, etc. are not sufficient to counter these attacks as they rely solely on the implementation of the code hence factoring in the developer’s skill-set which in turn gave rise to Machine Learning based solutions. In this study, we have proposed a novel approach that takes the help of Natural Language Processing(NLP) and uses BERT for feature extraction that is capable to adapt to SQLI variants and provides an accuracy of 97\% with a false positive rate of 0.8\% and a false negative rate of 5.8\%.
Authored by Sagar Lakhani, Ashok Yadav, Vrijendra Singh
Natural Language Processing - In today s digital age, businesses create tremendous data as part of their regular operations. On legacy or cloud platforms, this data is stored mainly in structured, semi-structured, and unstructured formats, and most of the data kept in the cloud are amorphous, containing sensitive information. With the evolution of AI, organizations are using deep learning and natural language processing to extract the meaning of these big data through unstructured data analysis and insights (UDAI). This study aims to investigate the influence of these unstructured big data analyses and insights on the organization s decision-making system (DMS), financial sustainability, customer lifetime value (CLV), and organization s long-term growth prospects while encouraging a culture of self-service analytics. This study uses a validated survey instrument to collect the responses from Fortune-500 organizations to find the adaptability and influence of UDAI in current data-driven decision making and how it impacts organizational DMS, financial sustainability and CLV.
Authored by Bibhu Dash, Swati Swayamsiddha, Azad Ali
Natural Language Processing - Natural language processing (NLP) is a computer program that trains computers to read and understand the text and spoken words in the same way that people do. In Natural Language Processing, Named Entity Recognition (NER) is a crucial field. It extracts information from given texts and is used to translate machines, text to speech synthesis, to understand natural language, etc. Its main goal is to categorize words in a text that represent names into specified tags like location, organization, person-name, date, time, and measures. In this paper, the proposed method extracts entities on Hindi Fraud Call (publicly not available) annotated Corpus using XLM-Roberta (base-sized model). By pre-training model to build the accurate NER system for datasets, the Authors are using XLM-Roberta as a multi-layer bidirectional transformer encoder for learning deep bidirectional Hindi word representations. The fine-tuning concept is used in this proposed method. XLM-Roberta Model has been fine-tuned to extract nine entities from sentences based on context of sentences to achieve better performance. An Annotated corpus for Hindi with a tag set of Nine different Named Entity (NE) classes, defined as part of the NER Shared Task for South and Southeast Asian Languages (SSEAL) at IJCNLP. Nine entities have been recognized from sentences. The Obtained F1-score(micro) and F1-score(macro) are 0.96 and 0.80, respectively.
Authored by Aditya Choure, Rahul Adhao, Vinod Pachghare
Natural Language Processing - The Internet of Thigs is mainly considered as the key technology tools which enables in connecting many devices through the use of internet, this has enabled in overall exchange of data and information, support in receiving the instruction and enable in acting upon it in an effective manner. With the advent of IoT, many devices are connected to the internet which enable in assisting the individuals to operate the devise virtually, share data and program required actions. This study is focused in understanding the key determinants of creating smart homes by applying natural language processing (NLP) through IoT. The major determinants considered are Integrating voice understanding into devices; Ability to control the devices remotely and support in reducing the energy bills.
Authored by Shahanawaj Ahamad, Deepalkumar Shah, R. Udhayakumar, T.S. Rajeswari, Pankaj Khatiwada, Joel Alanya-Beltran
Natural Language Processing - This paper presents a system to identify social engineering attacks using only text as input. This system can be used in different environments which the input is text such as SMS, chats, emails, etc. The system uses Natural Language Processing to extract features from the dialog text such as URL s report and count, spell check, blacklist count, and others. The features are used to train Machine Learning algorithms (Neural Network, Random Forest and SVM) to perform classification of social engineering attacks. The classification algorithms showed an accuracy over 80\% to detect this type of attacks.
Authored by Juan Lopez, Jorge Camargo
Named Data Network Security - Design of the English APP security verification framework based on fusion IP-Address-MAC data features is studied in the paper. APP is named the client application, including third-party applications on PCs and mobile terminals, that is, smartphones. At present, Praat has become a software commonly used by researchers in the world of experimental phonetics, linguistics, language investigation, language processing and other related fields. Under this background, our target is selected to be the English AP. For the design of the framework, node forms a corresponding topology table according to the neighbor list detected by itself and the topology information obtained from the received TC message. To deal with the challenge of the high robustness, the IP and MAC data analysis are both considered. Through the data collection, processing and the further fusion, the comprehensive system is implemented. The proposed model is tested under different testing scenarios.
Authored by Jinxun Yu, Kai Xia
Named Data Network Security - Internet of Things (IoT) is becoming an important approach to accomplish healthcare monitoring where critical medical data retrieval is essential in a secure and private manner. Nevertheless, IoT devices have constrained resources. Therefore, acquisition of efficient, secure and private data is very challenging. The current research on applying architecture of Named Data Networking (NDN) to IoT design reveals very promising results. Therefore, we are motivated to combine NDN and IoT, which we call NDN-IoT architecture, for a healthcare application. Inspired by the idea, we propose a healthcare monitoring groundwork integrating NDN concepts into IoT in Contiki NG OS at the network layer that we call µNDN as it is a micro and light-weight implementation. We quantitatively explore the usage of the NDN-IoT approach to understand its efficiency for medical data retrieval. Reliability and delay performances were evaluated and analyzed for a remote health application. Our results, in this study, show that the µNDN architecture performs better than IP architecture when retrieving medical data. Thus, it is worth exploring the µNDN architecture further.
Authored by Alper Demir, Gokce Manap