-
2025Wassim Ayadi, , ,
Biclustering sustainable local tourism systems by the Tabu search optimization algorithm
Quality & Quantity, 2025
Abstract
Tourism is nowadays fully acknowledged as a leading industry contributing to boost the economic development of a country. This growing recognition has led researchers and policy makers to increasingly focus their attention on all those concerns related to optimally detecting, promoting and supporting territorial areas with a high tourist vocation, i.e., Local Tourism Systems. In this work, we propose to apply the biclustering data mining technique to detect Local Tourism Systems. By means of a two-dimensional clustering approach, we pursue the objective of obtaining more in-depth and granular information than conventional clustering algorithms. To this end, we formulate the objective as an optimization problem, and we solve it by means of Tabu-search. The obtained results are very promising and outperform those provided by classic clustering approaches.
Wassim Ayadi,ColBic: A New Biclustering-Based Collaborative Filtering.
21st International Conference on Artificial Intelligence Applications and Innovations (AIAI 2025) : 381-391, 2025
Abstract
Recommendation systems have become essential for filtering the vast amounts of information available on the Internet. Traditional collaborative filtering methods face challenges such as data sparsity and scalability issues. To address these limitations, we propose ColBic, a novel collaborative filtering approach based on biclustering and Iterative Local Search (ILS). Our method improves the accuracy of the recommendation by grouping users and items into dense biclusters and refining them through iterative optimization. Experimental results on the MovieLens-100K and MovieLens-1M datasets demonstrate that ColBic outperforms traditional collaborative filtering methods in terms of accuracy and coverage.
Sofian Boutaib, Maha Elarbi, Slim Bechikh, , Lamjed Ben SaidCross-Project Code Smell Detection as a Dynamic Optimization Problem: An Evolutionary Memetic Approach
IEEE Congress on Evolutionary Computation (CEC), 2025
Abstract
Code smells signal poor software design that can prevent maintainability and scalability. Identifying code smells is difficult because of the large volume of code, considerable detection expenses, and the substantial effort needed for manual tagging. Although current techniques perform well in within-project situations, they frequently struggle to adapt to cross-project environments that have varying data distributions. In this paper, we introduce CLADES (Cross-project Learning and Adaptation for Detection of Code Smells), a hybrid evolutionary approach consisting of three main modules: Initialization, Evolution, and Adaptation. The first module generates an initial population of decision tree detectors using labeled within-project data and evaluates their quality through fitness functions based on structural code metrics. The evolution module applies genetic operators (selection, crossover, and mutation) to create new offspring solutions. To handle cross-project scenarios, the adaptation module employs a clustering-based instance selection technique that identifies representative instances from new projects, which are added to the dataset and used to repair the decision trees through simulated annealing. These locally refined decision trees are then evolved using a genetic algorithm, thus enabling continuous adaptation to new project instances. The resulting optimized decision tree detectors are then employed to predict labels for the new unlabeled project instances. We assess CLADES across five open-source projects and we show that it has a better performance with respect to baseline techniques in terms of weighted F1-score and AUC-PR metrics. These results emphasize its capacity to effectively adjust to different project environments, facilitating precise and scalable detection of code smells while minimizing the need for manual review, contributing to more robust and maintainable software systems.
Slim Bechikh,,Adaptive Normal-Boundary Intersection Directions for Evolutionary Many-Objective Optimization with Complex Pareto Fronts
In International Conference on Evolutionary Multi-Criterion Optimization (pp. 132-147). Singapore: Springer Nature Singapore., 2025
Abstract
Decomposition-based Many-Objective Evolutionary Algorithms (MaOEAs) usually adopt a set of pre-defined distributed weight vectors to guide the solutions towards the Pareto optimal Front (PF). However, when solving Many-objective Optimization Problems (MaOPs) with complex PFs, the effectiveness of MaOEAs with a fixed set of weight vectors may deteriorate which will lead to an imbalance between convergence and diversity of the solution set. To address this issue, we propose here an Adaptive Normal-Boundary Intersection Directions Decomposition-based Evolutionary Algorithm (ANBID-DEA), which adaptively updates the Normal-Boundary Intersection (NBI) directions used in MP-DEA. In our work, we assist the selection mechanism by progressively adjusting the NBI directions according to the distribution of the population to uniformly cover all the parts of the complex PFs (i.e., those that are disconnected, strongly convex, degenerate, etc.). Our proposed ANBID-DEA is compared with respect to five state-of-the-art MaOEAs on a variety of unconstrained benchmark problems with up to 15 objectives. Our results indicate that ANBID-DEA has a competitive performance on most of the considered MaOPs.
Said Gattoufi, Nabil Ktifi, Mokhtar LAABIDIData Envelopment Analysis for Mergers and Acquisitions Transactions: Avenues of Research Toward Efficiency Gains
Data-envelopment-analysis-mergers-acquisitions, 2025
Abstract
The aim of this chapter is to explain in a simple way, without complications of mathematical modeling, a set of concepts and emphasize their interconnections and combinations in a large body of knowledge they created and emphasize the theoretical and applied benefits. In today’s business world marked by profound changes and technological challenges, global companies are in fact developing strategies to improve their profitability and efficiency while adapting to geopolitical changes. To strengthen their resilience, they are increasingly turning to restructuring, partnership re-engineering, and mergers and acquisitions (M&A) to consolidate their market position and increase their chances of survival. This book chapter analyses in its first part a large set of research papers related to this topic. Among the approaches and methodologies adopted for analyzing this dynamic, we explain the Data Envelopment Analysis (DEA) methodology and its variants and emphasized on the assessment of the efficiency gains realized through mergers, acquisitions, takeovers, splits, consolidations and restructuring. The related literature, referenced in SCOPUS, is analyzed and the features of this literature are identified and analyzed, emphasizing the most influencing authors and the topics of their research. Finally, the concluding section synthetizes the interconnections between DEA and its variants as a tool from one side and the restructuring and consolidation dynamics of businesses, mainly M&A, from the other side. Several topics are suggested to widen this body of knowledge and boost its impact on goods and services industries and improve understanding production processes in a variety of sectors.Hamida Labidi, Abir Chaabani, Nadia Ben AzzounaHybrid Genetic Algorithm for Solving an Online Vehicle Routing Problem with Time Windows and Heterogeneous Fleet
This paper proposes a hybrid genetic algorithm to address an online vehicle routing problem with time windows and a heterogeneous fleet, presented at Hybrid Intelligent Systems (HIS 2023)., 2025
Abstract
The Vehicle Routing Problem (VRP) is a well-known optimization problem in which we aim traditionally to minimize transportation costs while satisfying customer demands. In fact, most logistics companies use a heterogeneous fleet with varying capacities and costs, presenting a more complex variant known as Rich VRP (RVRP). In this paper, we present a mathematical formulation of the RVRP, considering both hard time windows and dynamically changing requests to be as close as possible to real-life logistics scenarios. To solve this challenging problem, we propose a Hybrid Genetic Algorithm (HGA). The experimental study highlights the out-performance of our proposal when evaluated alongside other algorithms on the same benchmark problems. Additionally, we conduct a sensitivity analysis to illustrate how resilient the algorithm is when problem parameters are altered.
Hajer Alaya, Lilia Rejeb, Lamjed Ben SaidExplanable AI in automatic sleep scoring: A review
Hajer ALAYA, Lilia Rejeb, Lamjed Ben Said, “Explainable AI in automatic sleep scoring: A review”, International Conference on Intelligence in Business and Industry 2025 (IBI'25) 24 et 25 avril 2025., 2025
Abstract
The application of Artificial Intelligence (AI) in
automatic sleep scoring presents significant opportunities for
enhancing sleep analysis and diagnosing sleep disorders.
However, a major challenge lies in the lack of transparency in
AI-driven decision-making, which can hinder trust and
comprehension among sleep researchers and clinicians.
Explainable Artificial Intelligence (XAI) has emerged as a key
approach to addresss these concerns by providing insights into
AI model predictions and improving interpretability. This
review examines the role and effectiveness of Explainability and
interpretability in automatic sleep scoring, analyzing key
challenges, the impact of various methodologies, and commonly
used algorithms. Based on a comprehensive analysis of 100
recent studies, we bridge the gap between computer-readable
data encodings and human-understandable information,
enhancing model explainability and transparency. Ultimately,
this review underscores the vital role of Explainability in
refining sleep evaluation and decision-making, emphasizing the
necessity of further research to address existing challenges and
maximize its potential.Ali Abdelghafour Bejaoui, Meriam Jemel, Nadia Ben AzzounaExplainable AI Planning:literature review
Automated planning systems have become indispensable tools in a wide range of applications, from robotics and healthcare to logistics and autonomous systems. However, as these systems grow in complexity, their decision-making processes often become opaque, 2025
Abstract
Explainable AI Planning (XAIP) is a pivotal research
area focused on enhancing the transparency, interpretability,
and trustworthiness of automated planning systems. This
paper provides a comprehensive review of XAIP, emphasizing key
techniques for plan explanation, such as contrastive explanations,
hierarchical decomposition, and argumentative reasoning frameworks.
We explore the critical role of argumentation in justifying
planning decisions and address the challenges of replanning in
dynamic and uncertain environments, particularly in high-stakes
domains like healthcare, autonomous systems, and logistics.
Additionally, we discuss the ethical and practical implications
of deploying XAIP, highlighting the importance of human-AI
collaboration, regulatory compliance, and uncertainty handling.
By examining these aspects, this paper aims to provide a detailed
understanding of how XAIP can improve the transparency,
interpretability, and usability of AI planning systems across
various domains.Hana Mechria, ,Mammogram images denoising based on deep convolutional neural network
Imapct Factor 2024: 3.6, 2025
Abstract
Mammogram images are subject to various types of noise, which restricts the analysis of images and diagnosis. Mammogram image denoising is very important to improve image quality and to make the segmentation and classification results more correct. In this work, we propose a Deep Convolutional Neural Network (DCNN) to denoise the mammogram images in order to improve the image quality by handling Gaussian, Speckle, Poisson, and Salt and Pepper noise. The main objective of this study is to remove different types of noises from mammogram images and to maximize the quantity of information content in the enhanced images. We first add noise models to mammogram images and then enhance the image by removing the noise using DCNN. Furthermore, we compare our results with state-of-the-art denoising methods, such as the Adaptive Median filter, Wiener filter, Gaussian filter, Median filter, and Mean. Three datasets have been used, including Digital Database for Screening Mammography (DDSM), mini-Mammographic Image Analysis Society (mini-MIAS), and a local Tunisian dataset. The experimental results show that DCNN has a better denoising performance than the other methods, with an average PSNR range of 46.0-51.83 dB and an average SSIM range of 0.988-99.83, which may suggest its adaptability to different models of noise.
Boutheina JLIFI, Syrine Ferjani,A Genetic Algorithm based Three HyperParameter optimization of Deep Long Short Term Memory (GA3P-DLSTM) for Predicting Electric Vehicles energy consumption
Computers and Electrical Engineering, 123, 110185., 2025
Abstract
To overcome Climate Change, countries are turning to greener transportation systems. Therefore, the use of Electric Vehicles (EVs) is leveraging substantially since they present multiple advantages, like reducing hazardous emissions. Recently, the demand for EVs has increased, which means that more charging stations need to be available. By the year 2030, 15 million EVs will be accessible, and since the number of charging stations is limited, the charging needs should be defined for better management of the charging infrastructure. In this research, we aim to tackle this problem by efficiently predicting the energy consumption of EVs. We proposed a Genetic Algorithm (GA) based Three HyperParameter optimization of Deep Long Short Term Memory (GA3P-DLSTM), which is an optimized LSTM model that incorporates a GA for Hyperparameter Tuning. After experimenting our methodology and performing a comparative analysis with previous studies from the literature, the obtained results showed the efficiency of our novel model, with Mean Squared Error (MSE) equals to 0.000112 and a Determination Coefficient (R) equals to 0.96470. It outperformed other models of the literature for predicting energy use based on real-world data collected from the campus of Georgia Tech in Atlanta, USA.