Aucune description disponible pour cet axe de recherche.
Description
Publications
-
2022Rihab Said, Maha Elarbi, Slim Bechikh, Carlos Artemio Coello Coello
Discretization-based feature selection as a bilevel optimization problem
IEEE Transactions on Evolutionary Computation, 27(4), 893-907., 2022
Résumé
Discretization-based feature selection (DBFS) approaches have shown interesting results when using several metaheuristic algorithms, such as particle swarm optimization (PSO), genetic algorithm (GA), ant colony optimization (ACO), etc. However, these methods share the same shortcoming which consists in encoding the problem solution as a sequence of cut-points. From this cut-points vector, the decision of deleting or selecting any feature is induced. Indeed, the number of generated cut-points varies from one feature to another. Thus, the higher the number of cut-points, the higher the probability of selecting the considered feature; and vice versa. This fact leads to the deletion of possibly important features having a single or a low number of cut-points, such as the infection rate, the glycemia level, and the blood pressure. In order to solve the issue of the dependency relation between the feature selection (or removal) event and the number of its generated potential cut-points, we propose to model the DBFS task as a bilevel optimization problem and then solve it using an improved version of an existing co-evolutionary algorithm, named I-CEMBA. The latter ensures the variation of the number of features during the migration process in order to deal with the multimodality aspect. The resulting algorithm, termed bilevel discretization-based feature selection (Bi-DFS), performs selection at the upper level while discretization is done at the lower level. The experimental results on several high-dimensional datasets show that Bi-DFS outperforms relevant state-of-the-art methods in terms of classification accuracy, generalization ability, and feature selection bias.
Rihab Said, Maha Elarbi, Slim Bechikh, Carlos Artemio Coello CoelloCost-sensitive classification tree induction as a bi-level optimization problem
In Proceedings of the Genetic and Evolutionary Computation Conference Companion (pp. 284-287), 2022
Résumé
Data imbalance is still so far a challenging issue in data classification. In literature, cost-sensitive approach has been used to deal with such a challenge. Despite its interesting results, the manual design of cost matrices is still the main shortcoming of this approach. The data engineer is still facing a great difficulty in defining the misclassification costs, especially with the absence of domain specific knowledge. Recent works suggest the use of genetic programming as an effective tool to design classification trees with automatically learned costs. Although promising results were obtained, evaluating a classification tree with a single cost matrix is not a wise choice. Indeed, the tree quality evaluation requires trying several misclassification cost matrices to be more precise and fair. Motivated by this observation, we propose in this paper a bi-level modeling of the cost-sensitive classification tree induction problem where the upper level evolves the classification trees, while the cost matrix of each tree is optimized at the lower level. Our bi-level modeling is solved using an existing co-evolutionary algorithm, and the resulting method is named Bi-COS. The obtained comparative experimental results on several imbalanced benchmark datasets show the merits of Bi-COS with respect to the state-of-the art.
Rihab Said, Maha Elarbi, Slim Bechikh, Lamjed Ben SaidSolving combinatorial bi-level optimization problems using multiple populations and migration schemes
Operational Research, 22(3), 1697-1735, 2022
Résumé
In many decision making cases, we may have a hierarchical situation between different optimization tasks. For instance, in production scheduling, the evaluation of the tasks assignment to a machine requires the determination of their optimal sequencing on this machine. Such situation is usually modeled as a Bi-Level Optimization Problem (BLOP). The latter consists in optimizing an upper-level (a leader) task, while having a lower-level (a follower) optimization task as a constraint. In this way, the evaluation of any upper-level solution requires finding its corresponding lower-level (near) optimal solution, which makes BLOP resolution very computationally costly. Evolutionary Algorithms (EAs) have proven their strength in solving BLOPs due to their insensitivity to the mathematical features of the objective functions such as non-linearity, non-differentiability, and high dimensionality. Moreover, EAs that are based on approximation techniques have proven their strength in solving BLOPs. Nevertheless, their application has been restricted to the continuous case as most approaches are based on approximating the lower-level optimum using classical mathematical programming and machine learning techniques. Motivated by this observation, we tackle in this paper the discrete case by proposing a Co-Evolutionary Migration-Based Algorithm, called CEMBA, that uses two populations in each level and a migration scheme; with the aim to considerably minimize the number of Function Evaluations (FEs) while ensuring good convergence towards the global optimum of the upper-level. CEMBA has been validated on a set of bi-level combinatorial production-distribution planning benchmark instances. The statistical analysis of the obtained results shows the effectiveness and efficiency of CEMBA when compared to existing state-of-the-art combinatorial bi-level EAs.
-
2017Chedy Abdelkarim, Lilia Rejeb, Lamjed Ben Said, Maha Elarbi
Evidential learning classifier system
Authors: Chedi Abdelkarim, Lilia Rejeb, Lamjed Ben Said, Maha ElarbiAuthors Info & Claims GECCO '17: Proceedings of the Genetic and Evolutionary Computation Conference Companion Pages 123 - 124 https://doi.org/10.1145/3067695.3075997, 2017
Résumé
During the last decades, Learning Classifier Systems have known many advancements that were highlighting their potential to resolve complex problems. Despite the advantages offered by these algorithms, it is important to tackle other aspects such as the uncertainty to improve their performance. In this paper, we present a new Learning Classifier System (LCS) that deals with uncertainty in the class selection in particular imprecision. Our idea is to integrate the Belief function theory in the sUpervised Classifier System (UCS) for classification purpose. The new approach proved to be efficient to resolve several classification problems.
BibTeX
@article{said2022discretization, title={Discretization-based feature selection as a bilevel optimization problem}, author={Said, Rihab and Elarbi, Maha and Bechikh, Slim and Coello, Carlos Artemio Coello and Said, Lamjed Ben}, journal={IEEE Transactions on Evolutionary Computation}, volume={27}, number={4}, pages={893--907}, year={2022}, publisher={IEEE} }
BibTeX
@inproceedings{said2022cost, title={Cost-sensitive classification tree induction as a bi-level optimization problem}, author={Said, Rihab and Elarbi, Maha and Bechikh, Slim and Coello, Carlos A Coello and Said, Lamjed Ben}, booktitle={Proceedings of the Genetic and Evolutionary Computation Conference Companion}, pages={284--287}, year={2022} }
BibTeX
@article{said2022solving, title={Solving combinatorial bi-level optimization problems using multiple populations and migration schemes}, author={Said, Rihab and Elarbi, Maha and Bechikh, Slim and Ben Said, Lamjed}, journal={Operational Research}, volume={22}, number={3}, pages={1697--1735}, year={2022}, publisher={Springer} }
BibTeX
@inproceedings{10.1145/3067695.3075997,
author = {Abdelkarim, Chedi and Rejeb, Lilia and Said, Lamjed Ben and Elarbi, Maha},
title = {Evidential learning classifier system},
year = {2017},
isbn = {9781450349390},
publisher = {Association for Computing Machinery},
address = {New York, NY, USA},
url = {https://doi.org/10.1145/3067695.3075997},
doi = {10.1145/3067695.3075997},
abstract = {During the last decades, Learning Classifier Systems have known many advancements that were highlighting their potential to resolve complex problems. Despite the advantages offered by these algorithms, it is important to tackle other aspects such as the uncertainty to improve their performance. In this paper, we present a new Learning Classifier System (LCS) that deals with uncertainty in the class selection in particular imprecision. Our idea is to integrate the Belief function theory in the sUpervised Classifier System (UCS) for classification purpose. The new approach proved to be efficient to resolve several classification problems.},
booktitle = {Proceedings of the Genetic and Evolutionary Computation Conference Companion},
pages = {123–124},
numpages = {2},
keywords = {uncertainty, machine learning, learning classifier systems, classification, belief function theory},
location = {Berlin, Germany},
series = {GECCO ’17}
}