I am a professor of mathematics (Ph.D) since 35 years at Ibn Tofail University, Faculty of sciences, department of mathematics, Kénitra City , Morocco. I have original publications in maths, and I am a member of Amercain Mathematical Society. I reviewed 100 items from Mathematical Reviews and I am named one of sciensific leaders in maths of the world in 2016 from International Biographical Center, England, Cambridge.
We establish some new common fixed point theorems of single-valued and multivalued mappings opera... more We establish some new common fixed point theorems of single-valued and multivalued mappings operating between complete ordered locally convex spaces under weaker assumptions. As an application, we prove a new minimax theorem of existence of a solution of a game.
We establish some new common fixed point theorems of single-valued and multivalued mappings opera... more We establish some new common fixed point theorems of single-valued and multivalued mappings operating between complete ordered locally convex spaces under weaker assumptions. As an application, we prove a new minimax theorem of existence of a solution of a game.
We study here stability properties of a class of well-posed optimization problems under suitable ... more We study here stability properties of a class of well-posed optimization problems under suitable variational convergences. Moreover we investigate the stability for uniformly well-posed optimization problems. We give applications to mathematical programming
International Journal of Online and Biomedical Engineering (iJOE)
This manuscript presents a simulation comparison of statistical classical methods and machine lea... more This manuscript presents a simulation comparison of statistical classical methods and machine learning algorithms for time series forecasting notably the ARIMA model, K-Nearest Neighbors (KNN), The support Vector Regression (SVR), and Long-Short Term Memory (LSTM). The performance of the models was evaluated using different metrics especially Mean Squared Error (MSE), Mean Absolute Error (MAE), Median Absolute Error (Median AE), and Root Mean Squared Error (RMSE). The results of the simulations approve that KNN algorithm has better accuracy than the others models’ forecasting notably in the middle and long terms. The MAPE for the KNN model was around 4.976843 while SVR and LSTM architectures had a MAPE of 6.810311 and 13.992133 respectively. In the medium and long term, ML models are so powerful on big datasets. Paradoxically, Machine learning architectures outperform ARIMA for shorter-term predictions. Thus, ARIMA is most appropriate in the case of univariate small data sets, where...
2020 International Conference on Intelligent Systems and Computer Vision (ISCV), 2020
Economic and social change in a digital era is making the insurance ecosystem more complex. It ma... more Economic and social change in a digital era is making the insurance ecosystem more complex. It makes several agents interact for different purposes. Therefore, a reflection on the insurance ecosystem modelling through multi-agent simulation seems interesting since it allows to capture the complexity of today’s insured on the one hand and on the other hand to measure the solvency of insurance and reinsurance companies (EAR) in order to ensure the viability of the said ecosystem. Thus, this research aims at modeling the Moroccan insurance ecosystem within the framework of a new risk-based solvency directive for the case of loan death cover and on the basis of a set of exogenous and endogenous factors that influence the solvency of insurance and reinsurance companies in Morocco. This paper is carried out on the basis of a model, developed using NetLogo software, consisting of 4 agents that interact in the case of the “borrower’s death” guarantee, namely: the insured, the EAR, the banks and the Supervisory Authority of Insurance and Social Welfare (ACAPS). Each agent has a set of characteristics and seeks a defined objective. Thus, the modelling carried out allows testing the impact of endogenous and exogenous variables on the solvency of the EAR according to a simulation in three scenarios (central, rainy and risky).
2020 Fourth International Conference On Intelligent Computing in Data Sciences (ICDS), 2020
The Islamic finance sector has experienced significant development in several countries of the wo... more The Islamic finance sector has experienced significant development in several countries of the world and has shown resilience in the face of various crises, including the financial crisis of 2008. According to scientists, this sector has the potential to grow significantly in the coming years. Takaful insurance plays a key role in this sector due to its dual economic and social vocation. Indeed, this activity involves several agents who help each other for different purposes while respecting Islamic principles. In this context, this work leads a reflection on a modelling of this sector by a multi-agent simulation. The latter allows us to deduce from the analysis of the micro-level behaviour of agents, the viability of the sector at the macro level. Indeed, this research models the ecosystem of the Moroccan Takaful insurance on the basis of a set of simulations aimed at measuring the impact of certain hypotheses on the viability of the sector. Also, the work was inspired by work done on the conventional insurance sector [1].The work was carried out on the basis of a Takaful Insurance and Reinsurance Company (EART) which manages a Takaful fund of the “Takaful Tamil” death guarantee of the Murabaha contract. The model assumes that the EART manages the fund on the basis of a Wakala contract and simulates the results under three scenarios relating to the technical and financial surplus distribution methods (prorata, selectivity and compensation).
In this paper, we will combine random set theory and portfolio theory, through the estimation of ... more In this paper, we will combine random set theory and portfolio theory, through the estimation of the lower bound of the Markowitz random set based on the Mean-Variance Analysis of Asset Portfolios Approach, which represents the efficient frontier of a portfolio. There are several Markowitz optimization approaches, of which we denote the most known and used in the modern theory of portfolio, namely, the Markowitz’s approach, the Markowitz Sharpe’s approach and the Markowitz and Perold’s approach, generally these methods are based on the minimization of the variance of the return of a portfolio. On the other hand, the method used in this paper is completely different from those denoted above, because it is based on the theory of random sets, which allowed us to have the mathematical structure and the graphic of the Markowitz set. The graphical representation of the Markowitz set gives us an idea of the investment region. This region, called the investment zone, contains the stocks in which the rational investor can choose to invest. Mathematical and statistical estimation techniques are used in this paper to find the explicit form of the Markowitz random set, and to study its elements in function of the signs of the estimated parameters. Finally, we will apply the results found to the case of the returns of a portfolio composed of 200 assets from the Paris Stock Market Prices. The results obtained by this simulation allow us to have an idea on the stocks to recommend to the investors. In order to optimize their choices, these stocks are those which will be located above the curve of the hyperbola which represents the Markowitz set.
There are many existing studies released in the field of Computer Vision, especially the field of... more There are many existing studies released in the field of Computer Vision, especially the field of Automatic License Plate Recognition. However, most of them are focused on using one method at the time, such as Thresholding algorithms, Edge Detections or Morphological transformations. This research paper proposes to automate the License plate recognition process, by combining four algorithms from the three methods mentioned above: Adaptive Thresholding, Otsu's Thresholding, Canny Edge Detection and Morphological Gradient applied to Edge Detection. The Goal achieved is to obtain the best binary image from those methods, and the statistical technique used in, is the median of pixel's intensity of all output images obtained by the four methods. Additionally, this research offers a comparative study on thresholding techniques to choose the best method for binarizing an image, which is the first and crucial step of Automatic License Plate Recognition Process.
We establish some new common fixed point theorems of single-valued and multivalued mappings opera... more We establish some new common fixed point theorems of single-valued and multivalued mappings operating between complete ordered locally convex spaces under weaker assumptions. As an application, we prove a new minimax theorem of existence of a solution of a game.
We establish some new common fixed point theorems of single-valued and multivalued mappings opera... more We establish some new common fixed point theorems of single-valued and multivalued mappings operating between complete ordered locally convex spaces under weaker assumptions. As an application, we prove a new minimax theorem of existence of a solution of a game.
We study here stability properties of a class of well-posed optimization problems under suitable ... more We study here stability properties of a class of well-posed optimization problems under suitable variational convergences. Moreover we investigate the stability for uniformly well-posed optimization problems. We give applications to mathematical programming
International Journal of Online and Biomedical Engineering (iJOE)
This manuscript presents a simulation comparison of statistical classical methods and machine lea... more This manuscript presents a simulation comparison of statistical classical methods and machine learning algorithms for time series forecasting notably the ARIMA model, K-Nearest Neighbors (KNN), The support Vector Regression (SVR), and Long-Short Term Memory (LSTM). The performance of the models was evaluated using different metrics especially Mean Squared Error (MSE), Mean Absolute Error (MAE), Median Absolute Error (Median AE), and Root Mean Squared Error (RMSE). The results of the simulations approve that KNN algorithm has better accuracy than the others models’ forecasting notably in the middle and long terms. The MAPE for the KNN model was around 4.976843 while SVR and LSTM architectures had a MAPE of 6.810311 and 13.992133 respectively. In the medium and long term, ML models are so powerful on big datasets. Paradoxically, Machine learning architectures outperform ARIMA for shorter-term predictions. Thus, ARIMA is most appropriate in the case of univariate small data sets, where...
2020 International Conference on Intelligent Systems and Computer Vision (ISCV), 2020
Economic and social change in a digital era is making the insurance ecosystem more complex. It ma... more Economic and social change in a digital era is making the insurance ecosystem more complex. It makes several agents interact for different purposes. Therefore, a reflection on the insurance ecosystem modelling through multi-agent simulation seems interesting since it allows to capture the complexity of today’s insured on the one hand and on the other hand to measure the solvency of insurance and reinsurance companies (EAR) in order to ensure the viability of the said ecosystem. Thus, this research aims at modeling the Moroccan insurance ecosystem within the framework of a new risk-based solvency directive for the case of loan death cover and on the basis of a set of exogenous and endogenous factors that influence the solvency of insurance and reinsurance companies in Morocco. This paper is carried out on the basis of a model, developed using NetLogo software, consisting of 4 agents that interact in the case of the “borrower’s death” guarantee, namely: the insured, the EAR, the banks and the Supervisory Authority of Insurance and Social Welfare (ACAPS). Each agent has a set of characteristics and seeks a defined objective. Thus, the modelling carried out allows testing the impact of endogenous and exogenous variables on the solvency of the EAR according to a simulation in three scenarios (central, rainy and risky).
2020 Fourth International Conference On Intelligent Computing in Data Sciences (ICDS), 2020
The Islamic finance sector has experienced significant development in several countries of the wo... more The Islamic finance sector has experienced significant development in several countries of the world and has shown resilience in the face of various crises, including the financial crisis of 2008. According to scientists, this sector has the potential to grow significantly in the coming years. Takaful insurance plays a key role in this sector due to its dual economic and social vocation. Indeed, this activity involves several agents who help each other for different purposes while respecting Islamic principles. In this context, this work leads a reflection on a modelling of this sector by a multi-agent simulation. The latter allows us to deduce from the analysis of the micro-level behaviour of agents, the viability of the sector at the macro level. Indeed, this research models the ecosystem of the Moroccan Takaful insurance on the basis of a set of simulations aimed at measuring the impact of certain hypotheses on the viability of the sector. Also, the work was inspired by work done on the conventional insurance sector [1].The work was carried out on the basis of a Takaful Insurance and Reinsurance Company (EART) which manages a Takaful fund of the “Takaful Tamil” death guarantee of the Murabaha contract. The model assumes that the EART manages the fund on the basis of a Wakala contract and simulates the results under three scenarios relating to the technical and financial surplus distribution methods (prorata, selectivity and compensation).
In this paper, we will combine random set theory and portfolio theory, through the estimation of ... more In this paper, we will combine random set theory and portfolio theory, through the estimation of the lower bound of the Markowitz random set based on the Mean-Variance Analysis of Asset Portfolios Approach, which represents the efficient frontier of a portfolio. There are several Markowitz optimization approaches, of which we denote the most known and used in the modern theory of portfolio, namely, the Markowitz’s approach, the Markowitz Sharpe’s approach and the Markowitz and Perold’s approach, generally these methods are based on the minimization of the variance of the return of a portfolio. On the other hand, the method used in this paper is completely different from those denoted above, because it is based on the theory of random sets, which allowed us to have the mathematical structure and the graphic of the Markowitz set. The graphical representation of the Markowitz set gives us an idea of the investment region. This region, called the investment zone, contains the stocks in which the rational investor can choose to invest. Mathematical and statistical estimation techniques are used in this paper to find the explicit form of the Markowitz random set, and to study its elements in function of the signs of the estimated parameters. Finally, we will apply the results found to the case of the returns of a portfolio composed of 200 assets from the Paris Stock Market Prices. The results obtained by this simulation allow us to have an idea on the stocks to recommend to the investors. In order to optimize their choices, these stocks are those which will be located above the curve of the hyperbola which represents the Markowitz set.
There are many existing studies released in the field of Computer Vision, especially the field of... more There are many existing studies released in the field of Computer Vision, especially the field of Automatic License Plate Recognition. However, most of them are focused on using one method at the time, such as Thresholding algorithms, Edge Detections or Morphological transformations. This research paper proposes to automate the License plate recognition process, by combining four algorithms from the three methods mentioned above: Adaptive Thresholding, Otsu's Thresholding, Canny Edge Detection and Morphological Gradient applied to Edge Detection. The Goal achieved is to obtain the best binary image from those methods, and the statistical technique used in, is the median of pixel's intensity of all output images obtained by the four methods. Additionally, this research offers a comparative study on thresholding techniques to choose the best method for binarizing an image, which is the first and crucial step of Automatic License Plate Recognition Process.
Uploads
Papers by driss mentagui