Enhancing Personnel Selection through the Integration of the Entropy Synergy Analysis of Multi-Attribute Decision Making Model: A Novel Approach
Abstract
:1. Introduction
2. Model Framework Overview: ES-MADM Structure
2.1. The Contribution of Entropy as a MADM Model Tool in the Personnel Selection Problem
2.2. Basic Concepts in Entropy and Information Theory
2.2.1. Entropy of an Information Source
2.2.2. Joint Entropy
2.2.3. Conditional Entropy
2.2.4. Mutual Information
2.3. The ES-MADM Model
2.3.1. Basic Concepts on Decision Making
- Decision-Making (DM) Process
- Decision Maker
- Alternatives (Candidates-Personnel Selection)
- Criteria (Evaluation Attributes)
- Criteria Weights
- Decision Matrix
2.3.2. Computation Steps of the ES-MADM method
- STEP 1. Define Alternatives-Criteria-Data Matrix
- STEP 2. Compute the Conditional Probabilities
- STEP 3. Compute the Integrated Criteria Weights
- STEP 4. Compute the Partial Specific Conditional Entropy of the Alternatives
- STEP 5. Compute the Entropy of the Alternatives
- STEP 6. Compute the Conditional Entropy of the Alternatives
2.3.3. Integrating the ES-MADM Model into the Personnel Selection Problem
3. Exploring ES-MADM Model Performance
3.1. An Illustrative Case Study for the ES-MADM Model
3.1.1. Case Study: Problem Statement
3.1.2. Case Study: ES-MADM (Initial Results)
3.1.3. Case Study: ES-MADM (Assigning Non-Equal SBJ Weights to the Criteria)
3.1.4. Sensitivity Analysis to ES-MADM Model
4. Results
4.1. ES-MADM Model: An Overview to the Results
4.2. Case Study (Initial Results)
4.2.1. Results Analysis
4.2.2. Comparison of the Results Produced by ES-MADM with Those from TOPSIS
4.3. Case Study (Assigning Non-Equal Subjective Weights to the Criteria)
- Candidate Importance: The selection probabilities pertaining to the candidates, denoting their relative importance, undergo variations, particularly concerning the least significant candidate. In cases with non-uniform subjective criteria weights, candidate C7 emerges as the least favored option, exhibiting the lowest selection probabilities. Conversely, candidate C2 consistently maintains the highest-ranking score across both scenarios. This can be attributed to candidate C2 consistently displaying superior values across all criteria in comparison to the other candidates.
- Integrated Criteria Significance: In both scenarios, it is evident that criterion , specifically pertaining to “Recommendation Letters,” consistently emerges as the most pivotal, while is consistently identified as the least influential. This observation aligns seamlessly with the fact that for both scenarios, all candidates exhibit relatively uniform performance levels concerning criterion , specifically their usage of MS OFFICE platforms. In stark contrast, when considering criterion , significant disparities among the candidates become apparent, resulting in notably divergent outcomes. To elucidate further, when there is substantial divergence in values among the candidates for a particular criterion, it leads to lower entropy measures. Consequently, this reinforces the credibility of that criterion in explaining and contributing to the decision-making process, hence ascribing it higher importance. In essence, the degree of variation in candidate values for a given criterion directly impacts its reliability and influence in guiding the decision, resulting in its elevated importance.
- Problem Stability: The stability of the problem seems to diminish when non-uniform subjective criteria weights are introduced, exemplified by the elevated value of ( in the second scenario. This phenomenon suggests a reduction in the reliability and consistency of the decision-making process. This trend occurs because when non-uniform subjective weights are assigned to criteria, there is a greater divergence in the importance attributed to various factors by decision makers. Consequently, the criteria may exert more variable influences on the final outcome, resulting in a less stable decision-making environment. The increased value of I(Y│X) in the second scenario underscores the greater uncertainty and variability introduced by non-uniform subjective weights, which can pose challenges in achieving consistent and reliable decisions.
- Criteria Importance: Figure 7 and Figure 10 offer a comparative visualization of the subjective importance , the objective importance and the integrated importance of the criteria. A clear distinction emerges when considering the scenario of equal subjective criteria weights versus the one where these weights differ significantly. In the case of equal subjective criteria weights, the integrated significance of the criteria primarily hinges on the objective weights, which are derived from the data matrix. These objective weights essentially constitute the primary contributors to the overall assessment of criteria significance. Conversely, when subjective criteria weights exhibit substantial variability, a different dynamic emerges. In this scenario, the objective-derived criteria weights, along with their corresponding objective significance measures, play a critical role as correctors. They refine and recalibrate the subjective assessments by introducing an objective perspective, grounded in the inherent information encapsulated within the data matrix. This comparison, as depicted in Figure 10, underscores the significance of integrating objective weights and their associated significance measures into the evaluation process. These objective elements serve as essential correctors for subjective evaluations, ensuring a more precise, equitable, and comprehensive appraisal of the overall criteria significance. Their inclusion helps strike a balance between subjective judgments and data-driven objectivity, leading to more informed and robust decision-making processes.
4.4. Sensitivity Analysis Results
5. Discussion
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Afshari, R.; Yusuff, R.M.; Hong, T.S.; Ismail, Y.B. A review of the applications of multi criteria decision making for personnel selection problem. Afr. J. Bus. Manag. 2011, 5, 28. [Google Scholar]
- Chae, J.S.; Kim, C.J. The Impact of Strategic Human Resource Management on Management Performance through Organizational Competence. J. Hum. Resour. Manag. Res. 2019, 23, 143–174. [Google Scholar] [CrossRef]
- Chen, C.T.; Hwang, Y.C.; Hung, W.Z. Applying multiple linguistic PROMETHEE method for personnel evaluation and selection. In Proceedings of the IEEE 2009 IEEE International Conference on Industrial Engineering and Engineering Management (IEEM)—Applying Multiple Ling, Hong Kong, China, 8–11 December 2009; pp. 1312–1316. [Google Scholar]
- Robertson, I.T.; Smith, M. Personnel selection. J. Occup. Organ. Psychol. 2001, 74, 441–472. [Google Scholar] [CrossRef]
- Golec, A.; Kahya, E. A fuzzy model for competency-based employee evaluation and selection. Comput. Ind. Eng. 2007, 52, 143–161. [Google Scholar] [CrossRef]
- Dursun, M.; Karsak, E.E. A fuzzy MCDM approach for personnel selection. Expert Syst. Appl. 2010, 37, 4324–4330. [Google Scholar] [CrossRef]
- Dağdeviren, M. A hybrid multi-criteria decision-making model for personnel selection in manufacturing systems. J. Intell. Manuf. 2008, 21, 451–460. [Google Scholar] [CrossRef]
- Kazimieras Zavadskas, E.; Antucheviciene, J.; Chatterjee, P. Multiple-Criteria Decision-Making (MCDM) Techniques for Business Processes Information Management. Information 2019, 10, 4. [Google Scholar] [CrossRef]
- Liao, S.; Chang, K. Select televised sportscasters for Olympic Games by analytic network process. Manag. Decis. 2009, 47, 14–23. [Google Scholar] [CrossRef]
- Bogdanovic, D.; Miletic, S. Personnel evaluation and selection by multicriteria decision making method. Econ. Comput. Econ. Cybern. Stud. Res. 2015, 48, 179–196. [Google Scholar]
- Hwang, C.L.; Yoon, K. Multiple Attributes Decision Making Methods and Applications; Springer: Berlin/Heidelberg, Germany, 1981. [Google Scholar]
- Kelemenis, A.; Askounis, D. A new TOPSIS-based multi-criteria approach to personnel selection. Expert Syst. Appl. 2010, 37, 4999–5008. [Google Scholar] [CrossRef]
- Singh, A.; Malik, S.K. MCDM and Its Role in Personnel Selection—A Review. Int. J. Eng. Tech. Res. 2014, 2, 1–4. [Google Scholar]
- Zavadskas, E.K.; Vainiūnas, P.; Turskis, Z.; Tamošaitienė, J. Multiple criteria decision support system for assessment of projects managers in construction. Int. J. Inf. Technol. Decis. Mak. 2012, 11, 501–520. [Google Scholar] [CrossRef]
- Karabasevic, D.; Stanujkic, D.; Urosevic, S. The MCDM Model for Personnel Selection Based on SWARA and ARAS Methods. Management 2015, 20, 43–52. [Google Scholar]
- Uslu, Y.D.; Yılmaz, E.; Yiğit, P. Developing Qualified Personnel Selection Strategies Using MCDM Approach: A University Hospital Practice. In Strategic Outlook in Business and Finance Innovation: Multidimensional Policies for Emerging Economies; Emerald Publishing: Bingley, UK, 2021; pp. 195–205. [Google Scholar]
- Popović, M. An MCDM Approach for Personnel Selection Using the CoCoSo Method. J. Process Manag. New Technol. 2021, 9, 78–88. [Google Scholar] [CrossRef]
- Danişan, T.; Özcan, E.; Eren, T. Personnel Selection with Multi-Criteria Decision Making Methods in the Ready-to-Wear Sector. Tech. Gaz. 2022, 29, 1339–1347. [Google Scholar]
- König, C.J.; Langer, M. Machine Learning in Personnel Selection. In Handbook of Research on Artificial Intelligence in Human Resource Management; Edward Elgar: Cheltenham, UK, 2022; pp. 149–167. [Google Scholar]
- Kanakaris, N.; Giarelis, N.; Siachos, I.; Karacapilidis, N. Shall I Work with Them. A Knowledge Graph-Based Approach for Predicting Future Research Collaborations. Entropy 2021, 23, 664. [Google Scholar] [CrossRef] [PubMed]
- Kanakaris, N.; Giarelis, N.; Siachos, I.; Karacapilidis, N. Making personnel selection smarter through word embeddings: A graph-based approach. Mach. Learn. Appl. 2022, 7, 100214. [Google Scholar] [CrossRef]
- Campion, M.A.; Campion, E.D. Machine learning applications to personnel selection: Current illustrations, lessons learned, and future research. Pers. Psycol. 2023, 76, 993–1009. [Google Scholar] [CrossRef]
- Goretzko, D.; Israel, L.S.F. Pitfalls of Machine Learning based Personnel Selection—Fairness, Transparency and Data Quality. J. Pers. Psychol. 2021, 21, 37–47. [Google Scholar]
- Zhang, N.; Wang, M.; Xu, H.; Koenig, N.; Hickman, L.; Kuruzovich, J.; Ng, V.; Arhin, K.; Wilson, D.; Song, Q.C.; et al. Reducing subgroup differences in personnel selection through the application of machine learning. Pers. Psychol. 2023, 76, 1125–1159. [Google Scholar] [CrossRef]
- Shannon, C.E. A Mathematical Theory of Communication. Bell Syst. Tech. J. 1948, 27, 379–423, 623–656. [Google Scholar] [CrossRef]
- Khinchin, A.I. Mathematical Foundation of Information Theory; Dover: New York, NY, USA, 1957. [Google Scholar]
- Sun, L.; Xu, J.; Cao, X. Decision Table Reduction Method Based on New Conditional Entropy for Rough Set Theory. In Proceedings of the IEEE 2009 International Workshop on Intelligent Systems and Applications, Wuhan, China, 23–24 May 2009; pp. 1–4. [Google Scholar]
- Le, D.H.; Reynolds, A.C. Estimation of Mutual Information and Conditional Entropy for Surveillance Optimizatio. SPE J. 2014, 19, 648–661. [Google Scholar] [CrossRef]
- Doumpos, M.; Figueira, J.R.; Greco, S.; Zopounidis, C. New Perspectives in Multiple Criteria Decision Making (Innovative Applications and Case Studies); Springer: Berlin/Heidelberg, Germany, 2019. [Google Scholar]
- Taherdoost, H.; Madanchian, M. Multi-Criteria Decision Making (MCDM) Methods and Concepts. Encyclopedia 2023, 3, 77–87. [Google Scholar] [CrossRef]
- Fox, W.P.; Spence, G.; Kitchen, R.; Powell, S. Using the Entropy Weighting Scheme in Military Decision Making. J. Déf. Model. Simul. Appl. Methodol. Technol. 2019, 17, 409–418. [Google Scholar] [CrossRef]
- Yang, J.-B.; Singh, M. An evidential reasoning approach for multiple-attribute decision making with uncertainty. IEEE Trans. Syst. Man, Cybern. 1994, 24, 1–18. [Google Scholar] [CrossRef]
- Al-Aomar, R. A combined ahp-entropy method for deriving subjective and objective criteria. Int. J. Ind. Eng. 2010, 17, 12–24. [Google Scholar]
- Korkmaz, O. Personnel Selection Method Based on TOPSIS Multi-Criteria Decision-Making Method. Int. J. Econ. Adm. Stud. 2019, 23, 1–16. [Google Scholar] [CrossRef]
- Diakoulaki, D.; Mavrotas, G.; Papayannakis, L. Determining objective weights in multiple criteria problems: The critic method. Comput. Oper. Res. 1995, 22, 763–770. [Google Scholar] [CrossRef]
- Xiao, F. Quantum X-entropy in generalized quantum evidence theory. Inf. Sci. 2023, 643, 119177. [Google Scholar] [CrossRef]
- Xiao, F. EFMCDM: Evidential Fuzzy Multicriteria Decision Making Based on Belief Entropy. Trans. Fuzzy Syst. 2020, 28, 1477–1491. [Google Scholar] [CrossRef]
- Xiao, F. On the Maximum Entropy Negation of a Complex-Valued Distribution. Trans. Fuzzy Syst. 2020, 29, 3259–3269. [Google Scholar] [CrossRef]
- Saaty, T.L. The Analytic Hierarchy Process, 1st ed.; McGraw-Hill: New York, NY, USA, 1980. [Google Scholar]
- Saaty, T.L.; Vargas, L.G. Decision Making with the Analytic Network Process (Economic, Political, Social and Technological Applications with Benefits, Opportunities, Costs and Risks), 2nd ed.; International Series in Operational Research & Management Science; Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
- Roy, B. The outranking approach and the foundations of electre methods. Theory Decis. 1991, 31, 49–73. [Google Scholar] [CrossRef]
Steps | Analysis |
---|---|
STEP 1. | Specify the N alternatives (actions, options, issues) |
STEP 2. | Specify the M criteria |
STEP 3. | Specify the data that has to be collected. |
STEP 4. | Rank the criteria by assigning weights for each criterion. The weight is also called the importance factor. Each weight is a positive number taking into consideration the relevance of the criterion for each alternative. |
STEP 5. | Rank the alternatives according to the criteria’s values. The order is determined as follows. Each criterion evaluates each alternative with a value , (μ = 1, 2, …, M and v = 1, 2, …, N). The value is the value (real or symbolic) that the criterion receives with respect to the alternative . The matrix is known as the “Data Matrix”. The rows of the Data Matrix are the values of the alternatives for each criterion, while the columns of the Data Matrix are the values of the criteria for each alternative. The transpose of the Data Matrix D is known as Decision Matrix. In practice, the Data Matrix is constructed according to specific statistical procedures. After estimating the weights for each criterion, we proceed in the ranking of each alternative expressed as the non-negative numbers . Based on the ranking values according to ranking methodology (α), the optimal alternative is selected and implemented. If the methodology for estimating the ranking values is fixed, the ranking values are denoted as . |
STEP 6. | Check the sensitivity of the selected alternative with respect to small changes of the weights of the criteria. Sensitivity analysis is required to validate the model’s robustness. Small modifications to the values of the criteria should not significantly alter the MCDM’s outcomes. This assumption must be verified for every MCDM model; otherwise, “Chaos” and unreliable conclusions may result. Moreover, we specify the domains for which the model is reliable. |
Factor | Notation |
---|---|
Criteria | |
Criteria Weights | |
Value of Criterion for Alternative | |
Alternatives | |
Alternatives Ranking (Method a) | |
Data Matrix |
Substeps | Remarks | |
---|---|---|
STEP 1 | ||
1.A | Define Alternatives ) and Criteria | Alternatives and criteria defined from decision maker (CRITERIA-ALTERNATIVES) |
1.B | Data Selection | Data are assembled (values that each takes for each ) (ELEMENTS OF DATA MATRIX) |
1.C | Data Matrix – Assign Subjective (SBJ) Criteria Weights | Data matrix is completed with data () and (SBJ criteria weights) assigned. (SUBJECTIVE CRITERIA WEIGHTS) |
STEP 2 | ||
2.A | Compute the Conditional Probabilities | Conditional probability that Y takes the value , given that X has the value . |
STEP 3 | ||
3.A | Normalize into the Data Matrix | Normalization of data matrix elements. Essentially the results are the same with 2. A |
3.B | Compute the Normalized Entropy of each Criterion | Quantifies the average amount of information or uncertainty associated with the outcome Y for a specific condition |
3.C | Compute the Diversification Degree for each criterion | Quantifies the degree of dissimilarity or distinctiveness associated with the specific condition μ |
3.D | Compute the Objective Weight for each criterion | Computation of the objective weights of criteria. (OBJECTIVE CRITERIA WEIGHTS) |
3.E | Compute the Integrated Weights for each criterion | Computation of the integrated weight weights of criteria. (INTEGRATED CRITERIA WEIGHTS) |
STEP 4 | ||
4.A | Compute the Conditional Entropy of the COAs and the Normalised Conditional Entropy | Calculates the conditional entropy of the variable Y given that X takes on the specific value . It measures the average amount of uncertainty or information required to describe the outcome of Y, considering given value X. |
Computes the mutual information between variables Y and X, given that X takes a specific value . Quantifies the reduction in uncertainty or shared information between Y and X, normalized by the logarithm base 2 of the number of possible outcomes N. (OBJECTIVE CRITERIA SIGNIFICANCE) | ||
STEP 5 | ||
5.A | Compute the overall Entropy of the COAs and the overall Score for each COA | calculates the entropy of the variable Y, which measures the average amount of uncertainty or information required to describe the possible outcomes of Y |
combines the conditional probability with the significance of the criterion to evaluate the joint probability or significance of occurring in conjunction with . | ||
computes the probability of the outcome Y taking the value . (ALTERNATIVE PROBABILITY-FINAL SCORE) | ||
STEP 6 | ||
6.A | Compute the Conditional Entropy of the COAs and the Normalised Conditional Entropy | measures the average amount of uncertainty or information required to describe the outcome of Y, considering the different values of X and their corresponding conditional entropies. captures the overall weighted conditional entropy for the specific condition . (INTEGRATED CRITERIA SIGNIFICANCE) |
quantifies the reduction in uncertainty about Y when X is known, relative to the total uncertainty in Y. (TOTAL PROBLEM UNCERTAINTY) |
Concept | Nomenclature | Type | Analysis | Data Remarks |
---|---|---|---|---|
Alternatives | INPUT | Defined by the decision-maker(s)—personnel selection department | ) | |
Where N is the number of the candidates to be evaluated | ||||
Criteria | INPUT | Defined by the decision maker(s)—type of characteristics serving as criteria for the evaluation of the candidates | ||
Where M is the number of the qualifications-characteristics to be evaluated | ||||
Elements of Data Matrix | INPUT | Defined by the decision maker(s)—refers to the values of each criterion for each candidate | M × N | |
All data regarding all candidates for all criteria | ||||
Subjective Criteria Weights | INPUT | Defined by the decision maker(s)—refers to the subjective weights assigned to the criteria | ||
The weight of the criterion increases in proportion to its corresponding value. | ||||
Normalized Conditional Entropy (Specific) | OUTPUT | Measures the objective criteria significance (significance of each criterion with respect to the candidate selection process) | ||
Objective dependence between the criteria and the candidate selection. The objective is to minimize the value as much as possible. | ||||
Product of Conditional Probability with Criteria Weight | OUTPUT | Measures the integrated criteria significance (significance of each criterion) | ||
Integrated dependence between the criteria and the selection probability (usually combined with the objective significance). The objective is to maximize the value as much as possible. | ||||
OUTPUT | Measures the probability of selecting each candidate | |||
The weight of the selection probability increases in proportion to its corresponding value. The objective is to maximize the value as much as possible for each candidate. | ||||
Normalized Conditional Entropy | OUTPUT | Measures the uncertainty of the decision problem (uncertainty between criteria–candidates). | ||
Dependence between the overall criteria and the overall candidate selection process, which correspond to the stability of the problem. The objective is to minimize the value as much as possible. |
Criteria | Weights | Alternatives | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
C1 | C2 | C3 | C4 | C5 | C6 | C7 | C8 | C9 | |||
X1 | Logistics Experience | 0.16667 | 4 | 3 | 3 | 4 | 9 | 4 | 3 | 3 | 3 |
X2 | Education | 0.16667 | 8 | 8 | 6 | 1 | 5 | 10 | 10 | 7 | 10 |
X3 | Flexible Working Hours and Overtime | 0.16667 | 5 | 8 | 8 | 10 | 3 | 10 | 6 | 6 | 8 |
X4 | Proficency in MS Office Programs | 0.16667 | 6 | 5 | 7 | 6 | 6 | 7 | 7 | 8 | 8 |
X5 | Package Software Used in The Field of Logistics | 0.16667 | 7 | 1 | 1 | 1 | 5 | 8 | 1 | 5 | 6 |
X6 | Recommendation Letters | 0.16667 | 1 | 1 | 8 | 8 | 1 | 8 | 1 | 1 | 5 |
Criteria | |||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
C1 | C2 | C3 | C4 | C5 | C6 | C7 | C8 | C9 | (OBJ Criteria Importance) | C1 | C2 | C3 | C4 | C5 | C6 | C7 | C8 | C9 | (INT Criteria Importance) | ||
X1 | 0.111 | 0.083 | 0.083 | 0.111 | 0.250 | 0.111 | 0.083 | 0.083 | 0.083 | 0.093 | 0.962 | 0.010 | 0.008 | 0.008 | 0.010 | 0.023 | 0.010 | 0.008 | 0.008 | 0.008 | 0.284 |
X2 | 0.123 | 0.123 | 0.092 | 0.015 | 0.077 | 0.154 | 0.154 | 0.108 | 0.154 | 0.108 | 0.956 | 0.013 | 0.013 | 0.010 | 0.002 | 0.008 | 0.017 | 0.017 | 0.012 | 0.017 | 0.328 |
X3 | 0.078 | 0.125 | 0.125 | 0.156 | 0.047 | 0.156 | 0.094 | 0.094 | 0.125 | 0.057 | 0.977 | 0.004 | 0.007 | 0.007 | 0.009 | 0.003 | 0.009 | 0.005 | 0.005 | 0.007 | 0.177 |
X4 | 0.100 | 0.083 | 0.117 | 0.100 | 0.100 | 0.117 | 0.117 | 0.133 | 0.133 | 0.011 | 0.995 | 0.001 | 0.001 | 0.001 | 0.001 | 0.001 | 0.001 | 0.001 | 0.002 | 0.002 | 0.036 |
X5 | 0.200 | 0.029 | 0.029 | 0.029 | 0.143 | 0.229 | 0.029 | 0.143 | 0.171 | 0.308 | 0.876 | 0.062 | 0.009 | 0.009 | 0.009 | 0.044 | 0.070 | 0.009 | 0.044 | 0.053 | 0.854 |
X6 | 0.029 | 0.029 | 0.235 | 0.235 | 0.029 | 0.235 | 0.029 | 0.029 | 0.147 | 0.422 | 0.829 | 0.012 | 0.012 | 0.099 | 0.099 | 0.012 | 0.099 | 0.012 | 0.012 | 0.062 | 1.110 |
Ranking of the Alternatives (Candidates) | Decision Stability | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
0.103 | 0.050 | 0.134 | 0.130 | 0.092 | 0.207 | 0.052 | 0.083 | 0.148 | 2.789 | 3.042 | 0.917 |
Criteria | Weights | Alternatives | |||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
C1 | C2 | C3 | C4 | C5 | C6 | C7 | C8 | C9 | |||
X1 | Logistics Experience | 0.2 | 4 | 3 | 3 | 4 | 9 | 4 | 3 | 3 | 3 |
X2 | Education | 0.1 | 8 | 8 | 6 | 1 | 5 | 10 | 10 | 7 | 10 |
X3 | Flexible Working Hours and Overtime | 0.3 | 5 | 8 | 8 | 10 | 3 | 10 | 6 | 6 | 8 |
X4 | Proficiency in MS Office Programs | 0.2 | 6 | 5 | 7 | 6 | 6 | 7 | 7 | 8 | 8 |
X5 | Package Software Used in The Field of Logistics | 0.1 | 7 | 1 | 1 | 1 | 5 | 8 | 1 | 5 | 6 |
X6 | Recommendation Letters | 0.1 | 1 | 1 | 8 | 8 | 1 | 8 | 1 | 1 | 5 |
Criteria | |||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
C1 | C2 | C3 | C4 | C5 | C6 | C7 | C8 | C9 | (OBJ Criteria Importance) | C1 | C2 | C3 | C4 | C5 | C6 | C7 | C8 | C9 | (INT Criteria Importance) | ||
X1 | 0.111 | 0.083 | 0.083 | 0.111 | 0.250 | 0.111 | 0.083 | 0.083 | 0.083 | 0.153 | 0.962 | 0.017 | 0.013 | 0.013 | 0.017 | 0.038 | 0.017 | 0.013 | 0.013 | 0.013 | 0.467 |
X2 | 0.123 | 0.123 | 0.092 | 0.015 | 0.077 | 0.154 | 0.154 | 0.108 | 0.154 | 0.089 | 0.956 | 0.011 | 0.011 | 0.008 | 0.001 | 0.007 | 0.014 | 0.014 | 0.010 | 0.014 | 0.269 |
X3 | 0.078 | 0.125 | 0.125 | 0.156 | 0.047 | 0.156 | 0.094 | 0.094 | 0.125 | 0.141 | 0.977 | 0.011 | 0.018 | 0.018 | 0.022 | 0.007 | 0.022 | 0.013 | 0.013 | 0.018 | 0.436 |
X4 | 0.100 | 0.083 | 0.117 | 0.100 | 0.100 | 0.117 | 0.117 | 0.133 | 0.133 | 0.019 | 0.995 | 0.002 | 0.002 | 0.002 | 0.002 | 0.002 | 0.002 | 0.002 | 0.002 | 0.002 | 0.059 |
X5 | 0.200 | 0.029 | 0.029 | 0.029 | 0.143 | 0.229 | 0.029 | 0.143 | 0.171 | 0.252 | 0.876 | 0.050 | 0.007 | 0.007 | 0.007 | 0.036 | 0.058 | 0.007 | 0.036 | 0.043 | 0.700 |
X6 | 0.029 | 0.029 | 0.235 | 0.235 | 0.029 | 0.235 | 0.029 | 0.029 | 0.147 | 0.346 | 0.829 | 0.010 | 0.010 | 0.082 | 0.082 | 0.010 | 0.082 | 0.010 | 0.010 | 0.051 | 0.911 |
Ranking of the Alternatives (Candidates) | Decision Stability | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
0.101 | 0.060 | 0.129 | 0.131 | 0.100 | 0.194 | 0.059 | 0.084 | 0.141 | 2.842 | 3.076 | 0.924 |
Criteria | |||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
C1 | C2 | C3 | C4 | C5 | C6 | C7 | C8 | C9 | (OBJ Criteria Importance) | C1 | C2 | C3 | C4 | C5 | C6 | C7 | C8 | C9 | (INT Criteria Importance) | ||
X1 | 0.093 | 0.233 | 0.070 | 0.093 | 0.209 | 0.093 | 0.070 | 0.070 | 0.070 | 0.149 | 0.943 | 0.014 | 0.035 | 0.010 | 0.014 | 0.031 | 0.014 | 0.010 | 0.010 | 0.010 | 0.445 |
X2 | 0.119 | 0.149 | 0.090 | 0.015 | 0.075 | 0.149 | 0.149 | 0.104 | 0.149 | 0.118 | 0.955 | 0.014 | 0.018 | 0.011 | 0.002 | 0.009 | 0.018 | 0.018 | 0.012 | 0.018 | 0.358 |
X3 | 0.076 | 0.152 | 0.121 | 0.152 | 0.045 | 0.152 | 0.091 | 0.091 | 0.121 | 0.067 | 0.975 | 0.005 | 0.010 | 0.008 | 0.010 | 0.003 | 0.010 | 0.006 | 0.006 | 0.008 | 0.206 |
X4 | 0.097 | 0.113 | 0.113 | 0.097 | 0.097 | 0.113 | 0.113 | 0.129 | 0.129 | 0.007 | 0.997 | 0.001 | 0.001 | 0.001 | 0.001 | 0.001 | 0.001 | 0.001 | 0.001 | 0.001 | 0.021 |
X5 | 0.159 | 0.227 | 0.023 | 0.023 | 0.114 | 0.182 | 0.023 | 0.114 | 0.136 | 0.279 | 0.893 | 0.044 | 0.063 | 0.006 | 0.006 | 0.032 | 0.051 | 0.006 | 0.032 | 0.038 | 0.790 |
X6 | 0.023 | 0.233 | 0.186 | 0.186 | 0.023 | 0.186 | 0.023 | 0.023 | 0.116 | 0.380 | 0.855 | 0.009 | 0.088 | 0.071 | 0.071 | 0.009 | 0.071 | 0.009 | 0.009 | 0.044 | 1.031 |
Ranking of the Alternatives (Candidates) | Decision Stability | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
0.087 | 0.215 | 0.107 | 0.103 | 0.084 | 0.164 | 0.050 | 0.070 | 0.119 | 2.851 | 3.046 | 0.936 |
Revised Values for the 2nd Candidate to Maximize Selection Probability | |||||||||||
X1 | X2 | X3 | X4 | X5 | X6 | ||||||
10 | 10 | 10 | 7 | 10 | 10 |
) | Most Important Criterion (INT) | Least Important Criterion (INT) | Problem Stability | |
---|---|---|---|---|
Most Important | Less Important | |||
C6 | C2 | 1.110 (Recommendation Letters) | 0.036 (Proficiency in MS OFFICE) | 0.917 |
Subjective Criteria Weights | ) | Most Important Criterion (INT) | Least Important Criterion (INT) | Problem Stability | |
---|---|---|---|---|---|
Most Important | Most Important | ||||
Non-equal | C6 | C7 | 0.924 | ||
Equal | C6 | C2 | 0.917 |
Candidate )-Ranking | Most Important Criterion (INT) | Least Important Criterion (INT) | Problem | ||||
---|---|---|---|---|---|---|---|
Before SA | After SA | Before SA | After SA | Before SA | After SA | Before SA | After SA |
0.050 9th favorable | 0.215 1st favorable | 0.917 | 0.936 |
Criteria | Second Candidate | |||
---|---|---|---|---|
Before SA | After SA | Percentage | ||
X1 | Logistics Experience | 3 | 10 | 233% |
X2 | Education | 8 | 10 | 25% |
X3 | Flexible Working Hours and Overtime | 8 | 10 | 25% |
X4 | Proficiency in MS Office Programs | 5 | 7 | 40% |
X5 | Package Software Used in The Field of Logistics | 1 | 10 | 900% |
X6 | Recommendation Letters | 1 | 10 | 900% |
Average Percentage of 2nd Candidate Increase in Criteria Values | 354% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Kiratsoudis, S.; Tsiantos, V. Enhancing Personnel Selection through the Integration of the Entropy Synergy Analysis of Multi-Attribute Decision Making Model: A Novel Approach. Information 2024, 15, 1. https://doi.org/10.3390/info15010001
Kiratsoudis S, Tsiantos V. Enhancing Personnel Selection through the Integration of the Entropy Synergy Analysis of Multi-Attribute Decision Making Model: A Novel Approach. Information. 2024; 15(1):1. https://doi.org/10.3390/info15010001
Chicago/Turabian StyleKiratsoudis, Sideris, and Vassilis Tsiantos. 2024. "Enhancing Personnel Selection through the Integration of the Entropy Synergy Analysis of Multi-Attribute Decision Making Model: A Novel Approach" Information 15, no. 1: 1. https://doi.org/10.3390/info15010001