Ja Weighting Methods For Multi-Criteria Decision Making .

Transcription

PRINT ISSN 1119-8362Electronic ISSN 1119-8362Full-text Available Online ioline.org.br/jaJ. Appl. Sci. Environ. Manage.Vol. 23 (8) 1449-1457 August 2019Weighting Methods for Multi-Criteria Decision Making TechniqueODU, G.O.Department of Mechanical Engineering, Faculty of Engineering, Delta State University, Abraka, Oleh Campus, 331107, Nigeria.Email: odugodwin@gmail.comABSTRACT: Determining criteria weights is a problem that arises frequently in many multi-criteria decision-making(MCDM) techniques. Taking into account the fact that the weights of criteria can significantly influence the outcomeof the decision-making process, it is important to pay particular attention to the objectivity factors of criteria weights.This paper provides an overview of different weighting methods applicable to multi-criteria optimization techniques.There are a lot of concept been reported from the literature that are very useful in solving multicriteria problems. Thepresent work emphasized on the use of these weighting methods in determining the criteria preference of each criterionto bring about desirable properties and in order to establish and satisfy a multiple measure of performance across allthe criteria selected by identifying the best options possible. And from the results, it shows that subjective weightingmethods are easy and straight forward in terms of their computations than the objective weighting methods whichderived their information from each criterion by adopting a mathematical function to determine the weights withoutthe decision-maker’s input,. This can be seen from the pairwise comparison which gives an internal storage and randomaccess memory of a smart phone a weight value of 0.33 and 0.22 respectively as they have the highest criteria weights.DOI: https://dx.doi.org/10.4314/jasem.v23i8.7Copyright: Copyright 2019 Odu. This is an open access article distributed under the Creative CommonsAttribution License (CCL), which permits unrestricted use, distribution, and reproduction in any medium,provided the original work is properly cited.Dates: Received: 20 May 2019; Revised: 27 July 2019; Accepted 31 July 2019KEYWORDS: Multi-criteria, Decision-making, Relative importance, Alternative, CriteriaIn most multi-criteria decision making (MCDM)models, assigning weights to criteria is an importantstep that needs to be reexamined. Though, determiningthe weights of criteria is one of the key problems thatarise in multi-criteria decision making (Dragan et al.,2018). There are various weighting methods that havebeen proposed in literature and applied for solvingdifferent MCDM problems such as goal programming,Analytic Hierarchy Process (AHP), weighted scoremethod, VIKOR, TOPSIS, etc. These weightingmethods are classified in different ways: Direct criteriaweighting methods (scaling, ranking-weight, pointallocation procedures and an indirect approach(weight derived from theories and mathematicalmodel). In practice, it is difficult even for a singledecision maker to supply numerical relative weights ofdifferent decision criteria. Naturally, obtaining criteriaweights from several decision makers is moredifficult. Quite often, decision makers are much morecomfortable in simply assigning ordinary ranks to thedifferent criteria under consideration. In such cases,relative criteria weights can be derived from criteriaranks supplied by decision makers. The decision forselecting an appropriate weighting method is a difficulttask in solving a multi-criteria decision problem.Several researchers have dismissed the difficulty inmeasuring the criteria weights and assume that theimportance of criteria weights is conversant with allEmail: odugodwin@gmail.comdecision makers (Zardari et al., 2015). However, thevalidity of criteria weights obtained from differentweighting methods cannot be ignored so as not to avoidany misuse of the MCDM models and getting reliablemodel results. MCDM methods can help to improvethe quality of decisions by making the decisionmaking process more explicit, rational, and efficient(Arvind and Janpriy, 2018). The author pointed outthat multi criteria decision making (MCDM) isregarded as a main part of modern decision scienceand operational research, which contains multipledecision criteria and multiple decision alternatives.Several researchers have come up with differentmethods of determining the criteria weights of a multicriteria decision making problem (Ginevicius andPodvezko 2005; Diakoulaki et al, 1995, Aldian andTaylor 2005; Dragan et al., 2018). Amongst is theweighted sum method (WSM) known to be the earliestand probably the most widely used method. The WSMwas later modified to weighted product method (WPM)in order to overcome some gaps associated with it. In1977, Saaty proposed the analytical hierarchy process(AHP) and it has recently become one of the popularmethods in most MCDM techniques. Nowadays,modification to the AHP is considered to be prevalentthan the original approach, e.g., the fuzzy AHP method.However, some challenges surrounds the theoretical

1450Weighting Methods for Multi-Criteria .basis of the method, it is easy to use and gives resultsthat are expected to the users. Despite its ease of use, theprocedure for processing information obtained from thedecision maker is difficult to ascertain. This makes themethod less suitable for situations with manystakeholders. Moreover for AHP, the number ofpairwise comparisons increases rapidly with the numberof criteria which makes it cumbersome. Othercommonly used methods are the ELETRE, VIKOR andthe TOPSIS methods. The ELETRE (Elimination andChoice Translating Reality) was first introduced in 1968by Bernard Roy to deal with outranking relations whichdeals with problem of ranking alternatives from the bestto worst by using pairwise comparisons amongalternatives considering each criterion separately.Though with the outranking relationship, decisionmaker may still take the risk of regarding one of thealternatives better than the other. This means that thedecision maker has a weak or strict preference for oneof the alternatives and sometimes unable to identify themost preferred alternatives because of their difficulty todetermine the alternative over the other. However, thismethod has the ability of eliminating less favourablealternatives and is convenient when there are decisionproblems that require fewer criteria with a large numberof alternatives. In addition, ELECTRE method consistsof a pairwise comparison of alternatives, based on thedegree to which evaluations of the alternatives and thepreference weights confirm or contradict the pairwisedominance relationship between alternatives. Itexamines both the degree to which the preferenceweights are in agreement with pairwise dominancerelationships and the degree to which weightedevaluations differ from each other. These stages arebased on a ‘‘concordance and discordance’’ set;hence, this method is also called concordance analysis.The VIKOR method is a multi-criteria decisionmaking (MCDM) method. It was originally developedby Serafim Opricovic to solve decision problems withconflicting and non-commensurable criteria, assumingthat compromise is acceptable for conflict resolution(Arvind and Janpriy, 2018). This method focuses onranking and selecting from a set of alternatives anddetermines the compromise solution closest to theideal solution. Chatterjee et al. (2012) proposeddecision-making methodology for material selectionusing compromise ranking method known as VlseKriterijumska Optimizacija Kompromisno Resenje’(VIKOR), which means multi-criteria optimizationand compromise solution. The TOPSIS method isbased on technique of ranking preferences bysimilarity to the ideal solution (TOPSIS) to aid inmaterial selection process proposed by Hwang andYoon in 1980 (Xu, 2007). According to this technique,the best alternative would be the one that is closest tothe positive-ideal solution and farthest from thenegative ideal solution. The primary concept ofTOPSIS approach is that the most preferred alternativeshould not only have the shortest distance from thepositive ideal solution, but also have the farthestdistance from the negative ideal solution (Vinodh etal., 2014). The Euclidean distance approach wasproposed to evaluate the relative closeness of thealternatives to the ideal solution. Thus, the order ofpreference of the alternatives can be obtained by aseries of comparisons of these relative distances. Theentropy method is the method used for assessing theweight in a given problem because with this method,the decision matrix for a set of candidate materialscontains a certain amount of information. The entropyworks based on a predefined decision matrix. Usingthe entropy method, it is possible to combine thematerial designer’s priorities with that of thesensitivity analysis. The TOPSIS method first convertsthe various criteria dimensions into non-dimensionalcriteria. When the designer finds no reason to givepreference to one criterion over another, the principleof insufficient reason (Star and Greenwood 1977)suggests that each one should be equally preferred.However, some modification of TOPSIS has beenproposed by Jahanshahloo (2006), Liu and Zeng,(2008), Rao and Davim, (2008), and Rao and Patel(2011).Weights assigned to criteria in multi-criteria evaluationhas both qualitative and quantitative data so as to makesure that the weight is taking into account for better andmore accurate decision making. However, assigningweights using qualitative data to criteria can beinfluenced by decision maker preference, and due to thisset back, Saaty (1977) proposed a numerical scale of“1– 9” in order to transform qualitative data intoquantitative by describing ‘1’ as equal importance and‘9’ as extreme importance (Abel et al., 2018). Weightsclassification can also be grouped into three categories:Subjective, objective and integrated or combinedweighting approach (Ginevicius and Podvezko 2005).Subjective weight determination is based on expertopinion, and in order to get the subjective judgments,analyst normally presents the decision makers a set ofquestions in the process. However, subjective criteriaweight determination is often time consumingespecially when there is no agreement between decisionmakers of the problem under consideration. Example ofthe subjective weighting method is the AnalyticalHierarchy analysis (AHP), Olson (2008) wrote on thesubjectivity in multiple criteria decision analysis; heargued that judgment is at the heart of human decisionmaking and, therefore, considered judgment to besubjective. If a decision were to be made objectively,one should simply adopt the “decision support” viewODU, GO

1451Weighting Methods for Multi-Criteria .that human decision-makers should be entrusted withthe final decision, and that every model is imperfect.Models do not include all factors. Even the mostcareful attempts at objective measurement willinevitably involve some inaccuracy. The author alsopointed out that we must accredit our own judgment asthe paramount arbiter. In the objective weightingmethods, criteria weights are derived from informationgathered in each criterion through mathematicalmodels without any consideration of the decisionmaker’s intervention (Aldian and Taylor, 2005). Theintegrated weighting approach is a weighting methodbased on the combination of subjective weighting andobjective weighting methods. It focuses on theprinciple of integrating the subjective weights basedon expert’s opinion due to his/her knowledge andexperience in the relevant field and the informationgathered from the criteria data in a mathematical form(objective weighting method). In the subsequentsection, the mathematical function and specificexamples of each of these methods will be illustratedand evaluated.Now, let us look at some of the most commonsubjective weights that have been used in previousMCDM studies are shown in Table 1.Ranking methodTable 1: Classification of weighting methodsWeighting methodsObjective weighting methodsEntropy methodCriteria Importance Through Inter-criteriaCorrelation (CRITIC)Mean weightPairwise comparison (AHP)Standard deviationRatio methodSwing methodDelphi methodNominal group techniqueSimple Multi-attribute RankingTechnique (SMART)Statistical variance procedureIdeal point methodSubjective weighting methodsPoint allocationDirect ratingSubjective weighting methods: The most commonlyused subjective weighting methods are listed in Table1 are as follows:(1) The point allocation method: This is one of thesimplest methods used to determine criteria weightsaccording to the priority of criteria, a decision-makerallocates a certain number of points to each criterion.The more points a criterion receives, the greater itsrelative importance (Golaszewski et al., 2012). In thisscenario, the decision maker is asked to allocate 100points across the criteria under consideration. The totalof all criterion weights must sum up to 100. Thismethod is easy to normalize. However, the weightsobtained from the use of point allocation method arenot very precise, and the method becomes moredifficult as the number of criteria increases to 6 ormore. For example, consider five key qualitycharacteristics of smart phone one should look out for:cost, display resolution, battery life, random memory,and internal storage.Table 2: Smart phone criteria weights using point allocation methodS/NCriteriaWeights1Cost102Display Resolution353Battery Life154Random Access Memory (RAM) 255Internal Storage15Total100Integrated weighting methodsMultiplication synthesisAdditive synthesisOptimal weighting based on sum ofsquaresOptimal weighting based on relationalcoefficient of graduation(2) The direct Rating method: The direct ratingmethod is a type of approach in which the decisionmaker first ranks all the criteria according to theirimportance. The rating does not constrain the decisionmaker’s responses as the fixed point scoring methodsdoes. It is possible to alter the importance of onecriterion without adjusting the weight of another(Arbel, 1989).(3) The pairwise comparisons: This method is used foranalyzing multiple populations in pairs to determinewhether they are significantly different from oneanother. It can also put as a method where thedecision-maker compares each criterion with othersand determines the level of preferences for each pairof such criteria. The use of ordinal scale (1 - 9) isadopted to help in determining the preference value ofone criterion against the other. And one of the mostcommonly applied methods based on pairwisecomparisons is the Analytical Hierarchy process(AHP) method. The number of comparisons can bedetermined byODU, GOcp n(n 1)2(1)

1452Weighting Methods for Multi-Criteria .Where c p the number of comparisons: n thenumber of criteriaDetermining the criteria weights based on pairwisecomparisons method has three main steps and can beimplemented as follows. The first step is to develop amatrix by comparing the criteria as shown in Table 4.Intensity values are used to fill the matrix, such as(1,3,5,7,9) representing equal importance, moderateimportance of one over the other, strong importance,very strong importance, extreme importancerespectively,. While the ordinal scale of 2,4,6 and 8 areintermediate values or when compromise is neededand can be represented as follows: equally tomoderately preferred – 2; moderately to stronglypreferred - 4; strongly to very strong importance -6;and very strong to extremely strong importance -8.The diagonal in the matrix is always 1 and the lowerleft values are inverse values if activity i has one of theabove numbers assigned to it when compared withactivity j, then j has the reciprocal value whencompared with i. To fill the lower triangular matrix,we use the reciprocal values of the upper diagonal.Thus we have complete comparison matrix.The second step is to calculate the criteria weight,which is also known as priority value or the principaleigenvector. This is done by using either of thefollowing methods:Method 1: By summing the values in each column,dividing each element by the column total, anddividing the sum of the normalized scores for each rowby the number of criteria as shown in the givenexample (Table 4). The calculation for the priorityvalue of the first row in the matrix is given as:summation for the first column total is 22, and theremaining four columns gives 4.33, 3.44, 5.70, and5.25. Therefore, the priority value for the first row isgives1 1 / 3 1/ 9 1/ 5 1/ 4( ) / 5 0.05 (2)22 4.33 3.44 5.70 5.25Method 2: Multiplying together the entries in eachrow of the matrix and then taking the nth root of thatproduct gives a very good approximation. The nthroots are summed and that sum is used to normalizethe eigenvector elements to add to 1.00. For theOrderRI10.0020.0030.5840.9051.12example, in Table 4, the number of criteria or attributeis five, which is the fifth root for the first row is 0.283and that is divided by 5.88 to give 0.05 as the firstcriteria weight.The third step is to estimate the consistency forsensitivity analysis known as consistency ratio (CR).If the consistency ratio is less than 0.1, then the ratioindicates a reasonable level of consistency in thepairwise comparisons, but once the CR is greater than0.1, it shows that the pairwise comparisons areinconsistent in judgment. Sensitivity analysis can beuseful in providing information as to the robustness ofany decision. In order to compute the consistencyratio, the following procedure needs to be followed:(a) multiply each value in the first row of the pairwisecomparisons matrix by corresponding criteria weightor eigenvector to obtain a new vector. For example,1 0.005 0.20 0.33 0.22 0.2 0.246(3)(b) Repeat step (a) for remaining columns, that is, theremaining four rows give 1.100, 1.840, 1.179, and1.040 as the five elements of max . (c) Divide eachelements of the vector of weighted sums obtained instep a-b by the corresponding priority value. Eachcomponent of (0.246, 1.100, 1.840, 1.179, 1.040) bythe corresponding criteria weight. This gives0.246 4.926 for the first row, other values are0.055.500, 5.576, 5.359, and 5.200. (d) Then compute theaverage of the values found in step c, let max bethe average. The mean of these values is 5.312, whichis the estimate of the max . If any of the estimates for maxturns out to be less than n, or 5 in this case, therehas been an error in the calculation. (e) Compute theconsistency index (CI), which is defined asCI max n(4)n 1(f) Using random judgments from Table 3 which wasderived from Saaty’s book, in which a set of judgmentsfor the corresponding value from large samples ofmatrices for the computation of consistency ratio, inwhich the upper row is the order of the random matrix,and the lower row is the corresponding index of theconsistency for random judgments referred to asrandom indexTable 3: Random Index67891.241.321.411.45Source: Saaty, (1980)ODU, GO101.49111.51121.54131.56141.57151.59

1453Weighting Methods for Multi-Criteria .(g) Therefore, the consistency ratio, CR isCI0.078; That means CR CR 0.07 (5)RI1.12Accept the matrix if consistency ratio is less than 0.1or 10%. Higher numbers indicates that thecomparisons are less consistent, while smallernumbers mean comparison are more consistent, CRabove 0.1 or 10% indicates that the pairwisecomparisons should be revisited or reversed (Setiawanet al., 2014).Table 4: Pairwise Comparison Matrix for the Criteria and Consistency RatioRD BLRAMINSCnth root of Priority 2700.22Cost (C)4111/211.1490.20Totals5.8801.00RD Resolution Display (RD; BL Battery Life (BL); RAM Random Access Memory (RAM); INS Internal Storage (INS)(4) Ranking method: This is one of the simplestapproaches to assign criteria weights. The criteria areusually ranked

ODU, GO basis of the method, it is easy to use and gives results . degree to which evaluations of the alternatives and the . wo