Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to main content

Advertisement

Quantifying explainable discrimination and removing illegal discrimination in automated decision making

  • Regular Paper
  • Published:
Knowledge and Information Systems Aims and scope Submit manuscript

Abstract

Recently, the following discrimination-aware classification problem was introduced. Historical data used for supervised learning may contain discrimination, for instance, with respect to gender. The question addressed by discrimination-aware techniques is, given sensitive attribute, how to train discrimination-free classifiers on such historical data that are discriminative, with respect to the given sensitive attribute. Existing techniques that deal with this problem aim at removing all discrimination and do not take into account that part of the discrimination may be explainable by other attributes. For example, in a job application, the education level of a job candidate could be such an explainable attribute. If the data contain many highly educated male candidates and only few highly educated women, a difference in acceptance rates between woman and man does not necessarily reflect gender discrimination, as it could be explained by the different levels of education. Even though selecting on education level would result in more males being accepted, a difference with respect to such a criterion would not be considered to be undesirable, nor illegal. Current state-of-the-art techniques, however, do not take such gender-neutral explanations into account and tend to overreact and actually start reverse discriminating, as we will show in this paper. Therefore, we introduce and analyze the refined notion of conditional non-discrimination in classifier design. We show that some of the differences in decisions across the sensitive groups can be explainable and are hence tolerable. Therefore, we develop methodology for quantifying the explainable discrimination and algorithmic techniques for removing the illegal discrimination when one or more attributes are considered as explanatory. Experimental evaluation on synthetic and real-world classification datasets demonstrates that the new techniques are superior to the old ones in this new context, as they succeed in removing almost exclusively the undesirable discrimination, while leaving the explainable differences unchanged, allowing for differences in decisions as long as they are explainable.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Notes

  1. This model does not express our belief how admission procedures happen. We use it for the purpose of illustration only.

  2. Short notation of probabilities: \(P(+|e_i)\) means \(P(y=+|e=e_i)\).

References

  1. Ahearn T (2010) Discrimination lawsuit shows importance of employer policy on the use of criminal records during background checks. via: http://www.esrcheck.com/wordpress/2010/04/12/

  2. Asuncion A, Newman D (2007) UCI machine learning repository. Online http://archive.ics.uci.edu/ml/

  3. Attorney-General’s Dept C (1984) Australian sex discrimination act 1984. via: http://www.comlaw.gov.au/Details/C2010C00056

  4. Becker G (1971) The economics of discrimination. University of Chicago Press, Chicago

    Book  Google Scholar 

  5. Bickel P, Hammel E, O’Connell J (1975) Sex bias in graduate admissions: data from Berkeley. Science 187(4175):398–404

    Article  Google Scholar 

  6. Calders T, Kamiran F, Pechenizkiy M (2009) Building classifiers with independency constraints. In: IEEE ICDM workshop on domain driven data mining (DDDM’09), pp 13–18

  7. Calders T, Verwer S (2010) Three naive bayes approaches for discrimination-free classification. Data Mining Knowl Discov 21(2):277–292

    Article  MathSciNet  Google Scholar 

  8. Chan PK, Stolfo SJ (1998) Toward scalable learning with non-uniform class and cost distributions: a case study in credit card fraud detection. In: Proceedings of ACM SIGKDD conference on knowledge discovery and data mining (KDD’98), pp 164–168

  9. Chawla N, Hall L, Joshi A (2005) Wrapper-based computation and evaluation of sampling methods for imbalanced datasets. In: Proceedings of the 1st international workshop on Utility-based data mining, pp 24–33

  10. Chawla NV, Bowyer KW, Hall LO, Kegelmeyer WP (2002) Smote: synthetic minority over-sampling technique. J Artif Intell Res 16:321–357

    MATH  Google Scholar 

  11. Collard D (1972) The economics of discrimination. Econ J 82(326):788–790

    Article  Google Scholar 

  12. Dedman B (1988) The color of money: atlanta blacks losing in home loans scramble: banks favor white areas by 5–1 margin. Atlanta J Const

  13. Dewey D (1958) The economics of discrimination. South Econ J 24(4):494–496

    Article  Google Scholar 

  14. Domingos P (1999) Metacost: a general method for making classifiers cost-sensitive. In: Proceedings of ACM SIGKDD conference on knowledge discovery and data mining (KDD)), pp 155–164

  15. Dutch Central Bureau for Statistics (2001) Volkstelling

  16. Elkan C (2001) The foundations of cost-sensitive learning. In: Proceedings of the 17th international joint conference on, artificial intelligence (IJCAI’01), pp 973–978

  17. Ellis E (2005) EU anti-discrimination law. Oxford University Press, Oxford

    Google Scholar 

  18. European Network Against Racism (1998). via: http://www.enar-eu.org/

  19. European Union Legislation (2012) via: http://europa.eu/legislation_summaries/index_en.htm

  20. Hajian S, Domingo-Ferrer J, Martinez-Balleste A (2011) Discrimination prevention in data mining for intrusion and crime detection. In: IEEE symposium on computational intelligence in cyber security (CICS). IEEE, pp 47–54

  21. Hajian S, Domingo-Ferrer J, Martínez-Ballesté A (2011) Rule protection for indirect discrimination prevention in data mining. Model Dec Artif Intell 6820:211–222

    Google Scholar 

  22. Hart M (2005) Subjective decisionmaking and unconscious discrimination. Alabama Law Rev 56:741

    Google Scholar 

  23. Kamiran F, Calders T (2009) Classifying without discriminating. In: Proceedings of the 2nd international conference on computer, control and, communication (IC4), pp 1–6

  24. Kamiran F, Calders T (2010) Classification with no discrimination by preferential sampling. In: Proceedings of the 19th annual machine learning conference of Belgium and the Netherlands (BENELEARN’10), pp 1–6

  25. Kamiran F, Calders T (2012) Data preprocessing techniques for classification without discrimination. Knowl Inf Syst 33:1–33

    Google Scholar 

  26. Kamiran F, Calders T, Pechenizkiy M (2010) Discrimination aware decision tree learning. In: Proceedings of IEEE international conference on data mining (ICDM), pp 869–874

  27. Kohavi R, John GH (1997) Wrappers for feature subset selection. Artif Intell 97(1–2):273–324

    Article  MATH  Google Scholar 

  28. Koknar-Tezel S, Latecki L (2010) Improving SVM classification on imbalanced time series data sets with ghost points. Knowl Inf Syst 24(2):1–23

    Google Scholar 

  29. Krueger A (1963) The economics of discrimination. J Polit Econ 71(5):481–486

    Article  Google Scholar 

  30. Luong B, Ruggieri S, Turini F (2011) k-nn as an implementation of situation testing for discrimination discovery and prevention. Technical Report TR-11-04, Dipartimento di Informatica, Universita di Pisa

  31. Margineantu D, Dietterich T (1999) Learning decision trees for loss minimization In: Multi-class problems. Technical report, Department of Computer Science, Oregon State University

  32. Pedreschi D, Ruggieri S, Turini F (2008) Discrimination-aware data mining. In: Proceedings of ACM SIGKDD conference on knowledge discovery and data mining (KDD’08)

  33. Pedreschi D, Ruggieri S, Turini F (2009) Measuring discrimination in socially-sensitive decision records. In: Proceedings of the SIAM international conference on data mining (SDM’09), pp 581–592

  34. Reder M (1958) The economics of discrimination. Am Econ Rev 48(3):495–500

    Google Scholar 

  35. Ruggieri S, Pedreschi D, Turini F (2010) DCUBE: discrimination discovery in databases. In: Proceedings of the ACM SIGMOD international conference on management of data (SIGMOD’10). ACM, pp 1127–1130

  36. Ruggieri S, Pedreschi D, Turini F (2010) Integrating induction and deduction for finding evidence of discrimination. Artif Intell Law 18:1–43

    Google Scholar 

  37. Sawhill I (1973) The economics of discrimination against women: some new findings. J Human Res 8(3):383–396

    Article  Google Scholar 

  38. Simpson EH (1951) The interpretation of interaction in contingency tables. J R Stat Soc 13:238–241

    MATH  Google Scholar 

  39. U. The US department of Justice (2011) The US federal legislation, via: http://www.justice.gov/crt

  40. Turney P (2000) Cost-sensitive learning bibliography. In: Institute for Information Technology, National Research Council, Ottawa, Canada

  41. United Kingdom Legislation, 2012. via:http://www.legislation.gov.uk/

  42. The US Civil Rights Act, 2006. via: http://finduslaw.com/

  43. U. Us Dept. of Justice. Us equal credit opportunity act, 1974. via: http://www.fdic.gov/regulations/laws/rules/6500-1200.html

  44. E. Us Empl. Opp. Comm. Us equal pay act, 1963. via: http://www.eeoc.gov/laws/statutes/epa.cfm

  45. US Fair Housing Act (1968). via: http://www.justice.gov/crt/about/hce/

  46. Wang B, Japkowicz N (2009) Boosting support vector machines for imbalanced data Sets. Knowl Inf Syst, pp 1–20

  47. Zliobaite I, Kamiran F, Calders T (2011) Handling conditional discrimination. In: Proceedings of IEEE international conference on data mining (ICDM’11), pp 992–1001

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Faisal Kamiran.

Additional information

A short version of this paper appeared in ICDM’12 [47].

Rights and permissions

Reprints and permissions

About this article

Cite this article

Kamiran, F., Žliobaitė, I. & Calders, T. Quantifying explainable discrimination and removing illegal discrimination in automated decision making. Knowl Inf Syst 35, 613–644 (2013). https://doi.org/10.1007/s10115-012-0584-8

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10115-012-0584-8

Keywords