Better Decision Support through Exploratory Discrimination-Aware Data Mining: Foundations and Empirical Evidence

Decision makers in banking, insurance or employment mitigate many of their risks by telling “good” individuals and “bad” individuals apart. Laws codify societal understandings of which factors are legitimate grounds for differential treatment (and when and in which contexts)—or are considered unfair discrimination, including gender, ethnicity or age. Discrimination-aware data mining (DADM) implements the hope that information technology supporting the decision process can also keep it free from unjust grounds. However, constraining datamining to exclude a fixed enumeration of potentially discriminatory features is insufficient. We argue for complementing it with exploratory DADM, where discriminatory patterns are discovered and flagged rather than suppressed. This article discusses the relative merits of constraint-oriented and exploratory DADM from a conceptual viewpoint. In addition, we consider the case of loan applications to empirically assess the fitness of both discrimination-aware data mining approaches for two of their typical usage scenarios: prevention and detection. Using MechanicalTurk, 215 US-based participants were randomly placed in the roles of a bank clerk (discrimination prevention) or a citizen / policy advisor (detection). They were tasked to recommend or predict the approval or denial of a loan, across three experimental conditions: discrimination-unaware data mining, exploratory, andconstraint-oriented DADM (eDADM resp. cDADM). The discrimination-aware tool support in the eDADM and cDADM treatments led to significantly higher pro-portions of correct decisions, which were also motivated more accurately. There is significant evidence that the relative advantage of discrimination-aware techniques depends on their intended usage. For users focussed on making and motivating their decisions in non-discriminatory ways, cDADM resulted in more accurate and less discriminatory results than eDADM. For users focussed on monitoring for pre-venting discriminatory decisions and motivating these conclusions, eDADM yielded more accurate results than cDADM.

Focus: Data Set
Source: Artificial Intelligence and Law
Readability: Expert
Type: PDF Article
Open Source: No
Keywords: Discrimination discovery and prevention, Data mining for decision support, Discrimination-aware data mining, Responsible data mining, Evaluation, User studies, Online experiment, Mechanical Turk
Learn Tags: Bias Design/Methods Ethics Fairness
Summary: This paper describes an exploratory study that uses different discrimination-aware data mining (DADM) methods to determine how information technology supports the decision process and keep it free from bias.