Professional Documents
Culture Documents
By
Gregory L. Brown, MSUS
And
Elaine Sutton, Ph.D.
Brown and Associates Consulting Services
Brown and Associates Consulting Services
• Founded in 2008
• Gregory L. Brown, MSUS, President
• Elaine Sutton, Ph.D., Senior Research Associate
• Expertise :
– Strategic thinking and planning
– Project and program development and management
– Organizational development and leadership
– Community building and organizing
– Meeting Facilitation
– Public policy and advocacy
– Governmental relations
– Survey research, data analysis and evaluation
2
Brown and Associates Consulting Services
• Client List
– The Cleveland Foundation
– Cleveland State University Levin College of Urban Affairs
– Prevention Research Center for Healthy Neighborhoods, CWRU Medical
School
– Burten, Bell, Carr Development, Inc.
– Environmental Health Watch
– Faith Community Credit Union
– Cuyahoga County Board of Health
– African American Collaborative Obesity Network (AACORN), Drexel University
3
Judge4Yourself.com Overview
4
Judge4Yourself.com Overview
5
Judge4Yourself.com Overview
6
Judge4Yourself.com Data Collected and Analyzed
• May 8, 2018 Primary Judicial Ratings – Cuyahoga County Court of Common Pleas –
General Division
• Rating for 2017 Judicial Ratings – Municipal Courts for Cleveland, South Euclid,
Lyndhurst and Cleveland Heights;
• Ratings for Ohio Justice of the Supreme Court 2016
• Ratings for Judicial Candidates 2016, Local Elections - Court of Common Pleas – General
Division – Slot 1
• Ratings for Judicial Candidates 2016, Local Elections - Court of Common Pleas – General
Division – Slot 2
• Ratings for Judicial Candidates 2016, Local Elections - Court of Common Pleas – General
Division – Slot 3
• Ratings for Judicial Candidates 2016, Local Elections - Court of Common Pleas –
Domestic Relations – Slot 1
• Ratings for Judicial Candidates 2016, Local Elections - Court of Common Pleas –
Domestic Relations – Slot 2
• Ratings for Judicial Candidates 2016, Local Elections - Court of Common Pleas –
Domestic Relations – Slot 3
7
Judge4 Yourself Data Analysis Results
• Of the nine sources of data collected for analysis, only four had data to compare
African and Caucasian American candidate rating scores.
• Of the four data sources used for comparison, no matter the judicial court position
being sought, African American candidates received a lower overall mean rating
score than their Caucasian counterparts.
– Court of Common Pleas Primary Election, 2018 – Caucasian Candidates, 2.6,
versus African American Candidates, 1.63
– Municipal Courts (Cleveland, South Euclid, Lyndhurst, Cleveland Heights, 2017
– Caucasian Candidates, 3.1, versus African American Candidates, 1.7
– Court of Common Pleas General Division-Slot 3, 2016 – Caucasian Candidates,
4.0, versus African American Candidates, 2.25
– Court of Common Pleas Domestic Relations-Slot 3, 2016 – Caucasian
Candidates, 3.75, versus African American Candidates, 3.0
8
Data Collection and Analysis Findings
• Finding - Based on the limited amount of data used for this analysis, the consistently
lower mean rating scores received by African American judicial candidates is a cause for
concern.
9
Recommendation
10
Recommendation
• There are a number of questions that need to be considered and addressed by impartial
research and investigation to ascertain if the ratings process is fair and just for all
candidates.
• Some of the questions to consider are:
– Are all African American candidates seeking judicial positions in Cuyahoga County
unprepared and unqualified to serve? If so, why are they determined to be unprepared
and unqualified?
– Is the rating system and process being used biased?
– Is the rating system and process objective?
– Are the criteria and process too subjective, causing each person in the rating process too
much latitude in assigning rating scores and deciding who will be recommended and
who will not?
– Are there criteria or other factors, other than those being used as criteria that might be
better suited to rate candidates and would provide a more equitable basis for decision-
making?
11
Analysis of Candidate Questionnaire
12
Judicial Candidate Questionnaire
• The Judicial Candidate Questionnaire is used to capture primarily qualitative and some quantitative
data about prospective judicial candidates. The questionnaire seeks general information about the
following:
• Candidates’ educational background;
• Military service;
• Work history and professional admissions;
• Honors and awards;
• Bar association and other memberships;
• Written work products;
• Judicial experience;
• Public office;
• Law practice;
• Litigation experience;
• Other law related activities;
• Potential conflicts of interests;
• Contacts;
• Past conduct;
• Health
• Goals and qualifications.
13
Analysis of the Candidate Questionnaire
• Based on our review of the Judicial Candidate Questionnaire and rating process; discussions
with current African American Cleveland Municipal Court Judges; and bar member
representatives who have participated in the candidate rating process, we gained insights
about the use of the questionnaire and corresponding rating process. These insights
included:
• Candidate questionnaire responses are used to determine the questions candidates will be
asked.
• Any bar association member representing a bar membership group comprising
Judge4Yourself.com can ask questions of any candidate.
• The questions asked each candidate are not the same or uniform.
• Each bar association group comprising Judge4Yourself.com convenes to discuss each
candidate’s questionnaire and interview responses.
• Each bar association, based on their discussions develops a rating score for each candidate.
• All bar associations reconvene to discuss their scores.
• The scores of each bar association are totaled and an average rating score determined.
• Based on the average rating score, candidates are determined to be excellent, good,
adequate, not recommended.
• Only those candidates who are not interviewed receive a rating of refused to participate.
14
Findings
15
Findings
16
Findings
• Findings #2 - The candidate review and assessment process is much more subjective
than objective, which directly influences rating scores and final determinations.
– Each bar association has the latitude to craft questions for candidates based on
their assessment of the candidate’s questionnaire responses to specific questions.
– So the questions asked of candidates are not uniform in the interview portion of
the rating process.
– The questions asked candidates not being uniform allows those deciding rating
scores an opportunity to be influenced by their subjectivity and the subjectivity of
other bar members leading to biased results.
– We were unable to discern if there is a corresponding rating instrument that allows
raters to quantify numerically the data collected in a uniform, non-subjective and
unbiased manner.
– This rating instrument would pool the qualitative data collected together and code
the data for statistical analysis.
– This conversion would make the rating process more objective.
17
Findings
18
Findings
• Finding #6 - Based on the data analysis conducted of rating score given to African
and Caucasian American judicial candidates; the candidate questionnaire and
rating process, we see the combination of subjective decision-making and implicit
cultural bias in the candidate rating system and process.
– Implicit bias occurs when someone consciously rejects stereotypes, but also
holds negative associations in his/her mind unconsciously.
– Implicit bias does not mean that people are hiding their racial prejudices.
– People literally do not know that they have them.
– Yet, researchers have concluded that the majority of Americans hold some
degree of implicit bias.
– Cultural bias is interpreting and making judgements based on standards
inherent to one’s own culture.
– The data collected and analyzed, though it was a limited sample, indicated
that African American candidates consistently receive lower rating scores than
their Caucasian counterparts.
19
Findings
20
Recommendations
21
Recommendations
• To make the data collection elements of the rating process more objective, the
creation of an instrument that translates qualitative data into quantitative data is
recommended.
• We note that one explanation for the consistently lower rating scores for African
American candidates is that they are unqualified and unprepared to hold the
positions they seek.
• Because of this possible explanation, we strongly recommend that additional
research by an independent researcher is conducted to vet this question fully for
the benefit of all involved.
• We suggest that the research is an immediate remedy to address if the cultural
bias is a result of implicit bias inherent to those engaged in the rating process
22
Thank You
23