You are on page 1of 13

Public Opinion Quarterly, Vol. 72, No. 4, Winter 2008, pp.

792803

WHO PARTICIPATES IN THE PUBLIC SQUARE


AND DOES IT MATTER?
ROBERT KIRBY GOIDEL
CRAIG MALCOLM FREEMAN
STEVEN PROCOPIO
CHARLES F. ZEWE

Abstract Survey research has been frequently criticized for reecting hastily drawn and poorly formed responses as opposed to more deeply
held attitudes or opinions. James Fishkin (1991), for example, has argued
that public opinion surveys miss the normatively and substantively important deliberative component of public opinion formation. In this paper, we
consider two questions relative to deliberative public opinion. First, who
shows up for deliberative opinion forums? And second, what difference
does their participation make in terms of their general attitudes toward
the political process? To answer these questions, we make use of a unique
set of data collected as part of a series of monthly television programs,
Louisiana Public Square, which aired on Louisiana Public Broadcasting from June 2004 to March 2005. These programs covered a range
of issues (e.g., public education, roads and transportation, health care,
religion, and public life) and included participants selected using random
digit dialing. Each month, participants learned about the issues, discussed
the issues with a trained moderator, and directed questions to relevant
state policy makers. Data were collected on relevant attitudes both before
and after the program, allowing us to (1) compare attitudinal and demographic differences among participants (preshow and postshow) and

are with
Manship School of Mass Communication, Louisiana State University, Baton Rouge, LA 70803.
An earlier version of this paper was presented at the 2005 annual meeting of the American
Association of Public Opinion Research, May 1215, Miami Beach, FL. The authors wish to thank
Adrienne Moore, the Director of the Reilly Center for Media & Public Affairs, for her generous
support of research about mass communication and its many faceted relationships with social,
economic, and political issues. The authors also thank Beth Courtney, Clay Fourrier, Al Godoy,
Kevin Gautreaux, and the rest of the Louisiana Public Square team for collaborating on this project.
Address correspondence to Robert Kirby Goidel; e-mail: kgoidel@lsu.edu

ROBERT KIRBY GOIDEL , CRAIG MALCOLM FREEMAN , STEVEN PROCOPIO AND CHARLES F. ZEWE

doi:10.1093/poq/nfn043
Advance Access publication October 17, 2008

C The Author 2008. Published by Oxford University Press on behalf of the American Association for Public Opinion Research.
All rights reserved. For permissions, please e-mail: journals.permissions@oxfordjournals.org

Who Participates in the Public Square

793

nonparticipants (preshow only), and (2) analyze attitude change among


participants particularly with respect to levels of trust in government and
perceptions of the responsiveness of the political process to public concerns. According to our results, the socioeconomic biases that predict
other forms of participation are equally present when considering participation in a deliberative forum. Unlike other forms of participation,
however, the deliberative forums considered in the present analysis attracted more ideologically moderate participants who valued the role of
discussion in democratic governance.
One of the central concerns with public opinion surveys as a methodology has
been that they do not allow for considered opinion. In his textbook treatment of
the subject, Asher (2004) cautions that much of what passes for opinion is really
nonattitudes, as respondents make up their opinions on the spot. Likewise,
Zaller and Feldmans (1992) theory of survey response is predicated on the
assumption that most survey respondents do not have well-formed opinions at
the level of specicity implied by survey questions. Such critiques, embodied
most famously in the work of Converse (1964), have formed the basis of
deliberative opinion polling. As Luskin, Fishkin, and Jowell (2002, p. 456)
surmise:
Now, several decades experience the wiser, we know that public opinion polls. . .
have not been the great boon to democracy that Gallup envisioned. The problem is
not mainly that the technology is sometimes abused (as in push polls) or shoddily
implemented, although of course it sometimes is. Nor is it just that the advent of
polling as scorekeeping has been part of the long descent of campaign coverage
toward sports coverage. The most fundamental problem is that not many of the
respondents answering any given question have very well considered or informed
opinions about the issue.

Unlike Converse, however, proponents of deliberative polling see shortcomings in public opinion as more reective of method than respondent and are
philosophically more closely aligned with deliberative democratic theorists
(Habermas 1989, 1997; Gutmann and Thompson 1996; Page 1996; Chambers
2003). Seen in this light, deliberation can enlighten participants, narrow information gaps, and bring respondents closer to the democratic ideal. Deliberative
opinion polling likewise attempts to compensate for shortcomings in traditional
polling by focusing on changes in attitudes both at the individual and aggregate
levels through the use of probability sampling and an informed discussion of
leading issues (Fishkin 1995; Luskin, Fishkin, and Jowell 2002; Sturgis 2003).
According to the theory, increased deliberation should produce insight into
what may be the true voice of the people if that voice were fully informed
(Fishkin 1995; Sturgis, Roberts, and Allum 2005).
Regardless of theoretical orientation, research on the effects of deliberative polling has reached a decidedly mixed verdict. Fishkin and others have

794

Goidel et al.

generally found positive effects for both political learning and attitude change
(Merkle 1996; Fishkin and Luskin 1999; Luskin, Fishkin, and Jowell 2002;
Brady, Fishkin, and Luskin 2003), while Gastil and Dillard (1999) found
improved political sophistication but negligible attitude change. Not all the
work, however, has been so positive. Denver, Hands, and Jones (1995) found
little attitude change, while Gastil and Dillard (1999) found increased political knowledge but little attitude change. Recent research by Jackman and
Sniderman (2006) and Sturgis, Roberts, and Allum (2005) likewise nds little
evidence that deliberation leads to greater ideological consistency.

Data
For this analysis, we rely on data collected as part of the Louisiana Public Square
television program broadcast each month on Louisiana Public Broadcasting.
We include data from eight episodes from June 2004 through March 2005.
Not included are the October data which were part of the national By the
People project sponsored by PBS, and December when no postshow data were
collected.
Louisiana Public Square is based on the idea of creating an interactive dialog
between an audience recruited through random digit dialing and state policy
makers. Each month, the Public Policy Research Lab at Louisiana State University recruits roughly 4060 respondents who agree to participate in the monthly
program. Respondents who agree to participate in the program are given a brief
(1012 questions) preshow survey including standard demographic items, items
specic to a given a months program, and items on trust in government and
the importance of the democratic process. Only those respondents who agree
to participate in the program are given the survey, meaning that response and
cooperation rates are based not only on a willingness to take the survey but
also on actual stated willingness to participate in a deliberative forum. In Appendix A, we provide the introductory script used for recruitment purposes. As
reported in table 1, response rates and cooperation rates are very low compared
to more typical survey research. Response rates for the Louisiana Public Square
surveys range from 4 to 11 percent, while cooperation rates range from 5 to
14 percent. Apart from issues of turnout (discussed below), it is considerably
more difcult to get respondents to agree to participate in a deliberative forum
than to participate in a survey interview.
Token incentives are given as part of the recruitment effort (e.g., an LPB
coffee mug), and actual attendance has ranged from 519 participants. Overall,
422 respondents have agreed to participate via the telephone interview and 92
have actually participated by attending the program. The overall turnout rate
computed as the percent of attendees to total survey respondentsis 22 percent,
varying from a low of 9 percent to a high of 33 percent. While such a turnout
rate is lower than we would like, it is not surprising given the limited incentives

Month
June 2004
July 2004
August 2004
September 2004
November 2004
January 2005
February 2005
March 2005
Total

Topic
Economic development
Property taxes
Education
Health care
Roads
Poverty
Healthy lifestyles
New Orleans
saints

Number of survey
Number of forum
respondents
attendees
Turnout rate
Response Cooperation
(percent of respondents) (percent of attendees) (attendees/respondents)
rate
rate
52 (12)
46 (11)
51 (12)
57 (14)
58 (14)
63 (15)
43 (10)
52 (12)

15 (16)
11 (12)
10 (11)
5 (5)
19 (21)
12 (13)
7 (8)
13 (14)

29
24
20
9
33
19
16
25

11
5
5
3
6
4
5
6

14
8
6
5
7
6
7
8

422 (100)

92 (100)

22

Who Participates in the Public Square

Table 1. Monthly Distribution of Respondents and Attendees, Turnout Rate, Response Rate, and Cooperation Rate

NOTE.Percent of survey respondents and forum attendees are based on the total number of respondents and participants across all months. Turnout rate is the
number of attendees divided by the number of respondents. Response rate is computed using AAPORs response rate calculator AAPOR formula 3 and cooperation
rate is computed using AAPOR formula 4.

795

796

Goidel et al.

for participation. Moreover, while limited incentives may create problems in


terms of turnout, it also means that our participants are motivated primarily
by the opportunity to engage in a public discussion. As a result, while limited
participation may be problematic in terms of representing the distribution of
public opinion, it works to our advantage in trying to understand who willingly
participates in deliberative forums.

Results
In table 1, we present the monthly distribution of respondents and attendees, as
well as response and cooperation rates (noted above). As would be expected,
turnout differs signicantly from month to month. The lowest turnout was
during the September 2004 episode, which was lmed in New Orleans and
coincided with Hurricane Ivan warnings and resulting evacuations. The second
lowest turnout rate was during February when the show focused on healthy
lifestyles. The highest turnouts were during January 2005 when the show
focused on roads and transportation and June 2004 when the show focused
on economic development. At rst glance, it would appear that topic plays
an important role in forum attendance. Within this context, it is important to
note that interest levels likely affect the willingness of the participant to take
the survey in the rst place (Groves, Presser, and Dipko 2004). Turnout rates,
response rates, and cooperation rates are highly correlated. The correlation
between turnout and response rates was .72 (p = .026), while the correlation
between turnout and cooperation rates was .59 (p = .09).

WHO PARTICIPATES ?

In table 2, we present a comparison of survey respondents, forum attendees, and


census estimates for selected demographics and for a measure of ideological
intensity. Ideological intensity is measured based on responses to standard
seven-point economic and social ideology scales asking respondents to place
themselves from very liberal to very conservative. These scales were rst folded
such that moderates were coded as 0 and ideologues were coded as 4, and then
combined into a single eight-point index. The bivariate correlation between the
individual items was .56.
Several interesting patterns emerge in the data. First, while women were
more likely to respond to the survey, men were slightly more likely to participate in the forum. Second, while our surveys were often subject to typical
survey biases, participants were signicantly more likely to be white, better
educated, higher income, older, and married. Across each of the demographic
categories (with the exception of gender), the differences in survey respondents and participants were substantial and indicate that the deliberative forum was less representative than the survey responses. Better incentives and

Who Participates in the Public Square

797

Table 2. Comparison of Survey Respondents, Forum Attendees, and Census


Demographics
Survey respondents

Forum attendees

2000 census

43
57

52
48

48
52

53
41
6

76
20
4

56
40
4

5
24
23
29
19

4
11
23
29
33

16
26
27
20
11

11
43
31
15

2
31
41
27

13
49
27
11

19
19
20
18
11
14

4
12
21
28
23
12

12
18
20
25
15
10

37
47
12
4

21
65
12
3

34
47
12
6

23
45
32

24
56
20

Gender
Male
Female
Race
White
Black or African American
Other
Education
Less than high school
High school
Some college
College (4-year degree)
Graduate
Income
Less than $10,000
$10,000$49,999
$50,000$99,999
$100,000 or more
Age
1824
2534
3544
4554
5564
65 and over
Marital status
Single
Married
Separated/divorced
Widowed
Ideological intensity
Low ideology (02)
Moderate ideology (35)
High ideology (68)
NOTE.Cell entries are percentages.

oversampling key demographic groups might alleviate some of these biases,


but in the absence of such efforts, our forums tended to overrepresent certain
demographic groups. Perhaps surprisingly, we also nd that respondents who
scored in the moderate range of an index of ideological intensity were also
more likely to participate. Moderate ideologues made up 45 percent of survey

798

Goidel et al.

respondents but 57 percent of forum attendees. This difference is made up


almost entirely by lower participation rates among the most ideological respondents. Strong ideologues comprised 32 percent of the survey respondents
but only 20 percent of forum participants.
To get a better sense of who participates, we conducted a logistic regression
of forum participation (coded 1 if the respondent participated, 0 otherwise)
on demographics,1 ideological intensity, attentiveness to news about public affairs and politics, and the value individual respondents placed on discussion.
Attentiveness to public affairs is based on responses to the following question: How closely would you say that you follow news on state government
and public affairs in Louisiana? Would you say follow the news very closely,
somewhat closely, not very closely or not at all? Responses were coded on
a four-point scale with 4 indicating respondents who followed the news very
closely and 1 indicating respondents who did not follow the news at all. Finally, the value of deliberation was measured based on respondent agreement
with two statements on the value of discussion in a democratic society: (1)
Discussionseven those that involve serious disagreementsshould be an
important part of the political processes; and (2) Democracy requires hearing
from all sides of a political issue, even those with unpopular views. Responses
were coded such that higher values indicate respondents who place greater
value on discussion, and were combined into a single index (r = .40). The
results are presented in table 3.
As can be seen in table 3, participation in these deliberative forums is a
function of race, education, age, ideological intensity, and the value the respondent placed on discussion. In terms of demographics, we nd that white,
better educated, and older respondents were most likely to participate. In this
respect, participation in the Public Square reects standard social economic
biases in other forms of participation. The more interesting ndings, however,
were centered on the psychological variables, where we nd less ideological
respondents and respondents who place a greater value on discussion were
more likely to participate. Taken in context of the ndings on the importance
placed on discussion, the ndings appear to reect respondents who may not
have a rm view on issues and who believe discussion is an important part of
developing an informed opinion. Which is to say that this sort of deliberative
process appears to attract a different audience from the participants in talk radio
1. Education is measured based on responses to the highest degree received, including grade school,
high school, vocational or technical school, associates degree, bachelors degree, masters degree,
or Ph.D., JD, or some other advanced degree. Income is measured as household income ranging
from less than $10,000, $20,000$30,000, $30,000$40,000, $40,000$50,000, $50,000$60,000,
$60,000$70,000, $70,000$80,000, $80,000$100,000, and $100,000 and above. Race is coded
1 if the respondent is white or Caucasian, 0 otherwise. Gender is coded 1, if the respondent is
female, 0 otherwise. Married is coded 1 if the respondent is married, 0 otherwise. Age is measured
in years. Each months variable is coded 1 for the specic month, 0 otherwise. March serves as the
baseline month.

Who Participates in the Public Square

799

Table 3. Logistic Regression of Forum Attendance on Demographics, Topic,


and Psychological Variables
B (SE)
Demographics
Race
Gender
Education
Income
Age
Age squared
Married
Month/topic dummies
June 2004 (economic development)
July 2004 (property taxes)
August 2004 (education)
September 2004 (health care)
November 2004 (roads)
January 2005 (poverty)
February 2005 (health)
Psychological variables
Political attentiveness
Ideological intensity
Discussion value
Constant
Goodness-of-t statistics
Nagelkerke R2

1.11 (0.34)
0.06 (0.30)
0.31 (0.10)
0.02 (0.07)
0.13 (0.06)
0.001 (0.0007)
0.34 (0.32)
0.89 (0.57)
0.60 (0.59)
0.19 (0.60)
0.57 (0.67)
0.89 (0.58)
0.11 (0.57)
0.25 (0.67)
0.29 (0.23)
0.24 (0.08)
0.34 (0.14)
9.46 (2.08)
.28

NOTE.Cell entries are logistic regression coefcients, standard errors in parentheses.


p < .10; p < .05; p < .01.

or in other forms of invited programs that do not include some sort of random
selection.2
Finally, we consider the difference participation makes in terms of attitudes
toward the political process. Respondents were asked two items used in measures of political trust and government responsiveness: (1) whether government
is run for a few big interests looking out for themselves or for the benet of all
the people; and (2) how much attention government pays to the people when
2. To test for nonlinear effects, we replaced the ideology measure in the logistic regression
model with dummy variables indicating less ideological (those who scored between 0 and 2
on the ideological intensity measure) and highly ideological respondents (those who scored 6 or
above). Moderate ideologues (those scoring between 3 and 5) served as our base category. The
results indicate that the difference between less ideological respondents and moderately ideological
respondents were not statistically signicant, while differences between moderate ideologues and
strong ideologues (those who scored 6 and above) were statistically signicant. The coefcients
are as follows: high ideology 1.03 (0.35) ; low ideology 0.07 (0.38).

800

Goidel et al.

Table 4. Preshow and Postshow Attitudes of Trust in Government and Government Responsiveness

All survey
respondents
Trust
Few big interests
Benet of all
Responsiveness
Good deal
Some
Not much

Preshow
Postshow
survey
survey
Survey respondents (participants (participants
(nonparticipants)
only)
only)

67
33

68
32

63
37

62
38

14
45
41

13
44
43

17
48
35

25
57
18

NOTE.Trust is measured by responses to a question asking respondents whether government


is run by a few big interests looking out for themselves or for the benet of all. Responsiveness
is measured asking respondents, How much attention do you think government pays to what
people think about when it decides you? Cell entries are the percent of respondents giving a
particular response. The rst column includes all respondents, the second column includes survey
respondents who did not participate, the third column includes the preshow responses of program
participants, and the last column includes postshow responses of program participants.

deciding what to do.3 In table 4, we show the differences in responses for


all survey respondents, nonparticipants, and preshow and postshow program
participants. We anticipate that participants will be more trusting and perceive
government as more responsive after participating in the deliberative forum. To
test for preshow and postshow differences, we utilize the Wilcoxon signed-rank
test which compares the distribution of paired variables in related samples.
As can be seen in table 4, while there is little change in government trust
from participating in the program (Z = 0.54, p = .59), there was a more substantial and statistically signicant (Z = 2.26, p = .026) change in perceptions
of government responsiveness including a 16-point decline in the percent of
respondents saying government did not pay much attention to people when
deciding what to do. Increases in perceptions of government responsiveness
likely reect program formatpolicy makers are actively listening and responding citizen concernsand may be short-lived. The lack of any signicant
increase in trust may reect a longer viewparticipants witness policy makers being more responsive but such responsiveness may or may not translate
into changes in policy or ofcial behavior. Also worth noting, program participants were slightly more (though not signicantly so) trusting and perceived
greater government responsiveness than nonparticipants before participating in
the program. Participating in the deliberative forum may enhance perceptions
3. Unfortunately, these were the only items included in the data due to space limitations in the
monthly questionnaires.

Who Participates in the Public Square

801

of government responsiveness among respondents who may already be slightly


more inclined to believe that government listens to their view.

Discussion
So who participates in the Public Square? According to our results, the socioeconomic biases that predict other forms of participation are equally present
when considering participation in a deliberative forum (Verba, Schlozman, and
Brady 1995). Though audience members were selected via random digit dialing, the audience that actually attended these deliberative forums was whiter,
older, wealthier, and better educated than the general population.4 This supports
Ryfes (2002, p. 365) conclusion that issues of inclusion and identity lurk
even in randomly selected groups. Such self-selection biases can be particular problematic for the tenets of deliberative democracy as they may limit the
diversity of opinion and ideas expressed in deliberative forums (Ryfe 2005).
Unlike other forms of participation, however, the deliberative forums considered in the present analysis attracted more ideologically moderate participants
who valued the role of discussion in democratic governance. That deliberative
forums attract ideological moderates is somewhat surprising given previous
research that nds opinion intensity associated with political participation.
Combined with the nding that respondents who valued discussion were also
more likely to attend, participation in deliberative forums may attract otherwise
civically engaged citizens who are attempting to better understand contemporary issues. The value of such conversations then may reside in their ability to
be less polarizing and more oriented toward political learning and consensus
building.
As a result of their participation, respondents were signicantly more positive
regarding the responsiveness of government to public input, though notably
they were no more trusting. This conicts with Daves (1999) nding that trust
in government increases after deliberative discussions. By participating in an
event in which policy makers are responding to public concerns, participants
perceive a higher level of government responsiveness, but their perceptions
may not extend to the broader political system or to optimism for long-term
policy change. We should also note that it may well be that the gains are only
short-lived and quickly fade beyond the immediate aftermath of the event.
Still, the ndings suggest the possibility that deliberation can connect citizens
to the political process in important ways, and may yield different sorts of
conversations than when the focus is on partisans and political actors.

4. The audience was also wealthier though this difference did not hold up in multivariate analyses
controlling for other demographic variables.

802

Goidel et al.

Appendix A: Questionnaire Script


Hi, Im calling from the Public Policy Research Lab at Louisiana State University, my name is <name of caller>, and Im not trying to sell anything. We are
working with Louisiana Public Broadcasting to recruit residents from the Baton
Rouge area to participate in a televised discussion with the state government
ofcials on leading statewide issues. All participation is strictly voluntary.
IF YES, PRESS 1 TO CONTINUE WITH SURVEY
Is it ok if I take a minute to tell you a little about the project? Louisiana
Public Square is a program by Louisiana Public Broadcasting. Once a month,
a small group of randomly selected citizens will meet to discuss issues with
government ofcials and other experts. You have been randomly selected from
Baton Rouge area residents to participate in the program. As a participant, you
will have the opportunity to represent the views of Louisiana residents across
the state. This program will focus on health and lifestyle choices in Louisiana.
Doctors from LSUs Pennington Biomedical Center will be there to answer
your questions. You dont need to be an expert to participate. We will provide
information on current policies and laws and we are interested in the views of
citizens with various backgrounds, including different levels of knowledge and
experience.
PRESS 1 TO CONTINUE
The project would involve your participating in an hour-long discussion on
Wednesday, February 16, at the Louisiana Public Broadcasting studio. Does
the project sound interesting to you?
1 Yes
2 Maybe
3 No
Could I sign you up for the project?
1 Yes
2 No

References
Asher, Herbert. 2004. Polling and the Public: What Every Citizen Should Know. 6th ed. Washington,
DC: CQ Press.
Brady, Henry E., James S. Fishkin, and Robert C. Luskin. 2003. Informed Public Opinion about
Foreign Policy: The Uses of Deliberative Polling. The Brookings Review 21(3):169.
Chambers, Simone. 2003. Deliberative Democratic Theory. Annual Review of Political Science
6:30726.
Converse, Philip. 1964. The Nature of Belief Systems in Mass Publics. In Ideology and Discontent, ed. David Apter, pp. 20661. New York: The Free Press.

Who Participates in the Public Square

803

Daves, Robert P. 1999. Deliberative PollingFitting the Tool to the Job. In The Poll with a Human
Face: The National Issues Convention Experiment in Political Communication, eds. Maxwell
McCombs and Amy Reynolds, pp. 16986. Mahwah, NJ: Lawrence Erlbaum Associates.
Denver, David, Gordon Hands, and Bill Jones. 1995. Fishkin and the Deliberative Opinion
Poll: Lessons from a Study of the Granada 500 Television Program. Political Communication
12:14756.
Fishkin, James. 1991. Democracy and Deliberation: New Direction for Democratic Reform. New
Haven, CT: Yale University Press.
Fishkin, James. 1995. The Voice of the People: Public Opinion and Democracy. New Haven, CT:
Yale University Press.
Fishkin, James, and Robert C. Luskin. 1999. Bringing Deliberation to Democratic Dialogue. In
The Poll with a Human Face: The National Issues Convention Experiment in Political Communication, eds. Maxwell McCombs and Amy Reynolds, pp. 338. Mahwah, NJ: Lawrence
Erlbaum Associates.
Gastil, John, and James P. Dillard. 1999. Increasing Political Sophistication through Public Deliberation. Political Communication 16:323.
Groves, Robert M., Stanley Presser, and Sarah Dipko. 2004. The Role of Topic Interest in Survey
Participation Decisions. Public Opinion Quarterly 68:231.
Gutmann, Amy, and Dennis Thompson. 1996. Democracy and Disagreement. Cambridge, MA:
Cambridge University Press.
Habermas, Jurgen. 1989. The Structural Transformation of the Public Sphere: An Inquiry into a
Category of Bourgeois Society. Cambridge, MA: MIT Press.
Habermas, Jurgen. 1997. Between Facts and Norms: Contributions to a Discourse Theory of Law
and Democracy. Cambridge, MA: MIT Press.
Jackman, Simon, and Paul Sniderman. 2006. The Limits of Deliberative Discussion: A Model of
Everyday Political Arguments. The Journal of Politics 68:27283.
Luskin, Robert, James Fishkin, and Roger Jowell. 2002. Considered Opinions: Deliberative
Polling in Britain. British Journal of Political Science 3(23):45587.
Merkle, Daniel M. 1996. Review: The National Issues Convention Deliberative Poll. Public
Opinion Quarterly 60:588619.
Page, Benjamin I. 1996. Who Deliberates? Mass Media in Modern Democracy. Chicago, IL:
University of Chicago Press.
Ryfe, David M. 2002. The Practice of Deliberative Democracy: A Study of 16 Deliberative
Organizations. Political Communications 19:35977.
Ryfe, David M. 2005. Does Deliberative Democracy Work? Annual Review of Political Science
8:4971.
Sturgis, Patrick. 2003. Knowledge and Collective Preferences: A Comparison of Two Approaches
to Estimating the Opinions of a Better Informed Public. Sociological Methods and Research
31(4):45383.
Sturgis, Patrick, Caroline Roberts, and Nick Allum. 2005. A Different Take on the Deliberative
Poll. Public Opinion Quarterly 69(1):3065.
Verba, Sidney, Kay Lehman Schlozman, and Henry E. Brady. 1995. Voice and Equality: Civic
Voluntarism in American Politics. Cambridge, MA: Harvard University Press.
Zaller, John, and Stanley Feldman. 1992. A Simple Theory of Survey Response: Answering
Questions versus Revealing Preferences. American Journal of Political Science 36:579616.

You might also like