[message_box title=”” color=”blue”] This is Part II of a two-part series by The Governance Lab at the NYU School of Engineering on how to design a public participation initiative for lawmaking. The series is based on “Congress Is Broken; CrowdLaw Could Help Fix It” and “10 Recommendations for Designing Better CrowdLaw Initiatives.” [/message_box]
In 2014, Brazil’s parliament set up a “Hacker Lab” following its success at having the public co-create and co-draft new legislation governing the regulation of the internet. Brazil is also the home to “Promise Tracker,” a citizen-monitoring initiative created by civil society organizations in Brazil in partnership with the MIT Media Lab to enable communities to track the outcomes of policy. Using camera-phones and reporting data back through the platform, Promise Tracker was used to monitor the effectiveness of Brazil’s school lunch program in 28 schools with over 26,000 enrolled students, demonstrating the potential for having the public distribute the work of evaluating the impact of legislation on people’s lives.
Such uses of technology to foster more open and participatory lawmaking are known as CrowdLaw. CrowdLaw refers to diverse methods and tools for opening up the legislative process to more and more diverse expertise and input from the public.
CrowdLaw is still in its infancy. Practices differ. Some efforts are organized by governments and others by NGOs. Some are institutionalized with a legal mandate and others are informal. Some are designed to solicit public opinion, but others are looking for facts, data and expertise to improve the quality of legislation. They also occur at different points in the legislative process.
Still, CrowdLaw portends a future in which governing is less something perpetrated upon the public than something done in collaboration with the public, taking advantage of our diverse know-how and experience to craft laws and policies rooted in a more empirical understanding of on-the-ground conditions.
Ten critical steps for designing a public participation initiative for lawmaking
In our first post, we focused on four recommendations for aimed at improving the usefulness of crowdlaw for institutions: (1) optimizing for institutional needs, (2) designing for the needed information or expertise, (3) staffing to make participation usable, and (4) designing for use. In this post, we focus on recommendations for individuals to encourage participation including: 1) Focusing on incentives, 2) Offering clear directions, 3) Respecting privacy, 4) Giving feedback, 5) Increasing participation channels, and 6) Experimenting more. Together, these add up to ten ways to move from closed to open lawmaking.
1 Focus on incentives
Ask, “Why should a member of the public participate?” and get the answer by talking to and surveying potential users. Crowdsourcing literature indicates that perceived meaningfulness and fairness are critical to the quality of contributions and the viability of crowdsourcing platforms.[1] he onus is on managers not only to design a process that can have meaningful impact on government, but to articulate for the public their potential for impact, while making it easy for them to do so. In other words, make the rationale for participation explicit and “sell” the reasons to participate through both good design and clear explanation.
On Decide Madrid, a platform launched by the Madrid City Council for public participation in decision-making, the section where users can make proposals is much more popular than the discussion section because proposals are binding and have the potential to create change, whereas discussions are simply fora for more discussion. A survey of 482 users who had not registered for Decide Madrid found that 11% said participation was pointless, and 27% said they lacked time to participate — the most common reason cited for non-participation.[2] If an individual cannot quickly engage on a platform, it will be very difficult to overcome that reluctance through other incentives.
2 Explain clearly how to participate
The legislative process is complex, with many more bills proposed than ever become law. Therefore, a successful public engagement must explain the process and what is being asked of the participant, including setting out thresholds for action, such as the number of signatures required or what it takes for a comment to be considered. Crowdsourcing literature indicates that when “average participants” are “asked to perform technical tasks with specific instructions and detailed job classifications, their performance is equal to or better than the performance of experts.”[3]
For instance, the annual “Help Cut Red Tape” reports of British Columbia’s GovTogetherBC clearly explains what an engagement is about, how input will matter, and when to participate. The reports details popular ideas for streamlining government, statistics about the participation process, ideas submitted, and the government action taken on the issue (Figure 1). Such information helps participants understand in real time how their participation is transforming into government action. The Lisbon Participatory Budgeting process drove votes from 2,800 in 2008, to 29,000 in 2012 by increasing the presence and clarity of the process, such as adding a feature allowing citizens to track the state of implementation of successful proposals, setting up mobile participation booths, and even touring the city with a “Participatory Budgeting Bus.”[4]
3 Respect privacy and authenticate users when needed
Although it is technically possible to certify residency or identity, decide whether and when such hurdles are necessary. For example, if the goal is to get the best ideas to solve a problem, does it matter where they come from? In order to direct opportunities to participate to people based on their interests, a voluntary request for information might be welcome whereas involuntary data collection on people’s preferences may not. As an example of participant vetting, Reykjavik’s City Council is obliged to consider the 12–15 most popular proposals on the Better Reykjavik/Better Neighborhoods platform each month, so it authenticates participants using an electronic ID or password delivered through the citizen’s online bank to ensure one-citizen-one-vote. As an example of more complex authentication, Decide Madrid has a three-tiered system that determines the actions a member of the public can take consisting of:
- Unregistered users may browse site content.
- Basic verified users — verified through residence data and a mobile phone number — can post in discussions as well as create and support proposals.
- Completely verified users — verified in-person or via mail — can do all of those actions plus vote on proposals.
4 Give feedback
Public officials should respond to contributions and endeavor to communicate regularly about outcomes. Even if the public is invited only to participate in making proposals at the outset, create a mechanism to share final outcomes. For instance, participants of vTaiwan engage in on-going deliberations with each other and with representatives of relevant government ministries. Participants know that if consensus is reached, the Taiwanese government must either adopt the idea or provide a response as to why the idea is not feasible. GovTogetherBC publishes the results of every engagement. On the other hand, the Irekia system in Spain’s Basque region lacks thresholds for when citizens’ proposals receive a government response or are deemed actionable, creating ambiguity around what it takes for government to actually engage with a citizen proposal.
5 Diversify engagement opportunities and diversify who participates
Empirical research suggests that participation opportunities may be failing to attract diverse popular engagement. Ensuring participation by diverse members of the public is hard work, including investment in campaigns to recruit and give voice to the voiceless, especially by bringing news of the opportunity to participate to where people are (e.g. ads on popular rather than government websites). One study of the representativeness of 186 of the participants who contributed ideas to improving an off-road traffic law in Finland, found that participants were overwhelmingly male (86%), had formal education, and were 35–54 years old (46%). They also had previous civic experience: 72% of participants had written on an online forum prior, 41% had contacted a representative, and 33% had written an op-ed before.[5] Thus, institutions must take steps to identify who is most affected by or interested in a draft policy or law and take steps to reach out to those groups as a critical step in actively bringing underrepresented and relevant populations into the participation process.
To overcome the digital divide, for example, the creators of Decide Madrid established “Citizen Service Offices.” These offices are dispersed throughout the city and allow residents the opportunity to voice their opinions in person, if they so choose, in addition or in place of engaging online. But face-to-face engagement is not the only solution. Rather, it is important to take advantage of existing channels of communication, such as was the case in Mexico with the “Cineminutos contra la Corrupción” campaign (cinema minutes against corruption), in which the Secretariat of the Civil Service together with the Mexican Institute of Cinematography advertised the opportunity to participate in crafting an anti-corruption law was shown in movie theaters.
6 Test what works and iterate
Because CrowdLaw is a relatively new phenomenon, in order to accelerate adoption, more research is needed, which, in turn, necessitates that practitioners and researchers collaborate to design experiments. Research can involve natural experiments to observe how the platform works, who participates, and how. Additionally, consider running simple controlled trials by dividing participants into two groups and presenting them with alternative experiences, comparable to A/B testing. Try different ways of explaining how to participate, or testing the relevance of participation at different points in the legislative process. Also critical are surveys to gather information that can help in improving the effectiveness of the platform and process. Despite CrowdLaw’s promise, it’s not self-evident that more public participation per se produces wiser or more just laws. There are countless instances to the contrary, including notable recent plebiscites. Rather than improve the informational quality of legislation, opening up decision-making may end up empowering some more than others and enable undue influence by special interests. More direct participation could lead to populist rule with negative outcomes for civil liberties as well. Thus, legislatures are rightly slow to implement public engagement, fearing that participation will be burdensome at worst, and useless at best. To counter these risks and realize the benefits of CrowdLaw, there is an urgent need for systematic experimentation and assessment to inform and guide how legislatures engage with the public to collect, analyze, and use information as part of the lawmaking process in order to bring the legislative process into the 21st century.
To learn more about CrowdLaw, please visit the GovLab’s CrowdLaw micro-site at Crowd.Law, where you can watch videos by practitioners, browse case studies and sign up for our newsletter.
By Prof. Beth Simone Noveck, Director of The Governance Lab at the NYU Tandon School of Engineering. Parts of this article were originally published in Forbes magazine and by The Governance Lab. They have been republished with permission. Visit Crowd.Law for more about the GovLab´s work on this topic
[1] Helen K. Liu, “Crowdsourcing Government from Multiple Disciplines,” Theory to Practice (2017). See also: Dana Chandler and Adam Kapelner, “Breaking Monotony with Meaning: Motivation in Crowdsourcing Markets,” Journal of Economic Behavior & Organization 90 (2013): 123–33. Nikolaus Franke, Peter Keinz and Katharina Klausberger, “Does This Sound Like a Fair Deal? Antecedents and Consequences of Fairness Expectations in the Individual’s Decision to Participate in Firm Innovation,” Organization Science 24, no. 5 (2013): 1500.
[2] Investigación, marketing y opinión, “VALORACIÓN DE LA ACCIÓN DE GOBIERNO EN EL AYUNTAMIENTO DE MADRID,” June 2016, accessed July 24, 2017, https://ahoramadrid.org//wp-content/uploads/2016/06/Info-Ahora-Madrid.pdf
[3] Helen K. Liu, “Crowdsourcing Government from Multiple Disciplines,” Theory to Practice (2017). See also: Tara S. Behrend, David J. Sharek, Adam W. Meade, and Eric N. Wiebe, “The Viability of Crowdsourcing for Survey Research,” Behavior Research Methods 43, no. 3 (2011). Alexis Comber Linda, et al., “Comparing the Quality of Crowdsourced Data Contributed by Expert and Non-Experts,” PLOS ONE 8, no. 7 (2013).
[4] Giovanni Allegretti and Sofia Antunes, “The Lisbon Participatory Budget: results and perspectives on an experience in slow but continuous transformation,” The journal of field actions: Field Actions Science Reports Special Issue 11 (2014), accessed June 23, 2017.
[5] Tanja Aitamurto, Hélène Landemore and Jorge Saldivar Galli, “Unmasking the crowd: participants’ motivation factors, expectations, and profile in a crowdsourced law reform,” Information, Communication & Society 20, no. 8 (2017): 1239–1255. See also Huang, S.W., Suh, M.M., Hill, B.M., Hsieh, G., “How activists are both born and made: An analysis of users on change.org,” Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (2015): 211–220.
Leave a Reply