The California Report Card (CRC) is an online platform developed by the CITRIS Data and Democracy Initiative at UC Berkeley and Lt. Governor Gavin Newsom that explores how smartphones and networks can enhance communication between the public and government leaders. The California Report Card allows visitors to grade issues facing California and to suggest issues for future report cards.
The CRC is a mobile-optimized web application that allows participants to advise the state government on timely policy issues. We are exploring how technology can streamline and structure input from the public to elected officials, to provide them with timely feedback on the changing opinions and priorities of their constituents.
Version 1.0 of the CRC was launched in California on 28 January 2014. Since then, over 11,000 people from all 58 counties have assigned over 30,000 grades to the State of California and suggested issues for the next report card.
Lt. Governor Gavin Newsom: “The California Report Card is a new way for me to keep an ear to the ground. This new app/website makes it easy for Californians to assign grades and suggest pressing issues that merit our attention. In the first few weeks, participants conveyed that they approve of our rollout of Obamacare but are very concerned about the future of California schools and universities. I’m also gaining insights on issues ranging from speed limits to fracking to disaster preparedness.”
“This platform allows us to have our voices heard. The ability to review and grade what others suggest is important. It enables us and elected officials to hear directly how Californians feel.” – Matt Harris, Truck Driver, Ione, CA
“This is the first system that lets us directly express our feelings to government leaders. I also really enjoy reading and grading the suggestions from other participants.” – Patricia Ellis Pasko, Senior Care Giver, Apple Valley, CA
“Everyone knows that report cards can motivate learning by providing quantitative feedback on strengths and weaknesses. Similarly, the California Report Card has potential to motivate Californians and their leaders to learn from each other about timely issues. As researchers, the patterns of participation and how they vary over time and across geography will help us learn how to design future platforms.” – Prof. Ken Goldberg, UC Berkeley.
It takes only two minutes and works on all screens (best on mobile phones held vertically), just click “Participate“.
Anyone can participate by taking a few minutes to assign grades to the State of California on issues such as: Healthcare, Education, Marriage Equality, Immigrant Rights, and Marijuana Decriminalization. Participants are also invited to enter an online “cafe” to propose issues that they’d like to see included in the next report card (version 2.0 will come out later this Spring).
Lt. Gov. Gavin Newsom and UC Berkeley Professor Ken Goldberg reviewed the data and lessons learned from version 1.0 in a public forum at UC Berkeley on 20 March 2014 that included participants who actively contributed to identifying the most important issues for version 2.0. The event can be viewed at http://bit.ly/1kv6523.
We offer community outreach programs/workshops to train local leaders on how to use the CRC and how to reach and engage under-represented groups (low-income, rural, persons with disabilities, etc.). If you are interested in participating in or hosting a workshop, please contact Brandie Nonnecke at firstname.lastname@example.org.
Comments and suggestions will be scanned to omit identifiable information. We respect your privacy and will not share your email address with third parties.
The California Report Card (CRC) aims to increase communication between elected officials and the public. The CRC differs from randomized telephone polls and surveys in that participation is self-selective and accessible so far only to English speakers with access to a smart phone or web browser. We are working on extending the interface to Spanish speakers and persons with disabilities.
The primary advantage of web-based data collection — timely participation by many motivated individuals — is also a disadvantage, there is an inherent self-selection bias: the sample may not be representative of the full population. We are very interested in comparing the data from the CRC with that collected in standard polls on these issues conducted by other parties during the reporting period.
We conjecture that the “report card” format will be intuitive for participants and the CRC explores an ongoing process that shares the cumulative data with participants and invites them to help design the next Report Card (by suggesting timely issues and considering and grading the suggestions of others in the Cafe phase).
Standard telephone surveys also suffer from coverage bias as 25% of US households do not have landlines:
There are a myriad of other challenges in survey design, including bias in how questions are phrased, the order of questions, etc. (see references below). The CRC strives to limit such biases but avoiding them altogether is impossible.
By clicking “Skip” on any issue, participants can decline to grade issues for which they do not feel suitably informed or willing to grade. After entering each grade, participants see the Average Grade based on input from all previous participants. This is intended to provide rapid feedback to each participant on the data collected thusfar. We acknowledge this could have a biasing effect: participants may subsequently adjust grades to be closer to the average resulting in a form of regression toward the mean. We believe this will be rare, especially for Californians (who rarely follow the norm). We will report the number of such adjustments and the before and after values as well as comparisons with concurrent polls on similar issues by other groups, in the public dataset we release in March 2014.
In the Cafe discussion, we present only eight mugs at a time to avoid overcrowding. The mugs are randomly drawn (without replacement) from the set of ideas contributed by participants. To ensure that new ideas get sufficiently graded, the sampling algorithm is biased toward mugs with less grades. For each mug, we compute the Standard Error of the grades it has received so far and sample so that the Standard Errors of each mug tend to equalize. This also makes the system more robust to malicious or frivolous grading and reduces the effects of preferential attachment (ie. rich get richer) which common in many online rating systems.
Citizenville – Lt. Gov. Gavin Newsom
Behind the Times: Government Missing Out on Technology Innovation. Lt. Gov. Gavin Newsom. The Huffington Post. Nov. 2013
In his first book, Citizenville: How to Take the Town Square Digital and Reinvent Government, Lt. Gov. Gavin Newsom describes how network technology can be used to empower and engage the public with government. As social network technology and smart phones have changed how we interact with one another, these technologies can also be used to transform our relationship with government.
The CITRIS Data and Democracy Initiative (DDI)
DDI develops tools to support the evolving, dynamic relationships between digital media and democratic practices, for example novel mobile, Internet and social media applications to enhance online deliberation, participatory decision-making, and rapid mobilization. DDI seeks to enhance individual and collective awareness, understanding, and engagement for people of diverse backgrounds on critical social, political, and economic issues.
The Data and Democracy Initiative is collaborating with the UC Santa Cruz Center for Games and Playable Media, and UC Berkeley’s Center for New Media (BCNM), Human Rights Center, Social Apps Lab, the Algorithms, Machines, and People (AMP) Lab, and the MIT-IBM Network Science Research Center, among others, including companies and government and non-profit organizations. “This is exactly what governments need,” noted Alec Ross, Senior Advisor for Innovation to U. S. Secretary of State Hillary Clinton.
Self-selection sampling. Lund Research Ltd. 2012.
How accurate are self-selection web surveys? Jelke Bethlehem. Statistics Netherland. 2008.
The Value of Online Surveys. Joel Evans & Anil Mathur. Internet Research, 15(2), pp. 195-219. 2005
NSF SBE-CISE Workshop on Cyberinfrastructure and the Social Science. Francine Berman, UC San Diego & Henry Brady, UC Berkeley, 2005.
Contributions of Survey Research to Political Science. Henry E. Brady. Political Science and Politics, 33(1), pp. 47-57. March 2000.
Social Media and Political Engagement. Lee Rainie, Aaron Smith, Kay Schlozman, Henry Brady, & Sidney Verba. Pew Research: Internet Project. October 2012.
The Art of Political Science: Spatial Diagrams as Iconic and Revelatory. Henry Brady. Perspectives on Politics, 9(2), pp. 311-331. June 2011.