Civic Hacking Civic Tech Direct Action

Volunteer Coders Force the Dept Of Education to Actually Help Debtors

Volunteer Coders Force the Dept Of Education to Actually Help Debtors

When it comes to making conned students aware of their right to seek debt discharges, and providing the means for them to apply for that discharge, the Department of Education’s technology is essentially nonexistent.

  • This March saw the launch of a web application that makes it easy for victims of predatory colleges to request student debt forgiveness from the Department of Education. For the first time ever, debtors were able to exercise their right to apply for student debt cancellation on their mobile phones. More than 300 applications flowed in that first week. But this website wasn’t built by the government; it was built by volunteers with the Debt Collective, including myself, in less than a month. The Secretary of Education Arne Duncan frequently touts the virtues of tech, but in so far as modernizing how the agency actually helps current and former college students, the heavy-lifting has been forced onto the backs of students, debtors, and volunteers.


    As the total college student debt burden reaches $1.3 trillion, an offshoot of Occupy Wall Street called Strike Debt found that tuition-free education at all public two- and four-year colleges could be achieved with just $15 billion in new spending. Meanwhile, the government is expected to profit from student debt repayments to the tune of $127 billion over the next ten years. To make matters worse, student loans are unique from other kinds of household debt in that they cannot be discharged by bankruptcy. They are nearly impossible to get out from under, even in the most dire of circumstances. With these revelations, it seems true that some or all of this debt is morally illegitimate, and action is needed to give relief to these debtors.

    One of the only existing safety nets for millions of student debtors is the Higher Education Act, which gives the Department of Education authority to cancel debt when a school violates state law. But for many years the Department’s website had little to no information on how to dispute debt, let alone an online application to do so. The process was an uphill, seemingly impossible battle for debtors. That is, it was until this year, when a small group of volunteers at the Debt Collective decided to do something radical: the government’s job.

    For years, hundreds of thousands of students have found themselves scammed by predatory for-profit colleges. Corinthian Colleges, Inc., was one of the largest and most notorious of these chains. It was clear from multiple government actions and investigations, dating as early as 2007, that Corinthian was rife with deceptive and unfair business practices—from false job placement statisticssecurities fraud, to the unlawful use of military seals in advertisements. Despite this, the Department of Education continued to collect on the debt of scammed students, students Secretary Duncan admitted left Corinthian Colleges “in a worse position than when they started.”

    After months of watching the Department of Education do nothing, while hearing story after story of lives ruined by unpayable debt, our team at the Debt Collective decided to make a move. With help from some amazing lawyers, we began to craft a process to dispute students’ debt. Law student Luke Herrine spent hours coordinating our strategy with a team of legal experts, creating a multi-page application form that affected and eligible debtors would need to fill out by hand. In the form, each debtor needed to cite the specific legalese appropriate for their state. It wasn’t even clear that the Department would recognize the forms as legitimate—no one had ever done this sort of thing at scale before, and the Department had little-to-no information on their website. With an entirely paper-based and manual method, our first, ambitious goal was to submit 50 applications—one for each state.


    As the Facebook groups for former Corinthian students grew larger, it became clear that the goal of 50 Defense to Repayment applications was not enough. We wanted to dispute the debt for as many debtors as possible.


    Karissa McKelvey (left) and Ange Tran (Ann Larson)

    Karissa McKelvey (left) and
    Ange Tran (Photo courtesy Karissa McKelvey)

    Designer Ange Tran suggested we make a “wizard”—an application that walks debtors through each section of the long legal form. Tran had previously created a similar solution to automate sending letters for pro-solar policy advocacy in New York State. But years later, with a solid tech team and a large group of debtors in solidarity, our new application could take the concept further with mobile-first validated design, data-driven progress dashboards, and secure data storage. It needed to be built fast, too—as every day without a submission system meant another day of financial hardship and debt collector harassment for hundreds of thousands of debtors.

    Our online form had to enable a debtor to submit a variety of information about themselves: sensitive data, such as their social security number, birthdate, and address; state law(s) broken by the school that would fall under the Defense to Repayment provision (that changed based upon the state the student had lived in); and supporting materials, such as their story or evidence of abuses. Making people fill out a PDF directly using Acrobat would lead to a substantially lower conversion rate. Many don’t have access to a desktop or laptop computer, become overwhelmed by long forms, or have limited time with their busy schedules.

    The website had to be easy to read—checkboxes needed to hide away complicated legalese, replaced automatically using a spreadsheet of related paragraphs. Data had to be transferred over a secure, encrypted channel (https) with the completed forms accessible through only one secure administrator account to protect the privacy of debtors. With a conveniently-timed lull in funding for my day job, I began developing the back end system and overseeing the technical architecture, working with Tran, Herrine, and our front end developer and designer Zach Greene.

    We released the prototype to the public on March 25 after four weeks of research, design, and development. Within a week, the Debt Collective received over 300 applications (over 2000 to date). Herrine printed out hundreds of applications by hand and delivered them to the Department of Education with representatives from the Corinthian strikers. (When we asked the Department if we could send claims by email, they said no, they would have to be printed and submitted on paper.) Although the Debt Collective continued to submit applications on the behalf of debtors over the next few weeks, the Department of Education remained silent. We wondered if they were overwhelmed by the volume of forms.


    Finally, after two weeks, the Department of Education announced that they would consider the discharge of some students’ debt. This was a major victory for those affected, but also led to a tremendous revelation about the limits of the Department. They maintain that most debtors must still submit forms individually to prove their injury. A select group of 40,000 Heald students can submit to a “fast track” process, and need only complete an attestation form. But, laughably, the form only works in Adobe Acrobat—not on smartphones or other PDF programs. No wonder only 180 of the 40,000 eligible students have completed the form so far!

    The Department of Education loves certain applications of technology. In April, Secretary Duncan published a piece on on the importance of expanding the role of technology in the classroom. And they have an entire Office of Innovation and Improvement that provides grants for education innovation. But when it comes to making conned students aware of their rights to even seek debt discharges, and providing means for them to apply for that discharge, the Department’s technology is essentially nonexistent. If we could fashion a superior solution with a rag-tag, mostly-volunteer crew in two weeks and get the word out to thousands of students, why can’t a government agency with billions in funding do the same?

    Despite the vast amount of money, human resources, and technical knowledge the Department possesses, it still does not have an automated process to discharge the debts of all Corinthians—or any students—at once. The government’s failure to build technology may be one of intentional neglect, to erect barriers to legal processes to which we are entitled. There are resources the Department could use inside and outside of the government, such as 18F and Code for America, to build superior technology that better connect the data warehouses that track past, current, and future students and their family’s finances. But so far this hasn’t happened. Unfortunately, this is not a technical problem but a political problem. The onus of bad debt is placed onto the individuals who took the debt, not on the corporations that profit. And right now the Department’s technological priorities reflect that attitude.

    We’ve proved that something as simple as a PDF generator and mobile-first web front end can radically affect the landscape of education policy, but there’s still a lot of work left to do. We plan on extending the current application to work for all student debtors from multiple colleges, not just the Corinthian chain.

    As a Bay Area “techie,” I’ve heard more times than I can count that we are somehow “changing the world”—and usually I find it offensive. The tech community is rich with stories of startup-driven “disruption” that upends the status quo in a particular sector, without attention immediate needs within the political-economic climate. The fetishization of “disruption that changes the world” overlooks relatively simple efforts, aggrandizing those that affect more “fundamental” functions of the economy or the technology that runs it. But as government technology waxes and wanes, startups fail, and bubbles pop, we must dare to ask—what if the best things we can do in civic tech aren’t the most complex, cutting-edge, or “innovative” technological feats, but rather ones that are attuned to real needs on the ground and made in dialogue with grassroots communities and activists? Technology alone can’t “change the world,” but when technological is employed strategically by social movements, it can certainly help us do so.

    Karissa McKelvey is a programmer and former academic experienced in building interactive data visualization and collaboration tools.

Civic Hacking Civic Tech Democracy

Politics and the Culture of Fear: Is There a Place for Digital Disruption?

Politics and the Culture of Fear: Is There a Place for Digital Disruption?

  • It feels as if we can’t escape the culture of fear and extremism that is pervading politics. Political discourse is more vitriolic than ever after San Bernardino and Paris, and during the months of partisan name-calling and ugly mud-slinging among candidates for the U.S. Presidential Race. And clearly, there are no easy solutions to unraveling this vicious cycle.

    During the Christmas holiday, I had an experience that perfectly illustrated this to me. My family and I were at a friend’s house for a holiday event, and I overheard her guests talking as I walked through the kitchen. I heard, “The more he says, the more I like him.” Then, “He says the things we all think but are afraid to say.” I started to get that sick feeling in the pit of my stomach, hoping they weren’t talking about Donald Trump. Then I heard, “The only problem with building a wall between Mexico and the U.S. is that it will have to be so big that it’s impractical and expensive.” I tried to talk myself off the ledge, saying to myself, “Don’t open your mouth, just keep walking, don’t say anything, it won’t help or change anyone’s mind…..” But then as I was about to turn the corner, safely avoiding a conversation that would surely have turned ugly, I heard, “Of course we should ban Muslims from entering the country. Look what they did in Paris.” So, I turned sharply on my heel and unwisely marched over to the little group sitting around the kitchen table.

    “Excuse me,” I said, “but I couldn’t help but overhear your conversation, and I wish that you would consider the fact that excluding or persecuting people solely on the basis of their religion or ethnicity is how (voice rising) the Holocaust started.” And then, when the response to that grenade lob was dropped jaws and the explanation, “It would only be temporary,” I looked at them incredulously, probably with disgust on my face, and said, “That’s what Hitler said and” just in case they didn’t get it the first time, “that’s how the Holocaust started.” Then I abruptly left, muttering, “This was a mistake, I can’t talk about this…..”

    I found this conversation terrifying—not only because the thought of Trump as Presidentimages is terrifying, nor because I was disappointed in myself because I lost my cool, and created an extreme, unbridgeable divide between our viewpoints by invoking the Holocaust. No, this conversation was most terrifying because these people were not bad people. They were the type of people I appreciate: good, kind, hard-working people who love their kids and their family.

    So where does that leave us?

    I don’t have a solution, and indeed, my own extreme reaction during the kitchen table conversation shows that I lack objectivity and am certainly part of the problem. I do, however, as a scientist believe that we can harness what we know about our minds and brains to neutralize this vicious cycle of social and political extremism. Could digital disruption help move us along a path to such change? There might not be an app for that, but below I list three steps I believe could put us on the road towards digital disruption of the political culture of fear.

    1. Frame political extremism as an emotion regulation problem. Before any digital disruption can happen, we have to make sense of the problem and have a concept of what’s going wrong. We have all had one of those kitchen table conversations I described above. In these conversations, our emotions get the better of us: fear, disgust, anger. This is a problem in how we control our emotions and how our emotions control our thoughts, decisions, and actions, something psychologists call emotion regulation. The problem is that our strong emotions rarely convince our debating partners. Instead, they solidify the views everyone already holds, causing us to cling to them even more strongly and rigidly. Common ground is lost, and the divide between perspectives seems increasingly unbridgeable.

    Imagine how a version of that kitchen table conversation happens on the political world stage, sabotaging attempts at diplomacy and mutual understanding. The result is not just upset and angry people. Now the result is that our emotions directly shape political discourse, legal decisions, and policies that can affect generations to come.

    Thus, a first crucial step towards disruption of the political culture of fear is to frame political discourse in terms of emotion regulation, applying what we know about what goes wrong and how to fix it on the individual and group level.

    2. Use technology to promote empathy. Recent research in political psychology suggests that empathy can help heal rancorous political divides. A recently-published study showed that when political advocates fail to understand the values of those they wish to persuade, this “moral empathy gap” causes their arguments to fail. However, when political arguments are reframed in the moral terms of the other side, they are more effective. For example, when asked about their views on universal healthcare, conservatives who heard “purity arguments” (e.g., sick people are disgusting and therefore we need to reduce sickness) were friendlier towards universal healthcare, compared to when they heard “fairness arguments,” which are more consistent with liberal values.

    If we can use technology to bridge the moral empathy gap, we might be able to reduce political polarization and promote better emotion regulation, more compromise, and deepened understanding. Virtual Reality (VR) might be one such technology. I previously wrote about Chris Milk’s thought-provoking TED talk on VR as the “ultimate empathy machine.” By creating a sense of presence and of real interactions with people and worlds, VR forges empathic bridges leading to greater understanding and compassion. In his work with the UN, Chris Milk uses VR to vividly portray the plight of refugees to politicians and policy makers. How does seeing and experiencing the suffering of 5-year-old children in the refugee camps influence policy making? Almost certainly for the better.

    3. Use technology to calm the fearful brain. As political ideologies become increasingly polarized, neuroscience research suggests that the differences between liberal and conservative viewpoints may extend beyond policy preferences to fundamental differences in the “fearful brain.”

    In a paper I wrote in 2014 with Dave Amodio, a professor at NYU, we found that children of liberal compared to conservative parents showed a stronger “N2” brain response to mildly threatening and conflicting information. A greater N2, derived from EEG, suggests more openness to uncertainty, ambiguity, and threat. A culture of fear, in politics or otherwise, is marked by the opposite of this: inflexibility and discomfort in the face of uncertainty and ambiguity, along with resistance to change. These aspects of fear are part of the foundation upon which intolerance is built.

    What if we could create computerized interventions that promote our ability to cope with uncertainty and change, perhaps by strengthening the N2 response? My research on the stress reduction app Personal Zen, as well as other research, shows that this may be possible. More research is needed, but if science-driven digital mental health continues to evolve, reducing the political culture of fear could soon be in the palm of our hand.

    Tracy Dennis-Tiwary is a professor of psychology and neuroscience and writes about mental health and technology. This article was originally posted on her blog, Psyche’s Circuitry.

Civic Tech Commons



A new model—the Civic Trust—may help protect the public’s interest as civic tech evolves.

This is the second installment of three pieces on the business of civic tech and how we should be rethinking that business. You can find Sean McDonald’s first piece here.

Trust in institutions, globally, is at an all-time low. Government, business, journalism, and even nonprofits, are all losing the public’s faith. The U.S. government, in particular, has hovered near its lowest approval ratings in history for an uncomfortably long time. And, given recent history, it’s hard to blame us.

One reason is that we’re using the same organizational models we’ve used for decades. The way that we legally structure organizations (Corporations, 501(c)3s, Limited Liability Companies, etc.) defines their incentives, values, decision-making structures, and priorities. Incorporation models are the DNA of organizations—even those with revolutionary approaches to collective action—and that DNA replicates the same structural flaws and risks as traditional organizations. We are trying to build the future with the same organizational structures that gave us the present and—as Einstein said, “We cannot solve our problems with the same thinking we used to create them.” It’s fair to say that decision making structures are, collectively, how we think.

Amidst the global fallout in trust, there are many trying to build technological solutions to our trust deficit—both better verification and “trustless” systems. The most recent (and, arguably, credible) technology solution is the blockchain—the distributed administration and ledger architectures made famous by Bitcoin. The blockchain offers a huge amount of potential—in particular, it decentralizes administration (and administration costs), improves transparency (for those capable of understanding it), and increases the reliability of complex interactions. As a way to contextualize blockchain’s potential impact on the ecosystem and evolution of technology adoption, Nick Grossman’s “Venture capital vs. community capital,” is a great overview.

However, as Rachel O’Dwyer’s “The Revolution Will (not) Be Decentralized: Blockchains” points out—the primary challenges in collective action, while interesting technologically, still come down to mediating relationships, managing governance structures, and being able to set common standards. Blockchains create huge opportunities to transparently design and manage distributed processes, but they don’t enable us to evolve norms, resolve disputes arising from the transactions they administer, or meaningfully solve the definitional issues that lie at the core of our representative governance models (more on those here). To take the inverse of the “law cannot solve technology’s problems,” trope—we can’t expect technology to solve our organizational problems. 

One of our most challenging organizational problems is how to balance the financial needs of information channels and the integrity of the information and relationships the channels represent. Publishing platforms make it even more complicated—they use their content and engagement structures to claim and monetize our behavior and relationships. The majority of privately owned platforms—even well intentioned ones—sell some form of access to users, data, services, and servants—all of which distorts the integrity of the underlying relationships. As respected, anonymous cybersecurity expert @SwiftonSecurity tweeted recently:


And that’s where it gets really concerning—the more civic groups and governments rely on commercial technologies as intermediaries, the more potential there is to strain or distort the already struggling trust relationships between the public and institutions. As Erica R.H. Fuchs noted in a research paper about DARPA, institutions that develop technology require embedded network governance, meaning that the network should have an built-in role in making decisions. According to Fuchs, that means more than just technology—it includes bridging the policy, business, and legal implications of technology. That, as Stanford’s Lucy Bernholz notes, means not only changing what civil society groups do, it means changing how they work. 

So. How do we build trust in organizations? More importantly, how do we make organizations more trustworthy? I’m a big believer in progress through open, participatory processes. Still, they’re big questions that will require a lot of experimentation to answer. Finding socially conscious ways and spaces to experiment will be incredibly important, especially in an intellectual property climate that allows companies to own the ways that we engage with each other. A fundamental part of building trust in those organizations will involve ensuring that the underlying ideas that redefine collective engagement don’t disappear or irreparably change when the companies that build them do. And, like most forms of both collective action and technology, we’ll probably end up with more than one answer.

One answer of how to build those socially conscious, safe spaces may be a new approach to an old structure: the Civic Trust*. Traditionally, trusts are privately created legal agreements that create systems of management and governance over a particular set of assets, according to a set of values or desired goals. Trusts also create a legally enforceable fiduciary duty to the beneficiary, which can be defined when they’re created. For a broad overview of Trusts, this article from Findlaw is helpful. 

The use of Trusts to protect a set of common resources or values isn’t new. Natural resources are often donated to governments through Public Trusts, which can set standards around the maintenance and care of that resource. Similarly, many Trusts are created to ensure the integrity and sustainability of institutions—in his recent farewell to readers, the Guardian’s Alan Rusbridger cited how important the Scott Family Trust has been in helping maintain the newspaper’s independence. Similarly, in this article, Keith Porcaro lays out an incredibly smart approach to how Trusts could be used to protect user interests in the use of their data. Even venture capitalists form trusts to manage risk. But, in their present form, Trusts—like almost every other incorporation model—focus on specific inputs (resources, financial investment, votes) and less well-defined outputs (conservation, independence, profit, democracy). 

That’s where a Civic Trust are different. Civic Trusts focus on ensuring that transparent and meaningful participation processes are built into the way that technology products evolve. As most social platforms prove, well designed engagement processes are resources—and in publicly supported technology design, how they’re governed matters. 

Here’s how it works: A Civic Trust would be created by an organization that wants to be able to make meaningful guarantees to protect its users and customers. The Civic Trust would create an independent organization that owns the code and data resources created by the creator, using limited, revocable licenses to give for-profits, nonprofits, and governments the right to use, adapt, and sell products based on the underlying code. These licenses would give the Civic Trust the ability to audit and ensure that basic standards of participation were met in the way both the technology and organizations evolved (things like rate of versioning, feature development, security defaults, dispute resolution processes, data use, etc.). As opposed to focusing on the value of the outcomes, Civic Trusts would focus on the values embedded in the decision making processes of the user organizations. 

This approach isn’t without precedent. Civic Trusts work the same way that Apple and Google are structured, except that instead of using the structure to avoid taxes, it would use it to build and steward truly open and participatory governance processes (in technology). Civic Trusts could create safe spaces to experiment with governance and decision making processes, hardcoding a public advocate into the organizational DNA of the companies and technologies that connect us. Civic Trusts would help us define embedded network governance in the public interest.

My next post will dive into the nitty gritty of a Civic Trust, including sample contractual provisions and ideas. At its core, though, Civic Trusts are a recognition of the importance of ingraining participation in public interest technologies. Ultimately, we have yet to figure out how to design organizations that can balance the public interest with the expectations and requirements of their funders—as Paul Klein notes, even nonprofits struggle.

While I’m sure that there are many who disagree, that disagreement is exactly why we need Civic Trusts. We don’t have to agree, but we need the space to experiment and disagree in public. After all, as long as we, the public, invest in privately owned platforms—with our time, data, money, or anything else of value—we should have a meaningful voice about how they treat us, grow, and change. Otherwise, who would trust them?

Sean Martin McDonald is the CEO of FrontlineSMS. Frontline helps governments, businesses, technology providers, and nonprofit organizations translate what they do into text messaging microservices, enabling them to reach more people, more efficiently.

#PDF Civic Tech Design



This year, many of the speakers at Personal Democracy Forum challenged us to rethink the cultural design of our systems, not simply the technical.

(Andreas Pizsa, CC BY 2.0)
  • This year, many of the speakers at Personal Democracy Forum challenged us to rethink the cultural design of our systems, not simply the technical. Deanna Zandt asked us to “Imagine All The Feelz” and consider how we might create space for personal truths, even the painful ones, in our social media discourse. And in “Public Engagement is Broken: Are You Part of the Problem?” Catherine Bracy suggested that, instead of building a new social network, we redesign the public meeting from a space for contentious bickering to a space for productive dialogue.

    The truth is, the culture of a system determines its success. We need systems that are comfortable with the notion that they are not perfect. We need systems that acknowledge that we are always learning, that we improve over time. We need systems built on a culture of ongoing improvement, not fixed outcomes. We see this learning culture across many disciplines, from agile development in tech to continuous improvement in education. The core logic at the heart of all of these successful systems is that of the “growth mindset,” a philosophical stance first identified by Stanford psychologist Carol Dweck. 

    A “fixed mindset” assumes our qualities of character and intelligence are set at one inherent level, and there is little to be done to change them. Some of us are smart, some of us are dumb, and we play the hand we’re dealt. A growth mindset assumes that our intelligence and abilities are dynamic, and that we can improve our skill levels through practice. Our capacity is directly related to our effort. Carol Dweck finds that fixed mindset students are mainly motivated “to look smart all the time and never look dumb.” Growth mindset students are more likely to continue working hard despite setbacks, and look at challenges as opportunities for growth.

    What will happen if we adopt the growth mindset as the culture of our civic engagement systems? We can reframe our failures as opportunities to learn. We can contextualize data as a means to an end. We can embed accountability as a stepping stone to progress. We can meet people where they’re currently at and create opportunities for deeper participation over time. Our systems will incentivize participation, because participation will create improvement.

    Tristan Harris broke down what happens when we use a fixed outcomes approach to designing our systems in his PDF talk, called “Constantly Distracted? Design for Time Well Spent.” Using a fixed volume metric like time spent has lead to product features like the Facebook Timeline, which encourages passive content consumption. The Facebook Timeline has had a profound impact on how we spend our time on the internet, reducing active participation by omission. What if we measure mindful engagement, as Tristan advocates? What if, as he suggests, we use positive impact on human well-being as a measure of our success instead of time spent? Using our metrics to track what we value gives us a concrete pathway to deliver on growth mindset-based design.

    Last year I founded a social systems design lab called Thicket. From its inception, we’ve focused on creating a space for people to think and work together to solve our most entrenched systemic problems. Throughout the process of designing our community-powered research and design platform launching this summer, our team has been motivated by this question: How might we instill the growth mindset in our product design? We think we’ve done pretty well, but in the spirit of continuous improvement, we can do even better.

    Coming to PDF this year as a Civic Hall Fellow has been an invigorating reminder that there is a strong, motivated, energized community of thinkers, designers and technologists who believe that our systems can truly be better, and are putting in the effort to make them so. If we can channel that spirit of dynamic improvement into all of our systems, I believe we will have successfully created the conditions for greater civic engagement. 

    Deepthi Welaratna (@deepthiw) is founder of Thicket, a design lab and consultancy creating products and experiences that harness the power of global communities to move us forward, faster. Deepthi has spent the last 14 years influencing complex systems through public policy campaigns, creative leadership programs, and movement building around a range of social and economic issues. Deepthi attended PDF as a Civic Hall fellow this year.

#PDF Algorithms Civic Tech



Weapons of math destruction are characterized by their opacity, their power, their widespread use, their poor definitions of success, and their engendering of pernicious feedback loops.

  • Personal Democracy Forum is next week, and we’re reaching out to some of the speakers for a quick preview of their respective talks and panels. What follows are a few words from Cathy O’Neil, who writes at the blog and is working on a book about the dark side of big data. O’Neil will deliver a talk on “Weapons of Math Destruction.”

    You’ll be speaking at the conference on the subject of weapons of math destruction. Give us a preview: what the heck are weapons of math destruction?!

    They are mathematical algorithms that are being deployed to make important life decisions for certain people at certain moments. They are characterized by their opacity, their power, their widespread use, their poor definitions of success, and their engendering of pernicious feedback loops. I will give a bunch of examples of WMD’s from education (the Value-Added Model for Teachers), the criminal justice system (evidence-based sentencing models), and politics (micro-targeting).

    The theme of the conference this year is the future of civic tech. As briefly as you like: Where do you think civic tech is going, what do we have to look forward to, and what pitfalls should people working in this sector be aware of?

    I’d say that my example with micro-targeting in politics is more or less an intersection of WMD’s with civic tech. I am, in other words, a civic tech skeptic.

    I’m focusing on the pitfalls. Civic tech has a lot of positive vibes but successful data work, which is usually done in the quest of power, money, or both, should teach us a few lessons. If we want data or technology to work for the public good, we have to make it so in a deliberate and thoughtful fashion. It’s not good enough for us to “open up the data” and wait for the tide that lifts all boats. 

#PDF Civic Tech



  • Personal Democracy Forum is in less than two weeks, and we’re reaching out to some of the speakers for a quick preview of their respective talks and panels. What follows are a few words from Nanjira Sambuli, a research manager at iHub in Nairobi, who will deliver a talk on “During and After Atrocity: How Kenyans Use The Web to Heal and Deal.”

    So, for people who aren’t familiar with your work, how does it relate to civic tech?

    I manage research around governance and technology at iHub. That basically means that I spearhead and/or oversee research projects that assess how technology is being adopted or co-opted into governance in Kenya, and increasingly in East Africa. Its relation to civic tech is through insights gleaned from, for instance, studying if/how ICTs have facilitated two-way interaction between government and citizens

    You’ll be talking at the conference about how Kenyans have used the web to “heal and deal.” What most surprises you about the use of the web after an atrocity?

    My country has faced a number of security-related tragedies in the past three years, and due to the increasing uptake of social media, Kenyans have had an opportunity to grieve together, share in their anger, and at various turns engage in collective action towards seeking accountability or raising funds for emergency relief. It has been particularly interesting to observe the various civic roles that Kenyans online have engaged in, individually and collectively. It has also been interesting to observe the life cycle. For instance, very pertinent, difficult questions are often asked, in a quest to seek accountability. Folks, for instance, will tweet various authorities and representatives with great vigor “in the heat” of an event, but that vigor seems to dissipate the moment we move on to something else in the news cycle. Observing this over time has led me to wonder if the use of social media in times like these, and how Kenyans typically engage on these platforms, can be considered civic tech, and what that means for developers, legislators, civil society organisers, activists and others keen on engaging them online, or offline towards a civic action. The Kenyan case is not necessarily unique, but a particularly interesting one off which to ask deeper questions on what constitutes civic tech: is it tools, is it the use of tools, is it both?

    The theme of the conference this year is the future of civic tech. As briefly as you like: Where do you think civic tech is going, what do we have to look forward to, and what pitfalls should people working in this sector be aware of?

    I’m intrigued by the idea behind the term and concept of civic technology. As yet, I haven’t come across an agreed upon definition, and based on practice, it seems centered around designing specific tools that can facilitate or enhance civic engagement or civic action. I am curious as to how impact is assessed. I am curious (as a researcher) whether citizens’ needs are incorporated into design and implementation. I am curious as to what has been found to be the motivation and incentives among the various target audiences to use and reuse such tools as designed. One pitfall I think should be considered is that people may not be keen to visit 10 different apps designed for 10 different civic actions…how do we ensure that the design, deployment and continued use of civic technology is considered meaningful and worthwhile in the long-term? 

#PDF Civic Tech GovTech



“I’m hoping to challenge the audience to think critically about our role as advocates for digital democracy. Are we focused on the right problems? Where are our blind spots?”

Personal Democracy Forum is in less than two weeks, and we’re reaching out to some of the speakers for a quick preview of their respective talks and panels. What follows are a few words from Catherine Bracy, Code for America’s Director of Community Organizing, who will deliver a talk entitled “Public Engagement Is Broken. Are You Part of the Problem?”

So, for people who aren’t familiar with your work, how does it relate to civic tech?

Code for America’s mission is to build government that works for the people, by the people in the 21st century. We do that by collaborating with government on improving service delivery—in the health, safety and justice, and economic development areas—through technology. We also focus on improving the public’s relationship with government by creating innovative spaces and channels (sometimes digital) where government and residents can meet.

I understand you’ll be speaking at the conference about how public engagement is broken. Is this public engagement with government or with communities or something else entirely? You will also address how someone can tell if they are part of the problem; are people in the audience going to be squirming when you get there?

I’m speaking specifically about the public’s engagement with government. I’m certainly hoping to challenge the audience to think critically about our role as advocates for digital democracy. Are we focused on the right problems? Where are our blind spots? Why haven’t we been able to significantly move the needle on the public’s sense of trust in government? But, I’m also really hopeful and plan to share some bright spots I’m seeing.

The theme of the conference this year is the future of civic tech. As briefly as you like: Where do you think civic tech is going, what do we have to look forward to, and what pitfalls should people working in this sector be aware of?

I think we’re at a point in the civic (gov) tech movement where we can move from building apps to show what’s possible to really thinking strategically about how we can implement technology to make structural change inside government. We are beginning to measure our success not just by how many users a particular app gets, but by how much impact a tool has on a social outcome, or by the kinds of process and policy changes that happen within institutions as a result of building a tool. In terms of what to watch out for, I think we’re going to need to pay a lot of attention to privacy as we help governments open more data. But generally, there are lots of pitfalls whenever you try to change the status quo. As someone, can’t remember who, said, “the first ones through the wall are always the bloodiest.” But the friction is part of the process. It’s how we know we’re getting stuff done. And we’re extremely excited about what’s next. 

#PDF Civic Tech



Personal Democracy Forum is in less than three weeks, and we’re reaching out to some of the speakers for a quick preview of their respective talks and panels. What follows are a few words from Nancy Lublin, the CEO of and Crisis Text Line.

  • Your upcoming talk is titled “Winner Texts All.” In your work at Crisis Text Line, you’ve made intensely personal connections possible over a seemingly impersonal communication method. What has that taught you about capturing the power of the text message? Where is the inspiration for your talk coming from?

    Text feels both more private and anonymous, while also allowing for deeply personal real sharing. It’s a phenomenal medium for counseling. Last week someone posted something on Imgur that said: “I suffer from depression and my anxiety prevents me from calling the suicide hotline. Found out there is a text version 741-741 “Start” and it’s been some of the best advice no therapist in 16 years has given me.” That post was shared over 600,000 times in 24 hours, then went to Tumblr, then the homepage of Reddit.

    The theme of the conference this year is the future of civic tech. As briefly as you like: Where do you think civic tech is going, what do we have to look forward to, and what pitfalls should people working in this sector be aware of?

    I’m excited about how much is happening in this space, but I am going to lay down a controversial plea: we don’t need lots of stuff, we need lots of good stuff. For example, Crisis Text Line copycats are a really dumb idea that will confuse people and fragment the data. So while I believe in an open system, I’m hoping we can all be smart and collaborate to do the best, most important work, efficiently.

Civic Tech Smart Cities



Though much of our experience of civic tech to date has happened in the placeless plains of cyberspace, the next great frontier will be crafting the relationship between citizens and their connected urban habitats.

To kick off our coverage here at Civicist, we asked our contributing editors to share their thoughts on “What is civic tech?” We’ll publish their answers as they trickle in, and look forward to continuing the conversation in the weeks and months to come.

I recently spent my spring break devouring Red Plenty, Francis Spufford’s 2012 novel set in the 1950s Soviet Union. It is a tale of cybernetic utopia—the dawn of a might-have-been perfectly-planned industrial economy controlled by computers—where the physical world and the digital world are linked together in scientific synchrony. As I read, it became clear to me that the same trends will direct the future of the civic tech movement, bringing both a new scientific sensibility and a decidedly physical dimension to its computations.

Science has been a stranger to civic tech, despite its deep roots in human efforts to deal with the challenge of cities. Take the term “civic” itself. In civic tech, we use the contemporary meaning of “the rights and duties of citizens in relation to their community.” But in the late 19th century, the first urban planners used a different meaning as a call to arms. For them, “civics” described the application of new social science methods to the wicked problems of the industrial city. Civics was about better management, not mass indoctrination.

In the middle of the 20th century, the promise of digital computing breathed new life into these ambitions. Almost as soon people invented the idea of computers, they started fantasizing about using them to predict and control the behavior of society. Early theorists like Vannevar Bush, who oversaw the Manhattan Project, understood that these applications would in fact drive the evolution of the technology. As he wrote in 1945, at the dawn of the information age, “There will always be plenty of things to compute in the detailed affairs of millions of people doing complicated things.”

Fast forward to the early 21st century and cities and computers are again evolving together. As the pace and scope of global urbanization accelerates, computing is moving off of our desktops and into the buildings, vehicles and streets around us. Though much of our experience of civic tech to date has happened in the placeless plains of cyberspace, the next great frontier lays in crafting the relationship between citizens and their connected neighborhoods. The social web of our cities is about to get plugged into the internet of urban things.

I’m not alone in my hope that new knowledge gained through science can help us build better cities. An archipelago of laboratories spanning the globe has spun up in the last decade, from Boston to Mumbai, to invent the technologies and do the research to understand how cities work, and how to get people to live safer, more productive, more sustainable lives in them. Their approach? Deeply quantitative and heavily computational investigations into the processes of how cities grow, thrive, stagnate, and decline in response to human actions. Behavioral science has made huge strides in public policy recent decades. But we haven’t seen anything yet.

Civic tech will indeed be transformed by the lessons of this new urban science. But it will also push back against its technocratic urges. Civic tech’s pioneers have figured out how to thrive on decentralized participation and collaboration, which will be key to understanding and innovating in a messy and fast-changing urban world. The Soviets tried to create a remote control for the entire industrial economy, to control all the factories from the Kremlin. The smart city’s scientists will need to learn to know when to loosen the reins. The civic tech movement will show them when and how.

Civic Tech organizing



Civic tech requires believing that the technology of today can usher in a better tomorrow.

To kick off our coverage here at Civicist, we asked our contributing editors to share their thoughts on “What is civic tech?” We’ll publish their answers as they trickle in, and look forward to continuing the conversation in the weeks and months to come.

When Aristotle concluded “man is a political animal,” he identified a core tenet of human nature. People want to be seen and heard. People want to feel connected to one another. I became interested in the opportunities for technology to enhance democracy when I witnessed the 2007 Iowa caucus. Much of the civic learning at the time came from face-to-face participation, in-person dialogue, and deliberation with new people. I realized digital tools could potentially reduce the barriers to entry in this arena, but realized that in the noisy ruckus of people coming together, some speaking and listening would inevitably be lost. I left Iowa believing in the potential as well as the perils of digital tools for political participation.

I was inspired to study this further after serving as an organizer for the Obama campaign. It was there that I got to experience the rush of trying to mobilize, organize, and galvanize people. The campaign brought out all types of people who were unified in working towards a better future. But what happens the day after the election?

This is where civic tech comes in. Civic tech provides an opportunity to engage citizens in governance beyond simply voting every two to four years. Civic tech promises a more egalitarian public sphere. Civic tech is about deepening democracy. This definition is much more expansive than efficient public service delivery. It also relates to the deeper reason that people agree upon democratic governance in the first place. Of course, the promise of civic tech is tempered by the reality of people, politics, and institutions.

How do we grapple with political incentives and technology in real life? This is again where civic tech comes in. One dimension of civic tech is being realistic about human behavior. People are social beings. People want to do meaningful work and will participate in governance if it is structured well, if it is a social experience, and when they see results. Possible outputs can include new relationships with neighbors and government officials. Outputs are not as simple as the metrics of page views or clicks; engagement should not perfunctory. Furthermore, the follow-up from participation needs to be viewed as a vital component of participation from the outset. The life cycle of civic tech requires iterative two-way communication.

On the 50th anniversary of the “Bloody Sunday” events that took place in Selma, Alabama, President Obama said:

Because the single most powerful word in our democracy is the word “We.” We The People. We Shall Overcome. Yes We Can. It is owned by no one. It belongs to everyone. Oh, what a glorious task we are given, to continually try to improve this great nation of ours.

Of course, this is not confined to America. It applies to all governing institutions that try to represent “we the people.”

Civic tech is not about the tool or technology. It’s about working towards the type of society we want to live it. It has to be aspirational because so too is the democratic ideal. The very idea that a single federal government can govern 300 million people takes faith. So too does civic tech. Civic tech requires believing that the technology of today can usher in a better tomorrow. Though time will tell, the very interest and investment in civic tech presupposes that people have faith.