Skip to main content

Partnering with City Connects makes a difference for students, schools, and families. For community agencies, involvement with City Connects means serving students better.


Having observed many positive effects associated with City Connects, we need to assess whether the findings can be considered caused by the intervention. Evaluation designs incorporating random assignment of participants into treatment and comparison groups are optimal for making strong conclusions regarding intervention effects. Such experiments minimize “selection bias” by using random assignment to ensure that participants are equivalent prior to treatment. With the organic growth over time in the City Connects program, and since this school-level intervention is directed to each child in a unique way, it has not been possible to implement a randomized controlled trial design. Thus, we have been concerned about whether the promising effects we see from City Connects are indeed causal.

In the absence of randomized design, we have developed the best possible evidence for the effectiveness of City Connects through the use of statistical solutions and high-quality data to minimize potential selection effects resulting from violations of random assignment. Starting with basic inferential methods such as ordinary least-squares regression and moving to more complex modeling techniques, we continue to see consistent results across a variety of domains. Our primary analyses employ four complementary methods: longitudinal growth curve analysis, propensity score weighting, partitioning analysis, and individual-level fixed effects models.

Beyond the methodological, we have taken a logic-building approach to understanding the City Connects program. Several of our findings are consistent with what we would expect to see if the effects are causal:

  • We see dosage effects: more time in the program is associated with more improvement in outcomes.
  • We see evidence that we have successfully addressed selection bias: after applying propensity weights to take pre-intervention differences into account, City Connects and comparison students are comparable.
  • Moreover, if there are selection effects, they work against City Connects: before propensity weights are applied, City Connects students start off at a disadvantage in report card scores relative to comparison students, but after the City Connects experience they surpass those in comparison schools. 
  • The intervention achieves what it was designed to do: City Connects is helping children thrive and achieve in school, as measured by report card grades, standardized texts, and in classroom and personal behavior.
  • We see results replicate: the results of our evaluative review demonstrate the positive effects of City Connects repeatedly.

Given all of these considerations, we are increasingly convinced that given these outcomes, a claim of causality may well be merited.

Learn more about City Connects evaluators or read our publications