Emily Kramer is a Senior Program Analyst with the New York State Technical and Education Assistance Center for Homeless Students (NYS-TEACHS), which provides information, referrals, and trainings to schools, school districts, social service providers, parents, and others about the educational rights of children and youth experiencing homelessness. NYS-TEACHS is funded by the New York State Education Department (NYSED) and is housed at Advocates for Children of New York, Inc. (AFC). Here, she writes about New York State’s experience implementing a behaviorally-informed email communications project developed by the Office of Evaluation Sciences. You can read about the development of the project in Part I of this blog.
What did your participation in the project looks like?
We signed up to work with the Office of Evaluation Sciences (OES) on an outreach campaign to increase identification to students experiencing homelessness. The wonderful OES staff drafted content for eight email messages that we sent out to half of the liaisons in New York State, randomly selected. The OES annotated these outreach messages for us and included notes about key principles of marketing/behavioral economics. For example, we learned that a phrase like, “Over 70% of New York districts have identified students experiencing homelessness,” uses social norms marketing, meaning that the reader may be motivated to identify students experiencing homelessness if they know their peers have done the same.
How did you determine that identification of students experiencing homelessness was the issue you wanted to address with this initiative?
The OES spoke to us and the other states involved (New Mexico, New Jersey) and ended up pitching ideas for the project. The initiative had a number of goals: increasing use of existing resources, improving awareness of changes under ESSA, motivating liaisons, increasing identification, and raising awareness about some higher education resources. Identification data are readily accessible and thus were used to measure whether the project had an impact in that area.
What were the results and what did you learn?
The OES compared the increase in identification of students experiencing homelessness in the intervention group (those school districts that received the special emails) to the increase in identification in the control group (those districts that did not), and found a small but significant effect! They did note in their research paper that most of the effect came from another one of the three states involved in the project. That said, the process taught us a lot about how to improve our technical assistance through small tweaks in language. Some changes partially inspired by the project include:
- increased personalization (i.e. we use mail merge features to include a liaison’s name in the emails they received),
- writing out content in list form,
- using language that is less formal,
- appreciation of brevity, and
- renewed focus on creating simple checklists for liaisons that clearly outline action steps (e.g. our Supporting College Access Checklist and Top 10 Resources for Liaisons).
What are your recommendations for others who are interested in implementing these principles in their outreach?
While we aim to form personal connections with liaisons and service providers across our state, a basic reality is that we need to use mass newsletters to share important resources and announcements. It’s 2019, and we are all inundated with targeted marketing. So, in order to get people to read an email – and take an action – we need to be specific about the action and clear about how to take it! Luckily, there’s no minimum effort required to get started making marketing or behavioral economics-inspired changes to your outreach strategy, so we recommend diving in and editing your process along the way. If you are unable to measure your effectiveness through an analysis of homeless identification (or similar) data, or you’re looking for faster feedback, consider these options:
- compare email open rates for different messages;
- track website downloads of various forms you’ve highlighted;
- look at training attendance for events you’ve done; or
- ask for feedback in a survey,
All of those feedback mechanisms can help you move toward your goals.
What was the most challenging part of this project, and what was easier than you anticipated?
I think that the most challenging part was simply finding the extra time to format new emails in our bulk email platform, though this wasn’t an unexpected issue and the time was well spent. As for what has been easier than anticipated – we are by no means “experts” on marketing, but we have rather easily (and informally) incorporated some of the lessons learned from this project into other communications. For the most part, we have become more focused on creating content that is “catchy” and easy-to-read.