In today’s highly competitive higher education development marketplace, the ability to understand and effectively use data analytics has become more critical than ever before.
Yet, all too often, advancement and institutional leaders aren’t unearthing the most valuable parts of the story being revealed when they sit back to analyze the ocean of data in front of them.
In addition, many of these leaders aren’t tapping into how this data can help them deepen their relationships with donors, funders, alumni, and community partners.
WASHBURN & McGOLDRICK has been using data analytics to help our clients for decades and has conducted over 3,300 face-to-face interviews with top donors. We’ve also:
- Conducted surveys of more than 58,000 alumni on a range of issues and topics.
- Conducted our semiannual Advancement Moving Forward survey of advancement professionals tracking important issues within advancement including confidence, hybrid/remote work, Diversity, Equity, Inclusion, Belonging and Justice (DEIBJ), and the challenges of meeting annual goals.
Through this work, we’ve seen firsthand how the colleges, universities, and private secondary schools that truly listen to the data have been able to achieve and frequently exceed their development goals as well as address the challenges of engagement and philanthropy.
All of our survey analytics work follows seven very simple rules:
Be respectful of the alums
By the end of the survey, those alumni who participated should have a genuine sense that their views and time were valued. They are spending their valuable time responding to a survey and want to be heard. It is too easy for a survey to become unwieldy. Our experience shows that longer surveys result in lower completion rates. Our goal is to stay under 20 questions, resulting in an average completion time of six minutes. Of course, this can raise doubts as to whether it is possible to get enough depth with only 20 questions. That leads to the second rule.
All questions are interesting, but not all are important. Focus on the actional issues for an institution
One of the most common mistakes is attempting to do too much with a survey. Frequently, this manifests itself in the form of research that tries to solve all challenges at once. A typical symptom of this problem is general alumni surveys that are so excessive in length that it seems like their only goal is to frustrate alumni. Collaboration between the client and the consultant is critical. The institution knows the priorities and we know how to ask questions and analyze the data. A survey designed to inform an incoming president is different from a pre-campaign survey or a post-campaign survey.
Start slowly and then move towards questions requiring more thought
Responding to a survey can be cognitively exhausting. We are asking alums to think and thinking takes energy. If they get tired too soon then they will stop midway or not even start the survey. Start with the easy questions – i.e., five-point responses – which get them warmed up and then intersperse some easy questions throughout. This focuses their attention before the questions which require more thought. The rule is basically to ask them to react before they have to think.
Organize the questions by theme
We typically organize our surveys around key indicator questions for each of the four major themes of affinity, communications, engagement, and philanthropic interest. These key indicator questions closely map to the engagement categories identified by the Council for Advancement and Support of Education (CASE). While CASE focuses on actual alumni behaviors, we focus on attitudes and beliefs about those key themes. Engagement, for example, begins with a self-assessment of engagement with the institution and then follows with a skip pattern to gain a deeper understanding of why alumni chose or are inhibited from engaging. Those in-depth questions are customized for each institution based on their priorities and resources.
Target the analysis by segments within an alumni population
Another area where we often see missed opportunities is with analysis that fails to capture all the nuances of alumni opinions. Typically, what we encounter in this area is clients overlooking critical “data stories” of how various segments of their alumni base want to engage with the college or university at distinct stages of their lives. For example, one client assumed that older generations of alumni would not be interested in career advice and guidance as much as their more recent graduates. In fact, the data revealed that both groups wanted the school to be a career resource but for different reasons. The more recent alums were interested in finding resources that would help them with their early-stage career decisions including finding new jobs or pursuing a graduate degree. The older generations of alums wanted resources to help them pursue new career paths or journeys frequently very different from where they had spent the bulk of their professional careers. By listening to this “data story,” the client was able to become a “resource partner” for both groups, while simultaneously increasing their engagement with the university.
Response rates are nice, but the most important metric is margin of error
While response rates are frequently cited as a measure of a survey’s “success,” the margin of error is the more important statistic. It is a measure of the accuracy of the results. If you have seen any media coverage of a public opinion survey, then you will have seen a footnote note about the margin of error. It is usually cited in terms of plus or minus a percentage. Most surveys typically have a +/- 5% margin of error which means they are 95% certain that results are within +/- 5% of the results from the entire U.S. population. A few years ago, ignoring the margin of error led to wrong projections about a national election. The data was right. The interpretation was wrong. Our work achieves a more accurate level of accuracy with an average of +/- 2% margin of error of the results if we had responses from 100% of the alumni population.
Listen to all the data
Once all the data are collected the science of analysis meets the art of interpretation and stories appear. Alumni data, like life, are multivariate and the responses are interwoven. Patterns will appear across questions for different segments defined by age, geography, donor status, gender, or any segment of interest to the institution. Resistance to engagement will reveal similar patterns in resistance to philanthropy. It is also important to look beyond the top of the chart – i.e., the good news. One client we worked with was surprised that nearly 40% of their top donors had little or no opinion of the school’s leadership team. Further analysis revealed that nearly 30% of top donors did not consider the institution a top priority. The real “data story” here was that their leadership needed to reach out to those donors who didn’t express a strong view to help them understand how the university’s leadership was moving forward in mission and vision.
Finally, our data analytics team has the capacity to take a macro view of how donors are shifting in their giving patterns and level of support over time. We can benchmark giving patterns (by source and purpose) of institutions against a set of similar and aspirational peers. By looking beyond total giving or overall participation rates, we find institutional stories and patterns. Within an institution Gift Migration analysis is increasingly becoming popular. This type of data research can provide advancement professionals and their leadership with insights into a macro view of how donors are shifting in their giving patterns and level of support over time. We can identify “money left on the table” and individuals whose giving is moving beyond expectations.
If there is one thing that we have learned from the nearly 350 surveys we have created and conducted over the past decade, it’s that the colleges and universities that are able to effectively read their “data story” and unlock its power will continue to thrive regardless of any internal or external disruptions.
The real question is: Are you ready to unlock your data story?
Steve Devlin, Ph.D., is the Senior Data Analytics Consultant for WASHBURN & McGOLDRICK. He can be reached at steven.devlin@wash-mcg.com