Current Lesson
Course Content

How initial results will be presented

Once the survey period has concluded, the research/data partner commonly provides a technical report (depending on your contractual agreement) that describes the representativeness, reliability and validity of the data, and presents the summary data for each question in the survey. 

Reporting the distribution of responses to each question is a good starting point. The number of survey participants that fall within each response category of a question (i.e., frequency) and the percentages they represent provide a simple illustration of how the data are distributed. Common descriptive statistics include the mean (i.e., the arithmetic average) and the standard deviation (i.e., the amount of variation around the mean). For questions listing several items or scales (as in Figure 1 below), it is recommended to present the items in the table from highest to lowest or most to least (in other words, in a different order than they may appear in the survey questionnaire). This data may be preferred in tabular form, with each table accompanied by a brief narrative description of overall findings. 

For example, in this sample from the 2021 Waterloo Region technical report, the questions are ranked by mean score from highest to lowest:

Figure 1: Survey response example ranking mean scores from highest to lowest

 Survey response example ranking mean scores from highest to lowest

Figure 2 shows the percentages of children along a scale for self-reported physical health.

Figure 2: Survey response example showing table distribution of mean scores along a scale

Children and youth self-reported physical health (n = 541)

Figure 2: Survey response example showing table distribution of mean scores along a scale

a Based on a 5-point scale where higher scores reflect higher self-reported levels of physical health.

Figure 3 provides a graphical illustration of the same distribution shown in Figure 2.

Figure 3: Survey response example showing graph distribution of mean scores along a scale

Children and youth self-reported physical health (n = 541)

Children and youth self-reported physical health (n = 541)

Some questions will have missing data due to non-response by some survey respondents. Some youth might have chosen not to answer certain questions for a variety of reasons (e.g., they felt the question was irrelevant to them, did not recall the requested information or felt it was too personal). For a few questions, response categories of “does not apply” or “don’t know” are offered to respondents, and these answers are typically not reported in the final survey findings. There are contingency questions that direct survey participants to other questions depending on their answer, so total responses to these questions are correspondingly lower. As a guideline, the researchers may ultimately retain only those submissions that had completed at least 30 per cent of survey questions with usable responses.

The research/data partner should present the data in the report organized into the nine dimensions of well-being, in the order established in the survey report template. The data within each dimension are intended to tell a story, from different perspectives, about how young people are faring in that dimension. More information about the overall meaning of each dimension can be found in the Canadian Index of Child and Youth Well-being and in the Sample Data Brief in the Appendix.

The summary data report provides an overall sense of child and youth status and experiences within your community. At this point, the core project team and researchers can take time together and apart to identify results that stand out as unexpected or particularly deserving of attention. This preliminary scan can identify areas requiring further data analysis and investigation. Before you proceed to producing data briefs or convening sense-making activities to interpret, distill and present the data, you can approach this first look at the data in a number of ways:

  • Go through the data report question by question and highlight all the findings that fall outside certain parameters you have set. Look for positive results and those that are concerning.

For example: You could set the standard that you will look more closely at any finding with higher than 20 per cent in a question’s least desirable category. Using those parameters, you would highlight data showing that 23 per cent of those surveyed have trouble getting to sleep at night.

For example: You could look more closely at any finding where the between-group differences among children are widest (e.g., by ethnicity, age cohort, rural/urban residence, gender or between one group of children and the total average). Disaggregating data to allow for between-group comparisons will depend on factors including sample size (for reliability and privacy protection) and community acceptability. For instance, use of data identifying First Nations, Métis or Inuit children should respect their data sovereignty and the OCAP Principles (refer to toolkit sections, Demographic profile and Respecting diversity).

In Figure 4, we compare levels of self-reported physical health for males and females. In this example, a higher percentage of boys (78.8 per cent) rate their physical health as very good or excellent compared to girls (71.1 per cent). This difference is reflected, too, in the higher average score for boys (M = 3.61) than for girls (M = 3.32). In addition, we can see that the variation in self-reported physical health, as reflected in the standard deviation, is greater among girls (SD = 1.26) than boys (SD = 0.98), which suggests there is less consensus in their self-assessments. Figure 5 visualizes the same results in a bar chart.

Figure 4: Survey response example showing demographic comparison in a survey response (table format)

Comparison of self-reported physical health by sex at birth

comparison of self-reported health by sex at birth

a Based on a 5-point scale where higher scores reflect higher perceived levels of physical health.

Figure 5: Survey response example showing demographic comparison in a survey response (graph format)

Comparison of self-reported physical health by sex at birth

Comparison of self-reported physical health by sex at birth


  • Compare the findings (or select findings) in your community to other jurisdictions (making sure to explain that comparisons are not direct), for example:
  1. Canadian youth overall (e.g., from the most recent iteration of the Canadian Index of Child and Youth Well-being)
  2. Another community that has completed the Community Child and Youth Well-being Survey
  3. The adult population of your community (e.g., from a CIW Community Wellbeing Survey)
  4. The adult population nationally, provincially or regionally (e.g., from national survey sources, such as the Canadian Community Health Survey)
  • Compare the data to previous surveys’ data. One of the most valuable ways to analyze the data becomes possible when you repeat the Community Child and Youth Well-being Survey. Noting what has changed since the previous survey is key to understanding the progress that actions have had, as well as persistent or emerging challenges.

Cross-referencing for additional insights

Using multiple indicators of a situation is stronger than relying on a single indicator. One of the best ways to approach persistent problems in new ways is to cross-reference one part of your survey data with others. While some patterns might be identified in sense-making activities and some correlations can be theorized based on a detailed look at the summary results, cross-referencing survey data requires specialized knowledge typically available from a researcher/data analyst. This approach works best when you have something specific you are looking for, such as relationships between specific data variables or specific between-group differences. An enabling technology like data visualization software can then be used to illustrate these relationships in an accessible and compelling way.

For instance, given “life satisfaction” is a proxy for overall well-being, one place to start is to explore which indicators in the survey results are strongly associated with life satisfaction. For example, you might compare groups of children on indicators that are typically highly associated with life satisfaction, such as self-reported mental health, feelings of social isolation, and levels of social support from friends, family, teachers and the community. But be open to variances and surprises!

Perhaps most importantly, you should compare groups and indicators that speak to issues and challenges with which your communities are most concerned. For example, if low levels of self-reported mental health among youth in your community are of concern, you can investigate which indicators are most strongly associated with mental health and which groups are most at risk and take steps to help increase their overall well-being.

The charts below (Figure 6) demonstrate some other options to present data for particular indicators of interest, including comparisons between groups of children based on gender and age.

Figure 6: Examples of data presentation for a survey report


Figure 6: Examples of data presentation for a survey report