An overview of results reporting in org chart and project based reviews
This article covers the following topics:
Before you start
As soon as reviewers start submitting their responses, Admins can analyze their data in the results tab of the review. We offer several views to help you discover insights about your people and organization. Check out our webinar on administering reviews and leveraging your results for additional guidance on how to make the most of your reviews.
Please note: review analytics only apply to rating, competency, and goals questions and weighted scores. Multiple choice and multiple select questions can't be analyzed. Lattice provides sentiment analysis for open-ended questions which you can read more about here.
Additionally, the results tab is only available in org chart and project-based review cycles. Reviews created by automated rules do not have a results tab and can't be analyzed with the visualizations below.
To access review cycle analytics:
Step 1: Navigate to the Admin > Reviews > Auditing page
Step 2: Select the desired review cycle
Step 3: Select the Results tab
Step 4: Select comparisons
Select comparisons
Use the Select Comparison box to choose the questions that you want to analyze. You can select up to ten questions at a time. Note that the 9-Box only supports two comparisons at a time.
Filter your data
Use the filter bar at the top of the Results page to filter your data set by user attributes or review group (self, peer, upward, downward). The attributes include the Lattice defaults of gender, age, department, etc., as well as any custom user attributes you may have created.
Apply as few or as many filters as you need to isolate the demographics you want to analyze. Adding multiple attributes will require reviewees to meet both conditions, whereas applying multiple values within a single attribute will show reviewees who meet either condition. For example, filtering for Gender = Female and Department = Engineering + Product will show responses about all women in the engineering and product departments.
Group your data
The default grouping for your results is by department. You can adjust how your information is being presented by selection Group by and choosing different options. You will see a list of all user attributes that are currently in your Lattice domain, again including custom attributes you've created, and review group.
Please note that the Group by: filter groups by reviewees. In an example where we Group by Manager and filter for upward reviews:
Stephen manages Ami, who manages Adnan, who manages no one.
We group the reviewees by manager and the responses for that reviewee. In this case, since we grouped by manager, we'd get a grouping for Stephen who manages Ami because Ami had upward responses in her review (from Adnan). However, there would be no grouping for Ami because her report did not receive any upward reviews (Adnan does not have any reviews).
Note that Lattice defaults to displaying the first 8 options for any grouping. To add more results to the visualization please use the + button beneath the chart and check the boxes for the desired additions.
Explore your results
You can examine your results by bar graph, 9-box, heatmap, or distribution. Select the button for the desired visualization as shown below.

Bar graph
The bar graph view allows you to see the results for a group of employees across many questions. This is helpful for comparisons of more than two questions, and a good overview to start your analysis.
The colors in the bars correspond to the icons below the chart. Hover over a bar to see the average rating for that group/individual.
Note: if you are looking at the responses for more than 3 questions, this bar will become a side bar graph rather than a vertical bar graph, as shown below.
While viewing the bar graph, you can show the Actual score or the Normalized score.
Actual Score: the actual average response to each question. For example, if you have a rating question that is out of 3 and another question that is out of 5, you would be able to see an average rating out of 3 or 5 when hovering.
Normalized Score: allows you to view all questions on the same scale expressed as a percentage of 100. This is helpful if you are looking to compare the responses for rating questions with different response scales.
9-Box
The 9-box view is best for comparing two different questions. For example, if you wanted to compare every department against these two questions, you navigate to the 9-box view and then on the "Group by" dropdown and select "Department." From here, you'll be able to hover over each dot to see where the department fell in regards to these two questions.
Heatmap
The Heatmap view is best for comparing different groups of responders against each other across more than one question. It provides the most data in a single visualization. You can apply filters to cut your data even further to compare very specific employee groupings.
Select the Export button to download your heatmap. The export preserves the format of the heatmap, and contains the colors of the cells.
Distribution
The Distribution view is best for understanding how many employees received each possible rating for the comparisons you've selected. This can be a good visualization to evaluate the effectiveness of the questions you're asking in your reviews. If you're looking for a bell curve, use the distribution view.
Just like the other visualizations you can filter your results by user attributes (custom and default), and review group. Apply multiple filters for fine-grained insights about the performance of specific groups of employees.
Actual Count vs. Normalized Count
Use the Show bar to choose between...
- Actual Count: number of times a rating was chosen
- Normalized Count: percentage of times a rating was chosen
The normalized count is good for when you're analyzing ratings with different response scales.
Note that you'll be able to graph up to 4 comparisons at a time. You can't change the Group By for distribution because the groups are automatically set to the response scales.