Guide to 360-degree feedback

Share this

It’s not a surprise that 360-degree feedback is popular today. The complexity of work increases and the success in completing work projects is directly related to the level of interaction between employees. Every day we collaborate with our colleagues and clients, discuss tasks with our managers, help our subordinates. People, who surround us at work, know about our skills quality, and capabilities because they see how we act in certain situations. What if we ask them to give us feedback about skills we are good at and those we could improve? One of the ways how to get feedback from your colleagues is using 360-degree feedback.

360-degree feedback is a tool that helps employees get feedback from their colleagues (managers, peers, subordinates, clients, etc.) on their skills and performance, compare it with their view of their skills, specify the steps to develop those skills which assist them to achieve success in work.

The history and definition of 360-degree feedback

In 1930 the military psychologist Johann Baptist Rieffert developed a methodology to select officer candidates. The essence of the method was that in addition to the team of experts, consisting of officers, psychiatrists, and psychologists, the candidates' comrades were interviewed, who assessed their personal qualities. It turned out that the opinion of the comrades could predict the "probation at the front" much better than most other methods such as intelligence and personality tests or psychological interviews. In other words: the assessment by comrades had the highest prognostic validity with regard to future performance.

The Rieffert method is a prototype of the modern 360-degree feedback method. 360-degree feedback became widespread in the 1990s when HR managers understood the concept and what prospects the method could bring for the professional development of employees. Due to the ability to conduct evaluations online with surveys and the emergence of Internet-based tools, 360-degree feedback became less time-consuming and has steadily increased in popularity.

So, what is the definition of 360-degree feedback? 360-degree feedback is a process through which feedback is gathered from and employee and their colleagues - manager(s), subordinates, peers, and supervisor(s). 360-degree feedback is so named because it solicits feedback regarding an employee's behavior from multiple points of view in contrast to other methods of gathering feedback (for example, one-way feedback from a manager to subordinates or from subordinates to a manager). It is very important that 360-degree feedback can be measured - that’s why 360-degree feedback survey forms usually require a numerical assessment of the reviewee (a person who receives feedback) directly by the reviewer.

Why use 360-degree feedback

In the mid-1990s, when 360-degree feedback was gaining popularity, experts were divided into two sides: one group argued that the 360-degree method should have been used exclusively for personal development, the other group insisted that 360-degree feedback could also be used to make personnel decisions that could directly affect the career of participants (for example, appraisal, compensation, staffing).

Considering that the number of companies in which 360-degree feedback is used to measure efficiency/effectiveness continues to grow steadily, there is no question anymore about whether to use the 360-degree method only for individual development or not. Rather, the question is - how to organize a 360-degree review in the most efficient way to meet both - personal development and decision-making goals.

Let’s take a closer look at application areas of 360-degree feedback:

1. 360-degree feedback for individual development only

Individual development is likely the first thing that comes to mind at the mention of 360-degree feedback. Feedback from the work environment allows reviewees to identify their strengths and weaknesses, increase their level of self-awareness, recognize ways to develop their soft skills and improve workplace relationships.

For development purposes, 360-degree reviews can be conducted both for a single participant and for groups of employees. Reviewees receive review results, as the 360-degree feedback is organized specifically for them.

2. 360-degree feedback for assessment only

360-degree feedback is used as a supplement to the manager's assessment. Thanks to feedback from subordinates’ peers, managers see a more complete picture of their subordinates and make a more balanced opinion about their work.

As we have already noted, for a long time, the use of the 360-degree method in performance evaluation has been criticized by many experts, however, the percentage of companies that apply 360-degree feedback to assessment is growing every year.

360-degree feedback can be used to evaluate the effectiveness of personnel (performance reviews), to identify employees with high potential (HiPo programs), to recognize outstanding and underperforming employees.

3. 360-degree feedback for individual development and assessment

Increasingly, the 360-degree method is being used by companies to both evaluate and develop employees. In this case, the results are used to make career decisions and to help employees develop professionally.

4. Organizational development

The list of organizational development tasks for which 360-degree reviews can be conducted is quite extensive. Below are just a few of them:

  • Change of management in the company
  • New competency model
  • Creating a new company culture
  • Growth of the company (for example, a transition from startup to business)
  • Forming new teams within a company.

In the case of organizational development, 360-degree feedback is often tied to the company's values.

5. Evaluation of the effectiveness of training programs

360-degree feedback can be used to measure the effectiveness of training programs as reviews can be organized for specific groups and track the dynamics of changes over time. For example, you can conduct a 360-degree review before and after the training program, and compare the results, or compare the results of a group that has completed a certain training program with a group that did not participate in that program.

Advantages and disadvantages of 360-degree feedback

In this part, we will examine the main advantages and disadvantages of 360-degree feedback. Note that many disadvantages can be avoided with the proper preparation and organization of reviews.

Advantages of 360-degree feedback:

  • Employees receive multisource feedback from colleagues instead of (or as a supplement to) feedback only from the supervisor
  • Identification of strengths and areas of development of employees
  • Employees’ understanding of how they are seen from the outside, which is important for effective team interaction
  • Fostering a feedback culture in the company that helps employees to develop their skills
  • Employees choose skills for development based not only on their own opinion but also on the opinion of colleagues, including the manager
  • For companies: 360-degree feedback helps to identify which competencies are at a high level, and which need to be developed at a team/department/company-wide scale. The whole-picture analysis of 360-degree feedback results enables managers to take a more deliberate approach to create training programs
  • Soft skills development - thanks to regular 360-degree reviews, participants can track how feedback on their skills has been changed and use it as a benchmark
  • Continuous employee development. 360-degree feedback motivates employees to work on their skills and become better
  • Feedback from colleagues encourages employees and increases their confidence.

Disadvantages of 360-degree feedback:

  • 360-degree feedback may lead to organizational problems if it is tied to personnel decisions (for example, compensation), and organized in a hurry without proper preparation of a review - goals are not defined, participants are not notified in advance, etc.
  • 360-degree feedback may be ineffective if the results are not published to reviewees, or after conducting a review, participants do not observe any changes in the company
  • Focus on participants' development areas instead of strengths. This is one of the disadvantages of 360-degree feedback that is most often criticized. To avoid this drawback (or smooth out its effect), notify all participants in advance about the review and its goals, tell them about the importance of highlighting the strengths of colleagues
  • Respondents give contradictory feedback, or if a review includes rating questions, set ratings without thinking. This shortcoming can be solved by notifying reviewers about an upcoming feedback campaign and its goals, as well as training reviewers on how to give feedback
  • Organization and administration of 360-degree feedback may take quite a lot of time and effort on the part of review organizers, even if they use special software. We recommend that you first conduct a review of a small group of people by using a special third-party service or a universal solution like Google Forms to determine how easy and convenient it will be to scale such a solution to a large group
  • Feedback from colleagues in the review results is usually depersonalized. It’s done to preserve the confidentiality of reviewers’ responses. But it may lead to difficulties in obtaining additional information from reviewers after the review is over

Setting up a 360-degree review

Preparation for 360-degree feedback may be divided into three parts:

  • Defining a purpose of a 360-degree review
  • Informing participants about a review before its launch
  • Technical setting up a 360-degree review

1. Defining a purpose of a 360-degree review

One of the most common reasons why 360-degree reviews may go awry experience is that the review objectives are not fully defined, or not defined at all. In the section "Why use 360-degree feedback", we described in detail what goals 360-degree feedback helps to meet. You as a 360-degree feedback organizer should determine whether the feedback results will be used for individual development, for making career decisions, or for both.

Without the support and active involvement of management, it is problematic to conduct an effective 360-degree review in the company, so first of all, you should start discussing plans and goals of a planned review with the heads of the organization or department to make sure that the management team shares these goals.

In the case of a 360-degree review for employee development, reviewees (as well as their supervisors) should understand that the review will be conducted not for the company but for them. The review itself will not change the employees, since it is not a "magic pill" that will do the work for each reviewee to develop certain skills, so the reviewees should draw conclusions based on the feedback results, and plan actions to develop their skills.

2. Informing participants about a review before its launch

After you have decided on your goals, notify the participants of the planned review.

What is important to do before launching the 360-degree review:

  • Explain to the participants the goals and benefits of 360-degree feedback (both for the organization as a whole and for each manager and employee)
  • Explain the value of 360-degree feedback results that reviewees will receive
  • Make sure that a review process is clear to the participants who receive feedback, as well as to their supervisors and reviewers
  • Define and announce to the participants the next planned steps after the review
  • Train managers how to read 360-degree feedback results of their subordinates and make a further plan of actions for their development
  • Train reviewers to give constructive feedback

On the last point, we would like to focus special attention. The value of 360-degree feedback results is often criticized since in comparison with the performance assessment, where only a manager gives feedback, in 360-degree feedback, reviewers are colleagues with different levels of expertise. This disadvantage can be offset by additional training of reviewers on how to properly give feedback to colleagues. We understand that not all companies have an opportunity to conduct full-fledged training, so the most important points/tips for reviewers can be specified in the instructions for the review in emails and on a welcome page to the review.

For example, what information you can send to reviewers:

  • Extreme rates should not be set often. The maximum rate indicates that an employee exceed expectation and shows outstanding skill level, they have nowhere else to grow - this tip will reduce the number of maximum points in the questionnaires that reviewers set when they are not sure what to answer
  • One person rarely has all his skills at the maximum level: therefore, reviewers shouldn’t be uncomfortable by setting not maximum points
  • If reviewers are not sure what score to set, because they did not interact enough with their colleague, then it is better to skip the question than to put the score “at random”
  • All reviewers’ responses are confidential (except for manager responses).

3. Technical setting up a 360-degree review

let's take a closer look at what you need to do to set up a new 360-degree review.

3.1. Creating a 360-degree feedback questionnaire - selecting competencies (skills) / questions (behavioral indicators) for a review

To set a new review you need to select competencies or a competency model which are represented by behavioral indicators for evaluation. Those competencies should be relevant to participants and organization/personal goals.

There are many definitions of competencies, but since the context of 360-degree feedback is important to us, we can stop at the following definition: competency is a set of skills, abilities, personal characteristics, and behaviors that help achieve the desired results. In a 360-degree questionnaire behavioral indicators are rated questions which reviewers need to evaluate. A behavioral indicator is an observable element of human behavior that indicates the level of proficiency in a particular competence.

For example, Leadership skill (competency) can be expressed by the following behavioral indicators:

  • Takes responsibility for the team’s successes and failures
  • Motivates co-workers to achieve results through personal example
  • Motivates subordinates to reach set goals by applying an individual approach with each one
  • Develops employee skills and competencies through various formats (workshops, courses, books, new projects)
  • Delegates ambitious tasks to subordinates to help them grow professionally
  • Provides regular employee feedback, noting their accomplishments and correcting errors
  • Takes an objective and proactive position during conflicts between team members

There are different methods for creating a competency model, and each of them has its own pros and cons. You can use ready-made competency models that can be found on the Internet, work with consultants, or develop a competence model on your own. Whatever method you use, choose such competencies and indicators that are important and can be applied by review participants in their work. Indicators’ phrasing and their value should be clear to reviewers.

Plan ahead - to be able to compare results, in future reviews a competency model should be the same as in the planned survey.

We are often asked how many competencies should be selected for a 360-degree feedback review and how many indicators should be included in a questionnaire. The answer depends on how often you plan to conduct a 360-degree review and the number of reviewees and reviewers for each reviewee. If you plan to organize reviews 3-4 times a year or more often, we recommend making questionnaires with no more than 30-40 questions (4-7 competencies). If you plan to launch a review once a year, you can add 40-60 questions or even more if you want to get a more complete picture. The more questions in your questionnaire, the longer it will take for a reviewer to fill it which may lead to fewer textual comments from the reviewer. On the other side, if your questionnaire consists of 20 or fewer items, you may have some difficulties with analyzing the results as there will be a risk of insufficient data for that.

Some tips for developing your own competency model for a 360-degree feedback review:

  • Avoid technical questions (for example, knowledge of a programming language). Reviewers should give feedback primarily on the apparent behavior of reviewees. You can use technical questions if you know that reviewers’ are able to evaluate professional skills, the level of expertise of their colleagues, and their feedback will be helpful for reviewees
  • Focus on the habits and skills, not the personality of reviewees
  • Stick to one scale: this will allow you to compare the results of reviews in the future, and it will be easier for reviewers to answer. Usually, a frequency or agreement/disagreement scale with 4/5/7/10 points is used.

In addition to rating questions (indicators) on competencies, consider adding open-ended questions which require a textual comment from reviewers. Although open-ended questions take more time to answer than rating questions, textual comments may contain various insights that will make your review much more valuable for participants. For the same purpose, it is recommended to give reviewers an option to leave a comment on each of the indicators to explain a particular rate that they set. Sometimes such comments are required for low or high rates, which certainly provides additional food for thought for reviewees, but may lead to reviewers avoiding setting low/high rates to not write additional text if they don’t want to do that.

Examples of open-ended questions:

  • Which 1-3 things should your colleague continue doing to be effective as a leader
  • Which 1-3 things should your colleague do less of to be effective as a leader?
  • Write any additional comments which might help your colleague to improve.
3.2. Making a list of reviewees and reviewers

360-degree feedback is impossible without reviewees (those who receive feedback) and reviewers (those who give feedback by filling out questionnaires). While it’s easy to create a list of reviewees as you usually know who is going to get feedback in your planned 360-degree review, there may be difficulties with selecting reviewers for each reviewee, that’s why we will give more details on making a list of reviewers for your review.

Selecting reviewers for reviewees is an important part of setting up a 360-degree review. The review of each reviewee should include a comprehensive point of view from the employee themselves, their supervisor, colleagues, subordinates, and even customers if appropriate. The combined opinion gives a more complete picture of an employee's skills, creates a balance between different points of view. Reviewers in 360-degree feedback are divided into groups so that it would be possible to compare the average scores of the groups with each other and with an overall average score.

What groups of reviewers can be distinguished:

  • Self (self-evaluation). We recommend that you always include self-assessment to find similarities and differences
  • Manager. Usually, it’s only one person. Though an option with multiple managers for one reviewee is also possible - for example, your company has a matrix management structure or teams form for every new project from a scratch
  • Peers. Depending on the situation, peers can be one general group, or be divided into several: peers from the same team, peers outside the team, etc.
  • Subordinates (or direct reports)
  • Clients
  • Partners

There may be other groups of reviewers for 360-degree feedback, but we recommend using groups that are clear to the reviewers so that there are no difficulties with reading review results and understanding which of the reviewers belongs to which group.

There are different ways to choosing reviewers. Each of these ways has its pros and cons:

A. A review administrator assigns reviewers for each reviewee based on the knowledge of employee relationship, as well as suggestions from reviewees and their supervisors

  • + A review administrator controls the load on reviewers
  • + An administrator can be sure that reviewers are assigned to reviewees in a timely and deliberate manner
  • + There is no need to train participants on how to select colleagues from whom they would like to receive feedback
  • - It may take quite a lot of work for a review administrator if there are many reviewees
  • - Reviewees may not feel involved in 360-degree feedback as they are out of the process of choosing reviewers

B. A reviewee chooses their own reviewers, and after that their choice can be edited/approved by their manager or review administrators

  • + A reviewee feels like a full-fledged participant in a 360-degree review: they choose all colleagues from whom they would be interested to get feedback so that feedback will be perceived as more valuable
  • + A process of selecting reviewers is almost completely distributed among reviewees, and less time and effort is required from review administrators to set a review
  • - Additional training for reviewees on how to choose their reviewers more effectively is recommended which requires resources. If you don’t want to organize a full training, you can create a short guide for reviewees. Also, the ease of use of the interface of 360-degree feedback software may reduce the likelihood of errors in the selection of reviewers
  • - Some reviewees do not want to choose their respondents or don’t have time for that, so their reviewers still have to be selected by their managers or review administrators. You should set aside time in advance for an approval process, during which you can decide what to do with reviewees without reviewers

C. A list of reviewers is formed automatically based on an analysis of the frequency of employee interaction with others: it can be an analysis of emails or a structure of an organization

  • + No time required to choose reviewers for review administrators and reviewees (except for approval of lists)
  • - Unpredictable and uncontrolled selection of reviewers: a 360-degree feedback service does not always know about all work contacts of reviewees. Less involvement from reviewees as they don’t participate in a reviewers selection process
  • - Access to internal company data is required, which is not always possible to provide due to security regulations

Regardless of what reviewer selection option is used, review administrators should be able to edit the lists of reviewees and reviewers both before and after the start of the review.

What you should pay attention to when choosing reviewers:

  • Include self-assessment and manager evaluation. Sometimes review administrators forget to do this and then results are missing important comparison of similarities and differences of reviewee, manager, and other colleagues points of view
  • If reviewees choose reviewers for a review, make a limit on the minimum and maximum number of reviewers
  • Reviewers should be able to give feedback to reviewees. If a reviewer rarely interacts with reviewees or does not interact at all, their feedback may not be of any value. Give reviewers an opportunity to skip certain questions or the whole review, so as not to force them to give feedback when they have not enough interaction with a reviewee for that
  • If there are too many reviewers for one reviewee, divide them into groups. For example, you can separate peers into peers inside/outside the team or internal clients
  • Reviewer groups (peers, direct reports, clients, etc.) should have at least 3 people. The exception is self and manager assessment. 5-8 reviewers is a normal amount for reviewees without subordinates. Managers may have more reviewers. In our practice, we have met 90 reviewers per one reviewee, but this, of course, is rather an exception. If a reviewee has only one subordinate, then you need to assign that subordinate to a group of colleagues, if you want to maintain the confidentiality of reviewers’ feedback.
3.3. Basic settings of a 360-degree review

In addition to selecting the competencies/indicators for your review and specifying reviewees/reviewers, you need to pay attention to the following when you are setting up a review:

  • Review dates. In addition to the start date and end date of the review, you may need a date for the end of the selection of reviewers by reviewees and a date for the end of approval of the reviewers list by managers, if you plan that reviewees will specify reviewers themselves. For small groups of reviewees (up to 30), we recommend limiting a 360-degree feedback review by one or two weeks. Reviews with a large number of reviewees may take 3-4 weeks, sometimes longer. However, we do not recommend stretching your review too much, as this may lead to such issues as less relevance of feedback (outdated or influenced by other reviewers)
  • Invitation/reminder texts. Add your company name or the last name of a reviewee (if there is one) to invitation emails (or other messages), so that reviewers do not ignore your invitation. A short instruction or description of the goals of the review in communication for reviewers may be very helpful too
  • Results publication settings. Decide in advance who gets access to 360-degree feedback results. If you are conducting a review for the first time, it is reasonable to provide access results only to administrators and managers of the reviewees once after the review is completed, then managers publish results to reviewees before or during a special meeting for discussing the review - in such case, there won’t be situations when the reviewees receive the results of their first 360-degree feedback review on their own and don’t discuss them with their manager or HR for a long time with a high chance of drawing wrong conclusions. If reviewees have already participated in 360-degree reviews before, you can publish the results for them immediately after the review is over. When choosing a third-party service for organizing a 360-degree feedback review, check whether it is possible to publish the results for all reviewees at once or separately for each of them after the review ends.

Administration and participation in a 360-degree feedback review

So, the 360-degree survey has been launched. Reviewers received invitations and started filling out questionnaires. Let's look separately at what is happening on the side of review administrators and on the side of the reviewers at this stage.

Administration of a 360-degree feedback review

The main goal for administrators during the review is to increase the number of completed review forms. That’s why the ability to track the progress of your review is so important. As an administrator, you should track completed self-evaluation/manager assessment forms, as well as lists of those who have not yet started filling out the questionnaires. Reminder messages should be sent frequently enough - if you use a specialized third-party service for 360-degree feedback, then it may be possible to send reminders only to those reviewers who have incomplete review forms, or to send reminders to reviewers individually. The frequency of sending reminders depends on the duration of the review - for 3+ week 360-degree feedback reviews, it is enough to send reminders once a week in the beginning and shorten the interval closer to the end of the review. For short reviews (a week or less), we recommend sending reminders every two days or even more often.

Participation in a 360-degree feedback review

We have already written about the importance of training reviewers on how to give feedback to their colleagues. Even a short instruction will be useful and will make the results more helpful for feedback receivers.

If you use a third-party service, make sure its user interface does not cause any difficulties for reviewers when they are filling review forms: reviewers should be able to return to filling out questionnaires any time and their answers are not lost when the page is reloaded, the review forms are mobile-friendly, reviewers can see the list of all their colleagues they should give feedback to on one page.

A reviewer should receive only one invitation to the review, even if they need to give feedback to several colleagues. If the reviewer receives one invitation for each colleague, they may get confused and miss someone’s review form.

In situations where a reviewer who is asked to provide feedback to a colleague hasn’t interacted with that colleague closely, provide an option to skip review form questions or the entire review form.

Make sure that reviewers can easily contact technical support or administrators for any questions - the more support channels you have for reviewers, the better.

360-degree feedback results

Congratulations, your 360-degree review is over! It would seem that the project is completed, but in fact, the work is just beginning, since the results are what the 360-degree feedback review was conducted for. What to do with the results? Send them to reviewees immediately or with a delay, should managers receive the results of their subordinates before those results are published for the subordinates? Analyzing the results, what should you pay attention to first? Below, we will look at these and other questions that arise for administrators of 360-degree feedback campaigns.

Who should have access to 360-degree feedback results

As we have already said, before conducting a 360-degree feedback review, you need to determine your goals. If your only goal is employee development, then the review results should be available only to reviewees - whether to share these results with someone else or not is up to reviewees. If, in addition to the individual development, you also plan to make career decisions or organize training programs in the company, then the results should be available to both HR employees and managers. Some companies do not provide results to reviewees, sending them only to managers - we consider it is a bad practice, since the effectiveness of 360-degree feedback largely depends on the involvement of reviewees and reviewers in the process, and if they do not see the final result and/or any positive changes, they begin to treat such reviews as an unnecessary formality.

If the majority of reviewees have not previously participated in 360-degree feedback reviews, it is better to get them familiar with their reports in 1-on-1 meetings to avoid mistakes in the interpretation of the results.

If you are conducting a 360-degree review for yourself, consider showing the results to your supervisor or an experienced colleague/coach to hear their feedback on your report and what the next steps could be.

The 360-degree feedback report structure

How to read 360-degree review results and what to pay special attention to?

360-degree review reports from third-party services have a similar structure and elements. The information below will be based on a typical report provided by Wystra. But even if you use another service or conducted reviews yourself making survey forms with Google Forms or something similar, this information will be applicable too.

360-degree feedback can answer the following questions:

  • Why should I increase my effectiveness in a particular area or improve my skills?
  • What exactly do I need to improve?
  • How can I get better?

Your 360-degree report should shed light on these questions, so the structure and elements of the report are designed to give you the insights you need.

In the report, you will see many elements that compare a reviewee’s self-assessment with an assessment of others. This comparison allows reviewees to improve their understanding of how they are seen by their colleagues, and how the opinion of colleagues differs from the opinion of reviewees about themselves. Through that comparison, reviewees can identify their strengths that they rated low, or vice versa, discover areas that they overestimated.

Usually, the information in the report is arranged according to the principle from general to specific.

The first half of the report is about competencies, a comparison between self-assessment of competencies and the assessment of others, and indicators with highest and lowest ratings. The second half displays detailed information on each competency and indicator with average ratings for groups, comments from colleagues, as well as answers to open questions.

Let's look at the main elements of the 360-degree review report.

Competency rating list

The list of competencies is usually presented in the form of a table or list with an overall average rating for all reviewer groups (except for self-assessment). Often, the list is supplemented with an average self-assessment rate, so that you can immediately see similarities and differences.

Comparison of competency ratings by roles (reviewer groups)

This comparison is usually shown in the form of a radar, where you can see how the average ratings on competencies from peers correlate with the ratings of a manager, subordinates (if any), and self-estimation. For example, subordinates may overestimate skills, and the manager may underestimate them. On the radar, such discrepancy will be quite obvious.

Comparison of self-assessment with reviewers’ assessment

Often in reports, you can see a section “Johari Window”. This section, in addition to the radar, allows you to better understand the correlation between self-evaluation of skills by reviewees and how they are perceived by others. The window consists of four zones (quadrants) - Open (obvious strengths), Blind, Façade (hidden strengths), and Unknown (development needs), in which certain competencies fall.

The four quadrants of Johari Window in the context of 360-degree feedback
Known Strength (Open)

The self-rating and all 'others' rating are both having high rates. This area represents the actions, behaviors, and information that are known to the individual and people around you.

Hidden Strength (Façade)

The self-rating has a high rate, but all 'others' rating is low. This area represents the actions, behaviors, and information that are known to you but not known to anyone else.

Unknown

The self-rating and all 'others' rating are both having low rates. This area represents the actions, behaviors, and information that are unknown to you and to others.

Blind Spot

The self-rating is low, but all 'others' rating is high. Actions and behaviors in Blind Spot are known to others, but the individual is not aware of them. The information in this area can be positive or negative and include hidden strengths or areas for improvement.

Behavioral indicators with highest and lowest ratings

This section lists the indicators that received the highest and lowest ratings from reviewers (except for self-evaluation). In addition to the indicators, the relevant competencies are also indicated. This section can be very useful in identifying strengths and areas for development.

Detailed analysis of each competency

In this section, reviewees will see the average ratings by group (reviewer role) for each skill and its indicators. In addition to the breakdown by group (role), the average rating for all groups (roles) is also available. In the detailed information, pay attention to the scores breakdown - this will allow you to understand whether reviewers’ scores are consistent or not. If a discrepancy in scores is large, then you need to analyze why this could happen - it is possible that colleagues see the participant completely differently from each other depending on the project or tasks that they are working on together.

Recommendations on how to analyze feedback from colleagues

  • Feedback should be taken in a positive way. Colleagues who give you feedback, with rare exceptions, try to help you become better
  • Self-evaluation ratings may differ from the opinions of others, and this is normal. Each colleague who gives feedback has their own opinion and experience, and the feedback is influenced by the working relationship, the impression that the participant makes on the reviewers. Each reviewer’s opinion is unique and valuable to the reviewee
  • Do not rush to develop competencies that received low scores immediately after the end of the review. Consider discussing the results with the manager or coach/mentor beforehand. It may turn out that it will be more efficient to develop exactly those competencies that have received high scores since the level of other competencies is at a sufficient level
  • Don't let your emotions overwhelm you while analyzing the review results. Feedback from colleagues can be pleasant, but it can also cause discomfort - try to focus on the overall picture of the report, and do not forget that feedback helps you develop as a professional and become better
  • Discuss the results with your reviewers. 360-degree feedback is usually confidential, and you don't know which of your colleagues wrote a particular comment or set a specific rate, but that shouldn’t stop you from discussing the results with your colleagues. You do not need to share the full results of the reviews with your colleagues, just mention the points that are still unclear to you. Your colleagues will appreciate your openness and interest. On the contrary, if you do not communicate with your colleagues about your results, this may lead to the fact that in the future your colleagues will give you less valuable feedback since they don’t see the returns. Feedback from colleagues is a kind of investment. In long run, colleagues want to see that their investment has paid off - their feedback has helped you

360-degree feedback - next steps

As we wrote above, 360-degree feedback is not a "magic pill". All employees - survey administrators, managers, reviewees, and reviewers - should remember that 360-degree feedback will not change anything by itself, and it will take time and effort to develop competencies based on feedback results.

Once reviewees have received their reports, they will have to determine what competencies or behaviors they want to improve over the next few months. The list of planned actions for the development of competencies can be recorded in the form of an individual development plan (IDP). An IDP is a document that helps an employee develop the necessary competencies. The IDP includes a list of development goals and activities that allow plan owners to achieve these goals. The IDP is created for several months and, as a rule, includes no more than three goals - if more goals are included, it will be difficult to focus on the development of certain competencies, there will be too many tasks. For each goal, a list of activities (tasks) is specified. It is recommended to create goals and tasks in a specific format that will allow plan owners to clearly determine what needs to be done and in what time frame. For example, employees can use the SMART format. SMART is an abbreviation, the interpretation of which is: Specific, Measurable, Achievable, Relevant, Time-bound.

The next 360-degree review should preferably be scheduled after the individual development plan is closed. If you repeat the review with the same competencies in the questionnaire, then its results can be compared with the results of the previous review, which will give additional food for thought. Sometimes it happens that in a repeated review, the scores of competencies are lower than in the previous one - most often this it happens because the reviewers began to set rates more meaningfully and do not overestimate them.

How regularly you need to conduct 360-degree feedback reviews, you decide for yourself, depending on the defined goals. We recommend conducting the review at least once a year. Since on average at least 6 reviewers give feedback to reviewees, it may be too time-consuming to conduct such a review more often than once a quarter.

By participating in 360-degree reviews, employees learn to give feedback to their colleagues that motivates them to improve and strengthens their working relationships. Perhaps in the first review, not all feedback will be informative, but each time the effectiveness of the 360-degree review will increase over time.

Conducting a 360-degree feedback review using a third-party service will allow you to avoid most of the administrative work and focus on the main thing - the results. If you have any questions about this guide or would like to see our platform in action, write to us at [email protected].

About Wystra

You can organize a 360-degree review for employees in your company or for yourself using a multi-purpose survey service (for example, Google Forms). In this case, you will need to send out links to the questionnaires to reviewers, ask reviewers to specify who they give feedback to and in what role, and process the results yourself - it will be time-consuming and challenging, so we recommend using a specialized third-party service for 360-degree feedback.

Wystra allows you to easily and quickly organize both regular and one-time 360-degree feedback reviews for yourself, your team, or a company. Using our service, you get the following:

  • Free demo version - you can organize a 360-degree review(s) for free for 1-5 reviewees, depending on the pricing plan
  • A database of competencies and behavioral indicators that you can use in your reviews. You can add your own competencies and indicators
  • Easy administration of 360-degree reviews: add/edit reviewees and reviewers before and after the launch of the review, send reminders to those who have incomplete review forms, or for individual reviewers, track the number of completed review forms for each reviewee and reviewer
  • Selection of reviewees and reviewers either by review administrators (using Wystra’s interface or an Excel file), or by reviewees (each reviewee chooses from colleagues whom he would like to receive feedback, and their manager or a review administrator approves the selection)
  • Automatic processing of review results. For reviews of up to 50 reviewees, results become available within a few minutes after the review is completed
  • Flexible configuration of access to review results: you can make the results available to all reviewees and their supervisors immediately after the review is over, or publish the results individually
  • Individual development plans for 360-degree review participants
  • Compare the results with the results of the previous reviews to track the dynamics
  • Technical and consulting support from Wystra specialists. We will help you with setting up or completely take over the administration of your surveys, adjust the scale, the appearance of the review forms, and the competency model in accordance with your goals. If you are planning a review for the first time and decided to set it up yourself, we will double-check that you set everything correctly.

Ready to start? Leave a demo request or start a free trial, and we will contact you to clarify your goals and set up an account for you.

Try Wystra free