top of page

Know what your team really thinks - use pulse surveys for insight and action [leadership best practices]

The people who make up an organization are its most important asset. Without the talents in the company nothing happens (well, at least until AI replaces us all). When a company is small and intimate you have enough interactions to be in-tune with the team. As the team grows, it becomes much more difficult to have a true sense of how the team is doing. Remote and hybrid work also make it harder as you don’t get to read non-verbal signs or interact with everyone the same way.


A board showing a scale to rate "how was your day"

So how do you know how the organization really feels? Over the years I’ve found and refined a specific technique I refer to as a pulse survey that worked really well for me. I’ve used it in my own organizations, brought it to others and evangelized it to many people I worked with. Today I am sharing it with you this leadership best practice in the hopes that you also find it as useful as I did.


Things you should know before you proceed

  • Some feedback will be difficult – Be grateful for getting difficult feedback and being able to act on it, never be resentful about it.

  • You will be sharing the feedback with everyone – You will be letting the good, the bad and the ugly air out in the open. This is a good thing!

  • Getting the feedback is only half the work – You must demonstrably take steps to act on whatever comes up and show the team their voice is heard.


So how does it work?

The method involves a few key elements. You can always experiment and adapt this to suit your needs and style, but some elements are critical in my view. You should also commit to Trust, Transparency, and Consistency for this to work well.


The key elements of the pulse survey:

  • An anonymous survey

  • Simple structure with a fixed happiness question and a dynamic part

  • Sharing the results with everyone

  • Acting on the results

  • A fixed cadence


image of a person typing on a laptop

A short anonymous survey

The feedback is collected using a short anonymous survey that is collected from everyone in the organization. You can use any platform to create this survey - there’s no need to invest in complex solutions. Google forms, Microsoft forms, SurveyMonkey, etc. will do.


The survey should be easy and quick to fill in, typically taking no more than 5 minutes. If there’s more feedback or insight you want to collect, you can either set up a dedicated survey or wait for the next survey instead of overloading the current one.


Part of what makes this work is the fact that people feel safe sharing how they truly feel. Don't require an email or login to submit a response and avoid questions that can raise concerns about anonymity. In larger organizations you can consider asking about a department affiliation if they are large enough and seeing them separately is important.


Survey Structure

The Happiness Index – the fixed part

The first question in the survey will be the same forever. I call it the happiness index but you can name it however you like.


“How happy are you in the organization?”

a scale of responses to the question "how happy are you in the organization" going from 1 to 10
An example of the rating scale for the happiness index from a real survey

I typically use a 1-10 scale as shown above. The reason for using a 1-10 scale is the ability to capture more nuanced responses and trending over time (more on that later).


Following the numeric response is an optional free text response to share any reason you scored this way. Remind people that whatever they answer will be visible to everyone.


Add-on questions

Beyond the happiness index question, I would add between 1-4 questions add-on questions to each survey. Use these additional questions to get the team’s input and feedback on specific aspects or to have fun and get to know each other as a team.


Questions are typically a structured question with preset options (such as a numeric scale, multiple selection, prioritization) followed by a free text question to add some color on the response. I always kept free text responses optional while keeping structured ones mandatory. Keep the 5-minute rule in mind and avoid lengthy surveys.


Example add-on questions

  • Rate the organization on productivity on a scale + free text suggestions for improvement.

  • How confident are you in meeting the upcoming milestone(s) + free text on primary reasons.

  • How well do you understand the company’s vision + free text on what can make it clearer or more compelling.

  • Asking about concerns or excitement related to some recent change or event at the company.

  • Asking about preferences or opinions about a key decision that is coming up (crowdsourcing opinions).

  • Sharing something fun or personal (creates a cool view of the team make up and a serves as a great conversation starter). Things like asking about favorite cuisine, sport, hobby, TV show, etc.


Sharing the results with everyone

Very shortly after the survey concludes (more on that later), share the aggregated results with everyone. I usually compiled the results into a document or presentation but any format will do.


A graph showing the results of the happiness index question
Example result for happiness index question

I would typically extract the aggregated results for each question from the survey tool itself or take the CSV export and generate simple visualizations of the results.


For free text responses, I share those verbatim and never leave anything out - this goes back to trust and transparency. If the free text response is an add-on to a structured response on a scale, I would sometimes show the free text responses aggregated based on low vs high scores on the scale so they are easier to interpret.


I preferred sharing the results using an open communication platform (slack/teams channel, some form of community communication, etc.). It speaks to the value of transparency and encourages a dialogue on those platforms directly.


When I shared the results, I would add my immediate reflections. This is optional, but I believe it is important. Always acknowledge the feedback as truthful and thank the team for providing it.


My recommendation is not to share individual responses as it creates some risk of singling out an individual, a risk I never wanted to take.


Acting on the results

The reason you are collecting this feedback is to inform decisions and act on it. If the team feels this is a one-way street the engagement will drop off, and inputs can become cynical.


Taking the results into the upcoming team meetings is a great way to engage in a safe space conversation. After all it is based on everyone's feedback. If you decided on specific actions, share these with your team and follow through. This is what perpetuates the magic.


Sometimes you need to take more specific action. For example, if you asked about confidence in meeting the upcoming project milestone and the score was low: Review the project plans, assess the project risks, activate mitigations, etc. The team just raised a collective red flag to your face, it would be foolish to ignore it.


A note on happiness index: Some changes and fluctuations in the happiness index are expected. The events/situations the organization is going through can also affect the scores. Look at the trends and distribution more so than the absolute score.


A fixed cadence

Embarking on this journey should be a long-term commitment. Decide on a cadence and do your best to stick to it. I recommend a cadence of between 1-3 months between surveys, not longer. Pick a cadence that fits your organization, and that you can commit and stick to.


To get maximum engagement (response rate consistently >85% which is very good), I found the following formula worked well:

  1. Announce the survey at the beginning of the work week by email + community (slack, teams, yammer, etc).

  2. On Wednesday send a reminder email stating that most of the team have already responded and encouraging you to do the same if you are among those who have not yet.

  3. Final reminder on the last day of the work week indicating the survey will soon close.

  4. Also have your team leaders encourage folks to respond so their voice is heard.

  5. Share the results in the beginning of the following week.


Some interesting experiences

I’ve had many invaluable insights based on the results themselves which I believe I wouldn’t have obtained otherwise. It wasn’t always pretty, but I was always grateful for knowing the truth and having the opportunity to act on it. In this case, ignorance is not bliss.


I also observed the following additional outcomes from this practice:

  • Raising awareness across the entire organization – Since results are shared with everyone, this created a heightened collective awareness. Some highlighted topics became top of mind and got the team thinking of solutions.

  • A calibration of views – Some individuals who had an extremist opinion themselves or a belief that everyone thought a certain way got to calibrate their view when seeing the responses from the collective team.

  • Creating dialogue within teams – The survey results became a common conversation topic among the teams. There was a genuine level of excitement and anticipation of the results whenever a survey was kicked off. You could sense this energy in the teams.

  • More open dialogue and feedback – On more than one occasion people came to talk to me following a survey to share an idea they had, which I would not have heard of otherwise. Sometimes they shared they have been thinking about for a while but never acted on it.


But my organization already has engagement surveys

People in larger organizations may have some form of employee engagement surveys in place. These tend to be in the 20+ questions range and serve slightly different purposes. I was in the same boat when I introduced the pulse survey in my department.


My recommendation is to share what you intend to do with your HR/People Ops partner and involve them in the process and results. Explain how this is complementary and how you intend to use it. Since the surveys are short, the overhead on the team is marginal. I always got positive responses when I did so.


What was this inspired by?

Back in 2014 I read the book titled “The Decoded Company: Know your talent better than you know your customers” by Leerom Segal, Aaron Goldstein, Jay Goldman and Rahaf Harfoush. I believe this book is a nice little gem in the management & leadership category which is not as well known as other books. It includes many real-world examples of data-driven practices at various companies and has some great insights.


The book contains a few references to similar approaches as the one outlined above but with some key differences. The first prominent one is a tool initially developed and used internally at 37signals by Jason Fried which they later commercialized and spun out as knowyourcompany.com (the company has since pivoted towards leadership training). The second is to a happiness at work pulse survey (book’s original link now redirects to fridaypulse.com).


These served as an inspiration for me to experiment with my own take on this, which I’ve refined over the years through practice.

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating

Copyright © 2025 Eran Rubens

bottom of page