Category 7 min read

Why Team Intelligence Is Emerging Now

Four forces are converging in 2026 to put team intelligence on the map for managers and HR leaders. Here is what is driving the shift and what to watch for in your own organization.

By Asa Goldstein, QuestWorks

TL;DR

  • Distributed and async work stripped away the casual signals leaders used to read in the room. Surveys and dashboards have not filled the gap well.
  • AI is absorbing routine individual work, so what is left for humans is increasingly collaborative judgment. The unit of analysis is shifting from person to team.
  • People analytics matured but stayed individual-keyed. The team layer was never named or built, so practitioners knew something was missing without having a word for it.
  • The science (Woolley 2010, Riedl 2021, Project Aristotle, Levy 2025) is stable enough that team-level behavior is now a credible thing to measure and act on.

If you work in HR, run a team, or sit anywhere near people strategy, you have probably heard the phrase "team intelligence" enter conversation in the last twelve months. It shows up in Bersin reports, in MIT Sloan articles, in vendor decks, and increasingly in everyday hallway talk among managers trying to explain what is broken about their team that their engagement scores are not catching.

If you want a definition first, that is covered in What Is Team Intelligence?. This piece is about why now. Why this concept is putting itself on managers' and HR leaders' desks in 2026 instead of 2016 or 2036. Four forces are converging at the same time, and each one alone would be interesting. Together, they are the reason team intelligence has a name worth knowing.

Force 1: Distributed work made teams illegible

For most of corporate history, leaders read teams the same way you read a room at a dinner party. You walk past someone's desk, you notice who is sitting with whom at lunch, you watch who shifts in their chair when the new initiative is announced. None of it was written down, but it was a continuous behavioral signal. Most of what good managers knew about their teams came from these incidental reads.

Distributed and hybrid work erased that channel. Stanford's WFH Research project finds that paid full days worked from home stabilized at roughly 28% in the U.S. through 2024 and 2025, up from about 7% pre-pandemic (WFH Research, Stanford). For managers, the question is no longer whether remote is good or bad. The casual data stream they used to rely on is gone for a meaningful share of their workforce, and the formal replacements are weak.

Gallup's State of the Global Workplace 2024 reports that only 23% of employees worldwide are engaged, 62% are not engaged, and 15% are actively disengaged. Manager engagement specifically dropped from 30% to 27% year over year, and the global productivity loss tied to disengagement is estimated at $8.9 trillion. Surveys give you the temperature of a workforce at a moment in time, but they do not surface the granular team-level patterns that used to be visible by walking around: the senior engineer on Team B who has stopped pushing back in design reviews, or the new hire on Team C who is being talked over without realizing it. Those signals used to surface for free in a shared office, and they no longer do.

That is the visibility loss, and it is the first force pushing team intelligence onto leaders' agenda. They need a way to see how teams behave when they cannot watch them work in person.

Force 2: AI is shifting the unit of analysis from individual to team

The second force is harder to name but possibly the largest. As AI absorbs more of the routine, individual-scoped work (writing first drafts, summarizing meetings, generating reports, triaging tickets), what is left for humans is increasingly collaborative judgment. Deciding what to build. Resolving disagreement about scope. Reading a customer situation no one on the team has seen before. Choosing which AI output to trust.

When the work was largely individual, individual productivity metrics were a reasonable proxy for team output. When the work becomes a sequence of joint decisions under uncertainty, those metrics fall apart. You can have five "10x" individuals who, as a team, ship the wrong thing six months in a row.

McKinsey's research on team effectiveness, drawing on data from more than 5,000 companies, finds that team-health drivers explain 69 to 76% of the difference between low- and high-performing teams, and that 75% of cross-functional teams underperform on basic measures like budget, schedule, and customer focus (McKinsey). The math is starting to favor leaders who can read team-level signal over leaders who can only read individual-level signal.

A recent MIT Sloan piece by Kleinbaum and Wheatley makes a related point: socially central leaders, the ones who occupy structural positions inside their team's communication network, drive measurably deeper alignment than leaders with the same titles who sit at the edge. The skill that matters there is relational and structural, and it rarely shows up on an individual performance review.

If AI eats the individual work, the team becomes the unit of analysis by default. Team intelligence is the language showing up to describe that shift.

Force 3: People analytics matured but stopped at the individual

The third force is internal to the HR-tech stack. Over the last fifteen years, people analytics graduated from a side function into a recognized field. Workforce analytics platforms like Visier proliferated. Talent intelligence platforms like Eightfold raised hundreds of millions of dollars. Engagement platforms like Culture Amp and Lattice became standard issue.

All of that infrastructure is real, useful, and individual-keyed. It tracks people one at a time and aggregates upward. The team layer (the level at which people actually work, decide, and ship) was largely skipped over.

Josh Bersin's 2024 research on the state of people analytics is sobering on this point. Only 10% of organizations consistently achieve the highest level of impact from people analytics, and just 9% have integrated their people, operational, work, and sales data well enough to draw real cross-functional conclusions (Josh Bersin Company, 2024). The field matured around individual measurement and never extended down to the team layer where the work actually happens.

And the lagging-indicator problem is getting worse. WorkBuzz's 2023 research found that roughly 32% of companies still rely on annual engagement surveys, and 17% are not formally listening to employees in any structured way at all (WorkBuzz, 2023). Annual surveys give you data weeks after the moment that produced it, at the wrong level of analysis. Practitioners have been complaining about this for a long time without a clean way to say what was missing.

"Team intelligence" gives them the word. Once you have a word, you can search for it, defend a budget for it, and design tools that produce it.

Force 4: The science is no longer fringe

The fourth force is the slowest moving but the most decisive. Researchers have been building the empirical case for collective intelligence at the team level for more than fifteen years, mostly outside the spotlight until recently.

Anita Woolley's 2010 paper in Science introduced the c-factor, a measurable property of groups analogous to individual IQ that predicts how well a group will perform across a variety of tasks. The original finding was that c-factor correlates more with social perceptiveness, equality of conversational turn-taking, and the proportion of women on the team than with average individual intelligence (Woolley et al., Science, 2010).

For a decade, that finding was treated as a curiosity. Then in 2021, Riedl and colleagues published a meta-analysis in PNAS covering 5,279 individuals across 22 studies and confirmed that collective intelligence is real, measurable, and modestly predicts group performance beyond what individual ability and group composition alone can explain (Riedl et al., PNAS, 2021). The replication is honest about its limits. Some scholars (Bates and Gupta, Rowe and others) have argued the effect size is smaller than originally reported and that general individual intelligence does much of the explanatory work. The debate continues, but team-level behavior now has the empirical footing it lacked a decade ago.

Around the same window, Google's Project Aristotle findings spread through corporate discourse. Psychological safety as the keystone team dynamic became a business-school staple. Charles Duhigg's 2016 New York Times piece on the project has been cited so many times in HR strategy decks that it is effectively a primary source.

Earlier still, Sandy Pentland's 2012 Harvard Business Review work on the "new science of building great teams" used wearable sensors to show that communication patterns, more than the content of conversations, predict team performance. Energy and engagement outside formal meetings explained roughly one-third of the productivity variance he measured (Pentland, HBR, 2012).

And in 2025, Jon Levy's book Team Intelligence brought the term itself into mainstream business discourse, alongside a wave of practitioner-oriented writing on the same theme (Levy, 2025). The science is stable enough that practitioners can act on it without feeling like they are betting on a fad.

Put plainly: team intelligence has moved past the one-paper-finding stage into an accumulated body of work with replications, meta-analyses, popular books, and corporate research programs all pointing at the same thing. That is the fourth force.

What this means for managers right now

The four forces above are abstract. Here is what they translate into for someone running a team this quarter.

Treat survey response data as a lagging indicator. Annual or even quarterly surveys are useful for confirming what you already suspect, but they arrive too late to give you the weekly behavioral signal you would need to intervene in time. If your engagement score dropped six points last cycle, the team-level dynamic that caused it was visible months earlier in how people interrupted each other in stand-up.

Pay attention to which work AI is absorbing. What remains for your team is collaborative judgment, which is exactly the work that depends on team intelligence. Teams that adapt fastest to AI tend to be the ones that can have direct, fast, low-defensiveness conversations about what to do with it, while teams where every AI rollout becomes a turf war tend to stall.

Watch for the team-level signals surveys cannot catch. Who speaks first in a conflict and who waits. How quickly the team recovers from a missed deadline. Whether disagreement gets surfaced in the meeting or three days later in DMs. Whether new hires are folded into the team's working rhythm in week one or week eight. These are the signals team intelligence is meant to make legible. (For a more structured version of this, see Team Intelligence Metrics.)

Map where team intelligence sits relative to what you already buy. It is adjacent to engagement, organizational network analysis, and people analytics. The team intelligence map walks through how they relate and where each one is the right tool.

Know what bad team intelligence costs. The McKinsey and Gallup numbers above suggest that the cost of operating without team-level signal is large and largely invisible until something snaps. The cost of bad team intelligence piece breaks the math down.

Where QuestWorks fits

QuestWorks is one example of what a team-intelligence platform can look like in practice. It runs on its own cinematic, voice-controlled web platform and integrates with Slack for install, invites, onboarding, and private coaching. Teams play through scenarios together, the system surfaces aggregate team trends and strengths-based highlights to leaders, and individual coaching with HeroGPT stays private. Participation is voluntary, HeroTypes are public, and nothing connects to performance reviews. For the broader picture of the space and how to evaluate any tool in it, the canonical reference is What Is Team Intelligence?.

Frequently Asked Questions

Four forces are converging at once. Distributed work removed the casual signals leaders used to read in person. AI is shifting the unit of analysis from the individual to the team because the work itself has become collaborative judgment. People analytics matured but stopped at the individual layer. And the underlying science (Woolley's c-factor, Riedl's PNAS meta-analysis, Project Aristotle) is now stable enough that practitioners can act on it. Together, those four forces have given the concept a name and a moment.

No. People analytics is keyed to the individual. It tracks engagement, performance, and retention person by person, then aggregates upward. Team intelligence is keyed to the team itself: how a group makes decisions, surfaces disagreement, recovers from mistakes, and coordinates under pressure. The data is behavioral, the unit is the team, and the cadence is continuous instead of annual.

Surveys are increasingly a lagging indicator. Roughly 32% of companies still rely on annual surveys, and 17% are not formally listening to employees in any structured way at all. Even the best-run survey program tells you what people feel weeks after they felt it. Team intelligence is meant to complement that listening function with continuous behavioral signal, so leaders have something between annual cycles.

Start treating team behavior as measurable signal instead of vibes. Watch for the patterns that surveys cannot catch: who speaks up first in conflict, who defers, how quickly the team recovers from a missed deadline. Pay attention to which routine work AI is absorbing, because what is left will be collaborative judgment. And read the underlying research so the language has weight when you bring it to leadership.

The canonical definition lives at What Is Team Intelligence, which covers the underlying science, the difference from adjacent concepts, and how the data is generated. From there, the team intelligence map shows where the concept sits relative to people analytics, organizational network analysis, and engagement platforms. Team intelligence metrics covers what to actually measure, and the cost of bad team intelligence covers the operational and financial impact when leaders are working without that signal.

Ready to Level Up Your Team?

14-day free trial. Install in under a minute.

Slack icon Try it free
Team Intelligence, Powered by Play Try QuestWorks Free