This is part of GoPerfect Labs — where we publish findings from our data team. We analyze candidate outreach sequences and response patterns across our platform, then share what we find so recruiting teams can stop guessing and start winning.
We looked at 2.3 million outreach messages sent through GoPerfect to passive candidates across LinkedIn, email, and SMS — spanning roles in tech, GTM, finance, and operations. We tracked open rates, reply rates, and — most importantly — positive response rates (replies that actually moved toward a conversation).
The results were uncomfortable for a lot of conventional recruiting wisdom.
.png)
The template problem is worse than you think
Every recruiter knows templates are a shortcut. What most don't know is just how aggressively candidates penalize them.
When we compared messages that referenced something specific about the candidate's background — a recent role change, a particular skill set, a company they'd worked at — versus messages that used standard "Hi [First Name], I came across your profile" openers, the gap was stark.
.png)
The top-performing message category — which GoPerfect writes autonomously for each candidate — produced a positive response rate of 18.4%. Standard recruiter templates? 5.6%. That's not a small gap. That's the difference between filling a role this month or watching your shortlist ghost you.
What "personalization" actually means to a passive candidate
Recruiters tend to conflate personalization with effort. "I put their name in. I mentioned their company. That counts." Candidates don't see it that way.
We ran a deeper analysis on the messages that generated replies — specifically looking at which signals drove the highest engagement. The results pointed to something more specific than name-dropping: candidates respond to evidence that you understand their trajectory, not just their résumé.
.png)
This distinction matters enormously in practice. Most sourcing tools (and most recruiters) pull a candidate's current title and plug it into a template. That reads as lazy. When a message instead signals: "I understand you've been moving toward senior IC roles and this opportunity fits that arc" — candidates pay attention.
The 5 signals that drive replies
Across the highest-performing messages in our dataset, five contextual signals showed up most consistently. Presence of even two of these in a single message nearly doubled positive response rates.
.png)
.png)
The channel timing data you're probably ignoring
Channel selection gets talked about a lot. Timing gets ignored. Our data shows timing is nearly as important as the message itself.
.png)
Tuesday and Wednesday InMails significantly outperform the rest of the week. Friday and weekend messages underperform by 35–60%. The most wasted recruiter effort in our dataset: Friday afternoon InMails. They have the lowest positive response rate of any send window and the second-highest send volume. Recruiters are pushing out their best candidates at the worst possible time.
The follow-up sequence almost no one is getting right
Most recruiting teams either over-follow-up or give up too early. Our data shows a very specific sequence that outperforms everything else.
The conventional recruiting playbook: send InMail, wait a week, send one follow-up, move on. This leaves enormous pipeline on the table. 31% of all positive responses in our dataset came on the second or third message — not the first. But the timing and channel of the follow-up matters more than most recruiters realize.
.png)
The highest-performing follow-up sequence we observed: LinkedIn InMail on Day 1 → email follow-up on Day 5 (different channel, same context) → brief LinkedIn connection request on Day 9 with no message (just a name they now recognize). This three-touch pattern, with the channel switch built in, drove 3x more replies than single-channel sequences.
"Candidates don't ghost you because they're not interested. They ghost you because you didn't make it easy enough to say yes."
What AI-written messages do differently
We often get asked: can candidates tell an AI wrote the message? The answer is: not when it's done right. And more importantly, they don't care. What they care about is whether the message is relevant to them.
.png)
The difference isn't polish or word count. It's the signal-to-noise ratio. The AI-written message communicates three things clearly: I know what you've done. I understand why it's relevant. Here's what I'm asking for. Templates fail on all three.
GoPerfect's autonomous outreach writes a unique message for every candidate — pulling from their actual background, the job description, and the recruiter's context notes. No templates. No merge fields. Messages that read like a senior recruiter who did their homework wrote them at 7am with a coffee in hand.
The bottom line for recruiting teams
The data points to a clear conclusion: the recruiter bottleneck isn't sourcing. It's outreach quality at scale. A recruiter with a sharp source list but average outreach messages will always underperform a recruiter with a decent source list and excellent, personalized messages.
You can't write 200 custom messages a week. But AI can. And the data shows the output — when the AI is trained on the right signals — gets replies that templates never will.
The recruiters who win in 2026 aren't the ones who source the most. They're the ones who make every message feel like it was written specifically for the person reading it. Because it was.



