What's your system for sourcing and vetting cross-border influencer and brand partnerships at scale?

I’m running into a real scaling problem. Right now, we find partners through direct outreach (emails, DMs), some word-of-mouth, and the occasional platform. But it’s chaotic. We end up spending way too much time on vetting people who ghost, or worse—we commit to partnerships that look good on paper but fall apart in execution.

The current process is basically: spreadsheet of names → outreach → “let’s see if they respond.” Then somehow we’re in the middle of a campaign with a partner who doesn’t return emails or their deliverables are half-done.

I’ve been thinking about this differently lately. What if we built a more systematic way to source partnerships? Not just influencers, but brand partners too—because a lot of our best campaigns come from co-branded pieces, not solo creator work.

Here’s what I’m trying to figure out: how do you actually vet partners before you commit? Like, beyond “does their audience match” and “do they seem professional in an email?” What red flags actually matter? What criteria actually predict success?

I’m also realizing we need a pipeline approach, not one-off sourcing. Like, a constant flow of vetted potential partners so we’re not scrambling when a campaign comes together.

A few questions I’m wrestling with:

  1. How much due diligence is too much? At some point, the vetting takes longer than running the campaign.
  2. How do you actually build a vetting checklist that works across markets? What’s non-negotiable for a US partner might be different for a Russian partner.
  3. What’s the actual decision point for “green light, let’s work together”? Is it analytics? Communication? Past track record?

Would love to hear how folks are handling this. Are you using any tools or platforms for partner matching, or is it still mostly manual?

Oh, this is my jam! I’ve built a pretty rigorous system because I got burned early—thought someone was perfect, and it was a disaster.

Here’s my vetting flow:

Tier 1: Basic Fit (10 minutes)

  • Audience size/type matches our need
  • Posting frequency is consistent
  • They’re actually responsive (test with a simple question)
  • No major red flags in recent posts (scams, drama, toxic comments)

Tier 2: Real Engagement (20 minutes)

  • Dive into comments—are their followers actually engaging, or is it mostly bots?
  • Look at past brand partnerships (if public)—did they seem thoughtfully done or just “posts with logo?”
  • Check their media kit or rate card—realistic or inflated?
  • Do they have a clear point of view or are they just chasing trends?

Tier 3: Alignment Call (30 minutes)

  • How do they think about partnerships? (Are they strategic or just “what’s the payment?”)
  • What’s their creative process? Can they take direction or do they need total freedom?
  • What do they actually care about? (This tells you if they’ll stay committed)
  • Past experience with cross-market or bilingual work?

Red flags that kill it for me:

  • Slow response time (if they’re too slow before we work together, it won’t get better)
  • Inflated metrics (fake followers, engagement pods—research this)
  • No clear opinion on what they do well
  • Bad past brand experiences (if you can find evidence of drama or non-delivery)
  • Unwilling to discuss terms/contracts in advance

What’s really helped: I ask every partner “What’s a campaign you did that you’re actually proud of?” If they light up and tell a real story, I feel way more confident. If they just shrug or say “most of them,” that’s concerning.

I keep a running spreadsheet with notes from each tier. Green, yellow, red. If someone is yellow, I don’t immediately disqualify—I just watch them for 1-2 cycles and see if my concerns feel confirmed or resolved.

For cross-market specifically: US partners tend to be pretty buttoned-up about contracts and timelines. Russian partners sometimes need more relationship-building upfront but can be incredibly loyal. Don’t treat them the same way.

Do you have any systematic filtering before you even get to the vetting stage?

Oh, one more thing—I’ve started asking partners for 1-2 references from past brand collaborations. Usually 5 minutes of conversation with someone who’s actually worked with them is worth more than all the metrics in the world. Most people will give you the reference.

We’ve built a pretty structured approach because, like you, early on we were just saying yes to people and regretting it.

Here’s our system:

Partner Sourcing Pipeline:

  1. Inbound (high quality—they’ve heard of us)
  2. Referrals (even better—trusted partner brings someone)
  3. Platform matching (we use a few tools now—more on that below)
  4. Outbound targeting (we create an ICP, find people directly)

The Vetting Checklist (we weight these differently by market):

Universal (non-negotiable):

  • Response time test (we send a simple question; how fast do they reply?)
  • Contract-ready (do they have standard terms, or are they afraid of contracts?)
  • Revenue-focused (can they articulate what success looks like beyond “got lots of engagement?”)

Market-specific:

  • US: Usually care more about data/analytics, professional presentation, track record of case studies
  • RU: Often care more about relationship, genuine interest in the brand, cultural fit

The Decision Gate:
We move forward if:

  • They pass the response time test (72 hours max)
  • They’re willing to formalize (contract, SLA)
  • They can articulate at least one past campaign where they moved real results

We walk away if:

  • They’re evasive about past work
  • Their rates are wildly inflated (I check against market rates)
  • They seem transactional (“what’s the budget?” as first question)

Tools that helped:
We now use a hybrid approach:

  • HubSpot to track all conversations and interactions (makes it hard for someone to slip through)
  • A simple Airtable base with our vetting checklist (forces discipline)
  • Platform connections (we use a few creator networks, though honestly manual sourcing still wins)

The real shift: we treat this like recruiting, not like shopping. We have interview guides, we document everything, we get multiple people to weigh in before we greenlight.

Vetting time: We actually don’t try to be faster. We’d rather spend 2 hours vetting properly than 10 hours managing a bad partnership.

For scale, we’ve also started building “partner tiers.” Top-tier partners (proven, responsive, professional) get faster approvals and better terms. Tier 2 partners we trial with something small. This lets us move fast on known-good partners while we’re careful with new ones.

Do you have a formal criteria document, or is it still pretty ad-hoc?

Also—and this was huge for us—we stopped trying to vet everything ourselves. We bring the client into some vetting conversations, especially around creative fit and brand voice. Their perspective catches things we’d miss, and it also increases their confidence in the partnership (because they helped pick them).

From a data perspective, here’s what actually predicts partnership success:

Strong predictors:

  1. Consistency (posting schedule, quality of work, audience growth trajectory—does it look organic?)
  2. Engagement rate (relative to audience size—the metric that matters more than follower count)
  3. Communication speed during vetting (if they’re slow to respond now, you’re predicting their behavior)
  4. Willingness to track/measure (do they log promo codes? Click links? Use UTM? If they resist, they’ll resist during campaigns)

Weak predictors that everyone assumes matter:

  • Follower count (huge audiences often underperform relative team size)
  • Professionalism of their media kit (fancy deck ≠ good partnership)
  • Length of time they’ve been active (some great creators are newer)

Process I’d recommend:

Build a partner scorecard with 8-10 criteria. Weight them (maybe audience relevance = 30%, engagement rate = 25%, communication speed = 20%, etc.). Run that on every prospect. Anything scoring above a threshold gets greenlit; below a threshold gets declined; middle zone you trial.

This removes gut feel entirely. I’ve found it cuts vetting time by 30% and improves success rate by about 25%.

Red flags in the data:

  • Sudden follower growth spike (could be bots)
  • Engagement rate collapse (content stopped resonating)
  • Ghost followers (use a service like HypeAuditor to spot-check)
  • High engagement but low conversion (they’re good at engagement but don’t move business)

For cross-market vetting specifically: you need separate scorecards. A 100K follower US influencer and a 100K Russian creator operate in totally different ecosystems. Can’t use the same benchmarks.

What metrics are you currently pulling from partners? That could shape the scorecard.

I’ve been sourcing partnership across two countries, and here’s what I’ve learned:

The painful way: Assume all partnerships are created equal. They’re not. US dealmaking works different than Russian dealmaking.

What actually works:

Sourcing:

  • Direct outreach still wins (email, but personalized—not template)
  • Referrals from trusted partners (if Partner A recommends Partner B, they’re vouching)
  • Platform matching helps, but you still need human validation

Vetting:

  • I talk to someone on my team who’s worked with them or knows them
  • I ask the candidate point-blank: “Give me three references from people who didn’t hire you after working with you.” Catches the real issues.
  • I look at their financial stability (can they stay committed for 3-6 months, or are they desperate for money?)

Cross-market considerations:

  • US partners expect more formal structure upfront
  • Russian partners often need trust-building first
  • US partners will ghost if you’re slow to respond; Russian partners will push harder
  • Time zone differences mean communication models are different

Decision framework:

  • Does this person/team move revenue or vanity metrics? (Huge one)
  • Can they commit for 3+ months, or just one project?
  • Are they growing their own business (good sign) or stuck (risky)?
  • Do they seem like they’re building a brand or just trying to monetize followers?

One thing we implemented: we now do a 30-day trial with new partners. Small project, clear success metrics. If it goes well, we expand. If not, we learned fast.

That’s been the best move for reducing bad partnerships at scale.

Also—tools matter, but less than process. I’ve seen teams use fancy platforms but still have messy workflows. I’ve also seen teams use spreadsheets with incredible discipline. The tool is just a container for process rigor.