I’ve started digging into the bilingual hub’s case study library to find potential agency partners for cross-border projects, and I’m noticing something: I have no idea how to actually evaluate what I’m reading.
Like, everyone’s posting case studies about successful campaigns. They all show impressive results—nice ROI numbers, happy clients, clean deliverables. But how do you know if those numbers are real? If they’re cherry-picked? If the case study is describing what actually happened or what they want you to think happened?
I’m trying to get better at spotting the real learnings vs. the marketing fluff. So far, my instinct is to look at:
- Whether they explain what didn’t work (most don’t)
- Whether the numbers make sense for the industry/market
- How they describe the actual challenges, not just the wins
- Whether the client feedback feels genuine or scripted
But honestly, I’m probably missing things. And I feel like if I can get really good at reading case studies on the hub, I can dramatically reduce the time it takes to find partners I actually want to work with.
So here’s what I’m curious about: How do you evaluate case studies when you’re considering someone as a partner? What’s the stuff that matters, and what’s safe to ignore? Do you have a checklist or process, or is it mostly gut feel?
Finally, someone asking the right questions. Case studies are where a lot of people get fooled because they look professional, so people assume they’re accurate.
Here’s my evaluation process:
First filter: Numbers smell test
- ROI claims need to be specific. “500% ROI” is suspicious. “Increased ROAS from 2.3x to 4.1x, resulting in $540k additional revenue on $120k spend” is credible.
- If they claim 10x returns, ask: On what? Spend? Revenue? And for how long? Early-stage companies sometimes achieve high multiples that aren’t repeatable.
- Compare to industry benchmarks. For DTC, a 3-4x ROAS is good. 8x is exceptional. If everyone’s case studies show 8x, they’re cherry-picked.
Second filter: Problem definition
- Good case studies start with a real problem. “Client had low engagement” is weak. “Client’s engagement rate was 0.8%, 40% below industry average, and they couldn’t figure out why” is strong.
- If they don’t clearly state the problem, they probably don’t understand it.
Third filter: Methodology
- How did they measure success? If it’s vague, red flag.
- Did they A/B test? Did they have a control group? Or are they just showing “we did this and this happened”? (Correlation ≠ causation)
- What variables did they control for? Time of year? Market conditions? Traffic source?
Fourth filter: Honest failure
- Look for what didn’t work. A sign of a real case study is someone saying, “We tried X and it didn’t work, so we pivoted to Y.” Studies with zero stumbles are marketing, not case studies.
- Ask yourself: Did they learn anything? Or did they just execute a plan that worked perfectly on the first try? (Spoiler: real campaigns always have at least one thing that surprises you)
Fifth filter: Timeline honesty
- How long did this take? If they don’t mention timeline, they’re hiding something. Maybe it took 9 months to get to these results, which is important context.
- Short timeline (3-6 months) to big results is more impressive than 18 months to results.
Red flags in case studies:
- No specific numbers (“significantly increased” instead of actual percentages)
- Client testimonial that sounds written by the agency, not the client
- No mention of budget spent
- Success metrics that don’t align with business goals (“got 10k impressions” when client’s goal was conversions)
- No explanation of WHY something worked, just that it did
- Weird gaps in the narrative
Credibility boosters:
- They list the actual client name (sometimes they can’t, but when they do, it means they’re confident)
- They explain what they tried that didn’t work
- They directly quote the client (specific language, not generic praise)
- They show before/after metrics clearly
- They mention the resource investment (how many people, how much time)
I actually created a simple scoring system I use. For each case study, I rate:
- Metric specificity (1-5)
- Problem clarity (1-5)
- Methodology rigor (1-5)
- Honesty index (1-5) [measures how much they acknowledge challenges]
- Relevance to my needs (1-5)
Something scoring 20+/25 is worth considering. Below 15, I skip them.
For your cross-border partnerships specifically, look for case studies where they’ve worked with partners or agencies from different markets. Shows they understand complexity and collaboration.
Honestly, if a partner can’t write a good case study, I question whether they can execute well on complex projects. Good results require clarity, and good case studies require the same thing.
Anna’s framework is solid. Let me add a strategic angle.
When evaluating a potential partner through their case studies, I’m asking: “Could this person handle a project like mine?”
So look for:
-
Scale alignment: Did they work with budgets similar to yours? If they only show $50k campaigns and you work with $500k budgets, they might not have team capacity or experience managing that complexity.
-
Problem type: Did they solve the same type of problem you care about? If they’re known for demand generation and you need brand building, even great case studies aren’t relevant to you.
-
Market experience: For cross-border work, look specifically for case studies where they’ve navigated international complexity. Different currency? Language considerations? Regulatory differences? If you see none of that, they might not be ready.
-
Client retention: If you can find examples of the same client in multiple case studies, that’s a strong signal. Means the client trusted them with repeat work.
-
How they talk about collaboration: If all case studies are “we did everything,” they might not be good partners. Partners should show examples of working with other agencies or creators, not just showcasing their own work.
-
Team evolution: Look at case studies over time. Do they show growth in sophistication? Or does everything look the same year after year?
I also look at what’s not in their case studies. If they never mention budgets, I ask why. If they never show client names, I wonder about those relationships. If they only highlight wins and never touch challenges, I question their learning curve.
For a potential partner, I’m looking for humility and clarity more than impressive numbers. Someone who can admit what didn’t work and explain why shows they actually understand their craft.
I’m more practical about this. I skim the case studies, but what really tells me about a potential partner is:
-
How recent is the work? If their newest case study is from 18 months ago, they might not be actively working. I want to see recent examples.
-
Do they show the actual deliverables or just talk about them? Real case studies sometimes link to the actual campaign, landing page, or content. That transparency matters.
-
Can I verify any of it? If they mention a well-known brand, I can sanity-check the timeline and market conditions. If everything is anonymized, I can’t.
-
What’s the tone? If it sounds like a press release, probably marketing fluff. If it sounds like someone documenting what they learned, probably real.
But honestly? I don’t make decisions just on case studies. I read them as a starting filter, then I have a conversation. I ask them directly: “Walk me through one of these projects. What surprised you? What would you do differently?”
Someone who wrote the case study can elaborate on it easily. Someone who didn’t understand the work will fumble.
I’d also recommend: Look at 5-10 of their case studies, not just one. Patterns emerge. If 9 out of 10 show the same success and 1 shows a failure, they’re cherry-picking. If they show varied results, that’s more credible.
I look at case studies differently because I care more about partnership fit than raw performance.
When I’m evaluating someone as a potential partner, I read case studies looking for:
-
How do they credit others? Do they mention collaborators? Subcontractors? Other agencies? If they always go solo, they might not be great at partnerships.
-
Who are their typical clients? Look at case study themes. Do they work with values-aligned brands? Innovative companies? Established ones? I want to partner with someone who cares about the same types of clients I do.
-
How much context do they give? Case studies that explain the market, the challenges, the competitive landscape show someone thinking strategically. Ones that just show “we ran this campaign and got results” suggest someone just executing.
-
Do they show evolution? Look at their oldest and newest case studies. Is there a narrative of them getting better at what they do? Growing their capabilities? Or is everything basically the same?
-
How generous are they? Case studies that share actual learnings (not just wins) suggest someone who’s generous with knowledge. That matters in a partner.
I view case studies as a window into someone’s thinking, not just their execution. A partner with average execution but great thinking is usually better than brilliant execution paired with no insight into why it worked.
Also—and this is silly but true—if their case studies are beautifully designed and well-written, that tells me something about their care for detail and communication. Things I’d want in a partner.
When I was looking for US partners, I specifically looked for case studies that showed they’d navigated complexity similar to what I’d face entering the US market.
What I looked for:
- Did they successfully launch a product/brand into a new market?
- Did they explain the market differences they encountered?
- Did they show adaptation (not just “we did what we always do”)
- Could I actually learn something from their approach?
I also looked at the language they used. If they were talking about US market dynamics with nuance (not stereotypes), that was a good sign. If they treated market expansion like it was simple, I knew they didn’t get the complexity.
For cross-border specifically: Look for case studies with international clients, multiple currencies, or language considerations. If you see none of that, ask why. It’s a gap that matters for your needs.
I also looked at references. If a case study mentioned a notable brand or person, I’d sometimes try to verify. Not always possible, but when I could, it told me something about the agency’s credibility.
One more thing: I looked at case study frequency. If they’re publishing 2-3 high-quality case studies per year, they’re probably active and learning. If they’ve got 50 case studies from 5 years ago and nothing recent, that’s a red flag.
I look at case studies from a creator’s perspective, which might be different.
I’m asking: Did they work with real creators and was that experience good? Here’s what I look for:
-
Do they credit creators by name? Or do they generalize? (“We collaborated with 50+ creators…” vs. “We worked with [specific creators]…”)
-
How much did creators matter to the results? If the case study barely mentions creator involvement, that agency probably doesn’t actually know how to work with them.
-
Were creators fairly compensated? Sometimes you can infer this from case studies. If they mention creator rates or compensation, that’s usually a good sign.
-
Did the creators have creative input? Best case studies show collaboration with creators, not just creators executing a script.
-
Did creators continue working with them? Repeat collaborations suggest good experiences.
I’m skeptical of case studies where agencies talk about “managing” creators instead of “collaborating with” creators. One mindset treats them as hired hands, the other as partners.
Since you mentioned you want partners for influencer/UGC campaigns, definitely screen case studies for how the agency actually works with creators. That partnership dynamic is way more important than raw performance metrics.