We’ve done quite a few UGC campaigns now, and I’ve noticed we’re always flying a bit blind in the post-campaign phase. We get back stacks of video content from creators, we pick the best stuff, we push it to ads, but then… what? We know what performed in the funnel, but we never actually sit down as a team and extract lessons for the next round. It’s like we’re not learning systematically.
The gap I keep seeing is that UGC is different from influencer partnerships. With influencers, you at least have a person to debrief. With UGC, you’re working with multiple creators at once, and by the time you get results, they’ve moved on to other projects. The knowledge just evaporates.
So we’ve started building out what I’d call a UGC campaign playbook—basically a living document where we define the creative brief, document what creators actually delivered, track which videos performed (and why), and then synthesize takeaways for the next brief. Each campaign iteration has documented tasks (brief creator, review submissions, A/B test variations, measure results), specific actions (what we changed based on learnings), and measurable outcomes.
It’s helped us see patterns. Like, we noticed vertical video with fast cuts consistently outperforms static product shots. Another thing: creators in the 25-35 age range seem to resonate better with our demographic than younger creators, even though I would’ve guessed the opposite. These aren’t rocket science insights, but we only know them because we forced ourselves to document the logic.
But the process is still a bit manual and fragmented. I’m trying to figure out how to structure this more systematically so that new team members can onboard into the playbook quickly, and we’re not reinventing the wheel every campaign.
How are you all handling post-UGC reviews? Do you have a structured process, or is it more ad-hoc? What are the key sections you include when you analyze a campaign—what should I be documenting?