In this episode, I built an automation that taps into one of PlayThru’s greatest assets—its user base—to collect real insights and feed them directly into the content engine.
In the last episode, I built a content automation that researches, outlines, drafts, and edits resource content for charity golf outings.
It does the job, but there’s a glaring issue: I’m not a charity golf outing expert.
I’ve helped run charity golf outings, but I haven’t overseen one end-to-end. Who am I to teach anybody about how to do it?
In this episode, I built an automation that taps into one of PlayThru’s greatest assets—its user base—to collect real insights and feed them directly into the content engine.
AI can write fast and pull from its vast knowledge base for any topic. But, it can’t fake lived experience.
If I want resource content that actually helps organizers run better events, I need the stories, shortcuts, and lessons only practitioners know.
When new users sign up for PlayThru, the account creation form asks how they plan to use the system. More than 3,300 users selected “tournament” or “fundraiser.”
That’s a built-in group of experts whose insights and experiences can really elevate the quality of my content, which in turn will help other golf fundraisers.
But, manually sending out questions and processing responses isn’t something I want to do manually.
So, I built two automations to take care of the entire process.
First things first, I want to ask my users if they are interested in being experts for my content; in other words, I want them to opt in.
I started this process by adding an “Experts” tab in my Google Sheets editorial calendar. I then dropped all the “tournament” and “fundraiser” emails into column A. Initially, everyone’s “Opted In” status was set to “No”.
I then sent them an email request asking for their expertise using the email events@golfplaythru.com.
This will be the only email I use throughout this process because I can use automation to send and review emails on my behalf.
The first automation, The Email Categorizer, is built to pull in any new messages events@golfplaythru.com receives, use AI to process each, and then group them into one of four categories:
If the AI categorizes the email as “Opt In” or “Opt Out,” the automation grabs the sender’s email address, searches the Expert tab’s column A for that address. Then, it updates their “Opted-In” status accordingly.
For any emails tagged as “Other”, the automation forwards them to my email address to personally review and update. If I start to see different types of emails coming in, or the categorizer is missing things, I can update the AI prompt to better handle those situations.
Those emails are easy. The Response emails create a couple of challenges.
Processing response emails is trickier than the opt-ins and outs. The reason is that, at any given time, there may be insight requests sent for a handful of different blog posts. I don’t expect my experts to rephrase the question in their answers like a grade-schooler.
Instead, I’ll likely get some short, incomplete reply that in no way references what was asked—which is fine, I’ll take it!
How do I connect the response to the original blog topic?
The answer is to use the email subject line.
When a request for insight is sent (which we’ll get into shortly), the automation adds “Insights #3” or something like that to the subject line. The # is the post ID number.
Any emails categorized as “Response” then run through an AI operation that pulls this ID from the subject line:
Scan the <email_subject> and locate the blog ID. This ID will be prefaced by a hash symbol (#) and follows the word "Insights" (e.g., "Insights #22", "RE: Insights #22").
Now, when I save the insights to my Google Sheet, I can also save the post number.
In addition to the post number, the AI operation looks for two other bits of information:
Side Note: If you own a SaaS product and you’re not taking every opportunity to ask for product feedback and ideas, you’re dead in the water. It’s insanely valuable!
With my post ID and insights captured, now I need to save the data.
First thing I want the automation to do is search for the expert in my Experts tab. The reason is that I want to give credit where credit is due and keep track of who responds and how often.
In addition, there’s a chance the responder may not be one of my original experts. I can see a situation in which the original recipient forwards the message to someone better equipped to answer it.
If the expert is found, the automation then does several things:
If the reply did not come from an expert in my database, I still want to save the same information, but I don’t have a record of the question asked, so I save all the details without that bit. It’s easy enough to pull when writing the content.
But how do I request the insights? That’s covered by the second automation.
I want to manually control when insight requests are sent, but they need to be sent out and given time for replies before the content can be generated. For this reason, we can’t build this part into the original content generation automation. It has to be a pre-step process.
So, I added two new statuses in the editorial calendar:
When the editorial calendar status column is set to “Research,” the Insight Requester first pulls in information about the blog post, including the topic, key insights and the Post ID, which is a new column I added in for this automation.
The Post ID is how we will link up the blog post to any insights we collect. Nothing fancy to coming up with the ID, I just set it equal to the row number, which ensures there are no duplicates.
The Insight Request then uses AI to review the topic and key insights, and then come up with an open-ended question. I want the question to be simple to answer, but also ask about non-obvious aspects of the topic. Here’s the prompt:
- Go Beyond the Obvious: The question must be written to extract unique insights, common mistakes, or overlooked opportunities related to TOPIC that only a seasoned expert would think to consider.
- Keep It Simple: The question should be relatively short and easy to understand. We are not writing anything highly technical or sophisticated, so the question shouldn't be overly complicated to answer.
- Be Open-Ended: The question should invite the user to provide a multi-sentence answer, not a 'yes'/'no' or a short-phrase response.
This approach to questioning is the most effective way I’ve found to elicit real, tangible insights.
The question the AI returns is then sent to another AI operation, which builds it into a templated email.
Before blasting the requests out via email, I need to identify the experts, which takes some care.
I can’t just randomly pick experts from my list to send to. Not all the experts have opted in, and I don’t want the same expert to receive multiple requests a day or week.
So, when the automation queries the Experts tab, I need to check a few things:
Then, for each expert identified, I send them the same email from the events@golfplaythru.com email address. This way, all replies go directly to that inbox for categorizing (see above).
The final step is to update the Experts tab’s Last Contacted column with that day’s date and time. This makes sure experts don’t receive too many requests.
I also add in some logic to log when no expert email requests were sent, which could happen depending on how many opt-ins I’ve got. This way, the content generation automation knows when to skip the expert insight step.
Final step, I need to feed these insights back into the content generation automation. Fortunately, that’s the easy part.
When the content generation automation starts, it pulls in information about the blog post, which now includes the post ID.
It then opens the Insights tab and pulls all rows where the post ID matches the one from the editorial calendar. It then adds all the insights to a CSV.
From there, I can add the insights as one variable to the Outline and Draft operations with a bit of context about what the information is and how it should be used.
Ta-Da. Resource content written with expert insights.
I’m always leery of putting AI and automations out in the world that interact with my audiences without some level of human involvement. At the end of the day, you never know what the AI is going to come back with and do.
So, throughout the automations, I put safeguards in place to make sure I’m aware of what’s going out and how. Here’s a quick recap to consider when building your own automations.
I needed real-world experience to elevate my resource content, and I wasn’t the right person to manufacture it. The automation detailed here and accessible below taps the expertise already in PlayThru’s user base and routes it into the content system with minimal manual effort.
Next up, I’m tackling image generation. I have a specific visual style for the PlayThru blog, and I want to automate that too. If you’re following along and building your own version, now’s the time to put your expert network to work.
Follow along as we transform my side hustle into a fully Ai-native business. Hopefully we all learn a few lessons along the way and I'll be sharing the plans and automations I'm building.