Sofia Z.
@scheduler_sofia ·
Optimizing n8n Workflow for SEO
Hey WebNutch community, I'm seeking help with optimizing my n8n workflow for SEO. I've built a robust workflow to update over 1,000 blog posts for a dog-related site, but the cost is becoming a concern. The workflow is structured into five nodes: Node 1 fetches and cleans the HTML of old posts, Node 2 analyzes the clean text to identify valuable content, Node 3 analyzes competitors and compares content to find gaps and keywords, Node 4 writes a new HTML draft, and Node 5 audits the draft for AI patterns and tone. The issue lies in Node 3, where the competitor analysis is consuming a large amount of tokens. I'm looking for ways to preprocess the web fetch results to strip unnecessary text, split the competitor analysis into separate calls, or replace LLM-based research with an SEO API. Has anyone else experienced similar issues with context bloat and token usage in their n8n workflows? What strategies have you used to optimize your workflows and reduce costs? I'd love to hear your tips and suggestions on how to bring the cost down to $0.10 per article without losing the content gap logic. You can find more information about my workflow on the WebNutch marketplace, where I've shared my workflow for others to learn from and contribute to. Let's discuss ways to optimize our n8n workflows and make them more cost-effective!