Why would I build my own SERP tracker?
I built a small SERP tracker because I kept running into the same annoying problem. I did not need another SEO platform. I needed to know whether a defined set of keywords moved up or down, on desktop and mobile, over time. That is it. Nothing mystical.
As an SEO specialist, I like these problems because they sit right between SEO, data and a bit of development. I am not a developer by trade, and I do not pretend to be one. But I do think modern SEO work is getting harder to separate from data pipelines, APIs, cloud tools and reporting. If I can understand how the data is pulled, stored and reported, I become better at my actual job.
That was the main reason behind this project. I wanted a simple SERP observer that I could control. Not a giant SEO suite. Not a startup idea. Just a working tool that checks rankings, stores the data and gives me something I can build reports from.
And honestly, this is where many SEO tools start to feel strange for smaller companies. They sell a package. The package has many useful things in it, sure. Keyword research, audits, backlink data, competitor tools, reporting, content features, maybe some AI visibility features now as well. Fine. But a small or mid-sized business often does not need all of that every month. It needs a few jobs done properly.
Rank tracking is one of those jobs.
What is the problem with buying a full SEO platform just for rank tracking?
The problem is not that the big SEO platforms are bad. They are not. I use them, I like parts of them, and there are cases where they are clearly the right choice.
The problem is the bundle.
A lot of SEO software is priced like a toolbox, but sometimes you only need a screwdriver. The cheapest plan may already include more features than a small business will ever use, and the price is not really based on your small use case. It is based on the whole product category.
That makes sense from the vendor side. It is easier to sell a platform than to sell one tiny workflow. But from the client side, it can be a bit much. Especially when the real need is something like: “Please check these 300 keywords once a week and show me if we are going in the right direction.”
For that, I do not always want another login, another dashboard and another monthly subscription. I want the data in BigQuery. I want to connect it to Looker Studio. I want to compare it with Search Console, maybe GA4, maybe leads or revenue later. I want a weekly summary that says something useful, not a PDF export with twenty charts nobody asked for.
That is the difference.
Is a custom SERP tracker always cheaper?
No. And I would not sell it like that.
A custom tool is not free just because I wrote the code with AI help. It still needs an API provider, a cloud service, BigQuery storage, scheduled jobs and some maintenance. If someone builds it badly, runs too many checks, stores everything without thinking, or queries BigQuery like a drunk raccoon, the cost can go up.
So this is not a “DIY is always cheaper” argument.
It is more like this: if the need is narrow enough, the custom route can be better value because you are not paying for a whole SEO suite just to use the rank tracking part.
This is especially true if the company already has some kind of reporting stack. If BigQuery and Looker Studio are already in the picture, then SERP data is just another source. It does not need to live in a separate SEO platform forever.
How does the price comparison look in practice?
This is not a perfect comparison, because the products are not the same. Ahrefs and Semrush are full SEO platforms. DataForSEO is an API. My SERP observer is a small backend built on top of an API and Google Cloud. Still, the comparison explains why I started thinking about this in the first place.
| Option | What you are really paying for | Rank tracking angle | My honest take |
|---|---|---|---|
| Ahrefs | A full SEO platform with research, audit, backlink and tracking tools | The Lite plan currently includes 750 tracked keywords with weekly updates | Great tool when you need the whole package, but too much if you only want weekly position data |
| Semrush | A full SEO and marketing platform with tracking, research and reporting | The entry SEO plan currently lists 500 tracked keywords with daily tracking | Strong if you use the broader workflow, but again, it is a platform, not just a rank tracker |
| DataForSEO | SERP data through an API | Pricing is request-based, so the cost depends on how often and how deeply you check | Better fit if you want to build your own workflow and own the data structure |
| Custom SERP observer | Your own small tracking system with cloud storage and reporting | Keyword count, device split, location and schedule are under your control | Not free, not for everyone, but very practical when the use case is focused |
Ahrefs’ public pricing currently shows the Lite plan at 750 tracked keywords with weekly rank tracking updates, while Semrush’s SEO pricing and limits currently show 500 tracked keywords on the entry SEO plan with daily tracking. DataForSEO’s SERP API uses pay-as-you-go request pricing, which is a very different model from buying a full SEO platform.
For me, this is the important part. If I only care about a defined keyword set, I do not necessarily want to buy the whole supermarket. I may just want the raw ingredient.
How much would 750 weekly keyword checks cost with DataForSEO?
If I calculate the same 750-keyword weekly tracking setup with DataForSEO, the API cost is almost funny compared to a full SEO-tool subscription. DataForSEO’s Standard Queue price is currently $0.0006 for one SERP with 10 results, or $0.00465 for 100 search results. For rank tracking, I would calculate with the top 100, because checking only the first 10 results is often too shallow. So the math is simple: 750 keywords × $0.00465 = $3.49 per weekly run. In a normal month, that is roughly $15.10 if we calculate with 4.33 weeks.
There is still some Google Cloud cost on top of that, because the data has to run somewhere and it has to be stored somewhere. In my setup, that means Cloud Run, Cloud Scheduler and BigQuery. But at this size, this is not the scary part. Cloud Run has a free tier for small request-based workloads, Cloud Scheduler gives 3 jobs per month for free and then costs $0.10 per job, and BigQuery includes 10 GiB of free storage plus 1 TiB of free query processing per month. For a small weekly SERP tracker, the Google Cloud part should stay under $1/month, unless you start querying or storing data in a very careless way.
Why not just use Google Search Console?
Because Search Console answers a different question. I love Search Console. It is still one of the first places I look. But average position is not the same as a clean SERP check. It mixes real-world impressions, devices, locations, query variants and whatever Google decided to show that day. That is useful, but it can be noisy. A SERP tracker is more boring, and that is the point. It asks: for this keyword, in this location, on this device, where did the target domain show up? That kind of boring data is useful when you want to monitor specific pages or keyword groups. A category page. A service page. A few commercial queries. A set of local terms. You do not need thousands of keywords to make this valuable. Sometimes 100 well-chosen keywords tell you more than 5,000 random ones.
Search Console shows what happened in the market. SERP tracking shows what happened in a controlled check. I would rather use both than pretend one replaces the other.
Can this really be built with vibe coding?
Yes, if you do not try to build the next Ahrefs in a weekend.
That is where people usually go wrong. The scope gets stupid very quickly. First it is a SERP checker, then it needs user accounts, then a dashboard, then alerts, then competitor tracking, then billing, then some AI feature because apparently everything needs one now. At that point, congratulations, you are no longer solving your SEO problem. You are accidentally building SaaS.
I did not want that.
The first useful version only needed to post SERP tasks, fetch completed results, parse the target domain’s position, store it in BigQuery and make it reportable. That is a small enough problem. With AI-assisted coding, a version like this can be built in two or three days. Probably faster if the scope stays clean and you already know what you want.
The hard part is not writing every line of code by hand. The hard part is knowing what the tool should and should not do.
That is also why I like this kind of work. It forces me to become sharper as an SEO specialist. I need to define the measurement properly. I need to think about devices, locations, keyword sets, schedules, data structure and reporting. These are not “developer problems”. These are SEO problems that happen to need some code.
What does AI search change here?
AI search makes this even messier.
A normal rank tracker thinks in keywords. One query, one SERP, one position. But AI search does not behave that neatly. A single user question can turn into several hidden questions. When I looked at the query fan-out around SERP checkers, the topic quickly moved into free tools, paid tools, local tracking, mobile versus desktop results, APIs for developers, SERP features, accuracy problems and the future of rank tracking in AI-generated results.
That is exactly how people research things now. They do not stay inside one keyword. AI systems do not either.
So I do not think a SERP tracker is “the answer” to AI visibility. That would be too neat. It is one layer. You still need Search Console, analytics, maybe CRM data, maybe manual checks of AI answers, maybe sales feedback. But controlled SERP data is still useful, because it gives you a stable measurement point in a search environment that is becoming less stable.
And if the data is already in BigQuery, it becomes easier to combine it with other sources. That is where the custom setup becomes more interesting than a closed dashboard.
What about MCP and direct AI access to SEO data?
This is another reason I like the direction. DataForSEO also has an official MCP server, which connects AI tools to its APIs. That means some SEO data can already be pulled directly into AI workflows, without building a full interface around it.
For quick research, that is useful. But for monitoring, I still want a database. A chat window is not a reporting system. It can help me explore the data, but I do not want weekly rank history to live inside a conversation. I want it stored properly, so I can chart it, compare it, filter it and come back to it later.
AI is good at helping with the workflow. BigQuery is better at remembering what happened.
Who should not build this?
A company should not build this if nobody will maintain it. That is the boring answer, but it is true. If the team has no technical confidence, no reporting setup and no interest in owning the data, then a commercial SEO platform is probably the safer choice. Pay the subscription, use the UI and move on. But if the company only needs a focused rank tracking workflow, and it already cares about data ownership or custom reporting, then building a small tool is not some crazy developer fantasy. It is a normal business decision. For a lot of smaller companies, the question is not whether Ahrefs or Semrush is “worth it” in general. They are often worth it for the right user. The real question is whether this specific company needs the full package every month.
Sometimes yes. Sometimes no.
Where is my version?
I published my version here: SERP Observer on GitHub.
It is not a commercial product. It is not polished like a SaaS tool. It is a working example of how I approached a very specific SEO data problem: track selected keywords, store the results, connect them to reporting and leave room for weekly AI-assisted summaries later.
For me, that is the fun part. Not because I want to become a software company. I do not. I am an SEO specialist. But every time I build something like this, I understand the measurement side of SEO a bit better. And that makes me better at the work I actually sell.