This is the final part of the series on SEO with R and AWS. In the first part, we prepared the AWS instance with RStudio, in the second part we carried out a small SEO analysis, and in the third part, we created a visibility index and an “actionable reporting”. This part is about how it is too exhausting for even the most hardened data scientist to run the individual scripts through RStudio every day. SEO monitoring should therefore run via an appealing interface.
GUI development with Shiny
With Shiny, the developers of RStudio have provided a framework that enables interaction with data with just a few lines of code. Fortunately, Shiny is included in the AMI that we installed in Part 1 of this series.
The code is divided into two parts, the server code and the UI code; both parts can be converted into a file, which can then be used as an app. R is placed in a special folder in RStudio. The server part uses the individual functions that we used for the first parts of this series (that’s why you can’t find all the code here).
`
server <- function(input, output, session) {
get data from db and create dropdown-items
sites <- unique(results$website)
output$moreControls <- renderUI({
selectInput(“moreControls”, “Select Website”, choices=sites)
output$sviPlot <- renderPlot({
Visibility Index Code
})
output$actions <- renderTable({
Actionable Data Code
})
}
`
In the UI part, the UI is simply screwed together:
<br /> ui <- fluidPage(<br /> title = "Tom's SEO Tool",<br /> uiOutput("moreControls"),<br /> hr(),<br /> plotOutput("sviPlot"),<br /> tableOutput("actions")<br /> )<br />
And finally, the app is started:
`
Run the application
shinyApp(ui=ui, server=server)
`
That’s it. Simple and simple:

Not necessarily nice for a customer, but perfectly sufficient to build a reporting system for yourself. In the upper left corner, I select the website for which I want to have the reporting, and after the scripts have been run, the visibility index plot and the table with the largest ranking losses appear. Creating this dashboard took just 20 minutes (ok, to be fair, I had built other dashboards with Shiny before, so I had some experience
Next steps
In my current setup, I just used the code I had used so far, but that also means that it takes a while until something can be seen. It would be better if the data were automatically evaluated every day and the dashboard then simply retrieved the aggregated data. It’s on my to-do list.
Then it would be nice if you could click on an entry in the table and then analyze it further. It’s also on my to-do list.
A connection to Google Analytics would be great.
Not every keyword is interesting and should be included in the report.
All good points that I could tackle in time, and above all good material for further posts And I’m always open to more ideas
Result
With a little brainpower, little code, and the help of Amazon Web Services and R, we built automated and free SEO reporting. Of course, Sistrix & Co offer even more functions. This approach is primarily about making better use of the data from the Webmaster Console, which is available free of charge. Because let’s be honest: We usually have enough data. We just don’t make much of it most of the time.
This approach has another advantage: The data from the Search Console is gone after 90 days. You can’t look any further back. Here, however, we are building up an archive and can also look at longer-term developments.