Transcript
[00:00:19] Nathan Wrigley: Welcome to the Jukebox Podcast from WP Tavern. My name is Nathan Wrigley.
Jukebox is a podcast which is dedicated to all things WordPress. The people, the events, the plugins, the blocks, the themes, and in this case how AI is taking on the burden of troubleshooting website issues, and making suggestions for improvements.
If you’d like to subscribe to the podcast, you can do that by searching for WP Tavern in your podcast player of choice, or by going to wptavern.com/feed/podcast, and you can copy that URL into most podcast players.
If you have a topic that you’d like us to feature on the podcast, I’m keen to hear from you and hopefully get you, or your idea, featured on the show. Head to wptavern.com/contact/jukebox and use the form there.
So on the podcast today we have Arnas Donauskas. Arnas is a product manager at Hostinger, with over five years of experience in the web hosting industry. His journey began during college while working on his bachelor’s degree, when he needed to create a website and discovered WordPress as a beginner.
His first foray into website building sparked his interest in the industry, eventually leading him to a career where he now develops products that help others launch their own online presence. Recently he’s been working with a team tasked with delivering tools and improvements to WordPress users to ease their journey on starting and maintaining websites.
In this episode, Arnas shares insights from his presentation at WordCamp US in Portland, Oregon, where he discussed the future of fixing and optimizing websites with AI. For many WordPress users, managing site performance and troubleshooting errors can be time consuming and complex. Arnas and his team have been developing AI based solutions that not only help onboard new clients by automating website creation, but also proactively monitor and remediate website issues as they happen.
We get into the details of how Hostinger’s AI tools identify, and automatically fix, critical website errors such as HTTP response issues, and how they’re pushing site optimizations through automated performance enhancements.
Arnas explains the engineering challenges involved, the current state of success with automated fixes, and how user feedback is shaping the roadmap for new features like SEO analysis and accessibility improvements. He provides a behind the scenes look at how Hostinger tests and iterates on AI models, what kind of data is fed to those systems, and how the team balances automation with user control.
If you are curious about how artificial intelligence is transforming WordPress hosting and site management, and what this means for the future of the web, this episode is for you.
If you’re interested in finding out more, you can find all of the links in the show notes by heading to wptavern.com/podcast where you’ll find all the other episodes as well.
And so without further delay, I bring you Arnas Donauskas.
I am joined on the podcast by Arnas Donauskas Hello.
[00:03:41] Arnas Donauskas: Hello. Thanks for having me today.
[00:03:43] Nathan Wrigley: You are so welcome. We’re here at WordCamp US in Portland, Oregon. It is day two of the, kind of the conference, but it’s the first day of presentations and things like that. You are one of the presenters, and during the presentation you are going to be talking about fixing and optimising websites with AI.
I wonder if we begin the podcast with an introduction to you. So I’d love to find out more about what you do, what your role is at Hostinger, and how you’ve got yourself in the whole AI space.
[00:04:12] Arnas Donauskas: Yeah, would be glad to give a short overview. As Nathan introduced me, I’m Arnas Donauskas and I’m a product manager at Hostinger. And the whole web hosting industry, creating a website, I’ve been for more than five years. Well, I think my first interaction with WordPress was actually in my college when I was writing my bachelor’s degree. I needed a website at that point of time and I thought, okay, what should I do? What should I use? And I was very green back in the day. Everyone has to start somewhere.
And the WordPress came in as one of the first results that I searched on Google. I gave it a go. At first there were some challenges, interesting cases, what should I do with it? But then website got up and running. I finished my bachelors degree, so that was nice.
And at Hostinger I have a team, a squad, where we build various tools for clients who are using WordPress to make their journey smoother, to make their websites management easier, to make a whole, interacting with the online presence easier. So they would have tools that could assist them, you know, on day to day basis, how to get things done and how, you know, to get their first website started and running as fast as possible.
[00:05:24] Nathan Wrigley: So it seems like the hosting space, this is a really perfect fit for AI, because you presumably are onboarding clients and they have no website. I mean, in many cases maybe they have and they’re migrating something from one place to another. But I imagine a lot of your clients are brand new, they’re starting a new project, a new business, or whatever it may be, and they want to get a leg up in building something quickly.
And five years ago, no chance. You had to hire somebody, everything had to be done by a human being. And nowadays we’re seeing the rise of AI in these kind of onboarding processes where you go through some kind of wizard, and at the end it will spit out some approximation of a website which is suitable for your niche or what have you. And then you go in and you tinker and you make sure it’s exactly what you want.
Is that the kind of tooling that you are doing, or are you doing something slightly different to that over at Hostinger?
[00:06:12] Arnas Donauskas: Yes, we do have tools that are able to, and capable of, creating a website with AI prompt. You would tell what your website would like to be, and we have like a WordPress AI website builder that will build you a blog, an e-commerce based on a given prompt. So this is already a really head start of all of the things.
But also looking from another perspective, it’s totally understandable to see people who don’t want to build the website with AI, but would like to get guidance how things get done. From one perspective, you can get guidance, how to build the website itself. From another, do I need to make any DNS zone changes on my website? And at this point of stage, AI can help all the way through. You just simply ask what you would like to do, what are the settings you want to tweak? And AI can give you a really, really detailed step, you know, how to change those things.
One of the really nice examples I have, at Hostinger we have a Kodee, it’s a chat interface assistant that helps clients with various questions, and it does have information about the client itself and, you know, what actions it can do. And what trend I started to notice that clients know it’s an AI, and they start asking specific questions. Like, hey, here’s bulk text, can you edit that for me? Or can you give me more detailed steps how to do this and this? And the AI just gives those steps and clients just like, thumbs up, thanks. Have a nice day. And they just go on their thing.
So I see this trend, and it’s really nice that the users like utilising these tools, because at the end of the day, it helps save time, maybe additional money and, you know, it’s a win for the user.
[00:07:44] Nathan Wrigley: So I’ll just read the first sentence of the blurb. So the title of your presentation here is fixing and optimising websites with AI. And then the first sentence goes like this, and it encapsulates exactly what you’ve just said. This talk explores how AI can be used to automatically fix detected website errors and boost overall site performance.
So we’ve got this whole side of AI, which is the onboarding, we’ll help you build the site. But then it sounds like you’ve also now got tooling to, okay, you’ve got a website, let’s fix it up. Let’s make the improvements and adjustments along the way.
So, okay, then if we are allowing AI to crawl our website in some way, how does that actually work? What is going on? What is your platform doing to find the errors? I realise that’s a very broad question, but I’m going leave it like that.
[00:08:31] Arnas Donauskas: Yeah. So actually why this idea to create such tool came into the light, it was actually one of the feedback points we gathered from one of the WordCamps. Maybe it was Europe. But then to add up to it, we saw the problem when clients, let’s say a website starts receiving an error, or it starts to load slowly, they are not sure where to start troubleshooting this. And we have thought, why not make this process automatic and remove this hassle step for the client?
So how this tool, for troubleshooting the error, how it works. So at all times we are tracking all of our clients’ HTTP status. So basically, if there is no error, it’s 200, in most of the cases. There can be a permanent redirect HTTP status. But at all times we are tracking if it changed to an error code or no. If it did, then we are promptly informing the client, hey, we found an error on your website. It could be a 403 forbidden access, or 500, or a critical error. And we start informing the client, hey, an error was found, you can use our AI troubleshooter to automatically fix it.
So when the client lands to the interface itself, we already gathered all of the logs, we removed all of the information that should not land for the AI, that he’s not using it to troubleshoot the error itself. And then AI has a list of actions it can do, whether it’s troubleshooting or optimising.
And then based on the logs and our AI custom given prompt, it determines, this is the most likely action that will fix the site’s error. Or when it comes to optimising, here are like the listed settings you need to tweak to make that website go faster. And then at the end of the day, the client gets the error fixed or the website optimised.
[00:10:21] Nathan Wrigley: So that’s really interesting. So here we’re talking about some sort of critical error. So your tooling is going out, in the same way that an uptime monitor would’ve done in the past. But the difference here though is that the uptime monitor traditionally just tells you the problem. You might get an email or a phone call or something, but then you’re kind of on your own. You do the troubleshooting.
So the difference here is the AI then, it determines there’s a problem and then it offers suggestions. So you log into your control panel and it’s saying, okay, this is the most likely cause, here’s some things that you can do to remediate that problem.
[00:10:53] Arnas Donauskas: Yes, but those suggestions are also being applied automatically because it’s totally normal, you could just go to ChatGPT, you say, here I have this error, what could I do to fix it? And the AI troubleshooter however, it does not suggest, it gives the action that can be applied on the spot. So if you had a one o’clock, 403 error, at one o’clock, five minutes, you could have it resolved without your actual manual input. So the tool itself automatically applies those fixes and does that for you.
[00:11:26] Nathan Wrigley: Okay, that’s really curious because traditionally, I mean, obviously, I guess website hosting companies have had tooling around uptime monitoring and things like that in the past, but because your identifying piece, and the remediation piece, can access server logs and all of the infrastructure that you’ve got, it can identify the problem, figure out if that’s true, and then just crack on and do it.
So you can implement it, well, without implementing it. You just wake up at eight in the morning, maybe get an email to say, well, at one o’clock this thing happened and then we did this so you were able to have another seven hours sleep, that’s fine. Yeah, that’s really interesting.
[00:12:02] Arnas Donauskas: So one thing that we are also constantly working on is what automatic fixes we are capable to do. Because those are the ones where, you know, our developers work and make them so AI would have more, let’s call it options to pick from, based on the data it has, what went wrong. And this is, you know, where we have a mini roadmap, what we want to implement further, so we could increase the success rate of the fixed websites automatically.
Because this is something we also track about. And I will mention this in my speech. So at this point of day, we have 70% success rate on fixing the website. And how it’s being calculated, that when a first fix was applied, it was an actual success and that error got resolved. So at this point of day it’s 70%, and roughly, in absolute numbers, we are fixing per month 16,000 of the websites.
What’s the nicest part for me is that for 16,000 of websites some time was saved. Clients did not have to dig through a lot of information, and they got the problem resolved on the spot. Because imagine you have like a working business running, or you expect clients to come in and you get an error. What is the first thing you do? Like, it takes time. So this tool, you know, can prevent these problems and shorten the fixing journey.
[00:13:20] Nathan Wrigley: Yeah, not just time, but it presumably stops you losing revenue, and the sort of slightly unquantifiable emotional distress that comes with having a website which isn’t working. And obviously if that’s your industry and your business is, I don’t know, e-commerce or something, it’s very important that comes up.
So 70% sounds good, but obviously it means that 30%, there’s not the ideal outcome. What do you do in those scenarios where the thing was not solved? Do you log that and presumably your team then look at that and figure out over time, okay, how can we get that 30 to 20 to 10 and so on? And do you kind of roll back the remediation so that the thing which didn’t work, we unpick that and we just go back to where you were when the error occurred?
[00:14:00] Arnas Donauskas: Yeah, so very good follow up question on this. So with the 30% that we do, we still run additional fixes. So when we do the first fix, we tracked what changed, and then the further success rate can happen, that the fact the was fixed either on the third try or like the fourth try, so that 30% lowers.
But there are cases that none of the fixes helped, and it’s totally normal. Bigger website could be more complex problems, things like that happen. So then we proactively forward the user to our success specialist team who will assist on the spot. And they have all of the logs, what happened, what was tried, and what fixes were applied on the website on the spot.
In any case, there are backups that can be reverted without any of the fixes applied. So those 10 to 15% that nothing helped at the end of the day, gets a direct help from our success team so we could still solve, or help solve, the problem for the user.
[00:15:03] Nathan Wrigley: It’s kind of incredible that if we were to just rewind the clock five years, the stuff that you’ve just mentioned was nonsense. It could not happen. And yet we’ve got to the point now where you’re saying there’s a 70% success rate. I’m quite surprised it’s as high as that, so that’s amazing. And presumably, the ambition is to drive that up to 80 and 90 and what have you.
But just the mere fact that it’s possible is pretty remarkable, that there’s a technology which, it’s kind of got your back, it’s this agent running in the background whose job it is to figure this stuff out and you don’t have to think about it.
And I guess for your industry, you know, hosting in general, I presume a lot of the other companies are doing these kind of things. Over time, this will become the norm. It will just become a laundry list, one of the ticket items on the sales page. It’s, you know, we’ve got your AI agent monitoring the uptime, remediation is guaranteed. Maybe you’ll even get to like 99% of fixes or something like that in time. And it just kind of pushes what we’re going to expect from hosting companies like you. That’s fascinating. Really interesting.
[00:16:04] Arnas Donauskas: Yeah. And as you’ve mentioned, five years ago, it could have been a lot of, let’s say problems or issues making this happen, because at that point of time you need to chew up a lot of information and, you know, do the thinking on that received information. But now when AI does have quite a powerful approach on this, and it’s able to handle such high amount of information, that’s when you know the heavy lifting is taken to that part, the end user is now getting the fixes done.
As per norm on all of this fixing, I really would like to see that happen because it just helps out. You can spend your time on expanding, moving your business further, thinking of the new ideas what you could do, instead of maintaining the website. You know, there’s like a saying, it’s more fun to buy new parts to your car than replace the old ones or do the maintenance parts. So this is, I think, the same thing the website itself.
[00:16:57] Nathan Wrigley: Yeah, it really does feel like this is going to be the future. And obviously you’ve now got these technologies which can make, well, it’s approximating intelligent decisions. Whereas before it was just sort of, I guess you were going through a binary, is it this? Yes. Okay. Move to the next step. Is it this? No. Okay. Go back to this step. Whereas now there can be this whole load of things that you can throw at it.
And that brings me to the next question really. So you’ve just talked about all the critical things, so the website collapsing. So we do something to remediate that. What about the more, I don’t know, let’s say soft things.
So for example, maybe it’s SEO. You know, we have gone around your website, we’ve scraped it a little bit like maybe Google Bot might do, and we’ve identified SEO problems. Or it could be accessibility problems, or it could be, goodness, I don’t know, you’ve just used inappropriate language here, we’ve got a better idea for a UVP at the top of the webpage. Does it stray into that as well? Is it more than just critical failure problems?
[00:17:50] Arnas Donauskas: At this point of day, it’s more critical problems when the website is just full on down. But like, how I like to view the tools that we are building is whenever you build one tool, you receive clients’ feedback, you receive WordPress community feedback, where you can build more tools on top as a continuation to the first one. This is what I really like about all of this feedback culture.
This is the upcoming thing, and I think it’s only a matter of time when our troubleshooter and the optimiser will appear in the WordPress admin panel, where it will be able to tell you, I see an image has disappeared on your website, just upload it to me, I will fix it to you. Or I see some SEO problems. Or like you’ve mentioned, accessibility problems, or that some of the grammar mistakes were found in some of your posts. Something like that. So this is only a matter of time.
And why such approach was taken at this point of time is, we want to give users a tool that they could trust and be comfortable on using when it comes to the most critical problem or critical matter with the website related errors. So they know, okay, I can trust and use this tool, and fix my problem right away. And when that’s put on, then we can move to extensive features to the troubleshooter and optimiser as well.
[00:19:09] Nathan Wrigley: Would it be fair to say that you are developing solutions like that, though? Is that the kind of thing which is on your product roadmap to get those kind of tools, the SEO, the image fixing, the alt text identifier, the I know, the accessibility identifier, those kind of things. They’re in the background? You are building those? They’re roadmap items are they?
[00:19:26] Arnas Donauskas: I would say they are currently planned. Right now, what we have in the more recent backlog is how to reach my personal goal on this is 90% fixed rate. If you already have some plans, how it’ll be done. So a short sneak peek on it. We basically want to build like a way back machine on our AI troubleshooter, so it would know at any given time what happened to each of the file the customer has on their website. And it would be able to tell you, okay, I see that on this specific date, this single file was changed and that’s what led to a 500 error. I have a safe backup copy for it, I will restore it for you. User confirms. We do the restoration.
Or AI will be able to determine, I have a fully working website backup of your site, these are the orders that could be potentially missed if that is an e-commerce case. And if you want to, we can go ahead and restore the website to a fully working version and get your site back up and running again.
[00:20:26] Nathan Wrigley: We do live in interesting times, that’s for sure. You mentioned in the blurb for the talk, and I read the bit at the beginning, but I’ll just read it again. So your talk explores how AI can be used to automatically fix detected website errors, I think we’ve covered that, and boost overall site performance.
Now that’s a different piece, isn’t it? So if we’re now looking at site performance, presumably we’re talking about from slow to fast. Something wrong to fix. So basically, I’m asking the exact same series of questions, but from a performance point of view, not the site has collapsed and there’s an error. What are the things that you’re looking for there?
[00:20:59] Arnas Donauskas: To be fair with you, everything. So we look at everything when it comes to website performance. So we do like a benchmark result where we have our starting ground when it comes to optimising the website. And we are using Google Page Speed scores. I think it’s one of the most popular tools to benchmark the website to see what is loading slowly on it, what could be the potential problems with it. And then for each website individually, automatic fixes to images, JS, CSS minification are being applied, and the client then sees the improvement, whether it’s 10%, 20%.
So right now what we currently have from the data itself, I believe it’s been running for two to three months right now, and we’ve been gathering data, how the websites are being optimised. So on average, mobile page speed score is being increased by 20, and the desktop is by 10%.
But there’s a catch to it. These optimisation steps are safe. It means nothing bad will happen to the website after the optimisation steps, and the next step would be introducing risky steps that can affect how the website looks.
What I have in mind by that is, lazy loading sometimes can mess up one of the images, it appears slowly or after a while. So these things could happen, but this will be like a separate step informing the client, hey, we did the safe part, but we could push this further with some of the risks. No worries, you will be able to revert everything on the spot if something bad happens. So this will be the next step of it, and I’m really intrigued to see how fast the websites can be.
[00:22:35] Nathan Wrigley: Can you modify your hosting environment to be specific to my website, if you know what I mean? So if my website, for example, is, I don’t know, a brochure website, I’ve got five pages, you could cache that entirely. Really easy to do. But, okay, this website over here, a different one that I’ve got, it’s a WooCommerce website, there’s a whole different load of caching that might go on, there’s a whole different load of optimisations that go on there. Do you take that burden on, or is it more of a, okay, we’ve got this thing, you tick a box and now we’re going to do the performance thing? Will it figure all that stuff out, or are there tick boxes where I can go, do this but don’t do this, do this but don’t do this? How does it work?
[00:23:13] Arnas Donauskas: So each optimisation step to increase the performance is being applied to each website individually. It checks loading slowly. Right now, there is no possibility to customise the optimisation steps that you can do, but we are planning to integrate logic to the AI, or like past information for each type of website type. What caching should be applied on specific pages if the image is a landing one, or is it like a product image? So to give more extensive knowledge to the AI so it would be able to better determine how to approach different website types. But for now, what we check, still settings are unique to each website, but not to such extensive customisation.
[00:23:56] Nathan Wrigley: You’ve laid before us a really interesting engineering challenge. These problems exist in terms of performance, we’ve got to put a bunch of engineers on it, and they’re going to figure out this AI way of solving that. But how do you communicate the work that the AI is done to the people that want to know it’s been done?
Because in a way, I kind of want to know that’s happened to my website, but at the same time, I kind of don’t. I don’t want to be getting six emails a day saying, okay, we updated this image, oh, and then another email, we did x, and we did. But you’ve got to let me know that that’s happened. In some way, you have to communicate the value to me that, look at all this fabulous stuff we’re doing. But I kind of want to know, but I kind of don’t want to know. So it’s a difficult tightrope to tread. I’m just wondering how you manage that.
[00:24:36] Arnas Donauskas: Yeah, yeah. So at the end of each optimisation, client is getting an impacted result, did it increase, and by how much? And they are getting a full log, what was done on the website. And we are also trying to display that log to as most simple things as possible to understand, because some of those settings could sound, you know, very big words. But there’s actually very simple things that were done on the website. So we’re communicating that part to the users at the end of each optimisation as well.
[00:25:05] Nathan Wrigley: Okay, so you’re kind of making it easier to understand basically. You’re hoping to use normal language to explain something fairly technical. Yeah, okay. And summarising it, not sending an email for every single thing. And presumably over time the email’s become less and less anyway because, let’s say I migrate a website to your platform, the AI gets involved, and I’m imagining there’s more at the beginning, it’s front loaded. Oh, look, there’s this and this and this and this. And then slowly over time, oh, there’s less. We did it. It’s done. But, oh, new plugin, new thing. I’m guessing that you communicate less over time.
[00:25:37] Arnas Donauskas: With such optimisation things, yes, via email. I would say it’s less via email, more via interface. And I would say that at this point, it’s enough for a user to grasp the idea of what was done.
Why I say this? Because the amount of time the clients spend in the interface reviewing the optimisations and how many of them interact with it is quite high. I believe with optimisations it’s 70% of the users that actually started the migration, completed, you know, all of the interaction with the interface. And they’re spending approximately like from 10 to 15 minutes with it.
So I would say these are pretty good numbers. But you gave a very good point for the users’ clients who are more advanced. And perhaps it would be a good improvement point to give them an option to download all extensive logs, what was done, to see just what happened actually in depth, not just rephrased wording for some technical parts.
[00:26:36] Nathan Wrigley: Yeah, I think it’s a really difficult tightrope to tread because every time that your AI does something and it had a beneficial impact on my website, that’s good for me, but it’s also good for you because it builds that relationship, doesn’t it? You know, oh, look what the platform’s done. It’s brilliant. I didn’t have to lift a finger. Just came as part of the package. Fabulous. I’m happy with that.
But you just don’t want to overdo that communication because at some point it’s like, oh, you lose sight of it. And then the critical one will arrive where the website’s collapsed and, yeah, it’s another one, it just goes in the bin. So I guess there’s a tightrope to tread, which is kind of interesting.
How do you actually find these errors then? Do you have something akin to Google Bot, which is going and looking at the front end of the website as a human being would see it, if you like, and sort of scraping around inside the DOM, looking at screenshots and, you know, okay, yeah, we see that image isn’t, I don’t know, so just an open-ended question really.
[00:27:28] Arnas Donauskas: Since each of the website that we are troubleshooting are hosted with us, we are able to, you know, detect. Because the primary source that we are using to determine that something bad happened is the HTTP response.
[00:27:41] Nathan Wrigley: Right. That’s straightforward. Yeah.
[00:27:42] Arnas Donauskas: Yeah. So whenever that changes, we are able to know because each of the website is hosted with us on our infrastructure. So this is the most, the quickest and most straightforward approach we can use to determine that something bad happened. So this is the one we are running with. And quite good accuracy, unless there’s like a, some CDNs in that case. And this could be sometimes a problem because not always the true error will come out. But yeah, this is the method we are using.
[00:28:09] Nathan Wrigley: But on the performance side, presumably that’s slightly different because, you know, you mentioned lazy loading images or something, you’ve got to have some metrics and telemetry to say, we’ve got lazy loading images, okay, how do we deal with that?
[00:28:20] Arnas Donauskas: So with the performance part, clients are able to, you know, at any given time to initiate the optimisations. We will do the performance test to see if it actually needs an optimisation, because sometimes clients have very perfectly optimised websites, and they’re working like a speed. But we are occasionally running page speed performance tests, on weekly basis, I believe. And if we detect, okay, this website could be improved, then clients are being informed that, hey, you can do some optimisation steps that are automatic and you can go ahead and start the optimisation process.
[00:28:53] Nathan Wrigley: Okay, got it. Thank you. Curious thing that you are in this game of tennis, I presume, with the AI models. I’m presuming, I could be wrong, but I’m presuming that you are using AIs that we are familiar with. So I’m just going to drop a few names that I know. Things like Gemini, Claude, ChatGPT and things like that. I’m presuming there’s some connection that you’ve got with those. Maybe you have your own, I don’t know.
Given that they seem to change at a breathtaking pace, and in some cases the changes that they seem to ship kind of seem to degrade their capacity to do things. We’ve had a recent ChatGPT 5 update, which I think many people felt perhaps in certain scenarios was a backward step. How do you keep up with this?
[00:29:33] Arnas Donauskas: Testing, straightforward testing, but very good point on the whole different models and the providers on it. We simply do tests with each of the models. We scout around, we see, or it looks very promising, we test how it performs, and there are several points. How fast it can grasp the information and return back to us. So how long the request took time. Some of the models took like 10 seconds, some of them took 5. So we want the client to get the faster result as fast as possible.
And then there’s the second part, it’s the accuracy of the returned information. Because one of the learned lessons I will be sharing in my speech is that, we noticed that when newer models came in, how their accuracy was way better and the time to handle information was very shorter. So since we have like developers who are working on the AI models itself, we just always test to see if there’s something better that we could ship to our users so they would have better outcome on their end as well.
[00:30:35] Nathan Wrigley: Yeah, it’s fairly straightforward, isn’t it? It’s testing, testing, more testing, and go with the thing which provides the best tested answer. But curiously though, you must have applied a ton of engineering time into this endeavor. So there’s a load of people on the ground, that must cost Hostinger quite a bit of money. And then presumably there’s quite a lot of money being sent to these AI agents. But I’m guessing it’s hard to justify a price increase to your end users.
So it must be kind of a fairly difficult business decision. How much of this can you do? Because you could AI forever, you know, and just keep going and going and going and endless cycles. So I’m guessing from a business point of view, there’s a, again, another tightrope to tread. How much can you do? Or is this more a case of, is this stuff a premium thing that you offer? Do you have to pay an additional fee to get access to this stuff?
[00:31:21] Arnas Donauskas: No. No additional fee. AI troubleshooter and optimiser is pre included with all of the hosting plans we offer for our clients base. And the price for that did not change because this tool was introduced.
You’re right, it took some time to deliver final versions of the products, approximately seven to eight months. But it was all worth it, I think, because clients can now automatically do things and don’t have to spend time themselves.
And from a company point of view, we just want to deliver best user experience they could have and, you know, that they could trust us even when the website is down with an error and how we can solve it, and what we can do the quickest or how to, you know, assist user on optimising the site.
[00:32:08] Nathan Wrigley: It’s the market at work, isn’t it? Essentially. You’re trying to make your offering different and unique and offer something which adds value, and so you take the hit, I guess.
Do you want to get to the point where everything is completely automated? I mean, is that a desirable outcome? Would it be something that you’d like to see where the human is completely out of the loop? Or do you always want to have an option for a confirm button or a roll, not rollback, we always want the rollback.
But it always feels like the light at the end of the tunnel here is that the human doesn’t need to be involved at all. It would be desirable if I could get up and be a hundred percent confident that my website, for all of the things that you did overnight, is better. And I don’t have to involve myself in that at all. But equally, there’s a bit of me which always wants the confirm button. I want to be able to see, well, not that one. Yes, that one. We’ll do that.
[00:32:56] Arnas Donauskas: I think confirmable actions will be there all the time, or most of the time. Because at the end of the day, this is the user’s website that the changes are being applied to, and the user is in control. Would you like to do those changes, would you not? One of the thoughts, I believe we discussed with our colleagues, what we have 100% fixed rate? Should we give users an option, just run everything, I trust this completely? It could be an option. But still, at the end of the day, this is the user’s website. It’s their business, it’s their blog, and we want to give best suggestions, but the user is the one who’s saying, yes, I would like to do that, or, no, I don’t want to see this.
[00:33:39] Nathan Wrigley: I guess you’re trying to get to the point where the confirmed decision is just really obvious. You want to go in and be entirely confident that, yep, I’m going to confirm it because I have this trust, but equally, there’s an option to not confirm it. That seems to be where the whole AI thing is going. The humans are always in the loop somewhere and it’s always that final confirmation step. And I think if we lose sight of that, we’re probably in a bit of trouble.
One of the questions I have as well is about WordPress, obviously, we’re at WordCamp US, this great big open source thing. And it brings to mind the question about these models, and the fact that they are entirely proprietary, you know, ChatGPT, Gemini, Claude, and all of these things. They’re having a lot of our data, we’re allowing them into the backend of our websites, but they, I don’t know if they have any open source models which are using. Are you shipping data to them? How does it align with the whole open source thing that WordPress is so keen to promote?
[00:34:31] Arnas Donauskas: Oh, very good question I can say. And it’s true that different models look like different silos. Different companies, they have different approaches what they do. But I really liked one of the comments, I believe I read on the Reddit, on all of the AI stuff. And it applies also on such websites. So for example, you’re a user who likes to explore things, and you want to try and fix websites with AI and do that automatically. A free model for the ChatGPT or any other AI model will be more than enough to run, as long as you have your prompt.
It will take some experiment times, that’s for sure, but everything could be actually run free on this part. So this is more, you know, into the open source area. But of course, when there are paid models and stuff like that, this could be, you know, one day could be tricky.
Perhaps we will have a fully open source that anyone could be willing to use without any additional charges. Time will show on this. But now, a lot of companies, people are creating tools that they allowed to do free trials or free for some time. So I think this is a matter of question on this as well, yeah.
[00:35:40] Nathan Wrigley: Yeah, I mean it really does seem like a really exciting time, in tech in general, but also WordPress in general. But it’s kind of really interesting to see the way that WordPress and hosting company’s interfacing through AI. And it does seem like there’s a lot of interesting stuff happening on your side.
Yeah, it’s been fascinating talking to you today, trying to explore this a little bit more. Where can we find you, Arnas? If we want to reach out and discover more about you or Hostinger, where’s the best place to go?
[00:36:05] Arnas Donauskas: So if you want to reach out directly to me, I’m always happy to do that via LinkedIn. I have my full profile set up so we can reach out through there. If you’re a Hostinger client and you have some feedback, just drop it to our support chat. I’m the one who always reads them, and I might even get directly in touch with you via one of the forms because I always keep an eye of our client’s feedback and I try to contact them as often as possible to follow up on some of the feedbacks they share.
[00:36:32] Nathan Wrigley: Well, Arnas, thank you so much for chatting to me today and prizing open this subject. I feel that this conversation is going to get more and more in depth, and more complicated as the years go by. But in 2025, good to know where we’re at. Thank you.
[00:36:43] Arnas Donauskas: Yeah. Thank you for inviting me. It was an honor.
On the podcast today we have Arnas Donauskas.
Arnas is a product manager at Hostinger, with over five years of experience in the web hosting industry. His journey began during college while working on his bachelor’s degree, when he needed to create a website and discovered WordPress as a beginner. This first foray into website building sparked his interest in the industry, eventually leading him to a career where he now develops products that help others launch their own online presence. Recently he’s been working with a team tasked with delivering tools and improvements to WordPress users to ease their journey on starting, and maintaining websites.
In this episode, Arnas shares insights from his presentation at WordCamp US in Portland, Oregon, where he discussed the future of fixing and optimising websites with AI. For many WordPress users, managing site performance and troubleshooting errors can be time-consuming and complex. Arnas and his team have been developing AI-based solutions that not only help onboard new clients by automating website creation, but also proactively monitor and remediate website issues as they happen.
We get into the details of how Hostinger’s AI tools identify, and automatically fix, critical website errors, such as HTTP response issues, and how they’re pushing their site optimisations through automated performance enhancements. Arnas explains the engineering challenges involved, the current rate of success with automated fixes, and how user feedback is shaping the roadmap for new features like SEO analysis and accessibility improvements.
He provides a behind-the-scenes look at how Hostinger tests and iterates on AI models, what kind of data is fed to these systems, and how the team balances automation with user control.
If you’re curious about how artificial intelligence is transforming WordPress hosting and site management, and what this means for the future of the web, this episode is for you.
Useful links
Fixing and Optimizing websites with AI – Arnas’ presentation at WordCamp US 2025