In this episode of the HIPAA Insider Show, host Adam and HIPAA Vault’s CTO, Gil Vidal, dive into a growing yet often overlooked threat to healthcare websites: bot traffic. Learn how automated bots—whether from major platforms like Google and Facebook or other sources—can wreak havoc on website performance and security.

Key topics include:

  • How bot traffic impacts healthcare website performance
  • The HIPAA security implications of excessive bot activity
  • Real-world examples of bot-related downtime and potential risks
  • Proactive IT and marketing strategies to mitigate bot-related issues
  • Emerging risks from AI-powered content scraping and automated traffic
  • Gil shares practical solutions, including the use of CDNs, load balancers, and caching tools, while also highlighting the importance of monitoring and log analysis to stay ahead of potential disruptions.

Whether you’re a healthcare marketer, IT professional, or security enthusiast, this episode equips you with the knowledge to protect your site from these invisible yet impactful threats.

Transcript


Adam Zeineddine
Hello and welcome or welcome back to the HIPAA Insider show. I’m your host, Adam. And today we’re tackling a critical security issue that’s putting healthcare websites at risk. The HIPAA and security impact of bots on healthcare websites. Joining me today, as always, is Gil Vidal, CTO and founder of HIPAA Vault. Gil, thanks for being here.


Gil Vidals
Yeah, thanks to you, Adam. And I’m glad to be back on the show.


Adam Zeineddine
Yeah, Gil, you know, lately we’re hear about healthcare websites experiencing mysterious crashes and performance issues. Now, this is something that’s always on your radar as the CTO of a successful managed security service provider. But hip vaults trace many of these incidents to mysterious traffic. Can you explain what’s happening there?


Gil Vidals
Sure. So websites are constantly being hit by automated traffic from the major tech companies and other companies too, but the major ones have the biggest influence. So, for example, Google has a bot crawler to index pages. Facebook has a preview bot. And these have been sometimes too aggressive. So we’ve been noticing issues with those bots crawling and sometimes creating havoc on the websites.


Adam Zeineddine
So they can potentially slow down the websites. But how does bot traffic translate into HIPAA security risks?


Gil Vidals
Well, it’s not a. It’s a. More of an indirect problem, Adam. The. The bot, when it comes around and it starts to consume the web pages much like a human, but the bot can do it much faster. So it starts to hit a page and digest it, and then it can do another page and another page. And sometimes it may not do it in a serial fashion, but instead do it in parallel. So it might say, let me grab these 20 pages all @ one time. And boy, a lot of servers, a lot of healthcare sites are not set up to handle those kind of requests. And so what happens is the site becomes inaccessible. It’s kind of like a denial service attack.


Gil Vidals
And so that means that the healthcare app service, whatever that service might be, is not available to the patient or to the doctor or the consumer.


Adam Zeineddine
Are you seeing, are there any real, like, real case scenarios? And if so, where are they coming from? And you mentioned denial of service there. Is it. Is it malicious denial of service or.


Gil Vidals
It’s not. I would say that the end result is detrimental because the bots are so heavy and they have so much activity that it is a. It is a negative. Has a negative impact at them. But I wouldn’t say it’s malicious, because malicious implies malevolent intent. You know, like someone’s trying to do something bad. In this case, Facebook, Google and the others are not intending to go out, do something bad, that the result is the same.


Adam Zeineddine
Interesting. Yeah. You mentioned Facebook. Speaking of large social media platforms, Gil, did you know that 95% of the viewers watching us on YouTube aren’t subscribed yet?


Gil Vidals
No, I didn’t know that.


Adam Zeineddine
Yeah. So if you’re enjoying our content on YouTube and you’re watching us, hit that subscribe button and give this video a thumbs up. Get a bit of marketing there. But getting back to the topic. Is bot traffic a marketing issue or an IT issue?


Gil Vidals
I think it becomes a blend of both. So let’s say, for example, you have an extensive site and you have a sitemap for Google and you say, hey, come crawl my site. And let’s say Google were to crawl the site aggressively, then it could become an IT issue. You need your site index by Google so it can find all that rich content that you’ve paid to write or you’ve written yourself. So from a marketing point of view, it could be a missed opportunity if the site’s too slow for Google to crawl it. The other thing is Google measures page speed and performance. So if your site’s performing poorly and Google scrolling, it detects that it’s a poorly performing site Now. Yeah, I’ve got an IT issue and you’ve got a marketing issue all at the same time.


Adam Zeineddine
Yeah. And so what signs would healthcare marketers look for versus the IT professionals managing the site?


Gil Vidals
I think if you’re seeing some sporadic downtime, like if you have alerting for a down site that you get on your phone and you notice that the site’s going down randomly, sporadically, but it’s consistent. By that, I mean, it doesn’t just happen once a year or twice a year. It’s happening several times a week and many times a month, then you want to take a look at that. Say, why is the site not responding consistently? And that’s too inconsistent, you would need to do something about it, but only if it happens two or three times a year. I wouldn’t worry about it because you never know. There’s some technical glitch that happened very rarely.


Adam Zeineddine
Yeah, we typically see on the marketing side, it’s more of a reactive approach where they’re getting feedback from departments or their customers that the website’s slow or it’s not accessible. From the IT standpoint, what solutions do you recommend to be maybe a bit more proactive?


Gil Vidals
Yeah, I think one of the better ones is to use a CDN because your pages are cached. And so when somebody goes to Pull up the page, they’re really pulling it up from the cache location, not necessarily from your web server. And that can help quite a bit. And so that helps with scaling. And I would say robots. Txt, that’s a very light solution because the robots. Txt file is a file that tells these bots whether they’re allowed to scan your site if they have permission. But many bots, especially malicious ones, they ignore that file, that robots. Txt file. So I wouldn’t count on that. But it is something that’s good practice and you want to ensure you’re scalable, ensure your application is scalable.


Gil Vidals
If you just have a single web server, you may need two servers with a load balancer or something more sophisticated than that.


Adam Zeineddine
Yeah. And what emerging risks should organizations watch out for? I mean the last episode we talked about the, you know, the positive impacts potentially of AI. But are there any emerging risks there in this scenario?


Gil Vidals
Well, there has been some AI powered content scraping going on. That’s the new development since AI is around also just DDoS in general, whether as we said earlier, it’s mal intended or even it has a positive intention behind it. You want to be careful with all this automated traffic hitting your site. Some of it may be caused by your own self will. You may have signed up for a service of some nature on the web that you didn’t realize they were going to be scanning your site so frequently, so aggressively. So you have to think to yourself, did I just subscribe to some monitoring site and instead of pinging my site, you know, a few times a day or once an hour, they’re pinging me every five seconds. And so you just created your own problem.


Gil Vidals
So you need to be conscientious about all the automation that happens on your website.


Adam Zeineddine
Most definitely. Is there any final advice you’d like to give to the listeners of viewers?


Gil Vidals
Well, you need somebody that knows how to watch the logs and set up monitoring. And so you need someone that can react if you have an issue. You need to have somebody who knows enough about the web services to be able to log in and see, hey, we are monitoring alert at a problem and log in, take a look and they can identify the nature of the traffic if it’s just a lot of users, and that’s one thing that if they see a lot of bot traffic intermixed in there, they may need to take some action. You could also use some caching tools besides the CDN that we mentioned earlier. You could use WP Rocket, which is a caching plugin. For WordPress sites, and that could be quite helpful in having your site respond much faster than if that caching is disabled.


Adam Zeineddine
That’s all great advice. Well, that’s it for this episode, folks. Thanks for watching. Stay compliant and stay secure.