Loading...
 
Skip to main content

View Articles

List Articles

Don't waste budget on junk traffic. Block hostile regions and reclaim your server resources for the users who actually matter.

Bernard Sfez - 2026-01-27 21:19

Why finance technological resources and pay technicians to manage resource consumption by bots or users who will never use your services because they are on the other side of the world? For any organization or business, leaving your servers open to the four winds is often an economic and security oversight that unnecessarily overloads your infrastructure. GeoIP Fencing, while not a requirement, becomes an attractive solution to transform your firewall into an intelligent digital "customs" gate. By filtering your incoming traffic by country, you eliminate a massive portion of network pollution and intrusion attempts from hostile nations, ensuring that every Euro invested is dedicated to an optimal, fluid, and secure experience for those who make up your legitimate audience.

Fixing Autocomplete in Tiki Wiki After Upgrading to Version 29

Bernard Sfez - 2026-01-27 12:24

After upgrading from Tiki Wiki 27 to Tiki 29, many users have discovered that their autocomplete functionality (used in CustomSearch for example) no longer works. The input field that previously showed suggestions as you typed now shows nothing at all. This can be particularly frustrating because there are no obvious error messages in the browser console, and the rest of your CustomSearch appears to function normally. The search still works, the results display correctly, but the helpful autocomplete suggestions have simply vanished. Find the solution in this article.

The Snapshot Trap: Why Your Backup Might Betray You

Bernard Sfez - 2026-01-01 17:04

If you have some experience, you know that making backups isn’t the same as being secure. You have probably already heard this reassuring line: “No worries, we run an automatic backup of all files every day.” With a bit of bad luck, you may have already experienced the limits of that approach when it’s applied indiscriminately, without oversight or control.

At OpenSource Solutions, our years of experience have taught us a hard reality: having a backup doesn’t mean being able to restore. Day-to-day server and data operations require heightened vigilance, particularly when it comes to cyber resilience. One of the most insidious pitfalls is blind trust in snapshots or global (bulk) backups. In this article, we will provide a quick diagnosis of these practices and share our “recipe” to avoid losing anything—but above all, to ensure optimal operability in record time in the event of a disaster.

How to Keep Your Site from Sinking Under AI Bots

Bernard Sfez - 2025-11-27 15:46

Tiki Wiki CMS is used on several sites that are currently under heavy fire from aggressive, resource-hungry AI crawlers. These bots can overwhelm servers to the point where applications become painfully slow or even completely unavailable, as we discussed in a previous article.

The answer cannot be limited to what happens “at the server’s doorstep” (WAF, rate limiting, IP blocking, etc.). It is just as essential to strengthen Tiki Wiki itself by acting on:

  • The content the site actually exposes to visitors and bots
  • The way queries are built, filtered, and chained
  • The detection and control of abnormal memory or CPU consumption
  • The data that is queried versus what is actually returned
  • The use of Tiki’s own optimisation and performance tools


The goal of this article is to present concrete measures within Tiki Wiki CMS to reduce the attack surface, control resource consumption, and ensure your site stays available even under AI: crawlers hungry for data that overload your servers and leave your sites slow or inaccessible|pressure from AI bots or malicious crawlers], while also improving its performance over the long term.

AI: crawlers hungry for data that overload your servers and leave your sites slow or inaccessible

Bernard Sfez - 2025-11-01 19:42

In the breakneck race to hoover up data, AI models are unleashing armies of bots that crawl the web nonstop—often ignoring rules like robots.txt. The result? Siphoned bandwidth, strained servers, and pages that slow to a crawl or go offline—an abnormal, excessive load that hurts publishers, businesses, and users alike. As The Register notes, more and more voices are calling out bots that send no traffic back yet rack up infrastructure costs and leave teams overwhelmed.

At OpenSource Solutions, we take a clear, measurable approach. We continuously analyze traffic logs to spot overconsumption, correlate signals across multiple servers to tell trustworthy crawlers (search engines) from aggressive AI scrapers, and roll out proactive defenses—quotas, rate limiting, IP allowlists, bot traps, even GeoIP filtering—to protect performance without blocking the indexing that’s essential for SEO. Let’s break it down.