Drupal Planet

Pixelite: Bots, scrapers, and proxies: defending Drupal sites in an automated internet

Over half of all web traffic in 2024 was automated. That is the headline number from the Imperva 2025 Bad Bot Report, and it is the first time bots have outnumbered humans in more than a decade. Drupal sites sit squarely in that traffic mix, and the old defensive playbook — block an IP, ban a user agent, drop a robots.txt entry, lean on Fail2ban — does not hold up anymore.

This is the companion post to my DrupalSouth Wellington 2026 talk, Bots, scrapers, and proxies: defending Drupal sites in an automated internet. The talk walked through the defences I actually use at amazee.io and recommend on client sites. The post covers the same ground, with a bit more room to show config and link out to the projects.

What actually changed

The technical context underneath bot defence has shifted in three ways that matter:

  • Residential proxy networks. Scrapers no longer come from a handful of cloud subnets you can block. They route through real consumer IP addresses, often unwittingly donated by free-VPN users or piggy-backed off shady SDKs in mobile apps.
  • Headless browsers everywhere. Playwright and Puppeteer have made it trivial to render JavaScript-heavy pages at scale. A page that needed a real human five years ago can be scraped today by anyone with a laptop.
  • AI-driven scraping. Volume is up sharply because every new LLM needs training data, and there is now a steady drip of new crawlers showing up. Meta's externalagent is one recent example. There will be more.

Mimicry is now the baseline, not the edge case. A modern scraper will rotate IPs, randomise user agents, replay realistic TLS fingerprints, and pace itself slowly enough to look like a real user. You cannot rely on signal that lives in one HTTP header.

The scale of it

If this still sounds like a niche problem, the numbers say otherwise.

  • 51% - share of web traffic that was automated in 2024, per the Imperva 2025 Bad Bot Report.
  • +96% - year-on-year growth in some popular bot services across Pantheon's hosting fleet in their July 2025 data.
  • 1B+ - unique monthly visitors Pantheon sees across its platform, which is the size of dataset those numbers are coming from.

On the amazee.io platform globally, 13% of incoming requests can be flagged as non-human based on the user agent alone. That is the lazy bots. The actual share of automated traffic is higher once you account for the ones that try to blend in. In absolute terms it adds up to hundreds of millions of requests every month.

The goal is not to block all bots

Before going through the defences, one thing I am careful to say up front, both on stage and here: the goal is not to block all bots. That is unwinnable, and the closer you get to it the more real users you break.

Search crawlers, RSS readers, uptime monitors, link-preview generators in Slack and iMessage, accessibility tooling - all bots, all wanted. The goal is to reduce abuse where it hurts most, on the endpoints that cost you real money or real performance, while leaving everything else alone.

Drupal-native defences

The defences closest to your application are the smartest. They can see the path, the user, the form, the cache state. They are also the most expensive per blocked request, because every block at this layer has already cost you a full PHP bootstrap.

Perimeter

The Perimeter module drops requests matching known-bad patterns: /wp-admin, /.env, xmlrpc.php, all the WordPress scanner noise that hits every Drupal site daily. It is the cheapest win on the list. It will not stop a serious scraper, but it will keep your logs clean and your error rate honest.

CrowdSec and AbuseIPDB

CrowdSec is a local agent plus a community blocklist. Every site running CrowdSec contributes detected attacks back to a shared signal, and pulls down the latest list of bad actors. It is the closest thing the open-source world has to a distributed reputation system.

AbuseIPDB is a reputation lookup service. You query an IP, you get a confidence score. It is most useful on the forms and login flows where you can afford the latency of an external API call. Both are available as Drupal modules.

Facet Bot Blocker

If you run Search API with facets, this is the single cheapest huge win available to you. Faceted search URLs are catnip for scrapers: every combination of filters is a new URL, every URL is uncached, every uncached request hits the database. A bot that crawls a faceted listing can take a site down without trying.

The Facet Bot Blocker module acts as a rate limit on requests that include at least one facet in the URL. Configure it to use Redis or memcache for the counter so you are not making the problem worse by hitting the database to record the request. On one of our hosting customers, this one module cut Search API load by more than half.

Form-side defences

Logins, registrations, password resets and contact forms all need their own treatment, separate from page-level defence:

  • Honeypot - invisible field plus a time-based check. Cheap, fast, surprisingly effective against the dumb half of form spam.
  • Antibot - requires JavaScript to submit, blocks the bots that do not run JS.
  • CAPTCHA, reCAPTCHA, or Cloudflare Turnstile - full challenge. Use the lightest option that works, and ideally only after Honeypot and Antibot have already rejected the easy cases.
  • Hidden CAPTCHA - bridges the gap when you want a CAPTCHA-style check without the accessibility cost of a visible challenge.
The trade-off the project pages don't mention

Every block at the Drupal layer has already cost you a PHP bootstrap. That is fine when the absolute volume is small. It is not fine when you are eating hundreds of millions of bot requests and bootstrapping PHP for each one. This is why you cannot stop at the application layer.

Web server and infrastructure

One layer out, the web server can drop requests before PHP ever runs. The trade-off flips: you save the bootstrap cost, but you lose access to application context.

Rate limiting and geo blocking

nginx ships with limit_req_zone, Apache has mod_ratelimit. Both are blunt but effective on volume. A starting point for nginx looks roughly like this:

limit_req_zone $binary_remote_addr zone=search:10m rate=10r/m; location /search { limit_req zone=search burst=5 nodelay; proxy_pass http://drupal; }

Ten search requests per minute, per IP, with a burst of five. Tune to taste. The $binary_remote_addr key is cheap on memory; a 10MB zone holds around 160,000 IPs.

Geo blocking is the other infrastructure-level lever. It is pragmatic and occasionally controversial. If your audience is the New Zealand public sector, blocking inbound from regions you do not serve is a defensible call. If your audience is global, it is not. Know your traffic before reaching for it.

ModSecurity and the OWASP CRS

ModSecurity with the OWASP Core Rule Set is a proper WAF you can self-host. Once tuned, it is real protection. The tuning is the catch — out of the box it will flag Drupal admin actions, file uploads, anything that looks like SQL in a body or a query string. Expect to spend real time pruning rules and adding exceptions for legitimate site behaviour before you stop generating false positives.

Cache discipline

A request that hits the cache costs you nothing. Whatever else you do, get your cache headers right. Vary on the bits that need to vary, cache aggressively on the bits that do not, and lean on the page cache or the reverse proxy in front of Drupal. The cheapest bot is the one that asks for a page you have already served.

(shamless plug) you can also use my site Caching Score to review your current caching setup, to see if there is anything better you can be doing.

Caching ScoreAssess how strong the caching capabilities of any given site is.Caching Scoresean.hamlinamazee.ioWhat this layer is bad at

Rate limits, ModSecurity rules and geo blocks are great at volume and bad at quality. They cannot tell a scraper trickling one request per minute apart from a real user. For that you need either the edge or the application.

Edge and paid bot management

The edge is where the big vendors live, and it is where you push the cheapest blocks. A scraper rejected by Cloudflare at the network edge never gets to your origin at all.

Cloudflare

The free tier already includes Bot Fight Mode, basic challenges, and Turnstile. For most small-to-medium Drupal sites, this is a good baseline at zero extra cost. The paid Bot Management product adds custom rule logic, JA3 and JA4 TLS fingerprinting, and machine-learning-based bot scoring you can wire into firewall rules. The jump from free to paid is significant in price; the jump in capability is also significant.

Fastly, Akamai, and the rest

Fastly offers the Next-Gen WAF (originally Signal Sciences) with a Bot Management add-on. Akamai sits at the enterprise tier with the most sophisticated fingerprinting available, and a price tag to match. Beyond those, there is AWS WAF with Bot Control, DataDome, HUMAN, and Imperva — all credible, all paid, all priced for sites where bot abuse is costing real money.

The trade-offs nobody puts on the sales deck

Bot Management at the edge solves real problems. It also comes with real costs that the vendor demos skip past:

  • Cost. Bot Management is almost always an add-on to the core WAF subscription, and the pricing escalates fast with traffic.
  • Vendor lock-in. Your rules, your dashboards, your observability all live in the vendor's UI. Migrating off is painful.
  • Accessibility and SEO. Aggressive challenges break real users, and search bots that fail a challenge will hurt your rankings. Test both before turning anything up.
  • Rules live outside your application codebase. They drift, they are not versioned alongside the code that depends on them, and a rule change can break a feature without any commit to point to.
  • False positives are invisible to you. By default, blocked requests do not reach your logs. You will not know which real users were turned away unless you specifically ask for that signal.
Anubis

The newest piece in the picture, and the one that has me genuinely interested.

What Anubis is

Anubis is an open-source reverse proxy (MIT licensed) that sits in front of your site and issues a proof-of-work challenge to clients before letting them through. It was built specifically for the AI scraper era — for the case where the scraper is mimicking a real browser well enough that classifying it on signal alone has stopped working.

Why proof-of-work, not CAPTCHA

The interesting move with Anubis is who pays the cost. A real user pays a few hundred milliseconds of CPU once when they first arrive, and never sees it again for the lifetime of the cookie. A scraper hitting you a million times pays the cost a million times.

That asymmetry is the whole point. CAPTCHAs put the cost on humans (the people who lose patience trying to identify traffic lights). Anubis puts it on whoever is doing the hammering. That is closer to the right shape of the trade.

Where to put it

You do not want Anubis in front of your whole site. You want it in front of the endpoints that are expensive and uncacheable. From the talk, my shortlist:

  • Search endpoints
  • Facet and filter URLs
  • Pagination tails - ?page=2348 is not a real user
  • Login, register, password reset
  • Spicy forms (contact, anything that triggers an email)
  • Authenticated user flows
  • Anything expensive and uncacheable

Static pages stay fast. The cache stays warm. The PoW cost only applies on the routes where it earns its keep.

But what about Googlebot?

This is the first question every site owner asks, and the answer is good. Anubis ships with allowlists for known good crawlers, matching IP ranges against the published lists from Google, Bing, and the rest. The allowlist is maintained upstream, which means you need to keep Anubis deployed on a reasonable cadence to pull in the latest changes. New legitimate crawlers do show up.

Demo site

You can see Anubis in action with a demo Drupal 11 site I put together, the login form has Anubis in front of it, the homepage does not.

Log in | Drush Site-InstallDrush Site-InstallPutting the layers together

None of these defences is a silver bullet on its own. Each layer is cheap at one thing and bad at another, and the trick is matching the layer to the threat.

Layered defence diagram showing requests flowing from clients through Edge/CDN, Anubis, web server, and Drupal, with cost-to-block increasing as you move closer to the application.

Block the cheap traffic at the edge. Block the lazy bots with rate limits and ModSecurity at the web server. Put Anubis in front of the endpoints that are expensive and uncacheable. Let Drupal-native modules handle the application-aware decisions where you actually need to see the user, the form, or the facet state.

Five things to take away
  1. No single layer is enough. Stack them. The edge handles raw volume, the web server handles patterns, and the application is the only thing that can see real user and form context.
  2. Match the protection to the threat. A login form needs different defence to a faceted search results page.
  3. Measure before you defend. Look at your actual traffic. Find your most-hit uncacheable endpoints. Defend those first.
  4. Watch accessibility and SEO. Every challenge you add is a tax on a real user or a real crawler. The cost of false positives is invisible unless you go looking.
  5. Plan for adversarial improvement. Whatever you deploy today, the scrapers get a turn next. Pick defences you can iterate on.
One last thing

You do not need to win the bot war. You just need to make your site a worse target than the next one.

The slides from the talk are on the DrupalSouth schedule page. The recording will be posted here once the DrupalSouth team have edited and uploaded it — check back in a few weeks.

MidCamp - Midwest Drupal Camp: MidCamp 2027 Dates Are Official: Save the Date for April 27-29, 2027

Mark your calendars. MidCamp is returning April 27-29, 2027

We are excited to officially announce the dates for the next MidCamp, the Midwest's community-driven event for designers, developers, strategists, content creators, marketers, project managers, and open source enthusiasts.

After another incredible year of learning, collaboration, and community, we are already looking ahead to what comes next. And yes, as announced during closing remarks, MidCamp will be returning to DePaul next year just in time for Norah Schrum's birthday, which feels like the perfect excuse to gather this community again. MidCamp 2027 will once again bring together people from across Chicago, the Midwest, and beyond for several days of connection, practical learning, hallway conversations, contribution, and the kind of idea-sharing that keeps open source communities thriving.

Whether you are a longtime MidCamp regular or considering your first trip, MidCamp is built to be welcoming, approachable, and full of opportunities to learn from one another.

What to expect as planning gets underway:

  • Engaging sessions from community speakers
  • Hands-on training and learning opportunities
  • Contribution and collaboration time
  • Social events to reconnect with friends and meet new faces
  • Community-focused experiences that reflect the spirit of MidCamp

Our organizing team is just getting started, and there will be many ways to get involved in the months ahead, from volunteering and sponsoring to submitting sessions and helping shape the event.

As was said during closing remarks: bringing value to others is the best gift, and this community proves that year after year.

For now, the most important thing to do is simple: save the date, bring your friends, and plan to be part of it.

Missing MidCamp already? You can relive this year's sessions by watching the recordings on our MidCamp 2026 YouTube playlist while we get planning underway for next year.

More details will be shared on the MidCamp 2027 event page as planning progresses.

We cannot wait to do it all again with this amazing community.

The Drop Times: Apex AI 2.0 Expands Drupal AI Integration With Multi-Provider Orchestration

Apex AI 2.0 introduces a provider orchestration layer for Drupal’s AI ecosystem, adding routing, observability, audit logging, and governance controls to deployments built on the Drupal AI module. The release supports nine AI providers and more than 24 models, and introduces middleware pipelines, GDPR-related tooling, MCP integration, and operational monitoring features for organisations managing AI-assisted workflows in Drupal.

Dries Buytaert: Acquia builds Drupal funding into its partner program

Today Acquia announced something I'm really proud of. We're calling it the Acquia Fair Trade Initiative.

When an Acquia partner closes a deal, 2% of that deal flows directly to the Drupal Association, credited in the partner's name, to fund Drupal's infrastructure and long-term growth.

Imagine an Acquia partner closes a $100,000 Drupal deal with Acquia. $2,000 goes to the Drupal Association, attributed to that partner. The 2% comes from Acquia, not from partner margins, so the partner keeps their full revenue and incentives.

The donation is publicly attributed in the Acquia Partner Portal and counts toward the partner's standing in the Drupal Association's Certified Partner Program. It is recognized as financial support for the Drupal Association, separate from non-financial contributions like code, case studies, or community participation.

Most of all, I like that this program is structural. It is not a one-time gift or sponsorship campaign. It is built into the economics of Acquia's partner program, so Drupal's funding grows automatically as Acquia and its partners grow.

Too often, funding for Open Source projects depends on periodic fundraising or individual goodwill. That can work, but it rarely scales in a predictable way.

Open Source sustainability works best when incentives align. With the Fair Trade Initiative, the Drupal Association receives more predictable funding, partners receive recognition through the Drupal Association's Certified Partner Program, and Acquia invests in the long-term health of the Drupal ecosystem its business depends on. And yes, this also creates more incentive for partners to work with Acquia on Drupal projects. Drupal wins, Acquia's partners win, and Acquia wins too. That is what incentive alignment looks like.

I set a reminder for myself to report back in a year, maybe sooner. I'm curious to see what this model can become.

The Drop Times: Drupal Community Invited to Participate in The DropTimes Townhall Discussions

The DropTimes invites wider participation from across the Drupal ecosystem through an upcoming Townhall focused on project updates, community initiatives and ecosystem discussions. The session will create space for contributors, agencies, developers and community members to present ongoing work, exchange feedback and discuss outreach and collaboration efforts across Drupal.

The Drop Times: Drupal AI Summit NYC Opens Today With Focus on Enterprise AI and Open Source Governance

Drupal AI Summit NYC will take place on 14 May 2026 at Convene on Madison Avenue in New York City, bringing together Drupal contributors, enterprise platform teams, digital strategists and AI practitioners for a day centred on the operational use of artificial intelligence inside Drupal ecosystems. Sessions throughout the event will examine governance, migrations, workflow integration, structured content systems and digital sovereignty, with speakers focusing on implementation challenges already emerging in production environments rather than speculative AI adoption.

The Drop Times: Drupal Community Mourns the Loss of Alanna Burke

The Drupal and open-source communities are mourning the passing of Alanna Burke, a longtime advocate, writer, and community leader known for her work in documentation, diversity initiatives, and developer advocacy. Burke contributed to organisations and community efforts including amazee.io, Drupal Diversity and Inclusion, DrupalCon North America, and Meta’s open-source AI documentation initiatives.

The Drop Times: Nick Opris Develops Daily Digest for Drupal AI Initiative Activity

Nick Opris has developed a daily digest system that tracks issue activity, comments, and merge requests across selected Drupal AI Initiative projects. The system is designed to reduce the overhead of monitoring fragmented issue queues and review workflows by generating separate summaries for developers and non-technical stakeholders from the same contribution data. The project reflects growing experimentation within the Drupal ecosystem around AI-assisted coordination and reporting tools.

Jacob Rockowitz: Drupal (AI) Playground: AI ate my work, and I need to be okay with that.

AI ate my work

I've been experimenting with using AI to build Drupal modules for the past few months. Two weeks ago, I released a module called the AI Schema.org JSON-LD module and wrote a blog post about it. The module essentially replaces the primary outcome of my Schema.org Blueprints module, which is to enhance SEO by providing high-quality Schema.org JSON-LD markup. The AI Schema.org JSON-LD module generates Schema.org JSON-LD by having contrib modules work together to call an AI provider with a simple prompt.

This simple module, which I built in four days, supersedes my work on the Schema.org Blueprints module, which I've been working on for four years. I could resent the fact that this new AI-powered module, created using AI, was replacing me and my work, but instead, it's just changing how I view the work I'm doing.

With AI, it's easier for me to explore new ideas and take on more ambitious tasks, while knowing that the code and modules I'm creating remain flexible and extendable by humans and machines. There's a fine line between feeling like AI is eating our work, replacing it, consuming it, or improving it. We should talk about it.

What does AI mean for me?

The most immediate thing I have to think about is how I took something I had previously built, saw how AI could replace it, and had to be open to recognizing the opportunity that AI could do things differently, better, and faster. Everyone needs to lean into that reality with AI: things can get done faster and with more possibilities.

It took me a while to realize that things had changed. I built a few very simple modules to understand how AI coding agents plan, document, build, test, and maintain code. After a few weeks, I began to see the...Read More

LakeDrops Drupal Consulting, Development and Hosting: Ten Months That Changed Everything: An ECA Journey

Ten Months That Changed Everything: An ECA Journey Jürgen Haas Tue 12 May 2026 - 15:00

This post tells the story of the ten months that took ECA from Dries Buytaert' private "1% of what it could be" feedback in June/July 2025 to a keynote at Drupal DevDays Athens in April 2026, by way of DriesNote moments in Vienna and Chicago. It opens a 9-post series exploring how UX research with Emma Horrell, Mark Dodgson and Lauri Timmanee, close collaboration with Shibin Das, and a focused build sprint produced in-context customization, a new React-based Workflow Modeler, integrated testing and replay, AI-powered documentation, and a vision for Drupal as an orchestration hub.

Talking Drupal: Talking Drupal #552 - MOSA

Today we are talking about The Midwest Open Source Alliance, What they do, and How they support Drupal with guests April Sides & Tearyne Almendariz. We'll also cover Canvas Field Component as our module of the week.

For show notes visit: https://www.talkingDrupal.com/552

Topics
  • Congratulations to April as the 2026 Aaron Winborn award!
  • What is MOSA, and what gap in the Drupal ecosystem was it created to fill?
  • How did MOSA get started, and who were the key people behind its formation?
  • MOSA acts as a fiscal sponsor—what does that actually mean in practice for Drupal events and initiatives?
  • What are some of the projects or camps MOSA currently supports?
  • How does MOSA help sustain and grow regional Drupal communities over time?
  • What does membership in MOSA look like, and who should consider getting involved?
  • How does MOSA balance local community focus with broader, national or global Drupal efforts?
  • What are the biggest challenges MOSA faces as a nonprofit supporting open source communities?
  • How has MOSA evolved in recent years, and what's different today compared to when it launched?
  • Looking ahead, what's the long-term vision for MOSA and its role in the Drupal ecosystem?
Resources Guests

Tearyne Almendariz - nlbcworks.com NineLivesBlackCat April Sides - weekbeforenext

Hosts

Nic Laflin - nLighteneddevelopment.com nicxvan John Picozzi - epam.com johnpicozzi

MOTW Correspondent

Martin Anderson-Clutz - mandclu.com mandclu

  • Brief description:
    • Have you ever wanted to place Drupal-rendered fields into your Drupal Canvas templates? There's a module for that.
  • Module name/project name:
  • Brief history
    • How old: created in Apr 2026 by me! With some help from a couple of AI models
    • Versions available: 1.0.0, which works with Drupal 11.2 or newer
  • Maintainership
    • Actively maintained
    • Security coverage
    • Test coverage
    • Documentation - a README, but is designed to be narrow in scope
    • Number of open issues: technically 5 open issues, but all marked as fixed
  • Usage stats:
    • 41 sites
  • Module features and usage
    • By design, when using Drupal Canvas to create templates for content types, the idea is to map field values to properties in the template's components
    • That is a new system, however, so site builders may find there are gaps in terms of available mappings for field types they need to use, or may want to draw on mature formatting options such the responsive image definitions that come with Drupal CMS
    • With the Canvas Field Component module installed, you'll find a new "Field display" option available in your Canvas component library. When you drag that into a Canvas template layout, you can choose which field from the content type you want to display, and the formatter to use
    • That, in turn, will expose all settings for the chosen formatter, as well as any third-party settings available, for example if using Date Augmenters with Smart Date fields
    • Those settings will be reflected in real-time inside the Canvas UI preview, and then on rendered content once the template changes are published
    • This module started as a simple idea, based on my own experience using other UI-based Drupal solutions for laying out content type templates, like Layout Builder or Acquia Site Studio. Over the years, I've come to appreciate the flexibility of being able to place Drupal-rendered fields into templates, so you can mix-and-match existing, robust formatting options with flexible ways of pulling field values into layouts that also include more bespoke elements. Or, just use this as a way to add more layout flexibility to Drupal's default, linear display controls. That's what I do on my own blog, where I use Layout Builder but don't have a single custom layout on the site. It's only used for enhancing the layout of structured content.
    • Full disclosure: I also used the idea for Canvas Field Component as the impetus to venture into vibe coding, inspired by the conversations happening in the AI Learners Club, which listeners will hear more about in an upcoming episode.

UI Suite Initiative website: UI Suite Monthly #35 — Translations Land, Core Proposals Heat Up, and AI Enters the Arena

Overall SummaryOur 35th UI Suite Monthly was one of the most packed sessions yet — a full hour of demos, strategy updates, and an urgent call to action for the community. We covered major progress on the Display Builder (now mid-beta with half its scope completed), a breakthrough demo of symmetric and asymmetric translation support, a roadmap for cleaning up and refocusing UI Patterns this summer, the exciting new ability to use SDC components as form elements, and two critical core proposals — the Design Token API and the Style API — that need community support before the May 15th freeze. We also gave a first look at our AI strategy for display building, with a live demo coming next month. In short: our ecosystem is maturing fast, and the next week is decisive.

The Drop Times: The Rising Cost of AI Automation

The AI industry spent years presenting automation as a cheaper alternative to human labour. In 2026, organisations are discovering that the economics are more complicated. According to Boston Consulting Group, enterprises are expected to increase AI spending significantly this year, even as pressure grows to demonstrate measurable returns. At the same time, infrastructure costs tied to inference workloads, data centres, and continuously running AI systems continue to rise across the industry.

That shift helps explain why Drupal’s AI direction has increasingly focused on operational flexibility rather than “AI-first” positioning. The Drupal AI Initiative’s provider-agnostic architecture allows organisations to move between commercial and open-source models without rebuilding workflows, while Drupal’s structured content model reduces unnecessary token usage by providing cleaner contextual data to language models. Recent work around AI observability, governance, and usage tracking reflects a broader industry movement toward cost predictability, monitoring, and infrastructure control as AI systems transition from experimentation into production environments.

The conversation around AI adoption is therefore beginning to move away from novelty and toward sustainability. Questions around inference costs, infrastructure ownership, governance, auditability, and long-term operational flexibility are increasingly shaping enterprise decision-making. Across the broader ecosystem, the organisations likely to benefit most from AI adoption may not be those deploying the largest models, but those building systems capable of managing automation reliably, transparently, and economically over time.

Editorial note: Editor’s Pick | Vol. 4 | Issue 18 referenced reporting and analysis from a blog post by Michael Anello on beginner Drupal training programmes without sufficient attribution. The newsletter has since been updated with proper credit and source links. The Drop Times regrets the oversight and thanks Michael Anello for bringing the matter to our attention.

Now, let’s move on to the story highlights from the past week.

DISCOVER DRUPALEVENTDRUPAL COMMUNITYORGANIZATION NEWS

Additional developments from across the Drupal ecosystem were published during the week. Readers can follow The Drop Times on LinkedIn, Twitter, Bluesky, and Facebook for ongoing updates. The publication is also active on Drupal Slack in the #thedroptimes channel.

Kazima Abbas
Sub-editor
The Drop Times

#! code: Drupal 11: Node Display Mode Preview Form

This is part five of a series of articles looking at HTMX in Drupal. If you are interested in reading more then there will be a list of related articles at the end of this article.

When I was thinking about ideas on demonstrating HTMX in Drupal I implemented things like infinite scroll, a tabbed interface, and a cascading select form. I basically recreating some things that I had done in non-Drupal HTMX inside a Drupal module.

I then had an idea to create something that I might actually find useful in my day to day work as a Drupal developer. This was some way of displaying nodes in different view modes.

In this article we will look at creating a simple form that allows users to enter a node ID and a view mode and see the node rendered in that view mode.

All of the code contained in this article can be found in the Drupal HTMX examples project on GitHub, but here we will go through what the code does and what actions it performs to generate content.   

Just like the other articles on HTMX, I'm going to start with the basics and define the route.

The Route

The route we need here just needs to point the path /htmx-examples/display-mode-preview at our form class.

drupal_htmx_examples_display_mode_preview_form: path: "/htmx-examples/display-mode-preview" defaults: _form: '\Drupal\drupal_htmx_examples\Form\DisplayModePreviewForm' _title: "HTMX Display Mode Preview Form" requirements: _permission: "access content"

There isn't anything unusual about this route, it's just a regular form route.

Let's create the form for this route.

The Form

The form class has a couple of injected dependencies, which are as follows:

Read more

Dominique De Cooman: From Athens to Rotterdam: Why Drupal AI Needs an "Athena" Release

Read moreSome places do not merely offer a view. They give you direction. Athens did that to me. During Drupal Dev Days, I found myself looking at the Acropolis from a distance. The Parthenon was there, standing above the city, glowing with a presence that is difficult to describe if you have not seen it in person.From Athens to Rotterdam: Why Drupal AI Needs an "Athena" ReleaseAISaturday, May 9, 2026 - 16:16

Pages