Drupal Planet

The Drop is Always Moving: Drupal 11.2.0 improves backend and frontend performance and scalability, completes the introduction of OOP support of hooks, adds JSON Schema support, includes AVIF image format capability, supports SDC variants, and more....

Drupal 11.2.0 improves backend and frontend performance and scalability, completes the introduction of OOP support of hooks, adds JSON Schema support, includes AVIF image format capability, supports SDC variants, and more. https://www.drupal.org/blog/drupal-11-2-0

Drupal blog: Drupal 11.2.0 is now available

New in Drupal 11.2

The second feature release of Drupal 11 improves backend and frontend performance and scalability, completes the introduction of OOP support of hooks, adds JSON Schema support, includes AVIF image format capability, supports SDC variants, and more.

Extension and site installation is three to four times as fast as Drupal 11.1.0

Thanks to various optimizations to container rebuilding and the installer, installing Drupal itself or extensions is now three to four times as fast. There are similar improvements when using the user interface, but it is more apparent when using Drush. In this video, we show Drupal 11.2 installing 60 modules in 5.7 seconds while Drupal 11.1 takes four times as much to do the same:

.module files are not needed anymore!

Starting with Drupal 11.2, the last APIs that needed .module files can be implemented as object-oriented hooks too! Developers can make use of [#RemoveHook] attributes to remove hooks, [#ReOrderHook] to change hook ordering and #[Preprocess] attributes to declare object-oriented preprocess hooks. Now there is no need for a .module file if all of the hooks are on classes in the Hook namespace.

Built-in JSON Schema generation for content entities

When working with Drupal entities over an API, it is important for developer experience to have a schema for the data structure of a particular entity. This allows clients to know, for instance, what acceptable values may be sent or received for the value and format properties of a formatted text field.

Drupal core can now generate JSON Schemas for content entity types. The typed data, serialization and field APIs have been enhanced to allow field-level schemas to be generated based on their storage configuration.

All field types shipped by core now provide JSON Schemas out of the box through their default normalizers. In addition, all the core typed data plugins provide JSON Schemas as well. This means that all core fields can generate JSON Schemas for their properties out of the box. Additionally, most field types provided by contributed projects or custom modules will generate JSON Schemas automatically so long as they do not provide a custom normalizer or depend on non-core typed data plugins.

Native variant support added to Single-Directory Components

In design systems, a variant allows grouping multiple component properties into a predefined set. The variant can then be used as a shortcut to render a component. Front-end developers could previously define a variant as a prop, but this approach did not support custom titles or descriptions to convey the variant’s purpose.
Now, you can use variants as a property at the root of your component declaration:

name: Card variants: primary: title: Primary description: ... secondary: title: Secondary description: ... props: {} slots: {} AVIF support added with fallback to WebP

Drupal 11.2 now supports AVIF in our image toolkit. AVIF offers better compression and image quality than WebP, especially for high-resolution images and HDR content. However, not all servers support conversion to AVIF. For that reason, a fallback mechanism was added to convert to WebP when AVIF support is not available.

CSS page weight improvements

Drupal core has long supported component-based CSS organization and conditional loading that depends on page elements. Using this system, the default CSS added to every page by Drupal core has been reduced from around 7 KB to 1 KB. This will improve bandwidth requirements and page rendering times for all but the most highly customized sites running on 11.2.

Navigation improvements

The modern Navigation module now automatically enables the built-in top bar functionality as well. An "overview" link is now shown when a menu item is a container for child items, making it easier to find the right page. Numerous other blockers have also been resolved, and this experimental module is close to becoming stable in a future minor release.

Recipe dependencies are now unpacked

Drupal recipes are special Composer packages designed to bootstrap Drupal projects with necessary dependencies. When a recipe is required, a new Composer plugin "unpacks" it by moving the recipe's dependencies directly into your project's root composer.json, and removes the recipe as a project dependency. This makes it possible to update those dependencies later and to not have the recipe as an active dependency of the site anymore.

Changes to Update Status module to better support modern workflows

Update Status now checks the status of uninstalled extensions, making your site even more secure.
Updating themes and modules in the Update Status module with authorize.php was not Composer-aware. This could cause various serious problems for sites and site deployment workflows. Therefore, this legacy feature has now been removed. Projects should generally be updated on the command line with Composer. The experimental Update Manager (Automatic Updates) will also be used for this in the future.

Cache efficiency improvements

Significant improvements have been made to Drupal's render cache performance due to optimizations in placeholder processing and cache tag invalidation checks. This results in smaller cache entries with fewer cache dependencies in the dynamic page cache, leading to higher cache hit rates and reduced cache storage requirements. The reduction in cache tag lookups reduces round trips to persistent cache storage backends on every HTML response. This applies whether the cache tag backend is using database, memcache, or redis, and leads to slightly faster page rendering performance on both dynamic page cache hits and misses. There is also a significant reduction in queries per second (QPS) for high-traffic sites, which should allow caching servers to handle more traffic with lower hardware requirements.

PHPUnit 11 support added

PHPUnit 11 can now be used for testing. While the default version remains PHPUnit 10, it's possible to update to PHPUnit with the command composer update phpunit/phpunit --with-dependencies. Drupal core testing on PHP 8.4 requires PHPUnit 11 as a minimum.

Core maintainer team updates

Since Drupal 11.1, Emma Horrel and Cristina Chumillas were announced as UX Managers.

Griffyn Heels joined as a provisional Core Leadership Team Facilitator. Juraj Nemec and Drew Webber were added as general core committers, and Pierre Dureau was added as a provisional Frontend Framework Manager. Check out their announcement.

Six people stepped up to become subsystem maintainers! Nic Laflin became a maintainer of the Extension API, Lee Rowlands became a co-maintainer of the Form and Render APIs, Adam Bramley became maintainer of Node module, Jean Valverde became a co-maintainer of Single-Directory Components. Mark Conroy became the maintainer of the Stable 9 theme and Brad Jones became a co-maintainer of Serialization. Many of the improvements above are thanks to leadership from these new maintainers!

Three subsystem maintainers stepped back. We thank Claudiu Cristea, Christian Fritsch, and Daniel Wehner for their immense contributions.

Finally, there have also been changes in the mentoring coordinator team: James Shields joined, while Mauricio Dinarte, AmyJune Hineline and Tara King stepped back from the role. Many Drupal contributors are thankful to have been mentored by them!

Drupal 10.5 is also available

The next maintenance minor release of Drupal 10 has also been released. Drupal 10 will be supported until December 9, 2026, after the release of Drupal 12. Long-term support for Drupal 10 is managed with a new maintenance minor release every 6 months that receives twelve months of support. This allows the maintenance minor to adapt to evolving dependencies. It also gives more flexibility for sites to move to Drupal 11 when they are ready.

This release schedule allows sites to move from one LTS version to the next if that is the best strategy for their needs. For more information on maintenance minors, read the previous post on the new major release schedule.

Want to get involved?

If you are looking to make the leap from Drupal user to Drupal contributor, or you want to share resources with your team as part of their professional development, there are many opportunities to deepen your Drupal skill set and give back to the community. Check out the Drupal contributor guide. Join us at DrupalCon Vienna in October 2025 or DrupalCon Nara in November 2025 to attend sessions, network, and enjoy mentorship for your first contributions.

eiriksm.dev: Drupal deployment confidence part 2: Composer validate

This is part 2 of a series on having CI pipelines for your Drupal site, and building Drupal Deploy Confidence. In this part of our series, we’re looking at another low-hanging fruit: composer validate. It's quick to set up and can save your team from subtle but frustrating dependency issues.

What does composer validate actually do?

According to the official Composer documentation, this command:

...checks if your composer.json is valid. If a composer.lock exists, it will also check if it is up to date with the composer.json.

That might sound trivial, but in practice it solves a common and annoying problem: when the lock file is out of sync with composer.json.

You’ve probably seen this warning:

Warning: The lock file is not up to date with the latest changes in composer.json. You may be getting outdated dependencies. Run update to update them.

In a CI context, composer validate will fail with a non-zero exit code if this mismatch is found. That’s a good thing - it prevents developers from merging code that might lead to confusion, instability, or surprise dependency changes.

Why you should care

Here’s why we make composer validate a standard in CI pipelines:

1. It prevents outdated or unexpected dependencies
When the lock file is behind composer.json, your project might not actually install the packages you think it will. That’s a recipe for bugs that are hard to trace, especially in production.

Failing early in CI ensures that your dependencies are exactly what you committed to - nothing more, nothing less.

2. It avoids misleading advice
The warning suggests you can simply run composer update to fix the mismatch. But unless you're deliberately updating all your dependencies, this is often bad advice.

Running composer update without constraints may introduce unintended package upgrades, which might either break other parts of the system, introduce regressions or complicate debugging.

By enforcing a clean lock file, you avoid needing to deal with this situation altogether.

Optional strictness

By default composer validate will also check things like requiring a name or description in your composer.json. This is particularly useful for packages intended for public reuse. For regular web projects, we usually skip these extra rules - they don’t add much value in a project context. In practice, that means the validate of our CI pipeline runs this command:

composer validate --no-check-version --no-check-publish

To make sure this is in fact literally the same on all platforms, we add a wrapper script to composer.json, so we can simply run 

composer ci:validate Conclusion

If you have followed the parts so far, you should now have 2 CI jobs in your pipeline. Please see the below links for both jobs in all 3 major version control providers:

Direct link to pipeline for Bitbucket: https://bitbucket.org/eirikmorland/drupal-confidence/src/part-2/bitbucket-pipelines.yml
Direct link to pipeline for GitLab: https://gitlab.com/eiriksm/drupal-confidence/-/blob/part-2/.gitlab-ci.yml?ref_type=heads
Direct link to pipeline for GitHub: https://github.com/eiriksm/drupal-confidence/blob/part-2/.github/workflows/test.yml

Drupalize.Me: SEO for Drupal Users: What You Need to Know

SEO for Drupal Users: What You Need to Know

When I was writing documentation for Drupal CMS’s SEO Tools recommended add-on (aka “recipe”), I realized that not all Drupal site users may be up-to-date on the essentials of SEO and how Drupal can help you make your site discoverable by your target audiences.

While Drupal has long been a solid foundation for building search-friendly websites — that doesn’t mean every Drupal user knows where to start with SEO.

In this post, I’ll define essential SEO concepts and highlight best practices direct from the documentation I wrote about promoting your site with SEO for the Drupal CMS User Guide.

Whether you're configuring a custom Drupal 11 site or using Drupal CMS, these tips apply to you. All of the tools mentioned below may be installed on any Drupal 11 site or installed via the SEO Tools recommended add-on in Drupal CMS.

Amber Matz Wed, 06/18/2025 - 14:41

Dries Buytaert: If a note can be public, it should be

A few years ago, I quietly adopted a small principle that has changed how I think about publishing on my website. It's a principle I've been practicing for a while now, though I don't think I've ever written about it publicly.

The principle is: If a note can be public, it should be.

It sounds simple, but this idea has quietly shaped how I treat my personal website.

I was inspired by three overlapping ideas: digital gardens, personal memexes, and "Today I Learned" entries.

Writers like Tom Critchlow, Maggie Appleton, and Andy Matuschak maintain what they call digital gardens. They showed me that a personal website does not have to be a collection of polished blog posts. It can be a living space where ideas can grow and evolve. Think of it more as an ever-evolving notebook than a finished publication, constantly edited and updated over time.

I also learned from Simon Willison, who publishes small, focused Today I Learned (TIL) entries. They are quick, practical notes that capture a moment of learning. They don't aim to be comprehensive; they simply aim to be useful.

And then there is Cory Doctorow. In 2021, he explained his writing and publishing workflow, which he describes as a kind of personal memex. A memex is a way to record your knowledge and ideas over time. While his memex is not public, I found his approach inspiring.

I try to take a lot of notes. For the past four years, my tool of choice has been Obsidian. It is where I jot things down, think things through, and keep track of what I am learning.

In Obsidian, I maintain a Zettelkasten system. It is a method for connecting ideas and building a network of linked thoughts. It is not just about storing information but about helping ideas grow over time.

At some point, I realized that many of my notes don't contain anything private. If they're useful to me, there is a good chance they might be useful to someone else too. That is when I adopted the principle: If a note can be public, it should be.

So a few years ago, I began publishing these kinds of notes on my site. You might have seen examples like Principles for life, PHPUnit tests for Drupal, Brewing coffee with a moka pot when camping or Setting up password-free SSH logins.

These pages on my website are not blog posts. They are living notes. I update them as I learn more or come back to the topic. To make that clear, each note begins with a short disclaimer that says what it is. Think of it as a digital notebook entry rather than a polished essay.

Now, I do my best to follow my principle, but I fall short more than I care to admit. I have plenty of notes in Obsidian that could have made it to my website but never did.

Often, it's simply inertia. Moving a note from Obsidian to my Drupal site involves a few steps. While not difficult, these steps consume time I don't always have. I tell myself I'll do it later, and then 'later' often never arrives.

Other times, I hold back because I feel insecure. I am often most excited to write when I am learning something new, but that is also when I know the least. What if I misunderstood something? The voice of doubt can be loud enough to keep a note trapped in Obsidian, never making it to my website.

But I keep pushing myself to share in public. I have been learning in the open and sharing in the open for 25 years, and some of the best things in my life have come from that. So I try to remember: if notes can be public, they should be.

Dries Buytaert: Automating alt-text generation with AI

Billions of images on the web lack proper alt-text, making them inaccessible to millions of users who rely on screen readers.

My own website is no exception, so a few weeks ago, I set out to add missing alt-text to about 9,000 images on this website.

What seemed like a simple fix became a multi-step challenge. I needed to evaluate different AI models and decide between local or cloud processing.

To make the web better, a lot of websites need to add alt-text to their images. So I decided to document my progress here on my blog so others can learn from it – or offer suggestions. This third post dives into the technical details of how I built an automated pipeline to generate alt-text at scale.

[newsletter-blog] High-level architecture overview

My automation process follows three steps for each image:

  1. Check if alt-text exists for a given image
  2. Generate new alt-text using AI when missing
  3. Update the database record for the image with the new alt-text

The rest of this post goes into more detail on each of these steps. If you're interested in the implementation, you can find most of the source code on GitHub.

Retrieving image metadata

To systematically process 9,000 images, I needed a structured way to identify which ones were missing alt-text.

Since my site runs on Drupal, I built two REST API endpoints to interact with the image metadata:

  • GET /album/{album-name}/{image-name}/get – Retrieves metadata for an image, including title, alt-text, and caption.
  • PATCH /album/{album-name}/{image-name}/patch – Updates specific fields, such as adding or modifying alt-text.

I've built similar APIs before, including one for my basement's temperature and humidity monitor. That post provides a more detailed breakdown of how I build endpoints like this.

This API uses separate URL paths (/get and /patch) for different operations, rather than using a single resource URL. I'd prefer to follow RESTful principles, but this approach avoids caching problems, including content negotiation issues in CDNs.

Anyway, with the new endpoints in place, fetching metadata for an image is simple:

[code bash]curl -H "Authorization: test-token" \ "https://dri.es/album/isle-of-skye-2024/journey-to-skye/get"[/code]

Every request requires an authorization token. And no, test-token isn't the real one. Without it, anyone could edit my images. While crowdsourced alt-text might be an interesting experiment, it's not one I'm looking to run today.

This request returns a JSON object with image metadata:

[code bash]{ "title": "Journey to Skye", "alt": "", "caption": "Each year, Klaas and I pick a new destination for our outdoor adventure. In 2024, we set off for the Isle of Skye in Scotland. This stop was near Glencoe, about halfway between Glasgow and Skye." } [/code]

Because the alt-field is empty, the next step is to generate a description using AI.

Generating and refining alt-text with AI

In my first post on AI-generated alt-text, I wrote a Python script to compare 10 different local Large Language Models (LLMs). The script uses PyTorch, a widely used machine learning framework for AI research and deep learning. This implementation was a great learning experience.

The original script takes an image as input and generates alt-text using multiple LLMs:

[code bash]./caption.py journey-to-skye.jpg { "image": "journey-to-skye.jpg", "captions": { "vit-gpt2": "A man standing on top of a lush green field next to a body of water with a bird perched on top of it.", "git": "A man stands in a field next to a body of water with mountains in the background and a mountain in the background.", "blip": "This is an image of a person standing in the middle of a field next to a body of water with a mountain in the background.", "blip2-opt": "A man standing in the middle of a field with mountains in the background.", "blip2-flan": "A man is standing in the middle of a field with a river and mountains behind him on a cloudy day.", "minicpm-v": "A person standing alone amidst nature, with mountains and cloudy skies as backdrop.", "llava-13b": "A person standing alone in a misty, overgrown field with heather and trees, possibly during autumn or early spring due to the presence of red berries on the trees and the foggy atmosphere.", "llava-34b": "A person standing alone on a grassy hillside with a body of water and mountains in the background, under a cloudy sky.", "llama32-vision-11b": "A person standing in a field with mountains and water in the background, surrounded by overgrown grass and trees." } }[/code]

My original plan was to run everything locally for full control, no subscription costs, and optimal privacy. But after testing 10 local LLMs, I changed my mind.

I knew cloud-based models would be better, but wanted to see if local models were good enough for alt-texts. Turns out, they're not quite there. You can read the full comparison, but I gave the best local models a B, while cloud models earned an A.

While local processing aligned with my principles, it compromised the primary goal: creating the best possible descriptions for screen reader users. So I abandoned my local-only approach and decided to use cloud-based LLMs.

To automate alt-text generation for 9,000 images, I needed programmatic access to cloud models rather than relying on their browser-based interfaces — though browser-based AI can be tons of fun.

Instead of expanding my script with cloud LLM support, I switched to Simon Willison's llm tool: https://llm.datasette.io/. llm is a command-line tool and Python library that supports both local and cloud-based models. It takes care of installation, dependencies, API key management, and uploading images. Basically, all the things I didn't want to spend time maintaining myself.

Despite enjoying my PyTorch explorations with vision language models and multimodal encoders, I needed to focus on results. My weekly progress goal meant prioritizing working alt-text over building homegrown inference pipelines.

I also considered you, my readers. If this project inspires you to make your own website more accessible, you're better off with a script built on a well-maintained tool like llm rather than trying to adapt my custom implementation.

Scrapping my PyTorch implementation stung at first, but building on a more mature and active open-source project was far better for me and for you. So I rewrote my script, now in the v2 branch, with the original PyTorch version preserved in v1.

The new version of my script keeps the same simple interface but now supports cloud models like ChatGPT and Claude:

[code bash]./caption.py journey-to-skye.jpg --model chatgpt-4o-latest claude-3-sonnet --context "Location: Glencoe, Scotland" { "image": "journey-to-skye.jpg", "captions": { "chatgpt-4o-latest": "A person in a red jacket stands near a small body of water, looking at distant mountains in Glencoe, Scotland.", "claude-3-sonnet": "A person stands by a small lake surrounded by grassy hills and mountains under a cloudy sky in the Scottish Highlands." } }[/code]

The --context parameter improves alt-text quality by adding details the LLM can't determine from the image alone. This might include GPS coordinates, album titles, or even a blog post about the trip.

In this example, I added "Location: Glencoe, Scotland". Notice how ChatGPT-4o mentions Glencoe directly while Claude-3 Sonnet references the Scottish Highlands. This contextual information makes descriptions more accurate and valuable for users. For maximum accuracy, use all available information!

Updating image metadata

With alt-text generated, the final step is updating each image. The PATCH endpoint accepts only the fields that need changing, preserving other metadata:

[code bash]curl -X PATCH \ -H "Authorization: test-token" \ "https://dri.es/album/isle-of-skye-2024/journey-to-skye/patch" \ -d '{ "alt": "A person stands by a small lake surrounded by grassy hills and mountains under a cloudy sky in the Scottish Highlands.", }' [/code]

That's it. This completes the automation loop for one image. It checks if alt-text is needed, creates a description using a cloud-based LLM, and updates the image if necessary. Now, I just need to do this about 9,000 times.

Tracking AI-generated alt-text

Before running the script on all 9,000 images, I added a label to the database that marks each alt-text as either human-written or AI-generated. This makes it easy to:

  • Re-run AI-generated descriptions without overwriting human-written ones
  • Upgrade AI-generated alt-text as better models become available

With this approach I can update the AI-generated alt-text when ChatGPT 5 is released. And eventually, it might allow me to return to my original principles: to use a high-quality local LLM trained on public domain data. In the mean time, it helps me make the web more accessible today while building toward a better long-term solution tomorrow.

Next steps

Now that the process is automated for a single image, the last step is to run the script on all 9,000. And honestly, it makes me nervous. The perfectionist in me wants to review every single AI-generated alt-text, but that is just not feasible. So, I have to trust AI. I'll probably write one more post to share the results and what I learned from this final step.

Stay tuned.

The Drop Times: A Look Under the Hood of Lupus Decoupled Drupal

Forget the usual trade-offs of headless architecture. Lupus Decoupled keeps Drupal’s powerful backend features intact while giving developers full control on the frontend. In this exclusive interview, Wolfgang Ziegler of Drunomics breaks down how the system works, why it matters, and what’s coming next for the project that’s redefining decoupled Drupal.

Metadrop: Metadrop April 2025: new releases for Drupal ecosystem, privacy and content editorial experience

In March, Metadrop continued its contributions to the Drupal ecosystem with a particular focus on privacy and content editorial experience. The team released new modules, updated existing ones, added integrations, and assisted clients with some internal issues not directly related to Drupal, while still having time do research on AI.

New modules and releases Iframe Consent

We developed a new module to manage IFrame consent, ensuring GDPR-compliant handling of embedded iframes by loading third-party content only after obtaining user consent. This effort enhances privacy in addition to existing modules like EXIF Removal and Youtube Cookies.

Watchdog Statistics 1.0.6

The release of version 1.0.6 added date filters, enabling users to generate reports from previous months and display log statistics for the last month — an…

Talking Drupal: Talking Drupal #507 - International Drupal Federation

In this episode of Talking Drupal, we delve into the International Drupal Federation Initiative with our guest Tim Doyle, CEO of the Drupal Association. We explore the goals, structure, and potential impact of this initiative on the global Drupal community. Additionally, we cover the Modeler API as our module of the week, discussing its functionalities and future potential. Joining the discussion are hosts John Picozzi, Norah Medlin, Nic Laflin, and Martin Anderson-Clutz, who bring their insights and perspectives to the table.

For show notes visit: https://www.talkingDrupal.com/507

Topics
  • Meet the Guest: Tim Doyle
  • Module of the Week: Modeler API
  • Deep Dive into Modeler API
  • Introducing the International Drupal Federation Initiative
  • Governance and Global Impact
  • Challenges and Future Prospects
  • Annual Meeting and Governance Structure
  • Challenges in Crafting Agreements
  • Local Associations and Their Needs
  • Engagement and Communication Strategies
  • Regional Organizations and Governance
  • US-Based Not-for-Profit Focus
  • International Federation and Local Support
  • Potential Risks and Governance Models
  • Implementation Timeline and Costs
  • Legal and Organizational Considerations
  • Community Involvement and Feedback
  • Conclusion and Contact Information
Resources

International Drupal Federation Initiative Recent DA Video Feature on The Drop Times ASBL

Guests

Tim Doyle - Drupal.org Tim D.

Hosts

Nic Laflin - nLighteneddevelopment.com nicxvan John Picozzi - epam.com johnpicozzi Norah Medlin - tekNorah

Module of the Week

with Martin Anderson-Clutz - mandclu.com mandclu

Modeler API

The Modeler API provides an API for modules like ECA - Events, Conditions, Actions, Migrate Visualize, AI Agents, and maybe others. The purpose is to allow those modules to utilize modelers like BPMN.iO, (and maybe others in the future) to build diagrams constructed of their components (e.g. plugins) and write them back into module-specific config entities.

The Drop Times: Honoring the Balance

Dear Readers,

Something subtle is shifting in the Drupal space. Over the past year, there has been a clear move to consolidate around Drupal CMS as the central message of the project. The intention is understandable. Making Drupal more accessible through low-code and visual tooling lowers the barrier for new users and small teams. But this unified direction, while strategic, risks unintentionally simplifying the perception of what Drupal is and what it is still capable of.

That concern comes into focus when we look at how DrupalCon Atlanta was structured. The sessions and keynotes gave the impression that Drupal CMS is not just a major initiative, but the primary path forward. Yet Drupal has always been more than a product. It has been a framework that adapts to a wide range of use cases, especially in enterprise environments. There was noticeably less visibility for advanced architectures, decoupled implementations, or the tools that support complex digital ecosystems.

This is where the reflections of community members like Jesus Manuel Olivas add useful contrast. His take on DrupalCon highlighted the gap between the official storyline and what many agencies are actively building. For organizations that rely on multi-site strategies, custom front-end frameworks, and API-first infrastructure, the current messaging does not quite reflect their day-to-day reality. These are not theoretical edge cases. They are living, large-scale implementations shaping digital strategy across industries.

Drupal's strength has always come from its flexibility. As the project evolves, it’s important to keep that core identity intact. There is room for Drupal CMS to grow without overshadowing the more complex and less visible work happening across the ecosystem. Honoring that balance is not just a matter of inclusion. It is a matter of relevance.

INTERVIEWDISCOVER DRUPALEVENTSPOTLIGHTORGANIZATION NEWS

We acknowledge that there are more stories to share. However, due to selection constraints, we must pause further exploration for now.

To get timely updates, follow us on LinkedIn, Twitter and Facebook. You can also join us on Drupal Slack at #thedroptimes.

Thank you, 
Sincerely 
Alka Elizabeth 
Sub-editor, The DropTimes.

Salsa Digital: Inside the Drupal AI Strategy

Strategy, Not Hype: The Real Plan for AI in Drupal Let’s be clear, Drupal’s AI Initiative isn’t a rushed bolt-on. It’s a full architectural rethink, designed to embed AI into the platform with a level of governance, flexibility, and transparency that most digital experience platforms can’t touch. The goal isn’t just to keep up with the market, but to set the benchmark for open-source AI in government and enterprise. Architectural Principles: Human Control, Open Choice Human-in-the-Loop, Always: AI agents generate content, layouts, and optimisation suggestions, but every change is logged, auditable, and subject to human review or rollback. Public sector and regulated industries demand this; Drupal is making it non-negotiable.

Gbyte blog: Paginate a grouped Drupal view

Solving Pagination for Grouped Content

In Drupal Views, "grouping" refers to the feature found in the Format section's Style options. When configuring a view's format (Table, HTML List, Grid, etc.), you can select one or more fields to group results by. This creates visual sections with headers for each unique field value, organizing content into logical categories.

The Problem

Standard Views pagination in Drupal operates on individual rows, without awareness of the grouping structure. This results in a broken, inconsistent pager with groups split across multiple pages.

The Solution

Views Grouping Field Pager treats each group as a pagination unit rather than each row. This ensures:

  • Groups remain intact across pages
  • The pager navigation works as expected displaying the correct number of items
Use Cases
  • Project libraries grouped by category
  • News articles organized by date
  • Product catalogs grouped by type
  • Event listings arranged by date
  • Research collections organized by topic
Technical Implementation

The module:

Drupalize.Me: What’s New in the Drupal CMS User Guide: June 2025 Update

What’s New in the Drupal CMS User Guide: June 2025 Update

Since the launch of the Drupal CMS earlier this year, we’ve been hard at work documenting everything you need to build and maintain a site using this new, streamlined Drupal experience. Our goal is to make the Drupal CMS User Guide a go-to reference for site builders of all experience levels — especially those coming to Drupal for the first time.

In this post, we'll share an update on the current state of the guide, highlight two new sections we’re especially excited about, and show you how your team or company can help move the project forward.

Amber Matz Fri, 06/13/2025 - 16:19

Palantir: Palantir is Now a Top Tier Drupal Certified Partner

Palantir is Now a Top Tier Drupal Certified Partner demet Fri, 06/13/2025 - 10:53

How partnerships benefit open source projects and their customers

Palantir has been contributing to open source projects for over 25 years, and we’re one of the world's leading contributors to the Drupal project and community. Over the years we’ve channeled our expertise into creating key modules now incorporated into Drupal core and helping organizations across healthcare, government, education, and more with Drupal solutions tailored to their needs.

We’re proud to announce that Palantir was recently accredited as a Top Tier Drupal Certified Partner. There are currently only three other companies in the entire world who have achieved this status based on their contributions to the project and community.

What is a Drupal Certified Partner?

The Certified Partnership program arose within the Drupal community to recognize the efforts of agencies that significantly contribute to the platform. Drupal Certified Partners are service providers that demonstrate three key characteristics:

  • A high level of expertise in Drupal
  • Commitment to contributing to Drupal core
  • Building the Drupal community

The program is overseen by the Drupal Association, which requires firms to demonstrate how they meet these criteria. Organizations looking for Drupal partners can trust Drupal Certified Partnership status to signal that the company leverages best-in-class Drupal expertise.

Organizations that work with Drupal Certified Partners can trust they’re collaborating with experts who not only build with Drupal— they help build Drupal itself. 

What are some of Palantir’s Drupal contributions?

At Palantir, we have a decades-long track record of building, maintaining, extending and promoting Drupal. As a Top Tier Partner, our clients trust us to maximize the platform’s capabilities to deliver best-in-class web applications—learn more about our Drupal consulting services today.

EditTogether

One of our most recent Drupal innovations is EditTogether: a free, multi-user editing solution that allows you to write and edit exactly where you publish. Rather than relying on external tools for collaborative editing, Edit Together allows you to edit, comment, and version control any field on your Drupal site. This eliminates the need for copy-pasting between your editor and publisher, avoids vendor lock-in, and allows you to take advantage of Drupal’s commitment to data sovereignty in your editing workflows.

EditTogether was conceived as part of our client work for a division of a state agency that provides Medicaid services to its residents. Their content workflows were high-friction and inefficient, relying on transferring content between several proprietary tools for writing, editing, and publishing. After auditing the available solutions, we realized there wasn’t a highly customizable, extensible editor in Drupal that could meet our client’s rigorous security requirements—so our team of developers and architects built a rich text editor using ProseMirror tools and connected it to Drupal via the Drupal Text Editor API.

After successfully delivering the EditTogether solution to the client, we’re currently validating EditTogether against other use cases and preparing for wider release back to the community.

Drupal.org modernization

When the Drupal Association needed to modernize their own site, they trusted us to deliver. In 2024, the Promote Drupal team wanted to migrate Drupal.org from Drupal 7 to Drupal 10 to show off Drupal’s most recent capabilities. Drupal.org is the online home of the Drupal community—so they needed to balance modernizing the site for new adopters and evaluators with making sure the 12,000 pages of documentation and 50,000 modules on the site remained continuously available for the entire community.

The Drupal Association wanted to work with a firm that not only knew Drupal inside out, but could oversee an incremental development plan to deliver the new content as fast as possible while still maintaining excellent service for the existing community. They chose Palantir.net to tackle the job: we delivered their needed site modernizations, dramatically improved test coverage, and built a new components-based design system to make adding new content a breeze.

We’re continuing to work with the Drupal Association to enhance their development team with added technical expertise and strategic consulting for the next phases of the modernization plan.

Why are partnerships vital to open source projects?

Open source technologies give developers and communities greater levels of transparency, flexibility, and financial accessibility in their digital projects—but maintaining open source projects can be challenging. Funding is frequently tight, and governance organizations are often overwhelmed with support requests. Open source solutions rely on organizations contributing their time, expertise, and money back into the technology to keep the lights on. However, many companies view open source contributions as simply a marketing opportunity or a “nice to have”—and the first thing to get cut when to-do lists start to get unwieldy.

The Drupal project wanted to incentivize more people to become active open source contributors— and this is where Certified Partnerships come in.

Partnerships reward consistent contribution

In exchange for meeting certain requirements (for example, a specified number of code commits, or donating a certain amount of money to the open source foundation), organizations can become certified partners of their communities—and enjoy a range of associated benefits.

Certified partnerships are also advantageous to organizations looking for agencies with proven expertise in a specific open source technology. When you work with a certified partner, you’re working with a company who actively helps to build and maintain the software you need. Partners have an intimate knowledge of the software they contribute to, and how to make the most of its capabilities. This knowledge and expertise also enable them to build custom solutions that work well with the core technology.

What makes the Drupal Certified Partner Program unique?

The Drupal Certified Partnerships program recognizes companies and organizations that have “gone above and beyond to sustain and grow the Drupal program”.

Drupal Certified Partnerships don’t just recognize code contributions—they also reward active community participation and encouraging Drupal adoption. Organizations qualify for Certified Partnership through a number of criteria:

  • Contributions and commit credits, weighted towards the aspects of Drupal that are most frequently used and need the most work
  • Proven case studies and success stories of delivering Drupal solutions to clients
  • Annual financial sponsorship of the Drupal Association
  • Event organization, participation, and sponsorship
  • Staff acting in contributor roles, such as board members, mentors, or content moderators
  • Drupal Association members on staff

The partnership program bolsters the thriving community around Drupal. When organizations are looking to partner with service providers with deep technical expertise, they have a robust marketplace of partners to choose from across regional markets and industries. 

Drupal Certified Partners offer solutions you can trust

Open source software (OSS) offers rigorously maintained, transparent solutions, without the hefty price tags of proprietary software— offering organizations of all sizes many incentives to default to open. According to the MIT Technology Review, “The free and open-source software movement is not only alive and well; it has become a keystone of the tech industry.” Indeed, the US Digital Services Playbook encourages federal agencies to evaluate open source solutions at every level of their stack.

When organizations decide to partner with external agencies for their digital transformation and modernization projects, there’s good news and bad news. The good news: numerous agencies build with open source solutions. The bad news: numerous agencies build with open source solutions.

Working with Drupal offers you transparency, flexibility, built-in accessibility, and data sovereignty. Working with a Drupal Certified Partner means you’re working with the people who build Drupal—and who are uniquely positioned to make the most of its powerful capabilities. As Drupal Association CEO Tim Doyle puts it, “Drupal Certified Partners are esteemed agencies that exhibit a profound level of expertise in Drupal, representing a select group of contributors crucial to the vitality and future prosperity of the Drupal ecosystem.”

We’ve been trusted by clients across numerous industries—and by the Drupal Association themselves—to build and deliver highly customized, performant Drupal solutions prioritizing user-centered design, accessibility, and turnkey security.

Learn more about our modular approach to custom Drupal development, or get in touch to discover how we can tailor our expertise to your organization’s needs.

The Drop Times: "We Really Want More People Contributing to Drupal"

Georgia’s Chief Digital and AI Officer, Nikhil Deshpande, shares insights with The DropTimes on how open-source technology and Drupal have transformed state services. In this interview, conducted by sub-editor Alka Elizabeth, Nikhil discusses accessibility, AI integration, community contribution, and the future of digital government platforms.

Drupal blog: Accelerating AI innovation in Drupal

This blog has been re-posted and edited with permission from Dries Buytaert's blog.

Drupal launches its AI Initiative with more than $100,000 in funding and a dedicated team to build AI tools for website creation and content management.
 

Imagine a marketer opening Drupal and with a clear goal in mind: launch a campaign for an upcoming event.

They start by uploading a brand kit to Drupal CMS: logos, fonts, and color palette. They define the campaign's audience as mid-sized business owners interested in digital transformation. Then they create a creative guide that outlines the event's goals, key messages, and tone.

With this in place, AI agents within Drupal step in to assist. Drawing from existing content and media, the agents help generate landing pages, each optimized for a specific audience segment. They suggest headlines, refine copy based on the creative guide, create components based on the brand kit, insert a sign-up form, and assemble everything into cohesive, production-ready pages.

Using Drupal's built-in support for the Model Context Protocol (MCP), the AI agents connect to analytics tools and monitor performance. If a page is not converting well, the system makes overnight updates. It might adjust layout, improve clarity, or refine the calls to action.

Every change is tracked. The marketer can review, approve, revert, or adjust anything. They stay in control, even as the system takes on more of the routine work.

Why it matters

AI is changing how websites are built and managed faster than most people expected. The digital experience space is shifting from manual workflows to outcome-driven orchestration. Instead of building everything from scratch, users will set goals, and AI will help deliver results.

This future is not about replacing people. It is about empowering them. It is about freeing up time for creative and strategic work while AI handles the rest. AI will take care of routine tasks, suggest improvements, and respond to real-time feedback. People will remain in control, but supported by powerful new tools that make their work easier and faster.

The path forward won't be perfect. Change is never easy, and there are still many lessons to learn, but standing still isn't an option. If we want AI to head in the right direction, we have to help steer it. We are excited to move fast, but just as committed to doing it thoughtfully and with purpose.

The question is not whether AI will change how we build websites, but how we as a community will shape that change.

A coordinated push forward

Drupal already has a head start in AI. At DrupalCon Barcelona 2024, I showed how Drupal's AI tools help a site creator market wine tours. Since then, we have seen a growing ecosystem of AI modules, active integrations, and a vibrant community pushing boundaries. Today, about 1,000 people are sharing ideas and collaborating in the #ai channel on Drupal Slack.

At DrupalCon Atlanta in March 2025, I shared our latest AI progress. We also brought together key contributors working on AI in Drupal. Our goal was simple: get organized and accelerate progress. After the event, the group committed to align on a shared vision and move forward together.

Since then, this team has been meeting regularly, almost every day. I've been working with the team to help guide the direction. With a lot of hard work behind us, I'm excited to introduce the Drupal AI Initiative.

The Drupal AI Initiative builds on the momentum in our community by bringing structure and shared direction to the work already in progress. By aligning around a common strategy, we can accelerate innovation.

What we're launching today

The Drupal AI Initiative is closely aligned with the broader Drupal CMS strategy, particularly in its focus on making site building both faster and easier. At the same time, this work is not limited to Drupal CMS. It is also intended to benefit people building custom solutions on Drupal Core, as well as those working with alternative distributions of Drupal.

To support this initiative, we are announcing:

  • A clear strategy to guide Drupal's AI vision and priorities (PDF mirror).
  • A Drupal AI leadership team to drive product direction, fundraising, and collaboration across work tracks.
  • A funded delivery team focused on execution, with the equivalent of several full-time roles already committed, including technical leads, UX and project managers, and release coordination.
  • Active work tracks covering areas like AI Core, AI Products, AI Marketing, and AI UX.
  • USD $100,000 in operational funding, contributed by the initiative's founding companies.

For more details, read the full announcement on the Drupal AI Initiative page on Drupal.org.

Founding members and early support

Some of the founding members of the Drupal AI initiative during our launch call on Google Hangouts.

Over the past few months, we've invested hundreds of hours shaping our AI strategy, defining structure, and taking first steps.

I want to thank the founding members of the Drupal AI Initiative. These individuals and organizations played a key role in getting things off the ground. The list is ordered alphabetically by last name to recognize all contributors equally:

These individuals, along with the companies supporting them, have already contributed significant time, energy, and funding. I am grateful for their early commitment.

I also want to thank the staff at the Drupal Association and the Drupal CMS leadership team for their support and collaboration.

What comes next

I'm glad the Drupal AI Initiative is now underway. The Drupal AI strategy is published, the structure is in place, and multiple work tracks are open and moving forward. We'll share more details and updates in the coming weeks.

With every large initiative, we are evolving how we organize, align, and collaborate. The Drupal AI Initiative builds on that progress. As part of that, we are also exploring more ways to recognize and reward meaningful contributions.

We are creating ways for more of you to get involved with Drupal AI. Whether you are a developer, designer, strategist, or sponsor, there is a place for you in this work. If you're part of an agency, we encourage you to step forward and become a Maker. The more agencies that contribute, the more momentum we build.

Update: In addition to the initiative's founding members, Amazee.io already stepped forward with another commitment of USD $20,000 and one full-time contributor. Thank you! This brings the total operating budget to USD $120,000. Please consider joining as well.

AI is changing how websites and digital experiences are built. This is our moment to be part of the change and help define what comes next.

Join us in the #ai-initiative channel on Drupal Slack to get started.

Pages