विपुल!

Open Source Recalibration in 2026

image

I didn’t attend FOSDEM this year.. But I was following it closely. Something struck me: there was a dedicated devroom called CRA in Practice. Practice. Not "awareness," not "an introduction to," but practice. The open source community has moved from (as the CFP email said) "oh no, regulations" to "okay, here's how we actually do this." That shift - from anxiety to action - captures something essential about where we are right now.

Last year, I wrote about the split between "professional" and "hobbyist" open source, and how regulatory pressure would accelerate it. That prediction held, but what I didn't anticipate was how quickly the landscape would crystallize. Instead of debating if the open source ecosystem will evolve, we are watching it happen in real time - with all the contradictions and compromises that entails. What's different from a year ago is that the energy has shifted: from hand-wringing to building.

From Niche to Necessary: Open Source, Digital Public Goods, and Infrastructure

In June of last year I had the privilege of participating in UN Open Source Week 2025. 800+ participants from 74+ countries gathered to discuss how to make open source the default. My takeaway from that event: "Open source at the UN is moving from niche to necessary."

The most energizing conversations weren't about the code or transparency. They were about value, trust; specifically, who we're actually trusting when the servers go dark.

Digital Public Infrastructure is having a moment. Brazil's Pix instant payment system now processes over 8 billion transactions monthly. India Stack serves more than a billion people. Tanzania launched Jamii Stack, integrating identity, payments, and data exchange and they're planning to open source it so other countries can build on it. Ethiopia's Fayda ID scores among the highest globally on DPI attribute alignment, despite being a low-income country. For too long, DPI conversations defaulted to a binary: either follow India's approach or adopt the GovStack framework. Tanzania, Ethiopia, and others are showing there's more than one way to build this.

The 50-in-5 campaign now includes 30+ countries committed to building DPI. In September, Trinidad and Tobago formally joined the campaign - a country I've been in conversations with on practicing their Open Source Program Office. That's the pattern I'm seeing: countries increasingly recognize that open standards, transparent governance, and local capacity are essential to avoiding lock-in and staying future-ready.

Europe is building real open source capacity too. The Eclipse Foundation and Linux Foundation Europe are investing in infrastructure. Decidim, the participatory democracy platform from Barcelona, is thriving as a Digital Public Good. MultiPoD is connecting European civic tech projects. The French and German governments are building their own open workspaces (Le Suite, Zendis). A FOSDEM devroom, the devroom 'Building Europe's Public Digital Infrastructure' - the fact that this has split off from general policy talk is a tell: European digital sovereignty has moved from slide decks to getting its hands dirty.

I’m genuinely worried about the 'digital infrastructure divide.' We're building a world where the wealthy get robust systems and everyone else gets stuck with fragmented pilots that never scale.. Or becoming dependent on proprietary solutions from vendors who see them as markets rather than partners. The procurement challenge is real: governments want open source but there's no vendor to negotiate with, no support contract to sign, no one to blame when things go wrong. That's where Digital Public Goods come in - as the building blocks that countries can adapt and deploy, with communities and stewards to support them.

The CRA: T-7 months

When I wrote about the Cyber Resilience Act last year, I predicted fear and uncertainty. The fear has given way to a kind of grim determination.

The "Open Source Software Steward" is now an official category under EU law - a recognition that foundations and organizations stewarding open source projects have different responsibilities than commercial manufacturers. That's a genuine win. The Linux Foundation, OpenSSF, and others have created compliance resources. The FOSDEM devroom I mentioned is full of practical tooling discussions: SBOMs, VEX, security-insights.yaml, the OpenSSF Baseline. The community isn't debating whether to comply - it's sharing how.

The next milestone is September 11, 2026 - just seven months away - when mandatory vulnerability reporting kicks in. If you're a manufacturer placing products with digital elements on the EU market, you need to report actively exploited vulnerabilities within 24 hours. This applies to legacy products too. If you shipped something in 2015 that's still in use and a vulnerability emerges, you need to detect it, track it, and report it.

The implication is straightforward: you can't report what you don't know. If you don't have SBOMs for your products, you're flying blind. The practical deadline for SBOM readiness isn't December 2027 when full compliance kicks in - it's now.

For UNICEF Venture Fund companies, this is actually an opportunity to get ahead. Start documenting your dependencies today. Understand which components you're using and where they come from. If you're building on open source (and you are), map your dependency chain now. The CRA explicitly protects non-commercial open source developers, but the companies building products on top of that open source are fully in scope. The good news? The tooling exists, the guidance is clear, and building these practices early makes your product stronger - not just compliant.

There's a healthy tension here: companies want the benefits of open source - the speed, the community, the innovation - and they're now being asked to take real responsibility for it. That's probably the right outcome, but it requires building new muscles. The days of "move fast and break things" are over for software that affects people's lives. That's probably good, even if it makes our jobs harder. Dependencies can't be a black box.

Open Source AI: Growing Pains

When DeepSeek R1 dropped in January 2025, it sent shockwaves through the AI industry. Here was a model that matched proprietary performance at a fraction of the cost, released under the MIT license - one of the most permissive licenses we have. The gap between open and closed AI models, which had been 17.5 percentage points on MMLU benchmarks, shrank to just 0.3 points in a single year. Chinese labs began overtaking Meta's Llama in downloads. Mistral even adopted DeepSeek's V3 architecture for their own models. Open source AI is thriving technically - that much is clear.

We’re currently having a massive, pedantic, and entirely necessary fight over what 'Open Source AI' actually means. The OSI put out a definition in 2024, and almost immediately, the community started tearing it apart.Bruce Perens who wrote the original Open Source Definition - agrees the definition is problematic, primarily because it doesn't require open training data. The core criticism: OSAID asks only for "data information" describing what data was used, which critics compare to calling software "open source" while keeping the source code secret.

Amanda Brock of OpenUK put it well: "This simply wouldn't be happening if we hadn't opened an unnecessary can of worms with the OSAID."

From the Digital Public Goods perspective, this matters deeply. The DPG Standard, which guides what we look for in UNICEF Venture Fund investments, requires open licensing and transparency. DPGA goes further than OSAID by requiring data openness too - and that distinction matters for the communities we serve. When the "source" of an AI system includes training data that reflects (or excludes) entire populations, transparency isn't a nice-to-have. It's how you ensure the technology works for everyone, not just those whose data was convenient to collect. The DPG Standard offers a more rigorous framework for evaluating AI openness - one that asks not just "can I use this model?" but "can I understand, reproduce, and trust it?"

Meanwhile, the models that actually pass OSAID validation - Pythia, OLMo, T5 - aren't the ones dominating headlines or download charts. Meta keeps pushing its own interpretation with restrictive licenses. Companies are racing to claim the "open" label for its legitimacy and regulatory advantages. But that makes frameworks like the DPG Standard even more important as a way to hold the line on genuine openness.

Sustainability: From Crisis to New Models

New models for funding open source are gaining traction. The Open Source Pledge asks companies to commit $2,000 per developer they employ to fund open source. Index-based programmatic funding is directing resources to the long tail of critical-but-invisible projects. There's talk of an open source endowment - permanent capital that could provide sustained support. And the DPG Marketplace that we are working on provides a functional path for funding and implementation: it bridges the gap between a prototype and a production-ready DPG, links government officials directly with software solutions, and minimizes procurement delays through integrated financial and logistical frameworks.

A joint statement from major foundations in late 2025 put it clearly: "Billion-dollar ecosystems cannot stand on foundations built of goodwill and unpaid weekends." They're right - and the response is starting to match the rhetoric.

But let's be honest about the gaps that remain. 60% of maintainers are unpaid, up from 46% in 2021. Almost half of npm packages with 1+ million monthly downloads have a single maintainer. FFmpeg, which powers basically every video you watch on the internet, remains underfunded. The Kubernetes External Secrets Operator went on pause. The sole maintainer of libxml2 resigned. The direction on maintainer compensation is still wrong.

I've been in this community since my Fedora days in 2016, long enough to know that this isn't a new problem. But the new models are more serious than anything we've had before. In my work at UNICEF, when we invest in companies building open source solutions, we're asking: what's the business model? How will this project survive when the grant ends? We're building toward an answer - supporting companies that can build sustainable businesses around open source, finding the path from "valuable to the world" to "financially viable." We see it working in our portfolio, often through hybrid models that balance openness with business viability. It's not fully solved, but the path is clearer than it was even two years ago.

What I do know is this: if your company's products depend on open source maintainers' labor and you're not contributing back, that's a business risk, not just an ethical issue. You're running a business that works only as long as someone can be pressured or guilt-tripped into working for free. The new funding models offer real ways to participate. The question is no longer "who will pay for open source?" but "which model works for your context?"

Where the Energy Is

Beyond the big structural shifts, two things from FOSDEM gave me particular energy.

FOSDEM had a new devroom this year on "Local-First, sync engines, CRDTs." The local-first movement promises "you own your data, in spite of the cloud." It's a paradigm shift that's been brewing for years but finally seems to be gaining serious traction. Developers are valuing practical autonomy - local control, self-hosting, auditability - over strict licensing debates. In a world of increasing centralization, there's something deeply compelling about technology that puts users back in control.

And In humanitarian open source, my corner of this world, I'm seeing more serious conversation about what it means to build sustainable systems. At UN Open Source Week, I led a breakout on Humanitarian FOSS. We talked about funding gaps, governance challenges, the brain drain from humanitarian tech to better-paying corporate jobs. But we also talked about solutions: foundations investing in for-profit companies building sustainable businesses, partnerships with local governments, grassroots civil society initiatives. The people doing this work aren't waiting for someone else to figure it out.

The Question That Won't Let Me Go

When a UNICEF Venture Fund startup asks me about strategy for 2026, here's what I tell them - and what I'd share with anyone building on open source right now:

The question that won't let me go isn't whether open source survives - it will. It's whether we're building toward genuine openness or just a new form of the dynamics we were supposed to challenge.

But here's what gives me hope when I'm being honest with myself: the conversations I'm having now are different from five years ago. Governments aren't asking "why open source?" but "how do we build the capacity to maintain it?" Startups in our portfolio aren't treating open licensing as a checkbox but as a competitive advantage. The people at FOSDEM weren't debating philosophy - they were sharing SBOMs.

The reckoning is real. So is the recalibration. And from where I sit, watching corporations, governments, academia, IGOs - all players figure out how to make open infrastructure actually work for the places that need it most, I'd rather be here than anywhere else.