DEV Community

Cover image for How I Built GPU-Accelerated Frosted Glass and Acrylic Blur for .NET MAUI (And Why Every Other Library Gets It Wrong)
Plixroit
Plixroit

Posted on

How I Built GPU-Accelerated Frosted Glass and Acrylic Blur for .NET MAUI (And Why Every Other Library Gets It Wrong)

Blur on Android in MAUI is kind of a mess. I needed a frosted-glass effect over a scrolling message list and everything I tried either stuttered or just looked bad when the content behind it moved.

So I dug into how all the existing libraries actually work under the hood. Turns out they're all doing the same thing:

  1. Grab a Bitmap of what's behind the view, on the CPU
  2. Run a blur pass in software
  3. Push it back up to the GPU
  4. Do it again next frame

Every. Single. Frame. No wonder it drops frames.

Why not just stay on the GPU the whole time?

Android 12 added RenderEffect and RenderNode. These let you hook into the GPU render pipeline directly and apply blur without ever touching the CPU. No bitmap snapshots, no software blur, no memory uploads between frames. The GPU just handles it natively.

That's the whole idea behind VitrumMAUI. Keep it on the GPU and get out of the way.

On a mid-range device blurring over a fast scroll, the difference is obvious. No jank.

Two views, that's it

BlurHostView wraps whatever content sits in the background. BlurConsumerView goes on top and shows the blurred result with an optional tint.

<vitrum:BlurHostView>
    <CollectionView ... />
</vitrum:BlurHostView>

<vitrum:BlurConsumerView
    BlurRadius="20"
    TintColor="#33FFFFFF" />
Enter fullscreen mode Exit fullscreen mode

No bitmaps to manage. No invalidation loops. No hacks.

Install

dotnet add package VitrumMAUI --version 1.0.8
Enter fullscreen mode Exit fullscreen mode

Or in your csproj:

<PackageReference Include="VitrumMAUI" Version="1.0.8" />
Enter fullscreen mode Exit fullscreen mode

What's supported

Android version What you get
12+ (API 31+) Full GPU blur via RenderEffect
9 to 11 Tint-only fallback

iOS and macOS are not included. Apple's own blur APIs are solid and already built into the platform, so there's no point reinventing that.

Why I actually built this

I was working on a chat UI in my own app. The bottom sheet sits over the message list and I wanted it blurred. Everything I tried was either too slow or required me to wire up manual invalidation which felt wrong. I wanted something that just works without me babysitting it.

So I built it. It's one day old, rough around the edges, but the core works well.

GitHub: VitrumMAUI
NuGet: VitrumMAUI 1.0.8

MIT licensed. If you run into issues or have ideas, open an issue.

Top comments (2)

Collapse
 
peacebinflow profile image
PEACEBINFLOW

The pattern of "CPU round-trip per frame" is one of those things that's so common in mobile UI that developers stop questioning it. Every blur library does it. Every screenshot-based overlay does it. It works fine on a static background in a demo, and then you put it over a scrolling list and suddenly you're dropping frames and nobody can quite articulate why—because the architecture looks reasonable on paper.

What I find interesting is that Android 12 added RenderEffect three years ago and the ecosystem mostly shrugged. The API is there. It's documented. But the inertia of the CPU-based approach is so strong that even new libraries default to the bitmap snapshot pattern. It's not a knowledge gap exactly—it's more that the older approach is what every tutorial shows, so it's what gets copied. The GPU path requires thinking about the render pipeline differently, and that mental model shift is the real adoption bottleneck, not the API availability.

The honest scope limitation to Android 12+ only is refreshing. No attempt to paper over older devices with a software fallback that performs terribly anyway. A tint-only fallback for API 9-11 is honest about what those devices can actually handle. Better to degrade gracefully to something that runs at 60fps than to ship a blur that runs at 12.

Makes me wonder how many other Android APIs are in this same situation—genuinely useful, shipped years ago, but buried under layers of outdated Stack Overflow answers and tutorial code that predate them. RenderEffect feels like one of those. What was the hardest part of wiring into the render pipeline—was it the API itself or just the lack of examples to work from?

Collapse
 
plixroit profile image
Plixroit

Honestly, the API itself was the easy part. RenderNode + RenderEffect.createBlurEffect is a few lines once you find them. The hard part was everything around it.

There are very few C# examples for hooking custom Android views into a ViewHandler at the render pipeline level. Most MAUI tutorials stop at "wrap an existing control", and going deeper to override DispatchDraw and manage your own RenderNode capture is uncharted territory. I ended up reading Telegram for Android's source to understand how they do their frosted bars.

The other painful discovery was a SIGSEGV in libhwui.so whenever a BlurConsumerView ended up inside the BlurHostView's subtree, even nested deep inside a child. The previous frame's cached RenderNode holds a stale reference to the blur node, so the next capture embeds it back into itself, infinite recursion in prepareTreeImpl, and the process dies with no exception. Took a while to figure out the rule has to be "consumers must be siblings, never descendants" and to enforce it in the README.

You're right that the bigger problem is the mental model shift, not the API. Hoping a working open source reference makes the next person's path shorter than mine was.