DEV Community

Cover image for Vibe coded Power Pages SPA: unfiltered
Riccardo Gregori
Riccardo Gregori

Posted on

Vibe coded Power Pages SPA: unfiltered

Today I tried something that feels deeply unnatural to me.

Not just uncomfortable—almost wrong.

Partly to challenge my own mental limits and break through some of my built-in biases. Partly to answer a question I’d been avoiding for a while: how far can I really go if I stop controlling everything and just… trust the technology?

I don't like blog posts describing vibe coding as "type something, do something else, and everything magically works". So I wanted to give it a try by myself, reporting everything I face on my journey. The goods, and the bads.

So I set myself a rule: no shortcuts, no safety nets.

I tried to build a Power Pages SPA portal from scratch—without using PACX, and without any of the templates or tools I’ve spent years building to make my life easier.

Just an empty folder in VS Code.

And two simple plugins provided by Microsoft.

Plugin abilitati


💡 Let's start with the idea

I’ve never been particularly fond of our corporate welfare portal, so I decided to ask my trusted friend, M365 Copilot:

Help me draft the requirements for a corporate welfare website.
Be synthetic, it's for a demo. Don't design approval workflow or too fancy features. Draft everything in markdown.

And the result is:

# 📌 Requirements – Corporate Welfare Portal (Demo)

## 1. Objective

The portal aims to provide a simple solution that allows employees to:

- View their available welfare budget  
- Access a catalog of available services  
- Request benefits (vouchers or reimbursements)  
- Monitor the status of their requests  

## 2. Actors

- **Employee**
- **HR / Administration**

## 3. Functional Requirements

### 3.1 User Access and Profile

- Access via corporate login (SSO)  
- User profile view including:
  - name and role  
  - available welfare budget  

### 3.2 Budget Management

- Display of:
  - available budget  
  - used budget  
- HR capability to assign and update budget  

### 3.3 Welfare Catalog

- Display available services:
  - organized by categories (e.g., health, education, leisure)  
- Search and filtering capabilities  
- Service detail view including:
  - description  
  - eligible amount  
  - usage conditions  

### 3.4 Benefit Request

#### Voucher

- Service selection  
- Voucher generation  
- Automatic budget deduction  

#### Reimbursement

- Submission of reimbursement request  
- Upload of supporting documents (e.g., receipts)  
- Request status tracking  

### 3.5 Operations History

- List of user requests including:
  - date  
  - type  
  - amount  
  - status  
- Detailed view of individual requests  

### 3.6 HR Back Office

- User management:
  - budget assignment  
- Catalog management:
  - create and update services  
- View user requests  

## 4. Non-Functional Requirements

### 4.1 Security

- Authenticated access (SSO)  
- Personal data protection (GDPR compliance)  

### 4.2 Usability

- Simple and intuitive user interface  
- Accessible from desktop and mobile devices  

### 4.3 Performance

- Fast response time for standard operations (within a few seconds)  

### 4.4 Traceability

- Logging of user operations  
- History of submitted requests  

## 5. Open Points (for Demo / Future Evolution)

- Integration with HR systems  
- Automatic handling of fiscal constraints  
- Catalog extension (new services)  
- Personalization based on employee segments  

## ✅ Demo Implementation Suggestion

- **Frontend:** Power Pages  
- **Backend:** Dataverse  
  - Tables: Users, Budgets, Services, Requests  
- **Automation (optional):** Power Automate for notifications  

Enter fullscreen mode Exit fullscreen mode

More than enough for a simple demo.

To start, I've:

  • created a local folder called neronotte01-site, that will be the root of our workspace.
  • opened the folder in VSCode
  • copied the above text and pasted it in a requirements.md file under the docs folder of my repo.

Now my repo looks like this:

- 📂 neronotte01-site
  - 📂 docs
    - 📄 requirements.md
Enter fullscreen mode Exit fullscreen mode

Now let's start by asking Copilot

Starting from the attached requirement document, design
and implement a power page portal with a modern and
effective UI. Put the source code in the /portal subfolder
Enter fullscreen mode Exit fullscreen mode

Start from a requirement document

Note: I prefer using Claude Sonnet 4.6 for this kind of tasks, so I started with that model.

Copilot starts by building a plan:

Copilot builds a plan

And then goes... coding everything following the plan. In this first draft it designs the site UI and wires up mock data, but that's good as a start.

After a couple of minutes the build is completed:

Site finalized

And this is the working result:

Site 1

Site 2

Pretty good result by now, considering we started from a simple spec document.

Now let's try to build the backend and manage the authenticated flow.


💻 Let's build the backend

Let's move forward by asking copilot to create the tables:

I asked to design and build the backend

After a bit of reasoning, Copilot starts asking a couple of questions:

  • What will be the publisher prefix: nn
  • How shoud he call the solution that will contain the tables: portal

Copilot asks for publisher

Copilot asks for solution name

Now we fought a bit around Python structure. I'm not (and I wont be) a Python developer. I have PIP installed in my machine, but just for the sake to run some script... so Copilot tried to initialize a virtual environment to launch the commands into.

In this step there was a lot of "try and error".

Copilot tries and fails...

But let's trust the process... for now.
After a few more tries, we got stuck on an authentication issue towards Dataverse.

Auth failed 1

Auth failed 2

Auth failed 3

Auth failed 4

After the analysis, Copilot came up with another solution:

The device code flow is blocked by Conditional Access policies. 
Since `az` CLI is already authenticated, let me use it 
to get a token directly for the Dataverse API instead.
Enter fullscreen mode Exit fullscreen mode

And... it worked!

`az` CLI can get a Dataverse token. 
Now I'll write a comprehensive Python script that uses it 
to create the publisher, solution, and all 3 tables via 
the Dataverse Web API.
Enter fullscreen mode Exit fullscreen mode

Copilot did 3 things:

  1. added a .env file at the root of my workspace
# .env
DATAVERSE_URL=https://neronotte01.crm4.dynamics.com/
TENANT_ID=31c42b90-d584-4ff8-b5b9-1c65de456516
MCP_CLIENT_ID=***REDACTED***
SOLUTION_NAME=Portal
PUBLISHER_PREFIX=nn
PAC_AUTH_PROFILE=neronotte01
CLIENT_ID=***REDACTED***
Enter fullscreen mode Exit fullscreen mode
  1. created a folder called scripts in my repo
  2. added 2 files in that folder: auth.py and create_tables.py. The second is the one containing the actual datamodel setup:

The generated script

Then it asked to run the script:

Run powershell script?

Unfortunately the run failed ❌. Authentication issue, again.

Long story short, I had to provide a client secret to make it work. It updated the .env file adding CLIENT_SECRET= at the bottom

# .env
DATAVERSE_URL=https://neronotte01.crm4.dynamics.com/
TENANT_ID=31c42b90-d584-4ff8-b5b9-1c65de456516
MCP_CLIENT_ID=***REDACTED***
SOLUTION_NAME=Portal
PUBLISHER_PREFIX=nn
PAC_AUTH_PROFILE=neronotte01
CLIENT_ID=***REDACTED***
CLIENT_SECRET=***REDACTED***
Enter fullscreen mode Exit fullscreen mode

And it updated also the script with a few changes to leverage that client secret.

A few try-and-error again, and we got the result:

Tables created

🤔 My personal opinion on this step

With PACX it would have been WAAAAAAAY more straightforward. I do the same step almost daily (asking GH Copilot: "start from a requirement document and draft the datamodel, then build the PACX script to create it"), and it goes as a charme. PACX wins drastically on this.


🔗 Let's wire things up

Third prompt to Copilot:

Now replace mock data with actual calls to 
the Dataverse tables you just created
Enter fullscreen mode Exit fullscreen mode

It started by drafting a 7 step plan:

Wiring plan

  1. Install @azure/msal-browser
  2. Create MSAL config + auth context
  3. Create Dataverse service layer
  4. Wrap main.tsx with auth
  5. Update Navbar to use real user
  6. Update all 5 pages to use Dataverse
  7. Build and verify

It runs for 5-6 minutes, and then...

Build completed

It asks me to configure the app registration properly on Azure. Once done, I launch the portal and it looks like this:

My Welfare Login

But... unfortunately the login is not working. If I click on the "Sign in with Microsoft" button, it opens a popup where I can peek the user (riccardo@neronotte1.onmicrosoft.com). Then it moves to the next screen, reopening the login page into the popup. And there I'm stuck.

I say the same stuff to the copilot and try, collaboratively, to fix it.

After a while, it cames with a solution:

The problem is that after authentication, Microsoft redirects 
the popup back to window.location.origin — which loads the 
full React app inside the popup, showing the login screen again.

The fix is a dedicated blank redirect page for the popup, 
which is the MSAL-recommended pattern for SPA popup flows.

...

Now you need to add http://localhost:5173/blank.html as a 
redirect URI in the Azure App Registration. Go to:

Azure Portal → App registrations → 6beaf75f-4793-4b3f-b364-14b9647556f2 
→ Authentication → Single-page application 
→ Add URI → http://localhost:5173/blank.html
Enter fullscreen mode Exit fullscreen mode

OK. I do it, try again and...

Still stuck

I'm still stuck. Let's go to copilot again:

Now it goes to blank.html, but remains opened (see attached screenshot).
Enter fullscreen mode Exit fullscreen mode

But this time i decided to change model, from Claude Sonnet 4.6 to GPT 5.4 (sometimes it is more effective).

It started working again for a while and...

Fixed

boom solution found. It was a problem on how the App Registration was configured. Now the portal opens properly:

Portal opened

But, of course without data.

🤔 My personal opinion on this step

On Power Pages SPA you have to rebuild the auth method from scratch every time. To avoid an useless token burning, do it once until it works properly, then extract the AuthContext.tsx that works, and reuse it across all your sites.

I did it, creating my own Power Page template that I reuse every time.


✍🏻 Create sample data

I don't want to do it manually, so let's leverage Dataverse MCP server for this purpose.

First of all we need to enable it.

  1. Access https://admin.powerplatform.microsoft.com/
  2. Go to Manage > Environments
  3. Select the environment that contains the portal
  4. Click on Settings > Product > Features
  5. Scroll until you find Dataverse Model Context Protocol

Dataverse Model Context Protocol

  1. in the Step 2: Add MCP clients to allow list select Advanced Settings
  2. in the Allowed MCP Clients list, ensure Microsoft GitHub Copilot App is enabled:

Microsoft GitHub Copilot App

Now let's setup the MCP Server in VSCode

  1. On VSCode, click CTRL+SHIFT+P to open the command palette
  2. Select MCP: Add Server...
  3. Select HTTP as streaming
  4. Add your MCP url, https://{environment}.crm4.dynamics.com/api/mcp
  5. Select Workspace as the place where to save the MCP config (it will create a local .vscode/mcp.json file with the MCP configuration)
  6. Insert Dataverse as the MCP name

The mcp.json file will be something like:

{
  "servers": {
    "dataverse": {
      "url": "https://{environment}.crm4.dynamics.com/api/mcp",
      "type": "http"
    }
  },
  "inputs": []
}
Enter fullscreen mode Exit fullscreen mode

Now we're ready. Let's ask copilot:

Now create sample data into the dataverse tables 
you've created, to show something into the app.
Enter fullscreen mode Exit fullscreen mode

It iterates for a couple of minutes and then...

Data created

I refreshed the browser, and the data appears.

Now we see the data in the app

And if I try to create a new request... it works! Request is created and saved in dataverse!

New request saved

Record in Dataverse


📦 One more to go: deploy

As of now we've worked locally. Let's deploy the site into the dev environment. Let's type in the Copilot chat:

Now publish the site to dataverse
Enter fullscreen mode Exit fullscreen mode

And it started thinking...

Copilot Publishing the site

It tried to upload the site, but got stuck on the fact that the JS attachments are blocked, as described here.

Blocked attachments

Fortunately it stopped here, because in the /dist subfolder I can clearly see there's the issue I described in my previous article, so I decided now to manually update the vite.config.ts before moving forward.

Once I've updated my vite.config.ts in the following way:

import { defineConfig } from "vite";
import react from "@vitejs/plugin-react";

export default defineConfig({
  plugins: [react()],
  build: {
    outDir: "dist",
    rollupOptions: {
      output: {
        entryFileNames: `assets/[name].js`,
        chunkFileNames: `assets/[name].js`,
        assetFileNames: `assets/[name].[ext]`,
        manualChunks: {
          router: ["react-router-dom"],
          msal: ["@azure/msal-browser"],
        },
      },
    },
  },
});
Enter fullscreen mode Exit fullscreen mode

Note: it's using @azure/msal-browser, even if as stated here you should use ADAL.js... but it's working, for now, so let's leave it as it is.

I typed Yes, remove js from blockedattachments and retry in the GitHub Copilot chat, and it run the following:

$blocked = 'ade;adp;app;asa;ashx;asmx;asp;bas;bat;cdx;cer;chm;class;cmd;com;config;cpl;crt;csh;dll;exe;fxp;hlp;hta;htr;htw;ida;idc;idq;inf;ins;isp;its;jar;jse;ksh;lnk;mad;maf;mag;mam;maq;mar;mas;mat;mau;mav;maw;mda;mdb;mde;mdt;mdw;mdz;msc;msh;msh1;msh1xml;msh2;msh2xml;mshxml;msi;msp;mst;ops;pcd;pif;prf;prg;printer;pst;reg;rem;scf;scr;sct;shb;shs;shtm;shtml;soap;stm;tmp;url;vb;vbe;vbs;vsmacros;vss;vst;vsw;ws;wsc;wsf;wsh'; 

pac env update-settings --name blockedattachments --value "$blocked"
Enter fullscreen mode Exit fullscreen mode

Of course it worked. Once done, it run again pac pages upload-code-site to upload the site.

Next step is (as per GH Copilot suggestion):

  • Activate/provision the site if it is not already live
  • Test the deployed site end-to-end in the browser
  • Clear/restart the site cache so the latest changes appear immediately

In fact, if I go to make.powerpages.microsoft.com, I can clearly see the site uploaded and disabled, ready to be reactivated.

MyWelfare Portal

Note: since this was the first site created on this environment, once I got into https://make.powerpages.microsoft.com/ I had to create a new, empty, website to move forward, otherwise the portal didn't get me to the page where I could barely reactivate the old one. I think it's a sort of bug... hope Microsoft fixes this soon.

The reactivation took a while (almost 30 minutes), but... once done...

Wrong portal?!

Wrong portal?!

Let me ask Copilot:

The published portal is showing something completely different, 
see attached screenshot.
Enter fullscreen mode Exit fullscreen mode

At first, Copilot hypothesis was a site mismatch between the one I had to create manually, and the one generated by it... but of course that wasn't the problem. First miss.

Then it tried with other stuff... let me skip the details. Unconclusive.

Then I spotted something in the admin center:

Standard data model

It has created the site with Standard data model, while Power Pages SPA requires the Enhanced data model.

I told this to it explicitly:

The problem is the site you deployed is with "Standard" data model, 
while power page SPA works with the Enhanced data model. 
I think you have to re-create the site completely.
Enter fullscreen mode Exit fullscreen mode

It started iterating for several minutes, but no meaningful solution came out. I even tried to change model back to Sonnet 4.6... useless.

After more than 1h of try-and-error, I tried the old way:

  1. Create a new empty website. It gets created with Enhanced datamodel by default
  2. changed the value of the sitename node of my local powerpages.config.json to the name of the newly created site
  3. opened the local /portal/.powerpages-site/website.yml and changed the id node to the GUID of the newly created website
  4. Manually push the code to this new site via pac pages upload-code-site -rp .
  5. Purged the cache via admin center.

Et voilà... site up and running!

Site Up and running

...not so fast... at login:

Other app reg to fix

We need to fix this other app registration:

  1. Go to https://portal.azure.com/
  2. Select App Registrations > All applications
  3. Select the App Registration mentioned in the screenshot by ID
  4. Select Authentication
  5. Click Edit button near the Redirect URI configuration for Single-page application
  6. Add the Url of your website and click configure (don't forget to add also /blank.html)

Redirect URIs

Now let's try again.

  1. In the admin center, refresh the portal cache
  2. Press F5 in the browser and re-execute the login process

And that's it, we're in!

Final


☺️ Conclusions

In the end, this experiment turned out to be far more than just a technical exercise.

Power Pages SPA prove to be incredibly powerful—much more than they might initially appear. When you strip everything down to the bare essentials, the core platform already provides everything you need to build a fully functional solution.

And that’s where the real surprise lies: the tools provided by Microsoft are a true game changer. They cover the entire lifecycle, from development to deployment, in a way that feels cohesive and increasingly mature. You can genuinely go from zero to a working application without relying on any external scaffolding.

That said, it’s not a smooth ride.

There are moments where things don’t click. Where the tooling feels a bit rough around the edges. Where you need to push, tweak, retry—and sometimes just hammer it until it works. And that’s exactly where experience makes the difference. Knowing how the platform behaves, understanding its quirks, and having the intuition to work around them becomes your real advantage.

And this is also where external tools (like PACX) still shine.

Even if the out-of-the-box Microsoft tooling gets you surprisingly far, there are still phases that feel “tricky” or not fully streamlined. External tools don’t replace the platform—they augment it. They smooth out the friction, reduce the cognitive load, and accelerate the parts that would otherwise slow you down. And, most important, the knowledge of who's between the chair and the keyboard still makes the difference.

But the biggest takeaway is probably this:

This way of building changes how you think.

It shifts the balance between control and trust. Between handcrafted tooling and platform-native capabilities. Between “how I’ve always done it” and “what’s now possible.”

And honestly—I didn’t expect to enjoy it this much.

Because once you let go of the need to control every detail, and start embracing what the platform already offers, you realize something subtle but powerful:

You’re not just building differently.
You’re working differently. Focusing on what really matters, and leaving the details behind.

Top comments (0)