Sitemap
Press enter or click to view image in full size
Created by the author using AI and Figma.

Supercharge Developer Workflows with GitHub Copilot Workspace Extensions

Boost Developer Productivity 10x

--

Motivation

It is 2025 and Software Developers are faced with an incredibly fast-paced high-tech landscape — full of innovation and automation around every corner. Even with the consideration of AI taking on many roles (and hopefully not our jobs just yet) we are constantly challenged with rising complexities of distributed systems, cloud-native architectures and cognitive loads as well as context switching and endless interruptions.

I can proudly say, that it is in our nature, as fighters of the coding realm and as knights of logic, that we are constantly searching for ways to streamline our workflows while maintaining high-quality code.

Because we all aspire to one ideal state of being:

Drink coffee and look at our Creation do its thing.

Simply, because we can.

But instead of doing that, we face the same old problems:

  1. Managing Complicated Codebases:
    While trying to ensuring scalability and maintanability — until the first new hire starts messing it all up.
  2. Adhering to Coding Standards:
    Aligning with organizational and industry best practices — while fighting about semicolons and curly brackets!
  3. Integrating New Technologies:
    Incorporating tools and frameworks — all while leading philosophical discussions about the potential of world disruption or domination and our responsibility there...
Press enter or click to view image in full size
Software Development Complexity Today

So what are we going to do about this?

The Role of Platform Engineering

Press enter or click to view image in full size
Platform Engineering Overview

Yes. I got a new fancy term for ya: “Platform Engineering“.

Platform engineering focuses on creating robust internal developer platforms that standardize workflows, tools and best practices across an organization.

What?

Platform engineering in a beanshell:

“Hey devs, you don’t all need to figure out how to make coffee from scratch. We’ve got a few caffeine nerds who tested all beans, optimized the water temperature and set up the perfect espresso machine. You just push a button and get great coffee every time. Drink & enjoy. You are welcome.”

We enhance Consistency.
Instead of everyone doing whatever they want and feeding the black hole of madness, the PE team defines structure and order.
PS: If you have OCD — think about joining the PE team.

We improve Developer Experience.
Instead of having every developer and team fight on how to get things done with everyone either reinventing the wheel or feeling like they are discovering fire, the PE team offers a set of predefined tools and workflows that all share and use.

We accelerate Delivery.
Instead of different teams figuring out how to deliver their product or feature, the PE team creates processes that work and lead to faster deployment cycles.
No more ‘it works on my machine’ and ‘I built this cool custom pipeline’.

And yes, true democracy in tech leads to chaos — sometimes you just need a group of wise engineers to make the smart decisions — so everyone else can focus on building cool stuff.

Developer Workflows?

Press enter or click to view image in full size
How are Developer Workflows helping me?

Last one, I promise: “Developer Workflow”.

Developer workflows are structured processes that guide the way code is written, reviewed, tested, and deployed.

Think of it like your favorite coffee shop — they don’t grind beans, heat water, or guess the recipe every time.
It’s your favorite — cause the coffee tastes great, every. Single. TIME.
There’s a clear process, so every cup comes out consistent, fast, and without surprises. We want the same for our code.

Developer Workflow challenges

Congrats! Imagine you are on the PE team that is responsible for Developer Workflows.
Here are some of the challenges you will face:

Lack of Standardization

Inconsistent practices across teams.

Everyone in the office making coffee their own weird way — some use a french press, some use filter coffee and one guy just boils his beans in a freakin’ pot. Chaos. No consistency. No efficiency. Just bad coffee for all. Unacceptable!

Difficulty maintaining Best Practices

Challenges in enforcing coding standards.

A coffee shop where baristas ignore the recipe!!!
— one makes espresso too weak, another burns the milk, and someone adds salt instead of sugar. YIKES.
Without rules, every cup is a random experiment.

Complex onboarding Processes

New developers struggle to understand existing workflows.

Your favorite coffeeshop just hired a new barista but gave him no training — he stares at the coffee machine, presses random buttns, spills the milk everywhere and finally tells you: “Can you come back tomorrow?”. Disappointment…

Inefficient Code Reviews

Time-consuming processes hinder productivity.

Imagine it’s monday and you really need that coffee. Now you see a barely moving line with a single overworked barista — orders pile up, you are waiting more than ever and by the time you get your coffee…it’s cold.
The week is over before it began.

Security and compliance concerns

Ensuring code adheres to necessary regulations.

Your favorite coffee shop is not doing any health inspections — you see them using expired milk, skipping handwashing…not your favorite anymore, is it?

So how can you deal with those challenges?

Artificial Intelligence to the rescue. There, I said it.

Don’t get me wrong, yes AI can be really overhyped nowadays— but LLMs are absolutely amazing at solving many of these challenges for us.
I myself am using it daily. For everything. I am fanboy like that.

The reasoning capabilities are getting better and better and considering that you can provide extensive knowledge bases and use tooling like online search— AI is becoming my biggest ally (and probably enemy…but I will let future David deal with that).

The Case for GitHub Copilot

GitHub Copilot is your AI pair programmer right inside VS Code.
What I truly love about it, is that Copilot has direct access to my local workspace and all extensions inside VS Code.
Furthermore it has access to the editor and can provide code suggestions, as well as reason about the beautiful code I am working with.
(Of course my code is always state of the art…Not. Ever. But it works.)

The perfect AI companion.
However it does lack in charm compared to Clippy.

And though I use Copilot quite regularly…it doesn’t really feel like its
MY personal AI Copilot. Oftentimes the role is limited to being my code assistant, the interactions are super transactional and the knowledge is generalized entirely.

Copilot today is like a barista that never remembers you, even after months of ordering the same damn coffee.
I don’t want “Hi there, what can I get you?” every single time.
I want “David, my man! Black coffee, no sugar, right?” or even better: “Black coffee coming right up — oh, and I saw you were working on that API yesterday, need help with that?”…too much. It’s a barista afterall.

Copilot should learn, anticipate, and feel personal — not just be a polite-but-forgetful order taker.

The Need for AI Customization

While GitHub Copilot offers really powerful out-of-the-box functionality,
I want it to know my organization’s platform engineering rules, enforce best practices without me reminding it, and just handle things for me.

Why should I keep saying, “No, don’t use that framework” or “Yes, we always structure our pipelines this way”?
Just learn it, apply it, and execute tasks
because that’s what AI should be doing.

What Copilot should do:

  • Enforce coding standards → no more “fixing it later”
  • Validate API contracts → so we don’t ship broken stuff
  • Integrate security best practices → because security isn’t optional
  • Align with business needs → so it writes stuff that actually makes sense
  • Execute predefined workflows → because I don’t want to manually click buttons and research scripts all day

What it shouldn’t do:

  • Forget everything I just explained
  • Ask me the same questions over and over
  • Suggest things that violate my team’s best practices
  • Give me code and commands to execute without doing so itself

And okay, no automated coffee and food ordering (for now at least)
but hey, one step at a time!

Press enter or click to view image in full size
Copilot Should Become YOUR Copilot

Extending Copilot or Make it do what I want.

There are two main ways to customize GitHub Copilot:
No-Code and Yes-Code (which is obviously cooler).

1. The No-Code Approach

This method is quick and easy — you configure Copilot using VS Code settings. In your workspace go to .vscode/settings.json and try this:

// .vscode/settings.json
{
"github.copilot.chat.codeGeneration.instructions": [
"Always use the following coding standards: ...",
"Follow our API design guidelines: ...",
"Enforce security best practices: ...",
"Use TypeScript instead of JavaScript and following libraries: ..."
]
}

You can even reference an instructions file like so:

Press enter or click to view image in full size

We reference a file, in this case stored at .copilot/.copilot-instructions.txt with our custom Copilot instructions.
Here is one such example:

Source: https://gist.github.com/dminkovski/a44262f318538acb72fe0e37f3cb367d

# GitHub Copilot Instructions for REST API Development

## General Guidelines:
- Follow modern **TypeScript** best practices.
- Prioritize **modular and reusable code** for maintainability.
- Always use **ESLint** and **Prettier** for code formatting and linting.
- Ensure compatibility with **Node.js LTS versions**.
...

⁠If you want Copilot to always follow your instructions without manually configuring settings, just store them in this specific file inside your repo: .github/copilot-instructions.md.

How to check if it’s working:
When Copilot Chat is open, you should see the attached context file in the chat window, confirming that it’s using your instructions.

Press enter or click to view image in full size

Pros of No-Code Approach:

  • Super Simple Setup
  • No extra Tools or Extensions needed
  • Works right out-of-the-box

Cons of No-Code Approach:

  • Super Limited Control
  • Can’t automate Complex Workflows

This simple tweak already makes Copilot smarter and more aware of your best practices.
Platform Engineering teams and developers can start using this today to inject best-practice prompting, making Copilot’s suggestions more relevant, structured, and opinionated.

But now… let’s get to the fun part. The best part. And the part you should strongly consider using. Because it’s the right path.
You still have a choice.

Extending Copilot Using Code. For Devs Only.

Press enter or click to view image in full size
Node + VS Code + Yeoman

VS Code is already highly customizable with extensions like linters, formatters, Docker tools, GitLens, Azure and more.
Thanks to the VS Code v1.88 update, we can now install custom extensions without publishing them to the marketplace! How awesome is that?

How to Install a Custom Extension Locally

Option 1: Drop it in .vscode/extensions

  1. Develop your custom extension (we’ll get to that later).
  2. Copy the extension folder into .vscode/extensions in your workspace.
  3. VS Code will detect it and suggest installation automatically!

Option 2: Manually Install from a Local Folder

  1. Open Command Palette (CTRL+SHIFT+P)
  2. Search for “Developer: Install Extension from Location…”
  3. Select the folder containing your extension.
  4. Easy as that! It’s installed and ready to use.
Press enter or click to view image in full size
Install Extension Locally

Why This Is a Game-Changer

There is absolutely no need to publish internal extensions to the VS Code marketplace.
Your custom extensions can enforce Copilot best practices, validate code, and integrate with internal tools.
Every single developer in your org can install the extension without hassle.

Now that we know how to install a local extension, let’s create one that customizes Copilot’s behavior.

Building our custom GitHub Copilot Extension

Let’s create the .vscode/extensions folder in our workspace and run the following command inside it:

npx --package yo --package generator-code -- yo code

This will generate and scaffold our starter extension boilerplate code.
Once you run it, you will be provided with some questions.

Press enter or click to view image in full size
Yeoman Generator

Keep in mind, that this scaffolder will generate all the code in the src folder and not inside the .vscode folder.

Manifest and Configuration

Anything we want to configure about our extension is defined in the extension manifest file, that is the package.json.

Scaffolded Extension Manifest

The most important configuration for us to work with GitHub Copilot can be found under contributes. This is how we extend various functionalities within Visual Studio Code.

In VS Code extensions, the commands contribution allows users to trigger actions via the Command Palette (Ctrl+Shift+P).

const disposable = vscode.commands.registerCommand('my-custom-copilot.helloWorld', () => {
vscode.window.showInformationMessage('Hello World from my-custom-copilot!');
});

Besides allowing users to trigger actions via the Command Palette, commands in VS Code also serve as an API for other extensions. This means - other extensions can call your commands asynchronously using:

await vscode.commands.executeCommand('my-custom-copilot.helloWorld');

Understanding the activate Function

The activate function is the brain of your extension. It:

  • Runs when VS Code loads your extension.
  • Registers commands, listeners, and integrations.
  • Allows you to use the VS Code API (vscode.window) etc.
export function activate(context: vscode.ExtensionContext) {
// Your custom extension code
}
Flow Diagram of our VS Code Extension

Integrating with Copilot

We have two components of the vscode api that we can use for our custom Copilot functionality.

  1. Chat Participant — Use the Chat Interface & Copilot Experience.
  2. Copilot’s LLM — Use the underlying Models for AI reasoning.

With VS Code’s Chat Participant API, we can:

  • Inject custom logic into Copilot Chat.
  • Modify user input and Copilot’s responses before they appear.
  • Add tools and commands to Copilot Chat.
  • Use the same UI/UX, but with our own AI logic.

And with Copilot’s LLM API, we can:

  • Bypass the Chat UI and send direct prompts to Copilot’s AI.
  • Process responses programmatically before showing them to the user.
  • Integrate Copilot’s reasoning into our own extensions, workflows, or automation.

Chat Participant Creation

To modify Copilot Chat behavior, we register a custom chat participant and define our own logic. Add a Chat Participant to package.json.

# ...
"contributes": {
"chatParticipants": [
{
"id": "rest",
"fullName": "'REST-API' Copilot",
"name": "rest",
"description": "REST API Copilot for NodeJS Web Apps",
"isSticky": true,
"commands": [
{
"name": "list",
"description": "List all available tools"
}
]
}
],
# ...

Register the Participant in extension.ts.
Modify src/extension.ts:

import * as vscode from 'vscode';
import { handler } from './chatparticipant-handler';

export function activate(context: vscode.ExtensionContext) {
// Create chat participant
const participant = vscode.chat.createChatParticipant('rest', handler);
context.subscriptions.push(participant);
// Do other stuff
}

Now, let’s define how our custom Copilot interacts with user input
Create src/chat-handler.ts with this content: https://gist.github.com/dminkovski/4901d37c20acf6dc03d0a32f427e8a06

You can reuse this handler for any participant you want to be using the tool calling logic.

The handler function will take care of calling the registered LLM tools, that we will add soon. Let’s also create the prompt and helpers for that as well.
The system prompt — is the most important part of any call to a LLM.
An AI agent is nothing more that a system prompt and a model.
Therefore make sure to spend enough time on refining the prompt.

You can find the entire code that is being referenced here:
https://github.com/microsoft/vscode-extension-samples/tree/main/chat-sample

I dare you to steal this code for the prompt.
Last but not least…let’s add some tools!

Register Tooling

The tools need to be registered in the package.json as well.
Under contributes we will add the languageModelTools key.
A tool is made up of the following attributes:

  • name: The unique name of the tool. This is used to reference the tool in the extension implementation code.
  • tags: An array of tags that describe the tool. This is used to filter the list of tools that are relevant for a specific request.
  • toolReferenceName: If enabled, the name for users to reference the tool in a chat prompt via #.
  • displayName: The user-friendly name of the tool, used for displaying in the UI.
  • modelDescription: Description of the tool, which can be used by the language model to select it.
  • icon: The icon to display for the tool in the UI.
  • inputSchema: The JSON schema that describes the input parameters for the tool. This is used by the language model to provide parameter values for the tool invocation.
# ... 
"languageModelTools": [
{
"name": "runScript",
"toolReferenceName": "runScript",
"tags": [
"terminal",
"execute",
"script",
"rest"
],
"displayName": "Run Script in Terminal",
"modelDescription": "Run the script in the terminal and return the output. Make sure to provide the correct command and path based on workspace context.",
"inputSchema": {
"type": "object",
"properties": {
"command": {
"type": "string",
"description": "The command to run"
}
},
"required": [
"command"
]
}
},
# ...

And finally lets register and implement the custom tools.

We’ll create a tool called runScript that will:

  • Execute scripts and commands in a hidden process.
  • Open a terminal instance if needed (e.g., for starting a web server).
  • Be selectable by Copilot’s LLM when appropriate.

Now, we’ll create a function to handle script execution.
Create a new file: src/tools.ts with containing this: https://gist.github.com/dminkovski/16395a0bcf712eec597f288edb3e594f

The only thing that’s left is to call the registerChatTools function in our activate function — and we are good to go!

Demo Time!

Run npm run compile or npm run watch and either move the extension to the .vscode/extensions folder or just install it using the way that I mentioned before.

Here is a sample repository for you to explore:
dminkovski/tcfy25-copilot-extension-demo: Tech Connect FY 25 — Copilot Extension Demos

Test Via Copilot Chat

  1. Open GitHub Copilot Chat (Ctrl+Shift+I).
  2. Activate your Copilot by typing @YOUR_COPILOT_NAME.
  3. Prompt: “Install the npm dependencies” or “Start the server.”

Expected Result:

  • Copilot should recognize the runScript tool and execute the command.
  • It should open a terminal for long-running scripts related to server stuff.
Press enter or click to view image in full size
Copilot Demo Chat

What’s Next? Enhancing Our Custom Copilot

Now our Copilot is doing real automation! But you can go further:

  • Secure script execution by whitelisting commands.
  • Integrate with CI/CD pipelines to trigger builds/deployments.
  • Expand tooling to include API testing, database migrations, and more.

Congrats, your custom GitHub Copilot is now an intelligent, interactive assistant!

Challenges I Faced & Future Outlook

Press enter or click to view image in full size
Challenges and Potential

Yes even though I was able to get it to work and I am pretty happy with the results, I did face some challenges and I want you to know about them.

Prompt Scaling & Multi-Agent Coordination

Current Limitation:
Chat Participants cannot communicate with each other directly.

What it should be like:

Baristas in a busy coffee shop yelling orders at each other:
“Hey Pete, pass me the milk!”“Sure Dan, here you go!”

Right now, each Chat Participant is isolated, like a barista working alone instead of a collaborative team. But it takes a team to create a great coffee-shop experience.

So why is this a problem?

  • We can’t create collaborative AI agents (e.g., one for code, one for security, one for documentation, one for project management).
  • Each agent operates in isolation, limiting complex problem-solving.

Future Possibilities:
Imagine an “AI Dev Team”:

  • Project Manager → Sets goals, breaks tasks into tickets and helps with time estimations.
  • Architect → Reviews designs, suggests best patterns and keeps technical documentation up to date.
  • Dev → Writes, runs and tests the code.
  • Security → Scans for vulnerabilities and enforces security best practices.
  • Reviewer → Ensures best practices before merging PRs and oversees Git messages etc.

Once Chat Participants can talk to each other, we could build a fully AI-driven software engineering workflow inside VS Code.

On the other hand, that would definetely mean I should consider that Barista role at my coffee shop…I guess I better start learning to make coffee.

Custom UI/UX for AI Extensions

Current Limitation:
Very hard to customize Copilot’s UI.

Why It’s a Problem:

  • VS Code supports custom UI, but setting it up is complex. This is done by injecting the generated JS using a custom Webview (VS Code API).
  • When using React, hot reloading issues make UI development slow. The workaround is dirty…you’d watch the build folder and have it recreate the Webview. Not great if you like custom UI/UX.

Debugging AI Prompt Hallucinations

Current Limitation:
No proper prompt debugging tools.

Why It’s a Problem:

  • No way to trace which part of a prompt caused which hallucination.
  • Prompt enhancement is only trial & error.
  • No clear visibility into why Copilot picks one tool over another.

What it should be like:

My barista watching my reaction to make the coffee recipe even better.
“Too bitter? More sugar next time!”“Perfect? Great, I’ll remember that!”

Right now, debugging prompts is like making coffee blindfolded — you don’t know which ingredient made it taste bad or good...

Future Possibilities:
LLM Debugging Tools for VS Code.

  • Live Prompt Debugger → See which change affected the AI response.
  • Token Visualization → Highlight which words influenced the AI’s response.
  • “Why did Copilot generate this?” → Explanation for behavior.
  • “Try Again” with a different prompt → Easily tweak and rerun prompts with small changes to see what works better for you.

Debugging prompts should be as easy as debugging code!

Summary

In this article, I tried to take you on a hands-on journey to extend GitHub Copilot beyond just being a code assistant — turning it into a true AI-powered teammate inside VS Code. Like transforming a basic coffee machine into a high-end one, we customized Copilot to be able to suggest platform engineering best practices, automate workflows, and potentially integrate directly into development pipelines.

We explored how to customize Copilot, register Chat Participants to enforce engineering rules, and create LLM tools that actually do useful work — like running scripts so you don’t have to type the same four terminal commands every day.

But here is where it gets really exciting:
The future of Copilot is going to be about agents working together.
Soon they will be able to collaborate:

  • Hey @ security, is this API according to our security best practices?”
  • “Not yet! @ architect do you have an idea?”
  • “I Got it! @ developer, refactor this endpoint using OAuth2!”

The dream of AI-driven software engineering is real — and once we will master multi-agent collaboration, debugging, and actual intelligent automation, we won’t just be coding with AI…
we’ll be managing entire AI teams.

I don’t think we will loose our jobs, rather we will be elevated to being AI Architects and Coordinators. Like with any technological breakthrough, some roles vanish and others appear in their place. And some of us will become subject matter experts, that will work on AI Agents and their abilities to solve increasingly complex challenges in a specific domain.

The tools are already here, the potential I see is huge, and we’re just getting started. So let’s keep building and experimenting.

PS: If you succeed in building this AI engineering dream team…Please destroy it. Immediately!

Before they start a union, start demanding sprint retrospectives and proper backlog management or useful requirements, or — nightmare— refuse to push untested code on a Friday. Come on!
Because the last thing I need is an AI agent telling me:
Sorry, I can’t fix that. It’s outside of my working hours. Goodbye.”

--

--

No responses yet