Many musicians rely on basic tools in everyday life, including a tuner, a metronome, and a keyboard. I support the kids in a marching band, and I see that they rely on numerous poorly developed apps that differ between iOS and Android devices. Many of them also include ads, and some even play video ads in the middle of a tuning session. I wanted to see if I could develop some web-based apps to solve the problem.
At the same time, I wanted to see whether the new Copilot agent on GitHub could help. While initiating a new project, the agent now offers to help with not only setting up the repository but also start coding. However, as it turned out, the agent solved the entire problem on the first attempt!
The task
I wanted a lightweight set of browser-based music utilities that require no installation and no backend. The goal was simple: open a URL and start using it. I asked the agent to create a metronome, a tuner, a piano, and a drum kit, and it immediately set to work implementing a solution.

I didn’t specify any particular web technologies, but am pleased with what it selected:
- HTML for structure
- CSS for layout and styling
- Vanilla JavaScript for interaction
- Web Audio API for sound
These constraints kept complexity low and made deployment trivial. It also ensured the tools run on most modern browsers without a build pipeline.
Building the apps
Five minutes after I made the request, the agent pushed a solution to my repository. It included a complete web page with a front-end interface and separate “apps”.
Metronome
The metronome uses a scheduler loop based on the Web Audio clock, so clicks stay stable. UI updates (beat flash and beat dots) are timed to match scheduled beats. It allows changing the BPM range (40–240), tap tempo, and choosing the time signature (2/4, 3/4, 4/4, 6/8). There is also a tempo name guide under the controls (Largo through Prestissimo), with active range highlighting.
Piano
The piano supports two octaves plus a high C key, mouse and keyboard interaction, octave shifting, waveform selection, and volume controls.
Drum kit
The drum kit synthesises various percussion sounds based on clickable pads and keyboard shortcuts. It also has an adjustable master volume. Noise buffers and filtered oscillators provide distinct timbres for each instrument.
Tuner
While the ones above are straightforward to implement, I was curious about how the agent would make a tuner. I was surprised to see that it managed to create a microphone-based analysis on the first attempt. The tuner uses getUserMedia for microphone input and autocorrelation for pitch detection. It displays the note name, frequency in Hz, cents deviation from nearest semitone, and a needle meter with in-tune feedback. A silence gate avoids unstable readings when no clear pitch is present. Quite clever!
Guitar tuner
Since the generic tuner worked so well, I asked the agent also to implement a dedicated guitar tuner, which builds on the regular tuner but implements standard 6-string targets (E2, A2, D3, G3, B3, E4), highlights the nearest string in real time and displays cents sharp/flat and clear in-tune feedback.
Refactoring
I was very surprised to see how the agent delivered a complete solution on the first attempt. I only had to fix a few things.
First, it created all the files in the root folder. I prefer to have one folder per “app”, so they are easy to link to. I asked the agent to reorganise the file structure, and it did so without any problems.
Second, it used English terminology in the UI, but I wanted it in Norwegian to be more kid-friendly. A quick message and it translated all the UI elements for me.

That worked well overall, though I wanted to modify a couple of the terms it chose. Those were, again, easy to fix:

I also wanted to change some of the icons it had used, which led me to explore the many standard emoji characters available in 📙 Emojipedia — 😃 Home of Emoji Meanings 💁👌🎍😍.
Deployment to GitHub Pages
Finally, I wanted to deploy the tools to a web page, and for simplicity, asked the agent to deploy to GigHub Pages.

This was the first time things did not work entirely as expected, so I created a new issue and realised I could assign it to CoPilot to work on it:

As always, it politely agreed to take on the task:

And only a couple of minutes later, I got a new merge request:

Unfortunately, the agent available in GitHub does not allow for discussion, so here I decided to revert to my regular workflow in Visual Studio Code:

It helped resolve the issue and helped me set up the repository correctly for pushing to Pages.
The Agent’s Reflection on the Process
The last part involved writing this blog post. I began by asking CoPilot (in VS Studio) to summarise the whole process and reflect on it. This resulted in a first draft of this post:

It is interesting how it included a certain level of technical reasoning: “Web Audio scheduling should rely on audio time, not UI timers”. It also had some UI reflections: “Focused tuner variants (general + instrument-specific) improve usability.
I didn’t ask it explicitly about future works, but it itself suggested these follow-ups:
- Recording and loop playback features
- Preset saving (BPM, waveform, volume)
- Accessibility upgrades and extended keyboard mappings
- Optional PWA support for offline use
All of these are excellent suggestions, although not necessary at this stage.
My reflections on the process
I have worked with various AI agents for some time now, but this was my first time trying an agent directly from GitHub. I am quite amazed by how it managed to turn a one-sentence request for a diverse set of musical tools into a complete, workable, web-based “app” suite in just 5 minutes.
I have spent a couple more hours making modifications (agent-based and manual) and writing this blog post. The whole process was blazingly fast. I could have written the apps myself, but with lower technical quality and spending days, not minutes, on the process.
One reason this worked so well was that I gave a relatively simple task within a well-defined space. There are lots of tuners and metronomes out there, so plenty of code to work from. The agents would have struggled more with something more complex, but I am eager to test their capabilities in the near future.
The apps were mainly made by a GitHub Agent in this repo, with some modifications from a CoPilot agent running in “Auto” mode in VS Studio. The CoPilot agent wrote the initial blog post, but I largely rewrote it. Grammarly helped with the final grammar checking.






