Ask Copilot agent to write the full end-to-end test and it is likely to write nonsense. For example, let's test the "zero teams / zero players" message on my soccer web app.
I can start a new test in the existing "Teams" spec file
1 | describe('Teams', () => { |
If I ask Copilot to write this test, what are the chances of Copilot writing a good end-to-end test?
The agent's solution is not too bad
1 | /// <reference types="cypress" /> |
Let's immediately run this test to see how well it works.
The test is failing
- the check for zero teams message is wrong. Copilot does not "know" what the "zero teams" selector should be
- the test is incomplete. It does not check if the "zero teams" component goes away when we add a team
Copilot has no idea about your project. It is like a developer who just sees the spec file and "guesses" the test steps and selectors based on their previous work experience, but without any idea how your project works. Let's improve it.
Add comments
Notice how Copilot added comments to the generated test code? That's a very good idea to describe what the test is trying to do. We should guide Copilot using comments ourselves.
1 | describe('Teams', () => { |
Using code comments in my opinion is preferably to putting more context into the Agent prompt, since prompts are transient, while the comments stay with the code.
Let's use the same prompt and check the generated code.
Much much better. Does it work?
Let's give Copilot shortcuts.
Use a page object
Instead of hoping that Copilot can discover in our test how to add a team and a player, why don't we give it a shortcut: by using a page object we can simplify Copilot's task. I will create a new static object that simply implements a few common actions on the page, like adding a team.
1 | import 'cypress-plugin-steps' |
Tip: I am using cypress-plugin-steps to create better visual test log.
Ok, so how can Copilot take the advantage of "gametime" page object? Let's include it in the prompt.
Bingo. The generated test looks reasonable and is passing.
Use Copilot instructions file
Typing the same "Use the page object from gametime.ts file to do common actions." in each prompt quickly becomes tiresome. We can use the common Copilot instructions Markdown file instead. Here is my instructions file that gives Copilot general instructions for code generation. Notice how I use the project-specific references.
Let's see if Copilot Agent can write a useful test with the minimal prompt "Implement this test". Wow, the Agent actually does the two-step. First it suggests using only the page object.
The generated test is bad, but the Copilot Agent is not done yet. It now checks if the syntax is correct and modifies the code to mix page object method calls with custom UI assertions!
The finished test is ... good and passing.
Here is the finished spec file for reference
1 | /// <reference types="cypress" /> |
Pretty sweet. I would accept this test if someone opened a pull request review with this code. It uses mostly page object methods for general actions on the page plus correct stable selectors for checking the "zero" components. No complains.
🎓 Want to learn how to use Cypress and Copilot or Cursor to quickly write useful end-to-end tests? Check out my online courses on these topics at https://cypress.tips/courses.
Tips
Add a TypeScript check
Often Copilot "invents" non-existent page object methods
To prevent such simple hallucinations, I include the following in my prompt / instructions
1 | When using the gametime.ts page object, check if the method names and their parameters are correct. |
I also use // @ts-check
in my spec files to type-check even JavaScript specs.