This guide will help you set up and run k6 for load testing on both Windows and macOS. k6 is an open-source load testing tool for testing the performance of your applications.
- Node.js: Version 22
- Yarn: Package manager for installing dependencies
- Windows: PowerShell or Command Prompt
- MacBook: Terminal
- VS Code or any other IDEs to look at the project structure
- Visit the k6 releases page and download the latest Windows version.
- Extract the zip file and then add k6 to the path
- Run
k6 version
on Powershell/Command prompt to verify that k6 is installed.
- Install Homebrew(if not already installed)
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
- Run the following command in Terminal:
brew install k6
- Type
k6 version
in Terminal. You should see the installed version of k6.
-
On Terminal/Powershell navigate to the directory
k6-load-tests
-
Run the following command to install all the required dependencies:
yarn install
-
To run the k6 load tests and get report on the Terminal/Powershell, run the command
yarn test
-
If you want to get detailed k6 dashboard html report, run the command
yarn test:report
- This test is a simple test that tests how the https://test.k6.io/ website responds when there is a big load.
- The test is configured with three stages, each defining a different load profile for virtual users (VUs):
- Stage 1:
- Duration: 15 seconds
- Target: 200 VUs
- Description: Gradually ramps up the number of VUs from 0 to 200 over the first 15 seconds.
- Stage 2:
- Duration: 15 seconds
- Target: 1000 VUs
- Description: Further increases the load to 1000 VUs over the next 15 seconds, simulating a peak load.
- Stage 3:
- Duration: 30 seconds
- Target: 0 VUs
- Description: Gradually ramps down the number of VUs back to 0 over the final 30 seconds, allowing for a cool-down period.
- Stage 1:
- The load test did have impact on the web application response time. Here is the observation
- The fastest response time was 96ms
- When the load was high the response time went up to 1000ms(1s)
- Average response duration was 140ms
- p90 was around 202 ms
- p95 was around 242 ms
- p99 was around 360 ms
- With the above observations we can say that when the load was going high the response times went high as well
- In the modern day web applications, the response time depends on the industry standards, application type.
- From my experience I would say APIs returning responses
- under 200ms is considered ideal
- 200ms - 1s is good
- anything over 1s need some improvements
- Slower response times can lead to frustration in the users, eventually leaving the platform
- k6 report that was generated is also committed (
k6-report.html
) so you can see the trends of how was the response time affected as the number of VUs increased and decreased over time.
- The k6 tests are integrated with GitHub Actions. For now the tests are run only on push to the
main
branch. But we can always change this strategy as per our needs. You can see the tests running on GitHub Actions.