Wordpress. Love it or hate it, it runs close to 40% of the internet. So when tasked with increasing it’s performance we need to establish a baseline and then measure our incremental changes. Queue Locust!
-
Install python3-venv and python dev packages
apt-get install python3-venv python3-dev
-
Create locust virtual environment and activate it
python3 -mvenv ~/.python_envs/locust source ~/.python_envs/locust/bin/activate
-
Install locust
pip install locust
-
Create locustfile.py, changing https://wordpress-site.com to the site you’re testing
from locust import HttpUser, TaskSet, task, between class WordPressTasks(TaskSet): @task def index(self): self.client.get("/") @task def login_page(self): self.client.get("/wp-login.php") class WordPressUser(HttpUser): tasks = [WordPressTasks] host = "https://wordpress-site.com" wait_time = between(5, 10)
-
Run your test.
-u specifies the number of users
-r defines the spawn rate of the users per second
-t 2m defines the time for the tests to run
locust --headless -u 50 -r 10 -t2m --only-summary
-
Distribute your test. Running locust as above will limit you to a single cpu on your testing machine and you’ll likely see the following message when exceeding ~75 users:
[2021-01-16 08:16:57,407] fontani/WARNING/locust.runners: CPU usage was too high at some point during the test! See https://docs.locust.io/en/stable/running-locust-distributed.html for how to distribute the load over multiple CPU cores or machines
Locust implements a client/server model to distribute workloads across multiple CPUs or hosts. We modify our locust command above to reflect the master (sic). We define the number of expected workers and the -u, -r, and -t arguments from step 5. Increase the values as you see fit, they will be evenly distributed among the workers.
locust -f locustfile.py --master --headless --expect-workers=8 -u 500 -r 30 -t3m --only-summary
Locust will start in server mode and wait for all of the workers to join. Once all 8 are running the test will be started.
[2021-01-16 08:42:24,640] fontani/INFO/root: Waiting for workers to be ready, 0 of 6 connected [2021-01-16 08:42:25,641] fontani/INFO/root: Waiting for workers to be ready, 0 of 6 connected [2021-01-16 08:42:26,455] fontani/INFO/locust.runners: Client 'fontani_df70fddb37af2cb0aaab8a7164b20e99' reported as ready. Currently 1 clients ready to swarm. ... [2021-01-16 09:11:18,477] fontani/INFO/locust.runners: Client 'fontani_d48661727c2a4d1b4afe54b41190e58b' reported as ready. Currently 8 clients ready to swarm. [2021-01-16 09:11:19,134] fontani/INFO/locust.runners: Sending spawn jobs of 62 users and 3.75 spawn rate to 8 ready clients [2021-01-16 09:11:19,137] fontani/INFO/locust.main: Run time limit set to 180 seconds
-
Fire up a screen on the host and run the following command in 8 different screens.
locust --headless -f locustfile.py --worker --master-host=127.0.0.1
-
Once the testing is completed the server process will spit out a report.
Name # reqs # fails | Avg Min Max Median | req/s failures/s -------------------------------------------------------------------------------------------------------------------------------------------- GET / 3092 0(0.00%) | 727 411 2940 610 | 17.17 0.00 GET /wp-login.php 3234 0(0.00%) | 1145 507 3578 1000 | 17.96 0.00 -------------------------------------------------------------------------------------------------------------------------------------------- Aggregated 6326 0(0.00%) | 941 411 3578 830 | 35.12 0.00 Response time percentiles (approximated) Type Name 50% 66% 75% 80% 90% 95% 98% 99% 99.9% 99.99% 100% # reqs --------|------------------------------------------------------------|---------|------|------|------|------|------|------|------|------|------|------|------| GET / 610 710 810 870 1100 1400 1900 2200 2700 2900 2900 3092 GET /wp-login.php 1000 1200 1300 1400 1800 2000 2400 2600 3200 3600 3600 3234 --------|------------------------------------------------------------|---------|------|------|------|------|------|------|------|------|------|------|------| None Aggregated 830 1000 1100 1200 1500 1900 2200 2400 3200 3600 3600 6326