Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

--html with --process 4 then get ValueError: StatsEntry.use_response_times_cache must be set to True #2908

Open
2 tasks done
kavlevru opened this issue Sep 19, 2024 · 4 comments
Labels

Comments

@kavlevru
Copy link

Prerequisites

Description

Hi, when I start locust with --html and --process, I get a ValueError: StatsEntry.use_response_times_cache must be set to True if we should be able to calculate the current response time percentile

Command line

locust --processes 4 -f tests/get__api_vhs_public_v1_selection_screen.py --html report.html --autostart --autoquit 5

Locustfile contents

Locustfile:
from locust import HttpUser, task, constant_throughput, tag

from common.core.app_config import AppConfig
from common.core.parsers import get_user_agent
from common.shapes import LinearAcceptableAnswerShape

config = AppConfig().get_config()


class GetApiVhsPublicV1SelectionScreen(HttpUser):
    host = config['HOSTS']['vhs']
    wait_time = constant_throughput(0.1)
    LinearAcceptableAnswerShape.max_users = 500

    @tag('selection_screen')
    @task(1)
    def get_api_vhs_public_v1_selection_screen(self):
        self.client.headers.update({'User-Agent': get_user_agent()})
        self.client.get(url='/api/vhs/public/v1/selection-screen/')

Shape:
from locust import LoadTestShape


class LinearAcceptableAnswerShape(LoadTestShape):
    """
    Эта форма нагрузки увеличивает количество пользователей линейно от 1 до 1000 (100 RPS).
    Когда время 95-го персентиля достигает значения acceptable_response_time (в мс),
    количество пользователей снижается на 2%, и тест продолжается в течение времени, равного test_run_after_decline.
    Если за skipping_iterations, установленном в def tick, итераций (tick) время ответов 95-го персентиля
    не стало меньше значения acceptable_response_time, то шаг повторяется.
    Затем тест завершает работу.

    """

    initial_users = 1
    max_users = 1000
    users_up_time = 100
    current_users = initial_users

    acceptable_response_time = 1500.0
    skipping_iterations = 0

    decline = False
    decline_time = None
    test_run_after_decline = 20

    def tick(self):
        run_time = self.get_run_time()

        if self.decline and (run_time - self.decline_time) >= self.test_run_after_decline:
            return None

        percentile_95 = self.runner.stats.total.get_current_response_time_percentile(0.95) \
                        or self.acceptable_response_time

        if percentile_95 > self.acceptable_response_time and self.skipping_iterations < 1:
            self.decline = True
            self.decline_time = run_time
            self.skipping_iterations = 10
            self.current_users = max(1, int(self.current_users * 0.98))
            return self.current_users, self.current_users

        self.skipping_iterations -= 1

        if self.decline:
            return self.current_users, 1

        if run_time < self.users_up_time:
            self.current_users = int(
                ((self.max_users - self.initial_users) / self.users_up_time) * run_time) + self.initial_users
            spawn_rate = self.current_users
        else:
            self.current_users = self.max_users
            spawn_rate = 1

        return self.current_users, spawn_rate

Python version

3.10.9

Locust version

2.31.6

Operating system

MacOS Sonoma 14.6.1 M1

@kavlevru kavlevru added the bug label Sep 19, 2024
@kavlevru
Copy link
Author

Тест завершен.
[2024-09-19 19:14:17,261] RITM-K25KWC/INFO/locust.main: --run-time limit reached, stopping test
[2024-09-19 19:14:22,265] RITM-K25KWC/INFO/locust.main: --autoquit time reached, shutting down
[2024-09-19 19:14:22,266] RITM-K25KWC/INFO/locust.runners: Got quit message from master, shutting down...
[2024-09-19 19:14:22,266] RITM-K25KWC/INFO/locust.runners: Got quit message from master, shutting down...
[2024-09-19 19:14:22,266] RITM-K25KWC/INFO/locust.runners: Got quit message from master, shutting down...
[2024-09-19 19:14:22,266] RITM-K25KWC/INFO/locust.runners: Got quit message from master, shutting down...
Traceback (most recent call last):
  File "/Users/r.kavlev/PycharmProjects/locust/venv/bin/locust", line 8, in <module>
Traceback (most recent call last):
  File "/Users/r.kavlev/PycharmProjects/locust/venv/bin/locust", line 8, in <module>
    sys.exit(main())
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/main.py", line 697, in main
Traceback (most recent call last):
  File "/Users/r.kavlev/PycharmProjects/locust/venv/bin/locust", line 8, in <module>
    sys.exit(main())
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/main.py", line 697, in main
Traceback (most recent call last):
  File "/Users/r.kavlev/PycharmProjects/locust/venv/bin/locust", line 8, in <module>
    sys.exit(main())
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/main.py", line 697, in main
    sys.exit(main())
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/main.py", line 697, in main
    save_html_report()
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/main.py", line 681, in save_html_report
    save_html_report()
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/main.py", line 681, in save_html_report
    save_html_report()
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/main.py", line 681, in save_html_report
    save_html_report()
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/main.py", line 681, in save_html_report
    html_report = get_html_report(environment, show_download_link=False)
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/html.py", line 55, in get_html_report
    html_report = get_html_report(environment, show_download_link=False)
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/html.py", line 55, in get_html_report
    html_report = get_html_report(environment, show_download_link=False)
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/html.py", line 55, in get_html_report
    update_stats_history(environment.runner)
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 908, in update_stats_history
    update_stats_history(environment.runner)
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 908, in update_stats_history
    html_report = get_html_report(environment, show_download_link=False)
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/html.py", line 55, in get_html_report
    current_response_time_percentiles = {
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 911, in <dictcomp>
    current_response_time_percentiles = {
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 911, in <dictcomp>
    update_stats_history(environment.runner)
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 908, in update_stats_history
    update_stats_history(environment.runner)
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 908, in update_stats_history
    stats.total.get_current_response_time_percentile(percentile) or 0,
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 597, in get_current_response_time_percentile
    stats.total.get_current_response_time_percentile(percentile) or 0,
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 597, in get_current_response_time_percentile
    current_response_time_percentiles = {
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 911, in <dictcomp>
    raise ValueError(
ValueError: StatsEntry.use_response_times_cache must be set to True if we should be able to calculate the _current_ response time percentile
    raise ValueError(
ValueError: StatsEntry.use_response_times_cache must be set to True if we should be able to calculate the _current_ response time percentile
    current_response_time_percentiles = {
    stats.total.get_current_response_time_percentile(percentile) or 0,
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 911, in <dictcomp>
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 597, in get_current_response_time_percentile
    stats.total.get_current_response_time_percentile(percentile) or 0,
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 597, in get_current_response_time_percentile
    raise ValueError(
ValueError: StatsEntry.use_response_times_cache must be set to True if we should be able to calculate the _current_ response time percentile
    raise ValueError(
ValueError: StatsEntry.use_response_times_cache must be set to True if we should be able to calculate the _current_ response time percentile
[2024-09-19 19:14:22,831] RITM-K25KWC/INFO/locust.main: writing html report to file: report.html
[2024-09-19 19:14:23,338] RITM-K25KWC/INFO/locust.main: Shutting down (exit code 0)

@kavlevru kavlevru reopened this Sep 19, 2024
@cyberw
Copy link
Collaborator

cyberw commented Sep 19, 2024

Hi! Is this only an issue when also using shapes or can you reproduce it with any locustfile?

@kavlevru
Copy link
Author

@cyberw , I reproduce it without shape

locust -f tests/get__api_vhs_public_v1_selection_screen.py -u 100 -r 10 --run-time 1m --html report.html --process 4
[2024-09-20 00:11:34,548] RITM-K25KWC/INFO/locust.main: Starting web interface at http://0.0.0.0:8089 report.html --process 4
[2024-09-20 00:11:34,564] RITM-K25KWC/INFO/locust.main: Starting Locust 2.31.6
[2024-09-20 00:11:34,564] RITM-K25KWC/INFO/root: Waiting for workers to be ready, 0 of 4 connected
[2024-09-20 00:11:34,565] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_b7bba10bcba44c8b88d687205307f6d9 (index 0) reported as ready. 1 workers connected.
[2024-09-20 00:11:34,565] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_ed52ac652b2141f6980efb839a892987 (index 1) reported as ready. 2 workers connected.
[2024-09-20 00:11:34,566] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_8271cbfc53e34f1bb4427b73739c9969 (index 2) reported as ready. 3 workers connected.
[2024-09-20 00:11:34,566] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_237ce8d3838847f49b5b96881c75c3cd (index 3) reported as ready. 4 workers connected.
[2024-09-20 00:11:34,566] RITM-K25KWC/INFO/locust.main: Starting Locust 2.31.6
[2024-09-20 00:11:34,566] RITM-K25KWC/INFO/locust.main: Starting Locust 2.31.6
[2024-09-20 00:11:34,566] RITM-K25KWC/INFO/locust.main: Starting Locust 2.31.6
[2024-09-20 00:11:34,567] RITM-K25KWC/INFO/locust.main: Starting Locust 2.31.6
[2024-09-20 00:11:35,570] RITM-K25KWC/INFO/locust.main: Run time limit set to 60 seconds
[2024-09-20 00:11:35,570] RITM-K25KWC/INFO/locust.runners: Sending spawn jobs of 100 users at 10.00 spawn rate to 4 ready workers
[2024-09-20 00:11:44,613] RITM-K25KWC/INFO/locust.runners: All users spawned: {"GetApiVhsPublicV1SelectionScreen": 100} (100 total users)
[2024-09-20 00:12:35,573] RITM-K25KWC/INFO/locust.main: --run-time limit reached, stopping test
[2024-09-20 00:12:35,614] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_ed52ac652b2141f6980efb839a892987 (index 1) reported that it has stopped, removing from running workers
[2024-09-20 00:12:35,615] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_ed52ac652b2141f6980efb839a892987 (index 1) reported as ready. 4 workers connected.
[2024-09-20 00:12:35,615] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_b7bba10bcba44c8b88d687205307f6d9 (index 0) reported that it has stopped, removing from running workers
[2024-09-20 00:12:35,615] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_8271cbfc53e34f1bb4427b73739c9969 (index 2) reported that it has stopped, removing from running workers
[2024-09-20 00:12:35,615] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_b7bba10bcba44c8b88d687205307f6d9 (index 0) reported as ready. 3 workers connected.
[2024-09-20 00:12:35,615] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_237ce8d3838847f49b5b96881c75c3cd (index 3) reported that it has stopped, removing from running workers
[2024-09-20 00:12:35,615] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_8271cbfc53e34f1bb4427b73739c9969 (index 2) reported as ready. 3 workers connected.
[2024-09-20 00:12:35,615] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_237ce8d3838847f49b5b96881c75c3cd (index 3) reported as ready. 4 workers connected.
[2024-09-20 00:12:41,580] RITM-K25KWC/INFO/locust.main: --autoquit time reached, shutting down
[2024-09-20 00:12:41,584] RITM-K25KWC/INFO/locust.runners: Got quit message from master, shutting down...
[2024-09-20 00:12:41,584] RITM-K25KWC/INFO/locust.runners: Got quit message from master, shutting down...
[2024-09-20 00:12:41,584] RITM-K25KWC/INFO/locust.runners: Got quit message from master, shutting down...
[2024-09-20 00:12:41,584] RITM-K25KWC/INFO/locust.runners: Got quit message from master, shutting down...
Traceback (most recent call last):
  File "/Users/r.kavlev/PycharmProjects/locust/venv/bin/locust", line 8, in <module>
Traceback (most recent call last):
  File "/Users/r.kavlev/PycharmProjects/locust/venv/bin/locust", line 8, in <module>
Traceback (most recent call last):
  File "/Users/r.kavlev/PycharmProjects/locust/venv/bin/locust", line 8, in <module>
Traceback (most recent call last):
  File "/Users/r.kavlev/PycharmProjects/locust/venv/bin/locust", line 8, in <module>
    sys.exit(main())
    sys.exit(main())
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/main.py", line 697, in main
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/main.py", line 697, in main
    sys.exit(main())
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/main.py", line 697, in main
    sys.exit(main())
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/main.py", line 697, in main
    save_html_report()
    save_html_report()
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/main.py", line 681, in save_html_report
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/main.py", line 681, in save_html_report
    save_html_report()
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/main.py", line 681, in save_html_report
    save_html_report()
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/main.py", line 681, in save_html_report
    html_report = get_html_report(environment, show_download_link=False)
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/html.py", line 55, in get_html_report
    html_report = get_html_report(environment, show_download_link=False)
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/html.py", line 55, in get_html_report
    update_stats_history(environment.runner)
    html_report = get_html_report(environment, show_download_link=False)
    html_report = get_html_report(environment, show_download_link=False)
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 908, in update_stats_history
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/html.py", line 55, in get_html_report
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/html.py", line 55, in get_html_report
    update_stats_history(environment.runner)
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 908, in update_stats_history
    update_stats_history(environment.runner)
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 908, in update_stats_history
    current_response_time_percentiles = {
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 911, in <dictcomp>
    current_response_time_percentiles = {
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 911, in <dictcomp>
    update_stats_history(environment.runner)
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 908, in update_stats_history
    current_response_time_percentiles = {
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 911, in <dictcomp>
    stats.total.get_current_response_time_percentile(percentile) or 0,
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 597, in get_current_response_time_percentile
    current_response_time_percentiles = {
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 911, in <dictcomp>
    stats.total.get_current_response_time_percentile(percentile) or 0,
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 597, in get_current_response_time_percentile
    stats.total.get_current_response_time_percentile(percentile) or 0,
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 597, in get_current_response_time_percentile
    raise ValueError(
ValueError: StatsEntry.use_response_times_cache must be set to True if we should be able to calculate the _current_ response time percentile
    raise ValueError(
ValueError: StatsEntry.use_response_times_cache must be set to True if we should be able to calculate the _current_ response time percentile
    raise ValueError(
ValueError: StatsEntry.use_response_times_cache must be set to True if we should be able to calculate the _current_ response time percentile
    stats.total.get_current_response_time_percentile(percentile) or 0,
  File "/Users/r.kavlev/PycharmProjects/locust/venv/lib/python3.10/site-packages/locust/stats.py", line 597, in get_current_response_time_percentile
    raise ValueError(
ValueError: StatsEntry.use_response_times_cache must be set to True if we should be able to calculate the _current_ response time percentile
[2024-09-20 00:12:42,156] RITM-K25KWC/INFO/locust.main: writing html report to file: report.html
[2024-09-20 00:12:42,663] RITM-K25KWC/INFO/locust.main: Shutting down (exit code 0)
Type     Name                                                                          # reqs      # fails |    Avg     Min     Max    Med |   req/s  failures/s
--------|----------------------------------------------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
GET      /api/vhs/public/v1/selection-screen/                                             600     0(0.00%) |    408     196    1571    400 |   10.09        0.00
--------|----------------------------------------------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
         Aggregated                                                                       600     0(0.00%) |    408     196    1571    400 |   10.09        0.00

Response time percentiles (approximated)
Type     Name                                                                                  50%    66%    75%    80%    90%    95%    98%    99%  99.9% 99.99%   100% # reqs
--------|--------------------------------------------------------------------------------|--------|------|------|------|------|------|------|------|------|------|------|------
GET      /api/vhs/public/v1/selection-screen/                                                  400    430    450    470    530    570    660    820   1600   1600   1600    600
--------|--------------------------------------------------------------------------------|--------|------|------|------|------|------|------|------|------|------|------|------
         Aggregated                                                                            400    430    450    470    530    570    660    820   1600   1600   1600    600

@kavlevru
Copy link
Author

kavlevru commented Sep 19, 2024

@cyberw , if set use_response_times_cache to True in env.py -> def create_worker_runner, error is gone
https://github.com/locustio/locust/blob/master/locust/env.py#L157
image

locust -f tests/get__api_vhs_public_v1_selection_screen.py -u 100 -r 10 --run-time 1m --html report.html --process 4
[2024-09-20 00:16:43,790] RITM-K25KWC/INFO/locust.main: Starting web interface at http://0.0.0.0:8089 report.html --process 4
[2024-09-20 00:16:43,810] RITM-K25KWC/INFO/locust.main: Starting Locust 2.31.6
[2024-09-20 00:16:43,810] RITM-K25KWC/INFO/root: Waiting for workers to be ready, 0 of 4 connected
[2024-09-20 00:16:43,947] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_73e616ec958741beada9fab26739580c (index 0) reported as ready. 1 workers connected.
[2024-09-20 00:16:43,947] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_4685be6bb12e4decb8312d758726db04 (index 1) reported as ready. 2 workers connected.
[2024-09-20 00:16:43,947] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_e279b0eb6f4b49fb920c6456d06265ab (index 2) reported as ready. 3 workers connected.
[2024-09-20 00:16:43,947] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_48df64c9b5aa45628af2246f9398b3a1 (index 3) reported as ready. 4 workers connected.
[2024-09-20 00:16:43,947] RITM-K25KWC/INFO/locust.main: Starting Locust 2.31.6
[2024-09-20 00:16:43,948] RITM-K25KWC/INFO/locust.main: Starting Locust 2.31.6
[2024-09-20 00:16:43,948] RITM-K25KWC/INFO/locust.main: Starting Locust 2.31.6
[2024-09-20 00:16:43,948] RITM-K25KWC/INFO/locust.main: Starting Locust 2.31.6
[2024-09-20 00:16:44,814] RITM-K25KWC/INFO/locust.main: Run time limit set to 60 seconds
[2024-09-20 00:16:44,814] RITM-K25KWC/INFO/locust.runners: Sending spawn jobs of 100 users at 10.00 spawn rate to 4 ready workers
[2024-09-20 00:16:53,883] RITM-K25KWC/INFO/locust.runners: All users spawned: {"GetApiVhsPublicV1SelectionScreen": 100} (100 total users)
[2024-09-20 00:17:44,819] RITM-K25KWC/INFO/locust.main: --run-time limit reached, stopping test
[2024-09-20 00:17:44,859] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_4685be6bb12e4decb8312d758726db04 (index 1) reported that it has stopped, removing from running workers
[2024-09-20 00:17:44,859] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_4685be6bb12e4decb8312d758726db04 (index 1) reported as ready. 4 workers connected.
[2024-09-20 00:17:44,859] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_48df64c9b5aa45628af2246f9398b3a1 (index 3) reported that it has stopped, removing from running workers
[2024-09-20 00:17:44,859] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_48df64c9b5aa45628af2246f9398b3a1 (index 3) reported as ready. 4 workers connected.
[2024-09-20 00:17:44,860] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_e279b0eb6f4b49fb920c6456d06265ab (index 2) reported that it has stopped, removing from running workers
[2024-09-20 00:17:44,860] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_e279b0eb6f4b49fb920c6456d06265ab (index 2) reported as ready. 4 workers connected.
[2024-09-20 00:17:44,862] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_73e616ec958741beada9fab26739580c (index 0) reported that it has stopped, removing from running workers
[2024-09-20 00:17:44,862] RITM-K25KWC/INFO/locust.runners: Worker RITM-K25KWC.local_73e616ec958741beada9fab26739580c (index 0) reported as ready. 4 workers connected.
[2024-09-20 00:17:50,832] RITM-K25KWC/INFO/locust.main: --autoquit time reached, shutting down
[2024-09-20 00:17:50,835] RITM-K25KWC/INFO/locust.runners: Got quit message from master, shutting down...
[2024-09-20 00:17:50,835] RITM-K25KWC/INFO/locust.runners: Got quit message from master, shutting down...
[2024-09-20 00:17:50,837] RITM-K25KWC/INFO/locust.runners: Got quit message from master, shutting down...
[2024-09-20 00:17:50,837] RITM-K25KWC/INFO/locust.runners: Got quit message from master, shutting down...
[2024-09-20 00:17:50,913] RITM-K25KWC/INFO/locust.main: writing html report to file: report.html
[2024-09-20 00:17:50,914] RITM-K25KWC/INFO/locust.main: writing html report to file: report.html
[2024-09-20 00:17:50,914] RITM-K25KWC/INFO/locust.main: writing html report to file: report.html
[2024-09-20 00:17:50,914] RITM-K25KWC/INFO/locust.main: writing html report to file: report.html
[2024-09-20 00:17:50,919] RITM-K25KWC/INFO/locust.runners: Worker 'RITM-K25KWC.local_73e616ec958741beada9fab26739580c' (index 0) quit. 3 workers ready.
[2024-09-20 00:17:50,919] RITM-K25KWC/INFO/locust.runners: Worker 'RITM-K25KWC.local_e279b0eb6f4b49fb920c6456d06265ab' (index 2) quit. 2 workers ready.
[2024-09-20 00:17:50,920] RITM-K25KWC/INFO/locust.runners: Worker 'RITM-K25KWC.local_48df64c9b5aa45628af2246f9398b3a1' (index 3) quit. 1 workers ready.
[2024-09-20 00:17:50,919] RITM-K25KWC/INFO/locust.main: Shutting down (exit code 0)
[2024-09-20 00:17:50,921] RITM-K25KWC/INFO/locust.runners: Worker 'RITM-K25KWC.local_4685be6bb12e4decb8312d758726db04' (index 1) quit. 0 workers ready.
[2024-09-20 00:17:50,921] RITM-K25KWC/INFO/locust.runners: The last worker quit, stopping test.
[2024-09-20 00:17:50,919] RITM-K25KWC/INFO/locust.main: Shutting down (exit code 0)
[2024-09-20 00:17:50,920] RITM-K25KWC/INFO/locust.main: Shutting down (exit code 0)
[2024-09-20 00:17:50,920] RITM-K25KWC/INFO/locust.main: Shutting down (exit code 0)
[2024-09-20 00:17:51,417] RITM-K25KWC/INFO/locust.main: writing html report to file: report.html
[2024-09-20 00:17:51,925] RITM-K25KWC/INFO/locust.main: Shutting down (exit code 0)
Type     Name                                                                          # reqs      # fails |    Avg     Min     Max    Med |   req/s  failures/s
--------|----------------------------------------------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
GET      /api/vhs/public/v1/selection-screen/                                             600     0(0.00%) |    371     163     882    360 |   10.09        0.00
--------|----------------------------------------------------------------------------|-------|-------------|-------|-------|-------|-------|--------|-----------
         Aggregated                                                                       600     0(0.00%) |    371     163     882    360 |   10.09        0.00

Response time percentiles (approximated)
Type     Name                                                                                  50%    66%    75%    80%    90%    95%    98%    99%  99.9% 99.99%   100% # reqs
--------|--------------------------------------------------------------------------------|--------|------|------|------|------|------|------|------|------|------|------|------
GET      /api/vhs/public/v1/selection-screen/                                                  360    380    410    440    480    540    720    750    880    880    880    600
--------|--------------------------------------------------------------------------------|--------|------|------|------|------|------|------|------|------|------|------|------
         Aggregated                                                                            360    380    410    440    480    540    720    750    880    880    880    600

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants