Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

logfile is erroniously written when there are many workers. #2927

Open
2 tasks done
clint-qrypt opened this issue Oct 7, 2024 · 1 comment
Open
2 tasks done

logfile is erroniously written when there are many workers. #2927

clint-qrypt opened this issue Oct 7, 2024 · 1 comment
Labels

Comments

@clint-qrypt
Copy link

Prerequisites

Description

When using processes = -1 in the locust.conf file and 12 or more workers are running, the logfile is getting crushed by all the simultaneous writes. Data is being inserted in the middle of writes from other workers, and a lot of missing newlines.

Command line

locust

Locustfile contents

self.client.post(
   "/api/v1/entropy",
   name="entropy" + "/" + TEST_CASE.name,
   headers=self.headers,
   json=TEST_CASE.value,
   timeout=90,
)

#conf file
host = http://localhost    # uncomment this for local dev env testing
users = 300
spawn-rate = 25
processes = -1
csv = results/300VUs-32B-Soak-Test
only-summary = true
run-time = 36h
headless = true

Python version

3.10

Locust version

2.31.8

Operating system

Ubuntu 22.04

@clint-qrypt clint-qrypt added the bug label Oct 7, 2024
@clint-qrypt
Copy link
Author

Screenshot 2024-10-07 at 8 09 37 AM

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant