It provides an environment which ...
- is highly configurable
- provides a realistic scenario benchmark
- works on every dataset
- works on SPARQL HTTP endpoints
- works on HTTP Get & Post endpoints
- works on CLI applications
- and is easily extendable
For further information visit:
Per run metrics:
- Query Mixes Per Hour (QMPH)
- Number of Queries Per Hour (NoQPH)
- Number of Queries (NoQ)
- Average Queries Per Second (AvgQPS)
- Penalized Average Queries Per Second (PAvgQPS)
Per query metrics:
- Queries Per Second (QPS)
- Penalized Queries Per Second (PQPS)
- Number of successful and failed queries
- result size
- queries per second
- sum of execution times
In order to run Iguana, you need to have Java 17
, or greater, installed on your system.
Download the newest release of Iguana here, or run on a unix shell:
wget https://github.com/dice-group/IGUANA/releases/download/v4.0.0/iguana-4.0.0.zip
unzip iguana-4.0.0.zip
The zip file contains the following files:
iguana-X.Y.Z.jar
start-iguana.sh
example-suite.yml
You can use the provided example configuration and modify it to your needs. For further information please visit our configuration and Stresstest wiki pages.
For a detailed, step-by-step instruction through a benchmarking example, please visit our tutorial.
Start Iguana with a benchmark suite (e.g. the example-suite.yml) either by using the start script:
./start-iguana.sh example-suite.yml
or by directly executing the jar-file:
java -jar iguana-x-y-z.jar example-suite.yml
@InProceedings{10.1007/978-3-319-68204-4_5,
author="Conrads, Lixi
and Lehmann, Jens
and Saleem, Muhammad
and Morsey, Mohamed
and Ngonga Ngomo, Axel-Cyrille",
editor="d'Amato, Claudia
and Fernandez, Miriam
and Tamma, Valentina
and Lecue, Freddy
and Cudr{\'e}-Mauroux, Philippe
and Sequeda, Juan
and Lange, Christoph
and Heflin, Jeff",
title="Iguana: A Generic Framework for Benchmarking the Read-Write Performance of Triple Stores",
booktitle="The Semantic Web -- ISWC 2017",
year="2017",
publisher="Springer International Publishing",
address="Cham",
pages="48--65",
abstract="The performance of triples stores is crucial for applications driven by RDF. Several benchmarks have been proposed that assess the performance of triple stores. However, no integrated benchmark-independent execution framework for these benchmarks has yet been provided. We propose a novel SPARQL benchmark execution framework called Iguana. Our framework complements benchmarks by providing an execution environment which can measure the performance of triple stores during data loading, data updates as well as under different loads and parallel requests. Moreover, it allows a uniform comparison of results on different benchmarks. We execute the FEASIBLE and DBPSB benchmarks using the Iguana framework and measure the performance of popular triple stores under updates and parallel user requests. We compare our results (See https://doi.org/10.6084/m9.figshare.c.3767501.v1) with state-of-the-art benchmarking results and show that our benchmark execution framework can unveil new insights pertaining to the performance of triple stores.",
isbn="978-3-319-68204-4"
}