Skip to content

Commit

Permalink
Enhancement: more logging in E2E tests + turn back on commit-workflow…
Browse files Browse the repository at this point in the history
… E2E test (#1523)
  • Loading branch information
tzaffi authored May 26, 2023
1 parent be76440 commit 25ab223
Show file tree
Hide file tree
Showing 13 changed files with 372 additions and 181 deletions.
7 changes: 5 additions & 2 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -46,16 +46,17 @@ jobs:
go_version:
type: string
environment:
CI_E2E_FILENAME: "fa6ad40d/rel-nightly"
CI_E2E_FILENAME: "f99e7b0c/rel-nightly"
steps:
- go/install:
version: << parameters.go_version >>
- install_dependencies
- install_linter
- run_tests
# Change this to run_e2e_tests once we have stable algod binaries containing delta APIs
# - run_e2e_tests_nightly
- run_e2e_tests_nightly
- codecov/upload

test_nightly:
machine:
image: << pipeline.parameters.ubuntu_image >>
Expand Down Expand Up @@ -133,6 +134,7 @@ commands:
command: go install github.com/golangci/golangci-lint/cmd/[email protected]

run_e2e_tests:
# no "-nightly" in final 2 steps
steps:
- run:
name: Install go-algorand stable binaries
Expand All @@ -144,6 +146,7 @@ commands:
- run: make e2e-filter-test

run_e2e_tests_nightly:
# "-nightly" suffix in final 2 steps
steps:
- run:
name: Install go-algorand nightly binaries
Expand Down
87 changes: 59 additions & 28 deletions api/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,55 +9,68 @@ The API is defined using [OpenAPI v2](https://swagger.io/specification/v2/) in *
The Makefile will install our fork of **oapi-codegen**, use `make oapi-codegen` to install it directly.

1. Document your changes by editing **indexer.oas2.yml**
2. Regenerate the endpoints by running `make generate`. The sources at **generated/** will be updated.
2. Regenerate the endpoints by running **generate.sh**. The sources at **generated/** will be updated.
3. Update the implementation in **handlers.go**. It is sometimes useful to consult **generated/routes.go** to make sure the handler properly implements **ServerInterface**.

## What codegen tool is used?

We found that [oapi-codegen](https://github.com/deepmap/oapi-codegen) produced the cleanest code, and had an easy to work with codebase. There is an algorand fork of this project which contains a couple modifications that were needed to properly support our needs.
We found that [oapi-codegen](https://github.com/deepmap/oapi-codegen) produced the cleanest code, and had an easy to work with codebase.
There is an algorand fork of this project which contains a couple modifications that were needed to properly support our needs.

Specifically, `uint64` types aren't strictly supported by OpenAPI. So we added a type-mapping feature to oapi-codegen.

## Why do we have indexer.oas2.yml and indexer.oas3.yml?

We chose to maintain V2 and V3 versions of the spec because OpenAPI v3 doesn't seem to be widely supported. Some tools worked better with V3 and others with V2, so having both available has been useful. To reduce developer burdon, the v2 specfile is automatically converted v3 using [converter.swagger.io](http://converter.swagger.io/).
We chose to maintain V2 and V3 versions of the spec because OpenAPI v3 doesn't seem to be widely supported. Some tools worked better with V3 and others with V2, so having both available has been useful.
To reduce developer burdon, the v2 specfile is automatically converted v3 using [converter.swagger.io](http://converter.swagger.io/).

# Fixtures Test
## What is a **Fixtures Test**?
## Fixtures Test

Currently (September 2022) [fixtures_test.go](./fixtures_test.go) is a library that allows testing Indexer's router to verify that endpoints accept parameters and respond as expected, and guard against future regressions. [app_boxes_fixtures_test.go](./app_boxes_fixtures_test.go) is an example _fixtures test_ and is the _creator_ of the fixture [boxes.json](./test_resources/boxes.json).
### What is a **Fixtures Test**?

Currently (September 2022) [fixtures_test.go](./fixtures_test.go) is a library that allows testing Indexer's router to verify that endpoints accept parameters and respond as expected,
and guard against future regressions. [app_boxes_fixtures_test.go](./app_boxes_fixtures_test.go) is an example _fixtures test_ and is the _creator_ of the fixture [boxes.json](./test_resources/boxes.json).

A fixtures test

1. is defined by a go-slice called a _Seed Fixture_ e.g. [var boxSeedFixture](https://github.com/algorand/indexer/blob/b5025ad640fabac0d778b4cac60d558a698ed560/api/app_boxes_fixtures_test.go#L302-L692) which contains request information for making HTTP requests against an Indexer server
1. is defined by a go-slice called a _Seed Fixture_ e.g. [var boxSeedFixture](https://github.com/algorand/indexer/blob/b5025ad640fabac0d778b4cac60d558a698ed560/api/app_boxes_fixtures_test.go#L302-L692)
which contains request information for making HTTP requests against an Indexer server
2. iterates through the slice, making each of the defined requests and generating a _Live Fixture_
3. reads a _Saved Fixture_ from a json file e.g. [boxes.json](./test_resources/boxes.json)
4. persists the _Live Fixture_ to a json file not in source control
5. asserts that the _Saved Fixture_ is equal to the _Live Fixture_

In reality, because we always want to save the _Live Fixture_ before making assertions that could fail the test and pre-empt saving, steps (3) and (4) happen in the opposite order.

## What's the purpose of a Fixtures Test?
### What's the purpose of a Fixtures Test?

A fixtures test should allow one to quickly stand up an end-to-end test to validate that Indexer endpoints are working as expected. After Indexer's state is programmatically set up,
it's easy to add new requests and verify that the responses look exactly as expected. Once you're satisfied that the responses are correct, it's easy to _freeze_ the test and guard against future regressions.

A fixtures test should allow one to quickly stand up an end-to-end test to validate that Indexer endpoints are working as expected. After Indexer's state is programmatically set up, it's easy to add new requests and verify that the responses look exactly as expected. Once you're satisfied that the responses are correct, it's easy to _freeze_ the test and guard against future regressions.
## What does a **Fixtures Test Function** Look Like?
### What does a **Fixtures Test Function** Look Like?

[func TestBoxes](https://github.com/algorand/indexer/blob/b5025ad640fabac0d778b4cac60d558a698ed560/api/app_boxes_fixtures_test.go#L694_L704) shows the basic structure of a fixtures test.

1. `setupIdbAndReturnShutdownFunc()` is called to set up the Indexer database
* this isn't expected to require modification

* this isn't expected to require modification

2. `setupLiveBoxes()` is used to prepare the local ledger and process blocks in order to bring Indexer into a particular state
* this will always depend on what the test is trying to achieve
* in this case, an app was used to create and modify a set of boxes which are then queried against
* it is conceivable that instead of bringing Indexer into a particular state, the responses from the DB or even the handler may be mocked, so we could have had `setupLiveBoxesMocker()` instead of `setupLiveBoxes()`
* this will always depend on what the test is trying to achieve
* in this case, an app was used to create and modify a set of boxes which are then queried against
* it is conceivable that instead of bringing Indexer into a particular state, the responses from the DB or even the handler may be mocked, so we could have had `setupLiveBoxesMocker()` instead of `setupLiveBoxes()`

3. `setupLiveServerAndReturnShutdownFunc()` is used to bring up an instance of a real Indexer.
* this shouldn't need to be modified; however, if running in parallel and making assertions that conflict with other tests, you may need to localize the variable `fixtestListenAddr` and run on a separate port
* if running a mock server instead, a different setup function would be needed
4. `validateLiveVsSaved()` runs steps (1) through (5) defined in the previous section
* this is designed to be generic and ought not require much modification going forward

* this shouldn't need to be modified; however, if running in parallel and making assertions that conflict with other tests,
you may need to localize the variable `fixtestListenAddr` and run on a separate port
* if running a mock server instead, a different setup function would be needed

4. `validateLiveVsSaved()` runs steps (1) through (5) defined in the previous section

* this is designed to be generic and ought not require much modification going forward

## Which Endpoints are Currently _Testable_ in a Fixtures Test?
### Which Endpoints are Currently _Testable_ in a Fixtures Test?

Endpoints defined in [proverRoutes](https://github.com/algorand/indexer/blob/b955a31b10d8dce7177383895ed8e57206d69f67/api/fixtures_test.go#L232-L263) are testable.

Expand All @@ -69,11 +82,11 @@ Currently (September 2022) these are:
* `/v2/applications/:application-id/box`
* `/v2/applications/:application-id/boxes`

## How to Introduce a New Fixtures Test for an _Already Testable_ Endpoint?
### How to Introduce a New Fixtures Test for an _Already Testable_ Endpoint?

To set up a new test for endpoints defined above one needs to:

### 1. Define a new _Seed Fixture_
#### 1. Define a new _Seed Fixture_

For example, consider

Expand All @@ -95,30 +108,48 @@ var boxSeedFixture = fixture{
```
A seed fixture is a `struct` with fields
* `File` (_required_) - the name in [test_resources](./test_resources/) where the fixture is read from (and written to with an `_` prefix)
* `Owner` (_recommended_) - a name to define which test "owns" the seed
* `Frozen` (_required_) - set _true_ when you need to run assertions of the _Live Fixture_ vs. the _Saved Fixture_. For tests to pass, it needs to be set _true_.
* `Cases` - the slice of `testCase`s. Each of these has the fields:
* `Name` (_required_) - an identifier for the test case
* `Name` (_required_) - an identifier for the test case
* `Request` (_required_) - a `requestInfo` struct specifying:
* `Path` (_required_) - the path to be queried
* `Params` (_required but may be empty_) - the slice of parameters (strings `name` and `value`) to be appended to the path
### 2. Define a new _Indexer State_ Setup Function
#### 2. Define a new _Indexer State_ Setup Function
There are many examples of setting up state that can be emulated. For example:
* [setupLiveBoxes()](https://github.com/algorand/indexer/blob/b5025ad640fabac0d778b4cac60d558a698ed560/api/app_boxes_fixtures_test.go#L43) for application boxes
* [TestApplicationHandlers()](https://github.com/algorand/indexer/blob/3a9095c2b5ee25093708f980445611a03f2cf4e2/api/handlers_e2e_test.go#L93) for applications
* [TestBlockWithTransactions()](https://github.com/algorand/indexer/blob/800cb135a0c6da0109e7282acf85cbe1961930c6/idb/postgres/postgres_integration_test.go#L339) setup state consisting of a set of basic transactions
* [TestBlockWithTransactions()](https://github.com/algorand/indexer/blob/800cb135a0c6da0109e7282acf85cbe1961930c6/idb/postgres/postgres_integration_test.go#L339)
setup state consisting of a set of basic transactions
## How to Make a _New Endpoint_ Testable by Fixtures Tests?
### How to Make a _New Endpoint_ Testable by Fixtures Tests?
There are 2 steps:
1. Implement a new function _witness generator_ aka [prover function](https://github.com/algorand/indexer/blob/b955a31b10d8dce7177383895ed8e57206d69f67/api/fixtures_test.go#L103) of type `func(responseInfo) (interface{}, *string)` as examplified in [this section](https://github.com/algorand/indexer/blob/b955a31b10d8dce7177383895ed8e57206d69f67/api/fixtures_test.go#L107-L200). Such a function is supposed to parse an Indexer response's body into a generated model. Currently, all provers are boilerplate, and with generics, it's expected that this step will no longer be necessary (this [POC](https://github.com/tzaffi/indexer/blob/generic-boxes/api/fixtures_test.go#L119-L155) shows how it would be done with generics).
2. Define a new route in the [proverRoutes struct](https://github.com/algorand/indexer/blob/b955a31b10d8dce7177383895ed8e57206d69f67/api/fixtures_test.go#L232_L263). This is a tree structure which is traversed by splitting a path using `/` and eventually reaching a leaf which consists of a `prover` as defined in #1.
1. Implement a new function _witness generator_ aka [prover function](https://github.com/algorand/indexer/blob/b955a31b10d8dce7177383895ed8e57206d69f67/api/fixtures_test.go#L103) of
type `func(responseInfo) (interface{}, *string)` as examplified in [this section](https://github.com/algorand/indexer/blob/b955a31b10d8dce7177383895ed8e57206d69f67/api/fixtures_test.go#L107-L200).
Such a function is supposed to parse an Indexer response's body into a generated model. Currently, all provers are boilerplate, and with generics, it's expected that this step will no longer be necessary
(this [POC](https://github.com/tzaffi/indexer/blob/generic-boxes/api/fixtures_test.go#L119-L155) shows how it would be done with generics).
2. Define a new route in the [proverRoutes struct](https://github.com/algorand/indexer/blob/b955a31b10d8dce7177383895ed8e57206d69f67/api/fixtures_test.go#L232_L263).
This is a tree structure which is traversed by splitting a path using `/` and eventually reaching a leaf which consists of a `prover` as defined in #1.
For example, to enable the endpoint `GET /v2/applications/{application-id}/logs` for fixtures test, one need only define a `logsProof` witness generator and have it mapped in `proverRoutes` under:
```
```go
proverRoutes.parts["v2"].parts["applications"].parts[":application-id"].parts["logs"] = logsProof
```
### How to Fix a Fixtures Test?
Supposing that there was a breaking change upstream, and a fixtures test is now failing. The following approach should work most of the time:
1. Run the broken test generating a temporary fixture file in the `fixturesDirectory` (currently [test_resources](./test_resources/)) with a name the same as the json fixture except begining with `_`
(e.g. `_boxes.json` vs. `boxes.json`).
2. Observe the diff between the temporary fixture and the saved fixture. If the diff is acceptable, then simply copy the temporary fixture over the saved fixture.
3. If the diff isn't acceptable, then make any necessary changes to the setup and seed and repeat steps 1 and 2.
2 changes: 1 addition & 1 deletion cmd/validator/core/command.go
Original file line number Diff line number Diff line change
Expand Up @@ -113,7 +113,7 @@ func run(config Params, errorLogFile, addr string, box string, threads int, proc
}
}

// start kicks off a bunch of go routines to compare addresses, it also creates a work channel to feed the workers and
// start kicks off a bunch of go routines to compare addresses, it also creates a work channel to feed the workers and
// fills the work channel by reading from os.Stdin. Results are returned to the results channel.
func start(processorID ProcessorID, threads int, config Params, results chan<- Result) {
work := make(chan string, 100*threads)
Expand Down
2 changes: 1 addition & 1 deletion docs/PostgresqlIndexes.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# PostgeSQL Custom Indexes
# PostgreSQL Custom Indexes

Indexer ships with a minimal index set in order to process blocks, and return transactions for accounts. More advanced queries and filters require customization of the indexes.

Expand Down
37 changes: 20 additions & 17 deletions e2e_tests/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,12 @@

An integration testing tool. It leverages an artifact generated by the go-algorand integration tests to ensure all features are exercised.

# Usage
## Usage

A python virtual environment is highly encouraged.

## Setup environment with venv
### Setup environment with venv

```sh
cd e2e_tests

Expand All @@ -22,8 +23,7 @@ python3 -m pip install .

## Running locally


### e2elive
### `e2elive`

This tests runs the `algorand-indexer` binary. Depending on what you're doing, you may want to use a different value for `--s3-source-net`. After the test finishes, a few simple queries are executed.

Expand All @@ -33,16 +33,18 @@ This command requires a postgres database, you can run one in docker:
docker run --rm -it --name some-postgres -p 5555:5432 -e POSTGRES_PASSWORD=pgpass -e POSTGRES_USER=pguser -e POSTGRES_DB=mydb postgres
```

Running e2elive
#### Running `e2elive`

**Note**: you will need to restart this container between tests, otherwise the database will already be initialized. Being out of sync with the data directory may cause unexpected behavior and failing tests.

```sh
e2elive --connection-string "host=127.0.0.1 port=5555 user=pguser password=pgpass dbname=mydb" --s3-source-net "fafa8862/rel-nightly" --indexer-bin ../cmd/algorand-indexer/algorand-indexer --indexer-port 9890
```

### e2econduit
### `e2econduit`

This runs a series of pipeline scenarios with conduit. Each module manages its own resources and may have dependencies. For example the PostgreSQL exporter starts a docker container to initialize a database server.

```sh
e2econduit --s3-source-net rel-nightly --conduit-bin ../cmd/conduit/conduit
```
Expand All @@ -64,39 +66,40 @@ make e2e

Alternatively, modify `e2e_tests/docker/indexer/Dockerfile` by swapping out `$CI_E2E_FILENAME` with the test artifact you want to use. See `.circleci/config.yml` for the current values used by CI.

### Creating new Conduit e2e tests
## Creating new Conduit e2e tests

### Creating a new plugin fixture

## Creating a new plugin fixture
All plugins for e2e tests are organized in `e2e_tests/src/e2e_conduit/fixtures/` under the proper directory structure. For example, processor plugin fixtures are under `e2e_tests/src/e2e_conduit/fixtures/processors/`.

To create a new plugin, create a new class under the proper directory which subclasses `PluginFixture`.
__Note that some fixtures have further subclasses, such as ImporterPlugin, which support additional required behavior for those plugin types.__
To create a new plugin, create a new class under the proper directory which subclasses `PluginFixture`.
**Note that some fixtures have further subclasses, such as ImporterPlugin, which support additional required behavior for those plugin types.**

### Main Methods of PluginFixtures
#### Main Methods of PluginFixtures

* `name`
* `name`
Returns the name of the plugin--must be equivalent to the name which would be used in the conduit config.

* `setup(self, accumulated_config)`
* `setup(self, accumulated_config)`
Setup is run before any of the config data is resolved. It accepts an `accumulated_config` which is a map containing all of the previously output config values.
If your plugin needs to be fed some data, such as the algod directory which was created by the importer, this data can be retrieved from the accumulated_config.

The setup method is responsible for any preparation your plugin needs to do before the pipeline starts.

* `resolve_config_input`
* `resolve_config_input`
This method sets values on `self.config_input`, a map which contains all of the Conduit config required to run your plugin. Any values set on this map will be serialized into
the data section of your plugin's config when running it in Conduit.

* `resolve_config_output`
* `resolve_config_output`
Here, similarly to `config_input`, we set values on the `config_output` map. This map is what will be passed between plugins during initialization via the `accumulated_config`. If your plugin is creating any resources or initializing any values which other plugins need to know about, they should be set in the `config_output`.

### Creating a new scenario using your plugin

## Creating a new scenario using your plugin
A `Scenario` is an abstraction for a given instantiation of a Conduit pipeline. In order to run a coduit e2e test, create a scenario in `e2e_tests/src/e2e_conduit/scenarios` and ensure that it is run in `e2e_tests/src/e2e_conduit/e2econduit.py`.

For example,

```
```go
Scenario(
"app_filter_indexer_scenario",
importer=importers.FollowerAlgodImporter(sourcenet),
Expand Down
2 changes: 1 addition & 1 deletion e2e_tests/docker/indexer-filtered/e2e-read/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -21,4 +21,4 @@ RUN pip3 install ./
ENV INDEXER_DATA="${HOME}/indexer/"
WORKDIR /opt/go/indexer
# Run test script
ENTRYPOINT ["/bin/bash", "-c", "sleep 5 && e2elive $EXTRA --connection-string \"$CONNECTION_STRING\" --indexer-bin /opt/go/indexer/cmd/algorand-indexer/algorand-indexer --indexer-port 9890 --read-only True --verbose"]
ENTRYPOINT ["/bin/bash", "-c", "env && sleep 5 && e2elive $EXTRA --connection-string \"$CONNECTION_STRING\" --indexer-bin /opt/go/indexer/cmd/algorand-indexer/algorand-indexer --indexer-port 9890 --read-only True --verbose"]
Loading

0 comments on commit 25ab223

Please sign in to comment.