Core delivery C# ASP.NET backend template.
- Install MongoDB on your local machine
- Start MongoDB:
sudo mongod --dbpath ~/mongodb-cdp
To inspect the Database and Collections locally:
mongosh
Run the tests with:
Tests run by running a full WebApplication
backed by Ephemeral MongoDB.
Tests do not use mocking of any sort and read and write from the in-memory database.
dotnet test
Run CDP-Deployments application:
dotnet run --project Btms.Backend --launch-profile Btms.Backend
Example SonarCloud configuration are available in the GitHub Action workflows.
We have added an example dependabot configuration file to the repository. You can enable it by renaming
the .github/example.dependabot.yml to .github/dependabot.yml
We are able to obtain test data to use in our tests a few ways.
This is data that we created by hand based on examples of messages that we had obtained. Canned test data can be found in Btms.Backend.IntegrationTests/Fixtures/SmokeTest
. In there you will find relevant folders for the messages that you want to simulate:
- ALVS - Custom Record notifications from CDS
- DECISIONS - Decision that is made by BTMS.
- GVMSAPIRESPONSE - ??
- IPAFFS - CHED notifications from IPAFFS
- CHEDA
- CHEDD
- CHEDP
- CHEDPP
The Test Data Generator can be found in the tools
project (tools/TestDataGenerator
). The test data is generated based on specifications provided in a scenario e.g. ChedASimpleMatchScenarioGenerator.cs
. A scenario should container at least a GetNotificationBuilder
or GetClearanceRequestBuilder
.
Example usage of GetNotificationBuilder
var notification = GetNotificationBuilder("cheda-one-commodity")
.WithCreationDate(entryDate)
.WithRandomArrivalDateTime(config.ArrivalDateRange)
.WithReferenceNumber(ImportNotificationTypeEnum.Cveda, scenario, entryDate, item)
.ValidateAndBuild();
Example usage of GetClearanceRequestBuilder
var clearanceRequest = GetClearanceRequestBuilder("cr-one-item")
.WithCreationDate(entryDate)
.WithArrivalDateTimeOffset(notification.PartOne!.ArrivalDate, notification.PartOne!.ArrivalTime)
.WithReferenceNumber(notification.ReferenceNumber!)
.ValidateAndBuild();
Note:
- Both the Notification Builder and Clearance Request Builder both take a sample file which it uses as a basis to create the test data. The sample file is located in
Scenarios/Samples
.
After creating your scenario your will need to add it to ConfigureTestGenerationServices
in BuilderExtensions.cs
.
Next, you will need to create a dataset that's specified in Program.cs
.
Example dataset:
var datasets = new[]
{
new
{
Dataset = "All-CHED-No-Match",
RootPath = "GENERATED-ALL-CHED-NO-MATCH",
Scenarios = new[] { app.CreateScenarioConfig<AllChedsNoMatchScenarioGenerator>(1, 1) }
},
...
}
- Dataset - Name of the dataset
- RootPath - Folder where the data will be created in. The folder will be in
TestDataGenerator/.test-data-generator
. - Scenarios - List of scenarios to create test data for. The CreateScenarioConfig generates scenarios based on the Scenario types from the
Scenarios
folder.
And finally, in order to trigger the data creation you will need to add some configuration to Properties/launchSettings.json
:
{
"profiles": {
"Generate All CHED no match": {
"commandName": "Project",
"commandLineArgs": "All-CHED-No-Match",
"environmentVariables": {
"DMP_ENVIRONMENT": "dev",
"DMP_SERVICE_BUS_NAME": "DEVTREINFSB1001",
"DMP_BLOB_STORAGE_NAME": "devdmpinfdl1001",
"DMP_SLOT": "1003",
"AZURE_TENANT_ID": "c9d74090-b4e6-4b04-981d-e6757a160812"
}
}
}
}
- Give the profile a name. This can be free text.
- commandLineArgs - The name of the new dataset.
- The rest of the configuration can be copied from the other profiles.
You may want to a new sample if you are creating data for a new scenario. To do this:
- Place the new sample file with the relevant JSON in the
Samples
folder. - Change the Properties of the file in Rider:
- Build action: Content
- Copy to output directory: Copy if newer
We have imported redacted production data that is stored in a Blob Storage. We can use BTMS Backend to import this data.
- Update
local.env
:- For one day dataset -
BusinessOptions:DmpBlobRootFolder=PRODREDACTED-20241204
- For one month dataset -
BusinessOptions:DmpBlobRootFolder=PRODREDACTED-202411
- When importing the data set the following:
BlobServiceOptions:CacheWriteEnabled=true # BlobServiceOptions:CacheReadEnabled=true
- After importing update the config to be the following so the data doesn't get imported again when you call the
initialise
API (http://0.0.0.0:5002/mgmt/initialise?syncPeriod=All)# BlobServiceOptions:CacheWriteEnabled=true BlobServiceOptions:CacheReadEnabled=true
- Once the config has been updated, start BTMS Backend and call the
initialise
API (http://0.0.0.0:5002/mgmt/initialise?syncPeriod=All). Note that a large amount of data will be loaded, particularly the full month dataset. It is also advisable to run Backend from a standalone terminal rather than from Rider as it struggles running this task.
- For one day dataset -