Skip to content

Commit

Permalink
Merge pull request #893 from HDRUK/release-preprod/v3.3.0
Browse files Browse the repository at this point in the history
Release preprod/v3.3.0
  • Loading branch information
reubensamuel authored Jan 11, 2023
2 parents 5dec5c4 + 12bb029 commit 3fa1021
Show file tree
Hide file tree
Showing 37 changed files with 997 additions and 44 deletions.
6 changes: 6 additions & 0 deletions .github/dependabot.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
version: 2
updates:
- package-ecosystem: 'npm'
directory: '/'
schedule:
interval: 'monthly'
74 changes: 74 additions & 0 deletions .github/workflows/codeql.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
# For most projects, this workflow file will not need changing; you simply need
# to commit it to your repository.
#
# You may wish to alter this file to override the set of languages analyzed,
# or to provide custom queries or build logic.
#
# ******** NOTE ********
# We have attempted to detect the languages in your repository. Please check
# the `language` matrix defined below to confirm you have the correct set of
# supported CodeQL languages.
#
name: "CodeQL"

on:
push:
branches: [ "dev", UAT, UATBeta, master, release ]
pull_request:
# The branches below must be a subset of the branches above
branches: [ "dev" ]
schedule:
- cron: '26 5 * * 1'

jobs:
analyze:
name: Analyze
runs-on: ubuntu-latest
permissions:
actions: read
contents: read
security-events: write

strategy:
fail-fast: false
matrix:
language: [ 'javascript' ]
# CodeQL supports [ 'cpp', 'csharp', 'go', 'java', 'javascript', 'python', 'ruby' ]
# Learn more about CodeQL language support at https://aka.ms/codeql-docs/language-support

steps:
- name: Checkout repository
uses: actions/checkout@v3

# Initializes the CodeQL tools for scanning.
- name: Initialize CodeQL
uses: github/codeql-action/init@v2
with:
languages: ${{ matrix.language }}
# If you wish to specify custom queries, you can do so here or in a config file.
# By default, queries listed here will override any specified in a config file.
# Prefix the list here with "+" to use these queries and those in the config file.

# Details on CodeQL's query packs refer to : https://docs.github.com/en/code-security/code-scanning/automatically-scanning-your-code-for-vulnerabilities-and-errors/configuring-code-scanning#using-queries-in-ql-packs
# queries: security-extended,security-and-quality


# Autobuild attempts to build any compiled languages (C/C++, C#, Go, or Java).
# If this step fails, then you should remove it and run the build manually (see below)
- name: Autobuild
uses: github/codeql-action/autobuild@v2

# ℹ️ Command-line programs to run using the OS shell.
# 📚 See https://docs.github.com/en/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob_idstepsrun

# If the Autobuild fails above, remove it and uncomment the following three lines.
# modify them (or add more) to build your code if your project, please refer to the EXAMPLE below for guidance.

# - run: |
# echo "Run, Build Application using script"
# ./location_of_script_within_repo/buildscript.sh

- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@v2
with:
category: "/language:${{matrix.language}}"
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -17,3 +17,4 @@ npm-debug.log*
package-lock.json
.env
globalConfig.json
google_analytics.json
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
//import { UserModel } from '../src/resources/user/user.model';
// Something

/**
* Make any changes you need to make to the database here
Expand Down
File renamed without changes.
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
import { DataRequestModel } from '../src/resources/datarequest/datarequest.model';
// Something else

async function up() {
// 1. Add default application type to all applications
Expand Down
40 changes: 40 additions & 0 deletions .old.migrations/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
# HDR UK GATEWAY - Data Migrations

The primary data source used by the Gateway Project is the noSQL solution provided by MongoDb. Data migration strategy is a fundamental part of software development and release cycles for a data intensive web application. The project team have chosen the NPM package Migrate-Mongoose - https://www.npmjs.com/package/migrate-mongoose to assist in the management of data migration scripts. This package allows developers to write versioned, reversible data migration scripts using the Mongoose library.

For more information on what migration scripts are and their purpose, please see sample background reading here - https://www.red-gate.com/simple-talk/sql/database-administration/using-migration-scripts-in-database-deployments/

### Using migration scrips

To create a data migration script, follow these steps:

#### Step 1

Ensure your terminal's working directory is the Gateway API and that node packages have been installed using 'npm i'.

#### Step 2

Run the command below, replacing 'my_new_migration_script' with the name of the script you want to create. The name does not need to be unique, as it will be prefixed automatically with a timestamp, but it should be easily recognisable and relate strongly to the database change that will take place if the script is executed.

./node_modules/.bin/migrate create my_new_migration_script

#### Step 3

Your new migration scripts should now be available in './migrations/', which you can now modify. You can import the required Mongoose models as normal to interact with the MongoDb database. The migration scripts that run locally will use the connection string taken from your .env file against the variable 'MIGRATE_dbConnectionUri'.

Complete the scripts required for the UP process, and if possible, the DOWN process. For awareness, the UP scripts run automatically as part of our CI/CD pipeline, and the DOWN scripts exist to reverse database changes if necessary, this is a manual process.

#### Step 4

With the scripts written, the functions can be tested by running the following command, replacing 'my_new_migration_script' with the name of the script you want to execute without the time stamp so for example
node -r esm migrations/migrate.js up add_globals

node -r esm migrations/migrate.js up my_new_migration_script

When this process is completed, the connected database will have a new document representing your migration scripts inside the 'migrations' collection, which tracks the state of the migration. If you need to run your scripts multiple times for test purposes, you can change the state of the migration to 'Down'.

During this process, please ensure you are using a personal database.

#### Step 5

Commit the code to the relevant git branch and raise a pull request. The migration script will run automatically as the code moves through each environment.
2 changes: 1 addition & 1 deletion migrations/migrate.js → .old.migrations/migrate.js
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import cli from 'migrate-mongoose/src/cli'; //lgtm [js/unused-local-variable]
import mongoose from 'mongoose';

mongoose.connect(process.env.MIGRATE_dbConnectionUri, {
mongoose.connect(`${process.env.MIGRATE_dbConnectionUri}/${process.env.database}/?retryWrites=true&w=majority`, {
useNewUrlParser: true,
useFindAndModify: false,
useUnifiedTopology: true,
Expand Down
2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
FROM node:12
FROM node:14

# Create app directory
WORKDIR /usr/src/app
Expand Down
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -151,4 +151,5 @@ terraform plan -var-file=vars.tfvars -out=tf_apply
terraform apply tf_apply && rm tf_apply
```


[Link to terraform file](deployment/GCP/api.tf)
3 changes: 3 additions & 0 deletions google_analytics.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
{

}
46 changes: 46 additions & 0 deletions migrate-mongo-config.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
// In this file you can configure migrate-mongo

// Have to call this as this is pre-app start, thus env hasn't
// been populated yet
require('dotenv').config();

const config = {
mongodb: {
// TODO Change (or review) the url to your MongoDB:
url: 'mongodb+srv://' +
process.env.user +
':' +
process.env.password +
'@' +
process.env.cluster +
'?ssl=true&retryWrites=true&w=majority',

// TODO Change this to your database name:
databaseName: process.env.database,

options: {
useNewUrlParser: true, // removes a deprecation warning when connecting
useUnifiedTopology: true, // removes a deprecating warning when connecting
// connectTimeoutMS: 3600000, // increase connection timeout to 1 hour
// socketTimeoutMS: 3600000, // increase socket timeout to 1 hour
}
},

// The migrations dir, can be an relative or absolute path. Only edit this when really necessary.
migrationsDir: "migrations",

// The mongodb collection where the applied changes are stored. Only edit this when really necessary.
changelogCollectionName: "changelog",

// The file extension to create migrations and search for in migration dir
migrationFileExtension: ".js",

// Enable the algorithm to create a checksum of the file contents and use that in the comparison to determine
// if the file should be run. Requires that scripts are coded to be run multiple times.
useFileHash: false,

// Don't change this, unless you know what you're doing
moduleSystem: 'commonjs',
};

module.exports = config;
30 changes: 30 additions & 0 deletions migrations/20221122095337-add_published_flag_to_data_requests.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
module.exports = {
async up(db, client) {
// TODO write your migration here.
// See https://github.com/seppevs/migrate-mongo/#creating-a-new-migration-script

/**
* Update DAR to include an overriding published field to determine the published
* state of a DAR edit form publication by a custodian
*/
// await db.collection('data_requests').updateMany({
// $set: { "published_form": false },
// });

await db.collection('data_requests').updateMany({},
{
$set: { "publishedForm": false }
}
);
},

async down(db, client) {
// TODO write the statements to rollback your migration (if possible)

await db.collection('data_requests').updateMany({},
{
$unset: { "publishedForm": false }
}
);
}
};
52 changes: 40 additions & 12 deletions migrations/README.md
Original file line number Diff line number Diff line change
@@ -1,40 +1,68 @@
# HDR UK GATEWAY - Data Migrations

The primary data source used by the Gateway Project is the noSQL solution provided by MongoDb. Data migration strategy is a fundamental part of software development and release cycles for a data intensive web application. The project team have chosen the NPM package Migrate-Mongoose - https://www.npmjs.com/package/migrate-mongoose to assist in the management of data migration scripts. This package allows developers to write versioned, reversible data migration scripts using the Mongoose library.
The primary data source used by the Gateway Project is the noSQL solution provided by MongoDb.
Data migration strategy is a fundemental part of software development and release cycles for a
data intensive web application. The project team have chosen the NPM package Migrate-Mongo - https://www.npmjs.com/package/migrate-mongo
to assist in the management of data migration scripts. This package allows developers to write versioned,
reversible data migration scripts using the Mongoose library.

For more information on what migration scripts are and their purpose, please see sample background reading here - https://www.red-gate.com/simple-talk/sql/database-administration/using-migration-scripts-in-database-deployments/
For more information on what migration scripts are and their purpose, please see sample
background reading here - https://www.red-gate.com/simple-talk/sql/database-administration/using-migration-scripts-in-database-deployments/

### Using migration scrips

To create a data migration script, follow these steps:

#### Step 1

Ensure your terminal's working directory is the Gateway API and that node packages have been installed using 'npm i'.
Ensure your terminal's working directory is the Gateway API and that node packages have
been installed using 'npm i'.

#### Step 2

Run the command below, replacing 'my_new_migration_script' with the name of the script you want to create. The name does not need to be unique, as it will be prefixed automatically with a timestamp, but it should be easily recognisable and relate strongly to the database change that will take place if the script is executed.
Run the command below, replacing 'my_new_migration_script' with the name of the script
you want to create. The name does not need to be unique, as it will be prefixed automatically
with a timestamp, but it should be easily recognisable and relate strongly to the database
change that will take place if the script is executed.

./node_modules/.bin/migrate create my_new_migration_script
./node_modules/.bin/migrate-mongo create my_new_migration_script

#### Step 3

Your new migration scripts should now be available in './migrations/', which you can now modify. You can import the required Mongoose models as normal to interact with the MongoDb database. The migration scripts that run locally will use the connection string taken from your .env file against the variable 'MIGRATE_dbConnectionUri'.
Your new migration scripts should now be available in './migrations/', which you can now modify.
You can interact directly with the database. The migration scripts that run locally will use the
connection string config taken from your .env file against the variables: database, user, password and cluster.

Complete the scripts required for the UP process, and if possible, the DOWN process. For awareness, the UP scripts run automatically as part of our CI/CD pipeline, and the DOWN scripts exist to reverse database changes if necessary, this is a manual process.
Complete the scripts required for the UP process, and if possible, the DOWN process. For awareness, the UP
scripts run automatically as part of our CI/CD pipeline, and the DOWN scripts exist to reverse
database changes if necessary, this is a manual process.

#### Step 4

With the scripts written, the functions can be tested by running the following command, replacing 'my_new_migration_script' with the name of the script you want to execute without the time stamp so for example
node -r esm migrations/migrate.js up add_globals
With the scripts written, the functions can be tested by running the following command,
replacing 'my_new_migration_script' with the name of the script you want to execute without
the time stamp so for example

node -r esm migrations/migrate.js up my_new_migration_script
./node_modules/.bin/migrate-mongo up (to run all migration updates)
./node_modules/.bin/migrate-mongo down (to rollback migration updates)
./node_modules/.bin/migrate-mongo up my_new_migration_script (to run a single migration update)
./node_modules/.bin/migrate-mongo down my_new_migration_script (to rollback a single migration update)
./node_modules/.bin/migrate-mongo status (to list any pending migrations yet to be run)

When this process is completed, the connected database will have a new document representing your migration scripts inside the 'migrations' collection, which tracks the state of the migration. If you need to run your scripts multiple times for test purposes, you can change the state of the migration to 'Down'.
When this process is completed, the connected database will have a new document representing your
migration scripts inside the 'migrations' collection, which tracks the state of the migration.
If you need to run your scripts multiple times for test purposes, you can change the state of
the migration to 'Down'.

During this process, please ensure you are using a personal database.

#### Step 5

Commit the code to the relevant git branch and raise a pull request. The migration script will run automatically as the code moves through each environment.
Commit the code to the relevant git branch and raise a pull request. The migration script
will run automatically as the code moves through each environment.

#### Note

You can avoid running migrations manually, you can use `npm run start-with-migrate` to launch the api
locally, with any pending migrations to be run - Ensure the targetted database is correct to avoid any
unwanted migrations elsewhere.
9 changes: 5 additions & 4 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
"private": true,
"dependencies": {
"@google-cloud/bigquery": "^5.9.3",
"@google-cloud/monitoring": "^2.1.0",
"@google-cloud/monitoring": "^3.0.3",
"@google-cloud/pubsub": "^2.19.4",
"@google-cloud/storage": "^5.3.0",
"@hubspot/api-client": "^4.1.0",
Expand Down Expand Up @@ -47,7 +47,7 @@
"keygrip": "^1.1.0",
"lodash": "^4.17.19",
"mailchimp-api-v3": "^1.15.0",
"migrate-mongoose": "^4.0.0",
"migrate-mongo": "^9.0.0",
"moment": "^2.29.3",
"mongoose": "^5.12.7",
"morgan": "^1.10.0",
Expand All @@ -67,7 +67,7 @@
"randomstring": "^1.1.5",
"redis": "4.0.0",
"simple-gcp-logging": "git+https://github.com/HDRUK/simple-gcp-logging.git#main",
"sinon": "^9.2.4",
"sinon": "^15.0.0",
"snyk": "^1.334.0",
"swagger-ui-express": "^4.1.4",
"test": "^0.6.0",
Expand All @@ -84,9 +84,10 @@
"mongodb-memory-server": "6.9.2",
"nodemon": "^2.0.3",
"plop": "^2.7.4",
"supertest": "^4.0.2"
"supertest": "^6.3.3"
},
"scripts": {
"start-with-migrate": "./node_modules/.bin/migrate up && node index.js",
"start": "node index.js",
"server": "nodemon --ignore 'src/**/*.json' index.js",
"debug": "nodemon --inspect=0.0.0.0:3001 index.js",
Expand Down
7 changes: 6 additions & 1 deletion src/config/server.js
Original file line number Diff line number Diff line change
Expand Up @@ -166,8 +166,13 @@ app.use('/api/v1/topics', require('../resources/topic/topic.route'));
app.use('/api/v1/publishers', require('../resources/publisher/publisher.route'));
app.use('/api/v1/teams', require('../resources/team/team.route'));
app.use('/api/v1/workflows', require('../resources/workflow/workflow.route'));

app.use('/api/v1/messages', require('../resources/message/message.route'));
app.use('/api/v1/reviews', require('../resources/tool/review.route'));
app.use('/api/v3/messages', require('../resources/message/v3/message.route'));

app.use('/api/v1/reviews', require('../resources/review/v1/review.route'));
app.use('/api/v3/reviews', require('../resources/review/v3/review.route'));

app.use('/api/v1/relatedobject/', require('../resources/relatedobjects/relatedobjects.route'));

app.use('/api/v1/accounts', require('../resources/account/account.route'));
Expand Down
1 change: 0 additions & 1 deletion src/resources/collections/collections.controller.js
Original file line number Diff line number Diff line change
Expand Up @@ -76,7 +76,6 @@ export default class CollectionsController extends Controller {

async getCollectionRelatedResources(req, res) {
let collectionID = parseInt(req.params.collectionID);

try {
const data = await this.collectionsService.getCollectionObjects(collectionID);
return res.json({ success: true, data: data });
Expand Down
Loading

0 comments on commit 3fa1021

Please sign in to comment.