Skip to content

Latest commit

 

History

History
121 lines (85 loc) · 3.24 KB

README.md

File metadata and controls

121 lines (85 loc) · 3.24 KB

SFTP Transfer to S3

This repository contains a Python 3 script that connects to the shipup SFTP server and copies the latest CSV reports to an AWS S3 bucket. It grabs the latest CSV reports and archives the files on the SFTP server once the transfer to S3 is complete. It also cleans up the local CSV reports.

Prerequisites

What to install to use the module locally

Note:

  • It is assumed that you are running MacOS and using homebrew for installing packages.
  • If you are using zsh, echo the path and copy the virtualenv configuratrion into ~/.zshrc instead of ~/.bashrc.

Install brew dependencies:

  • $ brew list shows what you already have installed.
$ brew update && brew upgrade
$ brew doctor
Your system is ready to brew.
$ brew install python
$ brew install awscli

Install the required pip packages:

$ curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py
$ sudo python3 get-pip.py

Set Python3 as a default version of Python:

$ echo 'export PATH="/usr/local/opt/python/libexec/bin:/usr/local/sbin:$PATH"' >> ~/.bashrc

Install virtualenvwrapper:

$ pip install virtualenv
$ pip install virtualenvwrapper

Add virtualenvwrapper to shell startup file (~/.bashrc):

export WORKON_HOME=$HOME/.virtualenvs
export PROJECT_HOME=$HOME/Devel
source /usr/local/bin/virtualenvwrapper.sh

Install pip requirements:

$ pip install -r requirements.txt

Export required environment variables:

# Change the environment variables to match the SFTP server and AWS S3 bucket of choice.
$ export USER=<SFTP-USERNAME>
$ export PASS=<SFTP-PASSWORD>
$ export HOST=<SFTP-HOST>
$ export BUCKET_NAME=<SFTP-BUCKET-NAME>

Have programmatic access to Eve's AWS account (currently Production)

Export your AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY of the AWS account you want to upload the CSV reports to.

OR

Ensure your credentials are in your ~/.aws/credentials file.

If they're not, you can add them by doing:

$ aws configure
AWS Access Key ID []: <enter-aws-access-key>
AWS Secret Access Key []: <enter-aws-secret-key>
Default region name []: <enter-region-id> # https://docs.aws.amazon.com/AWSEC2/latest/UserGuide/using-regions-availability-zones.html#concepts-available-regions
Default output format []: <leave-blank>

You can then check your CLI is using the correct credentials by doing:

$ aws sts get-caller-identity

Run the tests

$ pylint shipup_s3_transfer.py

Run the script

$ python3 shipup_s3_transfer.py

Potential Improvements

  • Unit tests
  • Improve the quality of Python code. e.g. Add classes and use different modules
  • Change the script into an ansible playbook
  • Create a custom docker base image with alpine packages and pip packages already installed

Built With

  • Python 3 - The programming language used
  • Pysftp - Module/Library used to connect to SFTP and download reports
  • Boto3 - Module/Library used to connect to AWS and upload reports to S3