Files
wlan-toolsmith/terraform/backups-929991548720
Johann Hoffmann 5db814629f [WIFI-7180] Improve our cloud costs visibility and control (#221)
* Adapt budget alarms and increase cost threshold

Signed-off-by: Johann Hoffmann <johann.hoffmann@mailbox.org>

* Fix Terraform syntax

Signed-off-by: Johann Hoffmann <johann.hoffmann@mailbox.org>

* Add lifecycle rule for logs bucket

Signed-off-by: Johann Hoffmann <johann.hoffmann@mailbox.org>

* Add name tags for S3 buckets

Signed-off-by: Johann Hoffmann <johann.hoffmann@mailbox.org>

* Fix Terraform syntax

Signed-off-by: Johann Hoffmann <johann.hoffmann@mailbox.org>

* Fix name tag

Signed-off-by: Johann Hoffmann <johann.hoffmann@mailbox.org>

* Fix deprecation warnings

Signed-off-by: Johann Hoffmann <johann.hoffmann@mailbox.org>

* Re-add versioning config for backup bucket

Signed-off-by: Johann Hoffmann <johann.hoffmann@mailbox.org>

* Fix more deprecation warnings and upgrade Terraform providers

Signed-off-by: Johann Hoffmann <johann.hoffmann@mailbox.org>

* Reset monthly budget for other projects and add cost anomaly alert

Signed-off-by: Johann Hoffmann <johann.hoffmann@mailbox.org>

* Remove policy condition

Signed-off-by: Johann Hoffmann <johann.hoffmann@mailbox.org>

Signed-off-by: Johann Hoffmann <johann.hoffmann@mailbox.org>
2022-11-14 14:54:40 +01:00
..
2020-09-25 16:24:50 +03:00

Abstract

This repo provides a code that deploy AWS infrastructure using Terraform on AWS to perform daily backups of github repositiories to S3 bucket.

Installation

  1. Install terraform https://www.terraform.io/downloads.html.

  2. Configure AWS access https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-configure.html.

First time setup

  1. Cd to remote_state_tf directory and run terraform init followed by terraform apply in order to create AWS S3 bucket storing terraform state.

  2. Cd to tf directory and run terraform init followed by terraform apply which creates AWS Step Function, IAM roles, ECS cluster etc.

  3. Cd to images/build_image_github_repo_backup, execute build_docker_image.sh script which builds docker image and pushes it to AWS ECR. Repeat the step for images/build_image_atlassian_cloud_backup.

  4. Subscribe necessary emails to SNS arn:aws:sns:<region>:<account id>:repo_backup.

  5. Update /sfn/atlassian-token, /sfn/atlassian-user, /sfn/github-token keys in SSM parameter store with valid values at https://console.aws.amazon.com/systems-manager/parameters/.

Updates to the backup code

All backup logic is stored in build_image directory, mainly in build_image/entrypoint_repo_backup.sh. Once the code is updated, execute build_docker_image.sh script which builds docker image and pushes it to AWS ECR.

Updates to the terraform code

IAM permissions, S3 bucket name, gihub token, github organization name, blacklisted repo list, backup schedule are passed as environment variables to ECS task and are managed by terraform (tf/terraform.tfvars). Once terraform code in tf directory is updated, execute terraform apply in order to apply the changes.

As an example, if you need to change S3 bucket name, perform the following steps:

  1. Cd into tf directory, run terraform state rm aws_s3_bucket.repo_backup and terraform state rm aws_s3_bucket_public_access_block.repo_backup.

  2. Update S3 bucket name - s3_bucket_backup_name variable in tf/terraform.tfvars.

  3. Run terraform apply.

  4. Copy the objects from old to new bucket https://aws.amazon.com/premiumsupport/knowledge-center/move-objects-s3-bucket/.

  5. Cleanup old bucket aws s3 rm s3://bucket-name --recursive (see more at https://docs.aws.amazon.com/AmazonS3/latest/dev/delete-or-empty-bucket.html).

  6. Delete old bucket via AWS S3 console.