Skip to content

Commit

Permalink
chore: added feedback corrections
Browse files Browse the repository at this point in the history
  • Loading branch information
JekzVadaria committed Aug 22, 2024
1 parent e72caf5 commit 48f3965
Show file tree
Hide file tree
Showing 52 changed files with 333 additions and 74 deletions.
24 changes: 9 additions & 15 deletions serverless-rest-api/python-rest-sam/tests/integration/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
import pytest
import time

APPLICATION_STACK_NAME = os.getenv('TEST_APPLICATION_STACK_NAME', None)
COGNITO_STACK_NAME = os.getenv('TEST_COGNITO_STACK_NAME', None)
globalConfig = {}

Expand All @@ -19,13 +20,6 @@ def get_stack_outputs(stack_name):
result[output["OutputKey"]] = output["OutputValue"]
return result

def get_tf_outputs():
result = {}
for key, value in os.environ.items():
if key.startswith("TF_OUTPUT_"):
output_key = key[10:] # Remove the "TF_OUTPUT_" prefix
result[output_key] = value
return result

def create_cognito_accounts():
result = {}
Expand Down Expand Up @@ -120,7 +114,7 @@ def create_api_key():
api_key_id = response["id"]
api_key_value = response["value"]
response = apigw_client.create_usage_plan_key(
usagePlanId=globalConfig['api_enterprise_usage_plan'],
usagePlanId=globalConfig['ApiEnterpriseUsagePlan'],
keyId=api_key_id,
keyType='API_KEY'
)
Expand All @@ -134,30 +128,30 @@ def clear_dynamo_tables():
# clear all data from the tables that will be used for testing
dbd_client = boto3.client('dynamodb')
db_response = dbd_client.scan(
TableName=globalConfig['locations_table'],
TableName=globalConfig['LocationsTable'],
AttributesToGet=['locationid']
)
for item in db_response["Items"]:
dbd_client.delete_item(
TableName=globalConfig['locations_table'],
TableName=globalConfig['LocationsTable'],
Key={'locationid': {'S': item['locationid']["S"]}}
)
db_response = dbd_client.scan(
TableName=globalConfig['resources_table'],
TableName=globalConfig['ResourcesTable'],
AttributesToGet=['resourceid']
)
for item in db_response["Items"]:
dbd_client.delete_item(
TableName=globalConfig['resources_table'],
TableName=globalConfig['ResourcesTable'],
Key={'resourceid': {'S': item['resourceid']["S"]}}
)
db_response = dbd_client.scan(
TableName=globalConfig['bookings_table'],
TableName=globalConfig['BookingsTable'],
AttributesToGet=['bookingid']
)
for item in db_response["Items"]:
dbd_client.delete_item(
TableName=globalConfig['bookings_table'],
TableName=globalConfig['BookingsTable'],
Key={'bookingid': {'S': item['bookingid']["S"]}}
)
return
Expand All @@ -167,8 +161,8 @@ def clear_dynamo_tables():
def global_config(request):
global globalConfig
# load outputs of the stacks to test
globalConfig.update(get_stack_outputs(APPLICATION_STACK_NAME))
globalConfig.update(get_stack_outputs(COGNITO_STACK_NAME))
globalConfig.update(get_tf_outputs())
globalConfig.update(create_cognito_accounts())
globalConfig.update(create_api_key())
clear_dynamo_tables()
Expand Down
60 changes: 60 additions & 0 deletions serverless-rest-api/python-rest-terraform/README-terraform.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
<!-- BEGIN_TF_DOCS -->
## Requirements

No requirements.

## Providers

| Name | Version |
|------|---------|
| <a name="provider_aws"></a> [aws](#provider\_aws) | n/a |

## Modules

| Name | Source | Version |
|------|--------|---------|
| <a name="module_codebuild_terraform"></a> [codebuild\_terraform](#module\_codebuild\_terraform) | ./modules/codebuild | n/a |
| <a name="module_codepipeline_iam_role"></a> [codepipeline\_iam\_role](#module\_codepipeline\_iam\_role) | ./modules/iam-role | n/a |
| <a name="module_codepipeline_kms"></a> [codepipeline\_kms](#module\_codepipeline\_kms) | ./modules/kms | n/a |
| <a name="module_codepipeline_terraform"></a> [codepipeline\_terraform](#module\_codepipeline\_terraform) | ./modules/codepipeline | n/a |
| <a name="module_s3_artifacts_bucket"></a> [s3\_artifacts\_bucket](#module\_s3\_artifacts\_bucket) | ./modules/s3 | n/a |

## Resources

| Name | Type |
|------|------|
| [aws_caller_identity.current](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/data-sources/caller_identity) | data source |
| [aws_codestarconnections_connection.github](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/data-sources/codestarconnections_connection) | data source |

## Inputs

| Name | Description | Type | Default | Required |
|------|-------------|------|---------|:--------:|
| <a name="input_build_project_source"></a> [build\_project\_source](#input\_build\_project\_source) | aws/codebuild/standard:4.0 | `string` | `"CODEPIPELINE"` | no |
| <a name="input_build_projects"></a> [build\_projects](#input\_build\_projects) | Tags to be attached to the CodePipeline | `list(string)` | n/a | yes |
| <a name="input_builder_compute_type"></a> [builder\_compute\_type](#input\_builder\_compute\_type) | Relative path to the Apply and Destroy build spec file | `string` | `"BUILD_GENERAL1_SMALL"` | no |
| <a name="input_builder_image"></a> [builder\_image](#input\_builder\_image) | Docker Image to be used by codebuild | `string` | `"aws/codebuild/amazonlinux2-x86_64-standard:5.0"` | no |
| <a name="input_builder_image_pull_credentials_type"></a> [builder\_image\_pull\_credentials\_type](#input\_builder\_image\_pull\_credentials\_type) | Image pull credentials type used by codebuild project | `string` | `"CODEBUILD"` | no |
| <a name="input_builder_type"></a> [builder\_type](#input\_builder\_type) | Type of codebuild run environment | `string` | `"LINUX_CONTAINER"` | no |
| <a name="input_codepipeline_iam_role_name"></a> [codepipeline\_iam\_role\_name](#input\_codepipeline\_iam\_role\_name) | Name of the IAM role to be used by the Codepipeline | `string` | `"codepipeline-role"` | no |
| <a name="input_create_new_role"></a> [create\_new\_role](#input\_create\_new\_role) | Whether to create a new IAM Role. Values are true or false. Defaulted to true always. | `bool` | `true` | no |
| <a name="input_github_connection_name"></a> [github\_connection\_name](#input\_github\_connection\_name) | Name of the GitHub connection to be used by the Codepipeline | `string` | `"github-connection-serverless"` | no |
| <a name="input_project_name"></a> [project\_name](#input\_project\_name) | Unique name for this project | `string` | `"serverless-tf-cicd"` | no |
| <a name="input_region"></a> [region](#input\_region) | AWS region to deploy serverless application in | `string` | `"eu-west-2"` | no |
| <a name="input_source_repo_branch"></a> [source\_repo\_branch](#input\_source\_repo\_branch) | Default branch in the Source repo for which CodePipeline needs to be configured | `string` | `"main"` | no |
| <a name="input_source_repo_name"></a> [source\_repo\_name](#input\_source\_repo\_name) | Source repo name of the GitHub repository | `string` | n/a | yes |

## Outputs

| Name | Description |
|------|-------------|
| <a name="output_codebuild_arn"></a> [codebuild\_arn](#output\_codebuild\_arn) | The ARN of the Codebuild Project |
| <a name="output_codebuild_name"></a> [codebuild\_name](#output\_codebuild\_name) | The Name of the Codebuild Project |
| <a name="output_codepipeline_arn"></a> [codepipeline\_arn](#output\_codepipeline\_arn) | The ARN of the CodePipeline |
| <a name="output_codepipeline_name"></a> [codepipeline\_name](#output\_codepipeline\_name) | The Name of the CodePipeline |
| <a name="output_dynamodb_table_name"></a> [dynamodb\_table\_name](#output\_dynamodb\_table\_name) | The Name of the DynamoDB Table |
| <a name="output_iam_arn"></a> [iam\_arn](#output\_iam\_arn) | The ARN of the IAM Role used by the CodePipeline |
| <a name="output_kms_arn"></a> [kms\_arn](#output\_kms\_arn) | The ARN of the KMS key used in the codepipeline |
| <a name="output_s3_arn"></a> [s3\_arn](#output\_s3\_arn) | The ARN of the S3 Bucket |
| <a name="output_s3_bucket_name"></a> [s3\_bucket\_name](#output\_s3\_bucket\_name) | The Name of the S3 Bucket |
<!-- END_TF_DOCS -->
48 changes: 38 additions & 10 deletions serverless-rest-api/python-rest-terraform/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,7 @@ aws cognito-idp initiate-auth --auth-flow USER_PASSWORD_AUTH --client-id <cognit

## Manually deploy the sample application
***Note:** Before deploying application manually first time you will need to deploy shared Cognito stack, see previous section for details.*
***Note:** This method of deployment is prefrable when you want to make changes to this sample application and develope your application using this template.*

This project is set up like a standard Python project.
You may need manually create a virtualenv:
Expand All @@ -79,14 +80,13 @@ pip install -r ./src/lambda_layer/requirements.txt
pip install -r ./tests/requirements.txt
```

To deploy your application for the first time, few terraform variables needed to be configured once. Open the `applications/variables.tf` file and update the default value of following variables as per the guidence.
If you want to change default variables, Modify the `default` value of following variables in `applications/variables.tf`.

* **serverless_application_name**: The name of the terraform application resources to deploy. This should be unique to your account and region, and a good starting point would be something matching your project name.
* **region**: The AWS region you want to deploy your app to.
* **cognito_stack_name**: The shared Cognito stack name
* **cognito_stack_name**: The shared Cognito stack name. ***If you changed Cognito stack name, make sure to update here***


Run the following in your shell to deploy the infrastructure manually:
Comment `terraform` block (line 14 to 16) from `applications/provider.tf` to configure local terraform backend. Run the following in your shell to deploy the infrastructure manually:

```bash
terraform init
Expand All @@ -99,6 +99,25 @@ The first command will download the required providers and terraform modules. Th

The API Gateway endpoint API and ID will be displayed in the outputs when the deployment is complete.

## Use the AWS SAM CLI to build and test locally

Comment line 15 to 17 from `applications/provider.tf` to configure local terraform backend. Build your application by using the `sam build` command.

```bash
sam build
```

The AWS SAM CLI installs dependencies that are defined in `requirements.txt`, creates a deployment package, and saves it in the `.aws-sam/build` folder. AWS SAM CLI also uses Lambda Layers configured in your terraform defination automatically.

Test a single function by invoking it directly with a test event. An event is a JSON document that represents the input that the function receives from the event source. Test events are included in the `events` folder in this project.

Run functions locally and invoke them with the `sam local invoke` command.

```bash
sam local invoke 'module.lambda_functions["locations"].aws_lambda_function.this[0]' --event events/event-get-location-by-id.json
sam local invoke 'module.lambda_functions["locations"].aws_lambda_function.this[0]' --event events/event-get-all-locations.json
```

### Usage plans

This application utilizes API Gateway [Usage Plans](https://docs.aws.amazon.com/apigateway/latest/developerguide/api-gateway-api-usage-plans.html). Usage plan references (Basic and Enterprise) will be displayed in the outputs when the deployment is complete.
Expand Down Expand Up @@ -149,7 +168,7 @@ git pull ../serverless-samples serverless-rest-api
cd python-rest-terraform
```

To deploy the cicd pipeline infrastructure, Update the variables in `terraform.tfvars` in newly created folder based on your requirement. Make sure to change the `source_repo_name` as per your GitHub Repository owner.
To deploy the cicd pipeline infrastructure, Update the variables in `terraform.tfvars` in newly created `serverless-rest-api-cicd` repository based on your requirement. Make sure to change the `source_repo_name` as per your GitHub Repository owner.

Update remote backend configuration as required, by default it will take it as local-backend.

Expand All @@ -167,11 +186,11 @@ git remote add origin <URL to GitHub repository>
git push origin main
```

![CodePipeline](./assets/CodePipeline.png)
![CodePipeline](./assets/TFCodePipeline.png)

Note that same Amazon Cognito stack is used in both testing and production deployment stages, same user credentials can be used for testing and API access.

## Cleanup #TODO
## Cleanup

To delete the sample application that you created manually, use the below commands:

Expand All @@ -180,12 +199,21 @@ cd serverless-samples/serverless-rest-api/python-rest-terraform/application
terraform destroy
```

If you created CI/CD pipeline you will need to delete it as well, including all testing and deployment stacks created by the pipeline. Please note that actual stack names may differ in your case, depending on the pipeline stack name you used.
If you created CI/CD pipeline you will need to delete it as well, including all testing and deployment infrastructure created by the pipeline. You can destroy the infrastructure created by pipeline by Manual approval of Destroy stage as shown below image.

![CodePipeline-Destroy](./assets/TFCodePipelineDestroy.png)

Once the serverless application infrastructre is removed, you can remove CI/CD pipeline resources. Run below commands:

CI/CD pipeline stack deletion may fail if build artifact S3 bucket is not empty. In such case get bucket name from the error message, open AWS Management Console, navigate to S3 bucket with build artifacts and empty it.
```bash
cd serverless-rest-api-cicd/python-rest-terraform/
terraform destroy --auto-approve
```

CICD stack deletion may fail if build artifact S3 bucket is not empty. In such case get bucket name from the error message, open AWS Management Console, navigate to S3 bucket with build artifacts and empty it.

Re-run below command to remove s3 bucket after emptying it.

```bash
terraform destroy
terraform destroy --auto-approve
```
Loading

0 comments on commit 48f3965

Please sign in to comment.