Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: Include Environment Variables in Task Definitions with JSON Fragments #73

Open
wants to merge 9 commits into
base: master
Choose a base branch
from

Conversation

rojaswestall
Copy link

@rojaswestall rojaswestall commented Nov 6, 2020

Issue #20

Description of changes:

Merge a JSON fragment of a task definition with a task definition using lodash's mergewith. containerDefinitions and name within each container definition are required in the fragment.

  • Merge JSON fragments specified with merge input with the task definition
  • Added lodash mergeWith dependency
  • Updated tests
  • Updated README
  • Updated actions.yml inputs (made image optional and added merge)

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

@rojaswestall rojaswestall changed the title Include Environment Variables in Task Definitions with JSON Fragments feat: Include Environment Variables in Task Definitions with JSON Fragments Nov 6, 2020
@allisaurus allisaurus self-assigned this Dec 8, 2020
Copy link

@allisaurus allisaurus left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for this contribution, @rojaswestall ! In addition the below feedback, can you also provide some evidence of testing this change? (e.g., a successful action run - you can run your own code by running npm package, pushing the generated /dist to your feature branch, and specifying yourfork@branch in the action.yml )

containerDef.image = imageURI;

// Check for imageURI
if(imageURI) {

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

since neither input is strictly required now, can we add some debug statements to indicate which course(s) of action the workflow is taking, and throw an error if neither is provided? That way if I wanted to see what was happening I could see:

(image specified) -------> "inserting container image [IMAGE]"
(merge file specified) ---> "merging task definition fragment [FILE_NAME]"
(nothing specified) -----> ERROR: "No image or merge file specified. Please specify one or both to use this action."


// Merge the merge file
if (!Array.isArray(mergeContents.containerDefinitions)) {
throw new Error('Invalid merge fragment definition: containerDefinitions section is not present or is not an array');

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would prefer we not require the presence of containerDefinitions, since there are top-level fields which can plausibly differ between environments (task-level memory/CPU, tags, task role, etc.). Is it possible to apply merge functionality across the entire task def vs. just this field?

@brunocascio
Copy link

Hey folks,

Do you plan to merge this soon? :D

@allisaurus
Copy link

@rojaswestall do you have any comment/response to the PR feedback? I would like to work towards getting this approved if you're still interested

@rajeevnaikte
Copy link

We need this, merge please. Thanks.

@allisaurus allisaurus assigned piradeepk and unassigned allisaurus Jan 22, 2021
@rntdrts
Copy link

rntdrts commented Feb 4, 2021

I don't think you need this to solve the problem, if you want to have different environment variables per deploying environment you can simply load a different task-definition.json. Here's what I did:

task-definition: deploy/task-definition.${{ env.ENVIRONMENT }}.json

For those who use SSM, the following is not a problem. It actually works well with that trick above ☝️ .
However, if you only want to use Github secrets and you want to add them to the environment variables I think it would be great if we had a way to inject those variables as we do with the <IMAGE> tag. Where we could set a specific key and add it to the inputs. Some example:

{
  "name": "adapter-service",
  "image": "<IMAGE>",
  "environment": [
    {
      "name": "NODE_ENV",
      "value": "<ENVIRONMENT>"
    },
  ]
}

There are probably different ways to do this, but I think we can simply search for values with <> and match them with env variables names:

      - name: Fill in the new image ID in the Amazon ECS task definition
        id: task-def
        uses: aws-actions/amazon-ecs-render-task-definition@v1
        with:
          task-definition: deploy/task-definition.${{ env.ENVIRONMENT }}.json
          container-name: container-name
          image: container-image
        env:
          ENVIRONMENT: ${{ env.ENVIRONMENT }} 

What do you think?

@brunocascio
Copy link

brunocascio commented Feb 4, 2021

@rntdrts Having 1 file per environment is not scalable IMHO.

I implemented a similar solution seeding variables into a template.

task-definition.myservice.template.json

{
    "family": "my-service-__ENVIRONMENT__",
    "networkMode": "awsvpc",
    "executionRoleArn": "arn:aws:iam::764292098539:role/ecsTaskExecutionRole",
    "containerDefinitions": [
        {
            "name": "my-service",
            "image": "",
            "portMappings": [
                {
                    "hostPort": 5000,
                    "protocol": "tcp",
                    "containerPort": 5000
                }
            ],
            "essential": true,
            "environment": [
                {
                    "name": "NODE_ENV",
                    "value": "production"
                },
                ...
            ],
            "secrets": [
                {
                    "valueFrom": "__ENVIRONMENT__.MY_SERVICE_API_KEY",
                    "name": "API_KEY"
                },
                ...
            ]
        }
    ],
    "requiresCompatibilities": [
        "FARGATE"
    ],
    "cpu": "256",
    "memory": "512"
}

workflow yml:

 - name: Sed "my-service" task definition with staging
   run: sed -i "s/__ENVIRONMENT__/staging/g" .github/workflows/task-definition.my-service.template.json

- name: Render "my-service" task definition
  id: rendered-my-service-task-definition
  uses: aws-actions/amazon-ecs-render-task-definition@v1
  with:
    task-definition: .github/workflows/task-definition.my-service.template.json
    container-name: my-service
    image: ${{ env.DTR_REGISTRY }}/my-service:${{ env.GIT_COMMIT }}

- name: Deploy "my-service" to staging
  uses: aws-actions/amazon-ecs-deploy-task-definition@v1
  with:
    task-definition: ${{ steps.rendered-my-service-task-definition.outputs.task-definition }}
    service: my-service-staging
    cluster: my-cluster

I get populated SSM before with staging.MY_SERVICE_API_KEY. If it's missing, so aws keep failing until it exists.

The goal here is to use the same task definition across all the environments (could be DEV, QA, PPD, INT, PROD, etc) and write the code once.

@rntdrts
Copy link

rntdrts commented Feb 4, 2021

The goal here is to use the same task definition across all the environments (could be DEV, QA, PPD, INT, PROD, etc) and write the code once.

Maybe I didn't understand, but I don't think that what was done here solves your problem (at least by looking at the code). What is done here is a way to merge JSON files with each other. You will end up having to create similar files for each deploying environment (staging.json prod.json), plus the generic task definition.

@brunocascio
Copy link

brunocascio commented Feb 4, 2021

The goal here is to use the same task definition across all the environments (could be DEV, QA, PPD, INT, PROD, etc) and write the code once.

Maybe I didn't understand, but I don't think that what was done here solves your problem (at least by looking at the code). What is done here is a way to merge JSON files with each other. You will end up having to create similar files for each deploying environment (staging.json prod.json), plus the generic task definition.

Yes and no, because your staging.json and prod.json would only contain the dynamic parts, like the following:

task-definition.json

{
    "networkMode": "awsvpc",
    "executionRoleArn": "arn:aws:iam::764292098539:role/ecsTaskExecutionRole",
    "containerDefinitions": [
        {
            "name": "my-service",
            "image": "",
            "portMappings": [
                {
                    "hostPort": 5000,
                    "protocol": "tcp",
                    "containerPort": 5000
                }
            ],
            "essential": true,
            "environment": [
                {
                    "name": "NODE_ENV",
                    "value": "production"
                },
                ...
            ],
            ]
        }
    ],
    "requiresCompatibilities": [
        "FARGATE"
    ],
    "cpu": "256",
    "memory": "512"
}

staging.json

{
    "family": "my-service-staging",
    "containerDefinitions": [
        {
            "name": "my-service",
            "secrets": [
                {
                    "valueFrom": "staging.MY_SERVICE_API_KEY",
                    "name": "API_KEY"
                },
                ...
            ]
        }
    ]
}

prod.json

{
    "family": "my-service-prod",
    "containerDefinitions": [
        {
            "name": "my-service",
            "secrets": [
                {
                    "valueFrom": "production.MY_SERVICE_API_KEY",
                    "name": "API_KEY"
                },
                ...
            ]
        }
    ]
}

Might be better to have a template file and replace it with sed, but for example, if you want to use a different logging driver for production, you have to duplicate the code, so here's where merge JSON files come into action.

@dspv
Copy link

dspv commented Jun 10, 2021

Hi. Any ideas when it can be merged? =)

@allisaurus allisaurus removed their assignment Mar 7, 2022
@alencarsouza
Copy link

Hey folks. Any idea when it is going to be merged? It has been a while now.

@brunoenribeiro
Copy link

Really looking forward for this!

@power-cut
Copy link

Waiting for this issue to get merged

@amazreech
Copy link
Contributor

amazreech commented May 10, 2024

Hi @rojaswestall, thank you so much for your contribution. Apologies on the delay. We will be working on reviewing Pull Requests on the repositories. In the mean time please ensure that below steps, if not already done, are taken care of in your PR:

  1. Verify if PR follows semantic pull request conventions.

  2. Please ensure any comments on the earlier reviews of the PR are addressed.

  3. Please run npm run package command to update dist/ folder with latest dependencies.

  4. Resolve merge conflicts on the PR if any.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet