Skip to main content
Skip table of contents

How to Automate Repo Upload to S3


  1. Pretty straight forward first create a new folder at the root of the repo titled .github/workflows.

  2. Then copy and paste this .yml file into that folder. (I titled it s3_upload.yml but you can title it whatever you want.)

name: Upload to S3

      - master #Change to main if that is the root branch
      # - [branch_name] #for branches you guys pull from
      - 'releases/**'

    runs-on: ubuntu-latest
    # See
      # so we can do aws auth
      id-token: write
      contents: read
      - name: Git clone the repository
        uses: actions/checkout@v4
      - name: configure aws credentials
        uses: aws-actions/configure-aws-credentials@v1
          role-to-assume: ${{ secrets.MARKETPLACE_S3_UPLOAD_ROLE_ARN }}
          role-session-name: gh-actions-${{ github.action }}
          aws-region: "us-east-1"
      - name: Copy files to the production website with the AWS CLI
        run: |
          aws s3 sync . s3://forklift-python-dependencies/${{ github.repository }}/${{ github.ref_name }}

Note: You’ll want to change push on branches from master to main if that is the name of the root branch. You may also wanna list any branches or tags you pull from.

Also: you may want to make a note copying this actions you can put your install step in here or something if necessary. Like for a library they my do python sdist or something to generate the files to upload.

To connect the S3 to marketplace replace the github URL with the S3 URL→ example:${{ github.repository }}/${{ github.ref_name }}

Note: github repository will be AgencyPMG/{repo-name} and github.ref_name will be the branch and tag

JavaScript errors detected

Please note, these errors can depend on your browser setup.

If this problem persists, please contact our support.