A demonstration on how to deploy a static website to AWS S3.
For this, we are going to use 2 Buckets, 1 bucket will hold the static webpage while the other bucket will be used for artifacts. These “artifacts” will be created by AWS when it builds the deployment pipeline used by CodePipeline. You’ll reference this bucket later when creating your pipeline.
This second Bucket, can be used for current and future pipelines.

For the 1st Bucket i will create and name it “wwwpages” – using AWS defaults, This bucket does not require any special permissions for CodePipeline to deploy to it
I will proceed to create a second bucket, for artifacts , for this example i will call it “rubani”. This bucket is what codepipeline will make a reference to.
Creating The Pipeline.
Log into your AWS account and then navigate into the AWS CodePipeline Service.
Select the orange Create pipeline button in the upper right. This will open the Choose pipeline settings dialog screen seen below.

Under Pipeline name we’ll name this pipeline rafiki-Website. The name can’t contain spaces or special characters.
AWS will require a service role that has appropriate permissions to interact with other AWS services like S3. Although you’re able to create unique roles for each pipeline you create, for your first pipeline I recommend that you choose New service role and select the check box to allow AWS CodePipeline to create it for you. If you’re creating and deploying the same sorts of sites to AWS S3, it’s likely you will be able to re-use this same service role for future pipelines.
You can also accept the default role name suggested by AWS if you want. In this example I’m changing the role name to AWSCodePipelineServiceRole-eu-west-1-website. The next time I create a pipeline, I will select the existing service role option and choose this one from the list.
Expand the Advanced Settings section!
Under Artifact store choose the Custom location radio button and then select the AWS S3 bucket that you created in the earlier step for all CodePipeline artifact storage. In this example that is the “rubani” bucket.

Upon clicking “NEXT” – it will open the next setup screen, which is the Add source stage dialog. This is where you’ll connect to your GitHub repository.
In the Source provider dropdown select GitHub (Version 2)
This opens the rest of the screen for all the necessary parameters to connect to your GitHub account.


AT this point, click on Next, this will take us to the build stage.
Because this example is a simple web page that requires no compilation or special deployment considerations, we can simply choose the Skip build stage button. You’ll then be asked “Are you sure?” and be prompted to confirm your selection. On the dialog prompt select the orange Skip button.
This brings us to the final Add deploy stage configuration screen where we’ll choose where CodePipeline should deploy your code.
In the Deploy provider selector choose Amazon S3. This will populate the screen with the settings required connect CodePipeline to the AWS S3 bucket we created for our site in the first section of this article.
Click the orange Next button. This will take you to the final step where you can scroll down and review all of your configuration settings for this pipeline.
Review the configuration, scroll down and click the orange Create pipeline button at the bottom of the screen.
This will save all of your settings and create your AWS CodePipeline. It will then trigger your first deployment and pull the latest code from the GitHub repository branch you specified and deploy it to your AWS S3 bucket.
You’ll be able to watch as the CodePipeline job works through each step. As it completes each one it will mark it with a green success indicator. Once the Deploy section completes your code will now be found in your AWS S3 bucket!

If we head back over to the S3 bucket we created earlier we’ll see that all of the folders and files created in that website project are now there. If you have AWS CloudFront connected to this bucket, the process to deploy via CloudFront will kick in automatically and you should see website updates in your browser within minutes, if not almost immediately.
We have now created a simple continuous integration and deployment to S3 using AWS CodePipeline and GitHub.