JPOW Token Launches

Good evening all. In light of structural economic weakness, a global pandemic, and egregious market valuations we are proud to announce the launch of $JPOW: the imminent solution to the DeFi…

Smartphone

独家优惠奖金 100% 高达 1 BTC + 180 免费旋转




Migrating my personal website to AWS

Very soon I’ll be moving to America, so I don’t want another Australian host due to the latency. I thought about just moving to Github pages (because they’re free), however I like the idea of the freedom of having a host with a database and a backend server in case I want to play with something, and I like having everything in one place.

I’ve spent the last 18 months at Assignar working exclusively with AWS, so I’ve learned a lot about many of the core features; how to configure them, how they work, and how dirt cheap it is. So AWS felt like a nice choice. I don’t get a whole lot of traffic on my personal stuff, so for the most part I will be able to stay under the free-tiers.

Current architecture:

Goal architecture:

So in terms of migrations I had a few things to do. I wanted to challenge myself and try and do this without downtime. Apart from the fact that the PHP and MySQL servers run on the same box, there aren’t any real dependencies between the pieces, so the order in which the migration occurs didn’t matter too much.

I’ve had some experience playing with DNS before on my host as well as on CloudFlare, and it’s made even easier by the fact that most DNS hosts let you export your config. With the configuration moved, I switched GoDaddy from pointing at CloudFlare to pointing at Route 53. A few hours later, the DNS caches refreshed and… certificate errors.

The routing worked perfectly! But I forgot about my certificates. To secure the connection between CloudFlare and my host, I had the self-signed CloudFlare origin certificate installed on my host. Because it’s self-signed; the browser treats it as untrusted.

Something I learned at Assignar is the best way to serve static assets is from S3. It’s cheap, fast, and reliable. If you’re not doing server-side rendering, then it’s the best way to get things out there. I created a new bucket in S3 and dumped the old files into it. It’s important to note; don’t make this bucket public, because we only want the files accessible via the proper URL.

CloudFront is a great service which (amongst other things) lets you globally distribute files stored on S3. It was pretty straightforward to fill out the form and create a new distribution, then add a new origin pointing at the S3 bucket. A quick whirl with the distribution URL and it all was working fine. The content is officially migrated.

As an aside, I rarely changed my homepage in the past, so I used to manually copy files across whenever I did make a change. However now that I’m on AWS, I can utilise CodeBuild. Creating a new project is easy enough — choose GitHub as the “Source provider”. Choose an AWS managed Ubuntu/NodeJS image, and leave everything else. If you want truly automated builds, tick the Webhook box and enter a branch name, and CodeBuild will rerun whenever you push to that branch.

The next step in the migration is switching Route 53 to point at this new distribution.

First I needed a certificate — I chose to get one issued by Amazon, but I could just have well used my Let’s Encrypt certificate I generated earlier. An important note here is that if you want CloudFront to be able to see it, you can ONLY store certificates in the us-east-1 (N. Virginia) region. I don’t know why exactly, but if you upload your certificates into any other region — CloudFront will not see them. The process for getting an AWS certificate is easy — follow the wizard in ACM to request a certificate, and then choose “DNS Validation”. It’ll automatically add the required DNS rules to Route 53, and your certificate will be issued once they’re confirmed. After that, you can then select the certificate in CloudWatch.

This is the final bit of cleanup, and isn’t really necessary — I just liked the idea of having everything billed from one place.

This process was actually surprisingly painless — I thought GoDaddy would have implemented some dark patterns or something to make it hard for you to transfer, but nope. There’s a button to unlock the domain, and one to request an authorization code. Once the unlock is done, it’s easy to follow the wizard inside Route 53 and fill out the information. It took about 2 days for the transfer to go through.

And with the last email confirmation — everything is migrated!

Add a comment

Related posts:

How do you start entrepreneurship?

Starting your own business is a daunting task, but it doesn’t have to be impossible. Here are tips to help you get started: Know what you want Before you can start anything, you need to know what it…

Buy TripAdvisor Reviews

If you are looking to buy Tripadvisor reviews, we will say that you are at the right seller. We provide worldwide qualified Tripadvisor review sales service. We have a number of authors who act as…

Beneficial ways in which America can change the life of foreigners

There are lots of people that immigrate from one place to another to live a better life. They migrate to avail the better opportunities that help them to spend a successful and contented life. Well…