With so many services available, AWS has made cloud development much easier. However, navigating around so many services can be a little confusing. Using the one click navigation bar, you can get to the services you use the most easily and quickly.

Setting up the toolbar

Setting up the toolbar is super easy, but it’s not immediately apparent that the tool bar even exists, never mind how to set it up.

Firstly, click the push pin icon at the top of the AWS console;

select the pin

Then drag any commonly used AWS services to the top of the screen;

drag the services

Once all the services you want are at the top of the screen, click the push pin again to finish.

done!

Hopefully this little tip will help people save some time when navigating around the AWS management console!

If anyone knows any similar tips I would love to hear them.

Retros are awesome. The chance for a team to come together and review how things are going is a huge opportunity to improve.

They should be open, honest, and engaging. It’s very easy to forget that they should be engaging. That’s why I like to try different formats when I get the opportunity to run a retro.

The elevator pitch

A common retro format is “stop, start, continue”, or a variation on that theme. It’s a common format because it is effective. I think it encourages people to be honest while reminding them positivity is also important.

However, I often find that in a retro where people are asked to give as many ideas as they want, they feel obligated to come up with lots of ideas. Even ideas they don’t particularly want to discuss themselves. Then you use up a lot of time as people deliver all these ideas back to the group and then more time as they are grouped together.

In the “elevator pitch” retro, each participant gets just one idea to pitch to the group to discuss. This can help to make people edit their own ideas and focus on what is truly important to them.

As people are only delivering one potential discussion topic to the group, hopefully they feel like that have more time to fully explain their point and the group will better understand their idea. Furthermore, the time saved on grouping many ideas can be spent on further discussions.

My experience

I’ve found running the”elevator pitch” retro to be a useful format for a retro. Often when people are asked to quickly deliver all their ideas on post it notes to the team; they can go into great detail, using a lot of time, or they gloss over important ideas.

By giving each person an allotted time about the one point they want to talk about it can help to focus the discussion. The extra time to explain their point can also help the group to decide if they would like to discuss it further.

If anyone has any other retro ideas that have worked well for them I to our love to hear them!

I’ve just seen a really useful option for curl commands; using the output option to save the return value to a file.

Using curl

Curl isn’t a tool I use everyday, but whenever I do, it always impresses me at how useful it is.

By using curl on the command line or in scripts, you can automate grabbing data from URLs. This can be super useful to speed up manual processes. By using the curl options to check things like headers you can quickly get through a lot of URLS that would have taken a long time by using a browser manually.

Output option

Using the output option, the data returned from the curl command will be saved to a file rather than stdout.

For example;

curl http://myurl.com -o myfile.txt

or

curl http://myurl.com --output myfile.txt

You can check out the rest of the curl options here.

Future processes

I saw this option while downloading the install script for the Python package installer (pip). But I thought this would be helpful in automating processes where I had to get information from URLs and save the outputs, then use those files as an input to another process.

If anyone else knows of any useful curl options I would love to hear them!

I’ve been thinking recently about why I think TDD is so effective. I think I write better code when I develop using TDD, but I’m also happier when I do. Writing cleaner code probably has something to do with that, but I think it also has something to do with the fabled “zone”.

Tests != TDD

Writing tests does not mean TDD. TDD is when your design is guided by the tests. By following the red, green, refactor steps the design will emerge through writing the minimal amount of code to make your tests pass.

If you write all of your code first then go back to add tests, you are not using TDD.

Benefits of TDD

In my opinion, TDD is a massively helpful tool when trying to write clean and maintainable code. I’ll probably look at the benefits of TDD and the approaches to realise those benefits in other posts.

However, recently I’ve been starting to notice that by following TDD I’ve been noticing other benefits that aren’t immediately apparent from looking at my code. I’ve been noticing that I spend more time in the elusive “zone”.

What is the zone

People often refer to the “zone” as the productive state of mind where you write your best code, everything comes easily to you, and amazing applications shoot out of your finger tips. I might have exaggerated that last bit.

While I do think done people put too much faith in the zone, I do believe it can take time to get in to a productive state of mind.

How TDD can help with the zone

I’ve always seen a bit of a running theme with a lot of developers that interruptions are terrible.

I agree that it can take me time to get back up to speed after an interruption. But I don’t think my productivity is more important than anyone elses. If somebody needs to interrupt me, I want them to feel free to do so.

This is where I find following a TDD approach can help with productivity. One of the best pieces of development advice I was ever given was “go home on a failing test”. That way, when you come back in the next day is much easier to pick up where you left off.

Using the XP principle of turn up the good, I try and always leave my work on a failing test whenever I have to leave my work. Having that failing test is much easier to focus on rather than trying to remember exactly what I was doing before.

If anyone else has any good tips on trying to stay productive I’d love to hear them.

In an app I was working on, I wanted to decouple the UI from the other architectural resources by using a Node back-end with an Express API. Using the JSON Web Token provided by Cognito allowed me to authenticate AWS Cognito users in the back-end.

AWS does have great documentation, but I couldn’t find many code examples of how to decode the JWT in Node. So hopefully this post might be able to help somebody in a similar position.

Getting the token

The first step is to get the JWT for the Cognito user from the client. The JWT is accessible from the user session when you are using amazon-cognito-identity-js

cognitoUser.getSession((err, session) => {
    const token = session.getIdToken().getJwtToken()
})

Sending the token with fetch

Once you have your token, you can send that token to your back-end server. In my case I’m using an Express server, so I’ve set up some routes for my API.

Using the Fetch API I posted the token to the server in the headers.

fetch("/api/myEndPoint", {
    method: "POST",
    headers: {
        "Content-Type": "application/json",
        "token": token
    },
    body: JSON.stringify({
        something: "some data"
    })
})

Get the header

To get the token out of the header in Express, you can get it from the request.

router.post('/api/myEndPoint', function(req, res, next) {
    var token = req.get("token")

    // { decode and use the token here }

    res.send(200)
})

Decode and check the JWT

Once you have the token on the server, you can use jsonwebtoken to decode and verify the JWT token.

const decodedJwt = jwt.decode(token, { complete: true });

The audience on the token should match the app client ID for the Cognito user pool.

if (decodedJwt.payload.aud !== "{app client ID}") {
    throw new Error('Invalid audience: ' + decodedJwt.payload.aud);
}

The issuer on the token should match the Cognito issuer URL; which is made up by adding the region and the user pool ID in the URL below.

if (decodedJwt.payload.iss !== "https://cognito-idp.{region}.amazonaws.com/{user pool ID}") {
    throw new Error('Invalid issuer: ' + decodedJwt.payload.iss);
}

Get the JWK

The JSON Web Keys are available by calling the below URL;

https://cognito-idp.{region}.amazonaws.com/{user pool ID}/.well-known/jwks.json

I used request-promise to call AWS for the JWKs. However, AWS will return all the JWKs for your account, so you have to then extract the correct JWK by comparing Key ID of the decoded JWT.

const options = {
    method: 'GET',
    uri: 'https://cognito-idp.{region}.amazonaws.com/{user pool ID}/.well-known/jwks.json',
    json: true
}

rp(options)
    .then(jwk => {
        var key = jwk.keys.find(key => {
            return key.kid === decodedJwt.header.kid
        })

        // get the PEM from the JWK
        // verify the JWT
    })

Get the PEM

Using jwk-to-pem you can then convert the JWK to PEM format to be used to verify the JWT.

var pem = jwkToPem(key);

Verify the JWT

Then by using jsonwebtoken again, you can verify the original token against the PEM key.

jwt.verify(token, pem, function (err, decoded) {
    if (err) {
        throw new Error('error: ', err)
    }
    
    // verified token!
});

Conclusion

Hopefully this example will save some time for people who are looking to verify AWS Cognito JWT tokens on their server side back end code.