š¾ Archived View for capsule.adrianhesketh.com āŗ 2016 āŗ 03 āŗ 21 āŗ using-sts-with-ansibles-aws-modules captured on 2023-07-22 at 16:32:06. Gemini links have been rewritten to link to archived content
ā¬ ļø Previous capture (2021-11-30)
-=-=-=-=-=-=-
Best practice for securing AWS is to setup individual users to access the AWS Console and to assign roles to those users to grant permission. Once this is in place, then setting up an AWS access key for a user's account isn't enough to be able to use the AWS command line or API tools, an additional step of switching into the role is required.
Most of the "getting started" etc. guides don't show this and older versions (< 2) of Ansible don't support it very well due to lack of support from the underlying Boto library.
If you use `yum` to install Ansible, thenĀ 1.9.6 is the default package as of Centos 7.2. So I'd recommend using pip to install Ansible instead. This brings in a later version:
sudo yum install -y python-pip sudo pip install ansible sudo pip install boto sudo pip install awscli
The script above also installs the command line tools. The first thing to execute is `aws configure`, this will prompt you to set the `AWS_ACCESS_KEY` and `AWS_SECRET_KEY` environment variables.
Access keys are setup in the Security Credentials tab of your user at [0]
Once you've got the awscli tools installed, you can execute `aws configure` to setup required environment variables. It should look something like:
$ aws configure $ export AWS_ACCESS_KEY = '#####' $ export AWS_SECRET_KEY = '#####' $ Default region: eu-west-1 $ Default format: json
The next step is to use `aws ec2 describe-instances` to take a look at what's running already. If your organisation is using roles for access management, then until you switch roles you'll get all sorts of issues:
A client error (UnauthorizedOperation) occurred when calling the DescribeInstances operation: You are not authorized to perform this operation.
Similarly, the `ec2.py` dynamic inventory script for Ansible will return:
ERROR: "Error connecting to AWS backend. You are not authorized to perform this operation.", while: getting EC2 instances
Or, once some of the configuration is complete, but not all of the environment variables set:
ERROR: 'Authentication error retrieving ec2 inventory. - No AWS_ACCESS_KEY_ID or AWS_SECRET_ACCESS_KEY environment vars found - Boto configs found at "~/.aws/credentials";, but the credentials contained may not be correct', while: getting EC2 instances
Even when I switched roles using the instructions at [1] I found that the Boto library was expecting different environment variable names to the aws cli tools, so I automated the steps to do it.
You'll need to customise the `--role-arn` parameter for your own usage. It can beĀ found within the IAM Roles List by switching into the role within the AWS console and viewing it at [2]
To use the scripts, execute `. ./switch_role.sh`, the script will execute theĀ `aws sts assume-role` command line to create a session id and set the Boto environment variables. It looks like the naming convention changed, and Boto is still transitioning, so the scripts set the required values.
In the end, I switched to Terraform which is really excellent. Anyway...
echo "AWS_ACCESS_KEY " $AWS_ACCESS_KEY echo "AWS_ACCESS_KEY_ID " $AWS_ACCESS_KEY_ID echo "AWS_SECRET_KEY " $AWS_SECRET_KEY echo "AWS_SECRET_ACCESS_KEY " $AWS_SECRET_ACCESS_KEY echo "AWS_SESSION_TOKEN " $AWS_SESSION_TOKEN echo "AWS_SECURITY_TOKEN " $AWS_SECURITY_TOKEN
Dividing a Go Project into Multiple Packages