In Amazon S3 CLI, there are only a handful commands:
1
cp ls mb mv rb rm sync website
We can use cp command to retrieve S3 object:
1
$ aws s3 cp s3://mybucket/myfile.json myfile.json
But if only the metadata of the object, such as ETag or Content-Type is needed, the S3 CLI does not have any command to do that.
Now enter S3API CLI. Not only the CLI commands can retrieve S3 objects, but also associated metadata. For example, retrieve S3 object similar to aws s3 cp:
If you work from place to place, such as from one coffee shop to another, and you need access to your Amazon EC2 instances, but you don’t want to allow traffics from all IP addresses. You can use the EC2 Security Groups to allow the IP addresses from those locations. But once you move on to a different location, you want to delete the IP address from the previous location. The process to do these manually and over and over again quickly becomes cumbersome. Here is a command line method that quickly removes all other locations and allows only the traffic from your current location.
The steps are:
Revoke all existing sources to a particular port
Grant access to the port only from the current IP address
Assume the following:
Profile: default
Security group: mygroup
Protocol: tcp
Port: 22
First, revoke access to the port from all IP addresses:
The aws ec2 describe-security-groups command before the first pipe returns JSON formatted data, filtered via JMESPath query, which is supported by AWS CLI, for example:
1
2
3
4
[
"XXX.XXX.XXX.XXX/32",
"XXX.XXX.XXX.XXX/32"
]
jq command simply converts an array of JSON to line by line strings, which xarg takes in, loops through and deletes one IP address at a time.
After this step, all IP addresses originally allowed are all revoked. Next step is to grant access to the port from a single IP address:
Amazon SQS or Simple Queue Service is a fast, reliable, scalable, fully managed message queuing service. There is also AWS CLI or Command Line Interface available to use with the service.
If you have a lot of messages in a queue, this command will show the approximate number:
1
2
3
$ aws sqs get-queue-attributes \
--queue-url $url \
--attribute-names ApproximateNumberOfMessages
Where $url is the URL to the Amazon SQS queue.
There is no command to delete all messages yet, but you can chain a few commands together to make it work:
What is the best command line tool to process JSON?
Hmm… Okay, let’s try different command line JSON processing tools with the following use case to decide which one is the best to use.
Here is the use case: JSON | filter | shell. A program outputs JSON data, pipes into a JSON command line processing tool to filter data, and then send to a shell command to do more work.
The command line JSON processor should filter each element of the array and convert it into its own line:
1
2
{"name":"Foo Foo"}
{"name":"Bar Bar"}
The result will be piped as the input line by line into a shell script echo.bash:
1
2
3
4
5
#!/usr/bin/env bash
whileread line; do
echo"ECHO: '"$line"'"
done
The final output should be:
1
2
ECHO: '{"name":"Foo Foo"}'
ECHO: '{"name":"Bar Bar"}'
Custom Solution
Before start looking for existing tools, let’s see how difficult it is to write a custom solution.
1
2
3
4
5
6
7
8
9
10
11
12
13
// Filter and convert array element into its own line.
var rl = require('readline').createInterface({
input : process.stdin,
output: process.stdout,
});
rl.on('line', function(line){
JSON.parse(line).forEach(function(item){
console.log('{"name":"' + item.name + '"}');
});
}).on('close', function(){
// Shh...
});
Perform a test run:
1
2
3
$ cat data.json | node filter.js | bash echo.bash
ECHO: '{"name":"Foo Foo"}'
ECHO: '{"name":"Bar Bar"}'
Well, it works. In essence, we are writing a simple JSON parser. Unless you want to keep the footprint small, you don’t want to write another JSON parser. And why bother to reinvent the wheel? Let’s start look at the existing solutions.
Node Modules
Let’s start with the tools from NPM registry:
$ npm search json command
Here are a few candidates that appears to be matching from the description:
jku - Jku is a command-line tool to filter and/or modifiy a JSON stream. It is heavily inspired by jq. (2 stars and not active, last update 8 months ago).
json or json-command - JSON command line procesing toolkit. (122 stars and 14 forks, last update 9 months ago)
jutil - Command-line utilities for manipulating JSON. (88 stars and 2 forks, last update more than 2 years ago)
Not a lot of choice, and modules are not active. This might be that because there is already a really good solution, jq, which has 2493 stars and 145 forks, and the last update was 6 days ago.
jq
jq is like sed for JSON data - you can use it to slice and filter and map and transform structured data with the same ease that sed, awk, grep and friends let you play with text. - jq
Instead of NPM install, do:
$ sudo apt-get -y install jq
Since we don’t need color or prettified, just line by line. So, here is the command chain:
The AWS Command Line Interface (CLI)] is a unified tool to manage your AWS services. With just one tool to download and configure, you can control multiple AWS services from the command line and automate them through scripts. [2]
The point here is unified, one tool to run all Amazon AWS services.
Install
The installation procedure applies to Ubuntu Linux with Zsh and Bash.
You should see a list of all available AWS commands.
Usage
Before using aws-cli, you need to tell it about your AWS credentials. There are three ways to specify AWS credentials:
Environment variables
Config file
IAM Role
Using config file is preferred, which is a simple ini file format to be stored in ~/.aws/config. A soft link can be used to link it or just tell awscli where to find it:
$ export AWS_CONFIG_FILE=/path/to/config_file
It is better to use IAM roles with any of the AWS services:
The final option for credentials is highly recommended if you are using aws-cli on an EC2 instance. IAM Roles are a great way to have credentials installed automatically on your instance. If you are using IAM Roles, aws-cli will find them and use them automatically. [4]
The default output is in JSON format. Other formats are tab-delimited text and ASCII-formatted table. For example, using --query filter and table output:
This will print a nice looking table of all EC2 instances.
The command line options also accept JSON format. But when passing in large blocks of data, referring a JSON file is much easier. Both local file and remote URL can be used.