Getting AWS, GCP and Azure IP ranges

TL;DR

This is my gist of result

https://gist.github.com/khanhicetea/c6cc74b99ab336d58c2da7929c2de709

Introdution

These are 3 biggest cloud providers (from Amazon, Google and Microsoft). This is the way to get their IP ranges (IPv4).

Amazon Web Service

AWS update its IP ranges on this link : https://ip-ranges.amazonaws.com/ip-ranges.json

1
2
3
4
# download ip-ranges.json file
curl -o aws.json https://ip-ranges.amazonaws.com/ip-ranges.json
# parse ip range with grep tool
grep -o -E '\d+\.\d+\.\d+\.\d+/\d+' aws.json > aws.txt

The result are in aws.txt file.

Google Cloud Platform

This script from LinuxFreelancer article

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
# array to hold list of IP blocks
ALL_IPS=()
NAME_SERVER='8.8.8.8'
txt_records=$(dig @${NAME_SERVER} _cloud-netblocks.googleusercontent.com txt +short)
txt_rr_only=$(echo $txt_records | grep -oP 'include:\S+' | sed 's/include://g')
[[ -z ${txt_rr_only} ]] && { echo 'No TXT dns record found.'; exit 1;}
## unpack txt records to get IPv4 ranges
for rr in ${txt_rr_only}; do
new_ips=$(dig @${NAME_SERVER} $rr txt +short | grep -o -P '(\d+\.){3}\d+/\d+')
for item in ${new_ips}; do
# add space separator between ip blocks
item=" ${item} "
ALL_IPS+=${item}
done
done

# sort IPs then output result file
echo ${ALL_IPS[@]} | sed 's/ /\n/g' | sort -n -t . -k 1,1 -k 2,2 -k 3,3 -k 4,4 > gcp.txt

Microsoft Azure

This script download the file updated at Nov 06 2017, the latest file is here : https://www.microsoft.com/en-us/download/details.aspx?id=41653

1
2
3
4
# download ip-ranges.json file
curl -o azure.xml https://download.microsoft.com/download/0/1/8/018E208D-54F8-44CD-AA26-CD7BC9524A8C/PublicIPs_20171106.xml
# parse ip range with grep tool
grep -o -E '\d+\.\d+\.\d+\.\d+/\d+' azure.xml > azure.txt

Play it all !

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
#!/bin/bash

# AWS
curl -o aws.json https://ip-ranges.amazonaws.com/ip-ranges.json
grep -oP '\d+\.\d+\.\d+\.\d+/\d+' aws.json > aws.txt
rm -f aws.json

# GCP
ALL_IPS=()
NAME_SERVER='8.8.8.8'
txt_records=$(dig @${NAME_SERVER} _cloud-netblocks.googleusercontent.com txt +short)
txt_rr_only=$(echo $txt_records | grep -oP 'include:\S+' | sed 's/include://g')
[[ -z ${txt_rr_only} ]] && { echo 'No TXT dns record found.'; exit 1;}
for rr in ${txt_rr_only}; do
new_ips=$(dig @${NAME_SERVER} $rr txt +short | grep -o -P '(\d+\.){3}\d+/\d+')
for item in ${new_ips}; do
item=" ${item} "
ALL_IPS+=${item}
done
done

# sort IPs then output result file
echo ${ALL_IPS[@]} | sed 's/ /\n/g' | sort -n -t . -k 1,1 -k 2,2 -k 3,3 -k 4,4 > gcp.txt

# Azure
curl -o azure.xml https://download.microsoft.com/download/0/1/8/018E208D-54F8-44CD-AA26-CD7BC9524A8C/PublicIPs_20171106.xml
grep -oP '\d+\.\d+\.\d+\.\d+/\d+' azure.xml > azure.txt
rm -f azure.xml

MacOS issue

If you’re running this script on MacOS, try to replace grep -P to grep -E

Use Cases ???

This list is very useful for some usecases, I will post them later ;)

Building Automated CI server with Drone and Docker

Introduction

Docker is great tool to management linux containers. It brings DevOps to next level, from development to production environment. And of course, before deploy anything to production, software should be tested carefully and automatically.

That’s why Drone, a new lightweight CI server built-on top Go lang and Docker, will help us to resolve the testing problems in simple and fast way.

Setup

This guide will assume you already have Docker and Docker Compose tool. And of course, root permission ;)

Step 1 : Clone my example docker-compose here : https://github.com/khanhicetea/drone-ci

1
2
3
$ git clone https://github.com/khanhicetea/drone-ci
$ cd drone-ci
$ cp .env.example .env

Step 2 : Update your setting in .env file

Step 3 : Run drone via docker-compose

1
2
$ source .env
$ sudo docker-compose up -d

Step 4 : Go to your Drone url (remember use https url), then authorize with Github provider.

Usage

In example repo, I created a sample .drone.sample.yml file so you can follow the structure to create own file.

I will explain some basics here

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
clone:
git:
thumbnail: plugins/git
depth: 5

pipeline:
phpunit:
thumbnail: php:7
commands:
- /bin/sh conflict_detector.sh
- /bin/sh phplinter.sh app lib test
- composer install -q --prefer-dist
- test/db/import testdatabase root passwd testdb test.sql
- php -d memory_limit=256M vendor/bin/phpunit --no-coverage --colors=never

notify:
thumbnail: plugins/slack
webhook: [your_slack_webhook_url]
channel: deployment
username: DroneCI
when:
status: success

notify-bug:
thumbnail: plugins/slack
webhook: [your_slack_webhook_url]
channel: bugs
username: DroneCI
when:
status: failure
branch: production

services:
testdatabase:
thumbnail: mysql:5.7
detach: true
environment:
- MYSQL_DATABASE=testdb
- MYSQL_ROOT_PASSWORD=passwd

This file consists 3 sections :

  • clone : To clone the source code and prepare for pipeline step. This section will be run first
  • services : Declare your docker services (databases, ip server) which source code connect to. This section will be run at sametime with pipeline (after clone)
  • pipeline : Testing pipe, where you put testing logic here.

In this pipeline, I made a example PHP testing through these steps :

  1. Check conflicts in code (grep for >>>> HEAD string)
  2. Run PHP linter in application codes
  3. Run Composer to install all dependencies
  4. Import testing database to mysql services (using testdatabase hostname to connect service)
  5. Run testing script via phpunit tool

Then, notify testing result via Slack channel ! ;)

A picture is worth a thousand words

Drone CI screenshot

Lets automate all the things !