I need to find an IP whose geographic location is already known from other source. Normally, it’s the other way around, but this situation is different. I will explain in later post. (Updated at 2012-03-04T01:00:14Z, I used this to found an IP from known geographical location of malicious ad clicker)

First, downloading the access log from GAE. It is simple, this is the command I ran:


python google_appengine/appcfg.py --num_days=3 request_logs project/ access.txt

--num_days specified 3 days of logs, the default is the logs of current calendar date, according to the documentation, options of request_logs can be found below in that page.

Next is to find the geographic location by IP address. You need to install MaxMind Python binding with C core on your system. Here is the core snippet I use to generate results:


import GeoIP

#gi = GeoIP.new(GeoIP.GEOIP_MEMORY_CACHE)
gi = GeoIP.open("/usr/share/GeoIP/GeoLiteCity.dat",GeoIP.GEOIP_STANDARD)
gi6 = GeoIP.open("GeoLiteCityv6.dat",GeoIP.GEOIP_STANDARD)

with open('ips.txt') as f:
for IP in f:
IP = IP.rstrip('\n')
if ':' in IP:
gir = gi6.record_by_addr_v6(IP)
else:
gir = gi.record_by_addr(IP)
print '%s: %s, %s, %s' % (IP, gir['city'], gir['region_name'], gir['country_name'])

GAE accepts both IPv4 and IPv6 connection, so you may want to also to look up for IPv6 address, or you will need to filter out IPv6 addresses and drop them. You can download free city database for IPv6. Note that Python binding version may need to be 1.2.7 for IPv6, I know 1.2.4 does not support IPv6.

Before you run this snippet, you need to process access.txt to have unique IPs:


cut -f 1 -d ' ' access.txt| sort | uniq > ips.txt

On Linux, it’s simple as that, or you can process in Python, I think it’s only need one line.

So, I ran:


script.py | grep '<CITY NAME>'

I got no results, I am sure if the requests were made, it must lies within 3 days. Something is really fishy.

Although I didn’t find that IP, but I wrote this post.