infosec python technical

Parsing and Merging Nmap XML Report Files in Python

Here are a couple of tools I wrote in python to parse and merge/ join Nmap .xml report files.

TL;DR:

  • gnxparse.py outputs discovered host, port info from nmap .xml, optionally in the form of nmap command(s) to re-scan hosts.
  • gnxmerge.py glues multiple (<host> sections from) nmap XML reports together.
  • You can download them from the git repo here.

Problem:

Nmap is great for network auditing. Scanning from an internal, privileged, and/or fast network location (eg inside your firewall) is straightforward and fast, but doesn’t give you the whole picture – which discovered hosts and services are exposed from a different – eg external/public network.

To get this info, you could do a firewall config audit, but if you don’t have this access or just want to do a functional test of the firewall, you need to run another scan. For the same accuracy, you’ll want the full range (1-65535) portscan, and this takes time. Also, this kind of scan is noisy and may generate a lot of firewall/ips logs. Lastly, if you traverse an IPS/IDS with such a noisy scan, it may drop you as malicious, and the rest of the results are lost.

Solution:

An alternative approach is to do a full scan internally, and use the results to make a much quieter external scan targeting only known live hosts and services. gnxparse.py can generate nmap ‘rescan’ commands to run from outside the firewall, and gnxmerge.py helps tidy up the results by merging the multiple output files back into a single report.

The workflow goes something like this:

  1. Perform a thorough scan of your publically routable subnets from a location inside the firewall. (Using some fairly well tuned host discovery options in nmap, it takes me about 3 hours to scan the full port range on ~1000 hosts on a fast internal network.)
  2. Run gnxparse.py with the ‘rescan’ option on the .xml file generated from the internal nmap scan. This will output a bash script with individual nmap commands to probe only those hosts and services found to be up.
  3. Copy the script to an external host with nmap and run it. (For all discovered services on the ~1000 hosts scaned earlier, it takes me only about five minutes to do a re-scan).
  4. Run gnxmerge.py on the folder of individual .xml reports generated (one per host) if you need to produce a single Nmap XML report file for any reason, for example to load up in Zenmap[1] and review which of your services are exposed externally.

Hopefully someone else finds these tools useful. You can download  gnxparse.py and gnxmerge.py from my GNXTools repo on bitbucket.

 

[1] I am aware Zenmap can also load multiple nmap report files for viewing; though it does not merge/save. 

technical website

Posting domain mapped permalinks via the wp-to-twitter plugin

A really useful WordPress feature is support for hosting multiple sites from the same core install. I used to run WordPress-MU for this purpose before it was rolled into the main version. If you have control of DNS you can easily have sites as subdomains, or even map a seperate domain to a subsite using a plugin such as WordPress MU Domain Mapping.

Since I use SSL with a wildcard certificate this also means I can securely administer wordpress subsites via https://subsite-example.glenscott.net while having a public non-ssl url of http://subsite-example.tld.

I recently installed the plugin WP-To-Twitter to use on a subsite, and found that when posting updates to twitter, wordpress unfortunately provides it with the http://subsite-example.glenscott.net/link-to-post permalink instead of the http://subsite-example.tld/link-to-post permalink. This doesn’t seem to be the fault of either plugin as MU Domain Mapping is using some smoke and mirrors rewriting to display the mapped http://subsite-example.tld domain while the plugins in the secure wordpress admin URL (including wp-to-twitter) rightly see the site as if it is located at https://subsite-example.glenscott.net. I tried a few combinations of changing the site_url and home_url values in the network admin –> sites panel, but only succeeded in breaking my domain mapping. Until domain mapping is baked into the wordpress core I suspect this will continue to be an issue.

I’m not a wordpress/plugin dev by any stretch but I spent a couple of hours looking through the code for wp-to-twitter and concocted a (temporary) hack which is now posting the correct permalink to twitter. I would have been happy with hardcoding the domain value since the plugin is only being used on one site, but in the end I learned a few things about core wordpress PHP functions and came up with a solution which should work for multiple subsites / domains.

I based part of this solution on the post here:
http://premium.wpmudev.org/forums/topic/permalinks-converted-to-those-domain-based

Here’s the code:

Add the following function to wp-content/plugins/wp-to-twitter/wpt-functions.php

function get_permalink_dom( $id ){
    global $wpdb;
    $linkpath = str_replace(home_url(), '', get_permalink($id));
    $thisblogid = get_current_blog_id();
    $thisblogdomain = $wpdb->get_var( "SELECT domain FROM wp_domain_mapping WHERE blog_id = $thisblogid AND active = 1" );
    return "http://" . $thisblogdomain . $linkpath;
}

Edit the following in wp-content/plugins/wp-to-twitter/wpt-truncate.php

    // comment/remove the below line (15) and replace it with the call to get_permalink_dom
    //$thisposturl = trim($shrink);
    $thisposturl = trim(get_permalink_dom($post_ID));

That’s it; the ‘Tweet Now’ box in the post editor should use the domain mapped link.

infosec linux sysadmin technical

Scanning and Reporting on SSL Cert Expiry Dates – an SSL Certificate Scanner using bash, php and jQuery

A while ago I cooked up a bash script to scan relevant internal subnets for ssl certs, save/parse a copy of the x509 data and list all the discovered info in a delimited text file for analysis in a spreadsheet.

This works well by itself, but for the convenience of quick lookups without involving excel or libreoffice, a web page can be useful. PHP provides a simple method for converting a delimited file into a table (fgetcsv() ), and jQuery has a great plugin called tablesorter which allows you to do some quick sorting and filtering right there in the browser. It didnt take long to mash these together into a one script web page to display the sortable certificate data at a glance.

Sample screenshot:

Screenshot sample of scancerts

The sample only shows the three dummy values I’ve included in the demo, but I’ve used this in production with 600+ scanned certs and it works well.

Scancerts has two main components:

  1. Bash script which eats a text file containing a list of networks to scan, uses openssl, sed, awk, grep, cut, etc to generate another text file containing a delimited list of discovered certs.
  2. PHP script which turns the delimited text file into a HTML table, and augments it with some jQuery so your browser can sort and filter the HTML table on the fly.

Installation Steps

  1. Create a web-accessible folder on your linux box
  2. Unpack the files in the provided archive to the web folder
  3. Make sure file/folder permissions are set correctly (and you can run PHP!)
  4. Add the subnets you want to scan into ‘subnets.txt’
  5. Make ‘scancerts’ executable
  6. Run scancerts and optionally add it to cron
  7. View a nice sortable html list of discovered certs

Download: scancerts_v0.1.tar.gz