Wednesday, July 22, 2020

Creating My OWASIS - Part 3 (Putting the pieces together and wrap-up)


In this third and final post, I will walk through the various components that went into making OWASIS work. In case you missed them, here are the links to Part One and Part Two.

This part of the process was the actual fun part - writing and assembling the scripts into a semi-cohesive package that could be run on a repeated basis to refresh the inventory information. I figured out in Part Two that I would rely on a combination of Bash and Perl scripting to make this all work. There were still a few minor obstacles to overcome

For one, I wanted all of the data output in a consistent manner, and some of the commands to do this would not render properly if they were just called through a remote, interactive session. So I wrote a script that could be uploaded and then run on any of the servers, which I called Remote.sh. This really formed the core of the inventory system and could be run on any server version and would return the data formatted in a consistent manner. The challenge was how to get this script on to all of the servers.

I decided to tackle the Telnet-only systems first. Since Telnet does not support file transfers, I decided to FTP (ugh, yep, not sFTP since that wasn't available) the Remote.sh script to the server first, then call the script from the Telnet session. This worked nicely and returned the information to the terminal.

The next step was to write a script that would automatically login to Telnet and then execute the Remote.sh script that had been previously sent to the user's home directory via FTP - I called this script AutoTelnet.pl. This script incorporated the previously mentioned "exepect.pm" module instructions to handle sending over the username and password (see security disclaimer in Part Two).

The last piece was to essentially build a loader script that would call these other two. Essentially, all this last script for the Telnet systems did was upload Remote.sh and then execute it by then running the AutoTelnet.pl script - I named this script FTP_Remote.sh (for obvious reasons).

For the SSH servers, I still used Remote.sh to run the commands remotely on all of the servers so that I could capture the data in a consistent manner, but since SSH supports file transfers as well, the process of moving the file and then executing it was very streamlined - and it too leveraged the "expect.pm" module for automating the login process.  I called this script AutoSSH.pl.

These scripts collectively represented the real bones of the OWASIS system. I had to write some additional supporting scripts though to make this as fully automated as possible. This included scripts like nslook.sh which I used to perform an nslookup on all valid hostname ranges (the bank named their servers sequentially, fyi). I used listing.pl to parse the output of nslook.sh and determine what systems support SSH and which only supported Telnet. Another script called Parse2csv.pl was used to scrape the output files from the Remote.sh scripts into a comma separated value file.

As I mentioned in Part Two - and looking back in hindsight - there were many security issues present with the way all of this worked. For one, while I played around with making the collection of the username and password interactive for the person running them to avoid hardcoding these values into the files, I still had to use a configuration file (called ftpers.txt) to store these values for running the Ftp_Remote.sh script. If you mis-typed the password in either the config file, or in the interactive prompts though, it would lock the account. This required a call to the security team (plus a mea culpa) to get the account unlocked. And this worked fine for the most part - except for the systems that were Telnet only - because I would not be able to access FTP until a successful Telnet authentication took place. So I wrote another script that I called AutoTelPasswd.pl that was my get out of jail/unlock account script. Let that run that on all of the Telnet servers and I was back in business!

For anyone that has not lost total interest in all of this at this point (anyone? Beuhler? Beuhler?), here are the original instructions I wrote up on how to run OWASIS:

Open-source WAS Inventory Package Instructions

Note: When doing the following, be SURE to use your correct password - failure to do so, WILL lock your account on all of the machines to attempt to log into
  1. Replace "<username>" and "<password>" in ftpers.txt
  2. Run "./Ftp_Remote.sh"
    1. After it has automatically ftp'd the Remote.sh script to all of the server in tn_ips.txt, it will prompt you for a username and password to use to telnet into all of the machines and run the Remote.sh script
  3. Run "perl AutoSSH.pl ssh_ips.txt"
    1. This can be run concurrently with ./Ftp_Remote.sh, as all of the processing is done remotely, so it will not slow down your local machine.
  4. When Ftp_Remote.sh completes, view the log file in an editor that allows you to do block select mode (textpad or ultraedit32), and block select only the first character in every line of the file, and then delete that block. (This way both log files have the same format)
  5. Run "cat SSH_connections-<datestamp>.log TN_connections-<datestamp>.log > Master_Inventory.txt"
    1. This will populate a single file with all of the output from Telnet and SSH servers
  6. Run "perl Parse2csv.pl Master_Inventory.txt > <output file.csv"
    1. I usually make an output file with a datestamp similar to the tn and ssh_connections files
  7. Open your <output file>.csv file in Excel
    1. There will be three disctinct partitions/ranges to the file
    2. Add text labels above the first row in each partition as follow:
      1. Partition 1: Hostname, Brand, Model#, OS Version, Processor Cores, IP Address
      2. Partition 2: Hostname, WAS Versions
      3. Partition 3: Hostname, WAS Home, Server Name, Server Status
    3. Select all of the cells in the first partition/range, goto Data, then filter - advanced filter; check unique records only, and OK
      1. Repeat for each of the three partitions
    4. Copy and paste each partition (text labels included) into its own sheet of a new Excel Workbook
    5. Rename the three sheets in the new workbook as follows:
      1. Sheet 1: Machine Info
      2. Sheet 2: WAS Versions
      3. Sheet 3: Server Info
    6. Proceed with any formatting, sorting, etc. of your choice
  8. If you so choose, now that you have a well formatted Excel "Database" file, you can import this into Access to run queries against - each sheet is equivalent to a table in a database - hostname is the primary key.




No comments:

Post a Comment