Mabuhay

Hello world! This is it. I've always wanted to blog. I don't want no fame but just to let myself heard. No! Just to express myself. So, I don't really care if someone believes in what I'm going to write here nor if ever someone gets interested reading it. My blogs may be a novel-like, a one-liner, it doesn't matter. Still, I'm willing to listen to your views, as long as it justifies mine... Well, enjoy your stay, and I hope you'll learn something new because I just did and sharing it with you.. Welcome!

Wednesday, November 18, 2009

Perl-icious: Function that reads from a config file

I was asked by my colleague to make a function he'll use for one of our scripts - to make our lives easier which I don't think won't happen unless THEY get rid of the night effin' shift! Anyway, it took me sometime to figure this out (I'm learning ok?!) and then I sought the help of a resident guru BE (his initial you idiot!).

Background:

I was to read the NetAgent configuration file and determine if the server is an MDH or P2PS (this is about RMDS). Another was from the RMDS config file to do the same, which means I got to make two.

Apologies but I cannot post config files here. But I believe you can easily understand how it process things.

Here is the code - as a standalone:

sub inNetAgentConfig {

my $host = shift;
my $conf_file = "/path/to/config.file";

open FH, $conf_file or die "Error: $!\n";
$/ = "ATTRIBUTES"; # set the line delimiter to ATTRIBUTES instead of default value of newline; see special variables
my @array = <FH>;

if ( grep {/$host/} $conf_file ) {
foreach my $line ( grep {/$host/} @array ) {
if ( $line =~ /GROUP=(\w+)/ ) {
return $1;
}
}
} else {
return "Non-existent";
}
}


Another function that I made:


sub inRmdsConfig {

my $host = shift;
my $rmds_file = "/path/to/config.file";

open FILE, $rmds_file or die "Error: $!\n";
while (<FILE>) {
chomp ( $_ );

if ( $_ =~ /^$host\*(\w+)\*serverId.?/ ) {
return $1;
} else {
return "Non-existent";
}
}

Saturday, November 14, 2009

MySQL - an old friend: DELETE and OPTIMIZE

Hey, how have you been? I know, it been sometime. Oops, I'm talking to myself again! A lot of things have happened and I could have posted some but not so excited. Not until now.

Background:

We are always receiving disk space alerts on a certain file system in a server owned by our team. What we usually do is to run a script (from crontab) which basically cleans up the MySQL log file. Yes, the nasty log files. But, I think, they forgot to consider having a script that trims the database/s - with entries that dates back 2006-07! Well, a colleague gave us a script that supposedly do this but it got some errors which, for now, is useless.

After checking with the senior engineers (read: oldies), it was agreed to delete some data and leave 30 days worth of records. As I dig deeper, I found out that one of the table have 177M rows - yes, that is M-illion! Hmm, it looked like I ate more than I could chew. But as usual, I know I am hard headed (sometimes, ok?) and I wouldn't give up just yet.

I used to do some maintenance when I was in my first job. I did some dumping, checking replications (my boss set it up, not me), deleting, optimize, etc. And it helped me - made me feel confident, speaking what experience can give you.

After some experiment - deletion, converting UNIX timestamp - I started my work. I deleted rows by the millions, about 10 in each run (I used ORDER BY and LIMIT functions here). You may wonder why not in one shot, well, the problem with DELETE is it locks the table and the table is being used by a monitoring tool which cannot be stopped for a long time. From SHOW PROCESSLIST, I can see a lot of INSERT statements queueing - and deleting 10M rows runs for 16+ minutes. I was so happy that I forgot that deleting doesn't equate to reducing disk space automatically. Just a side note, on the back of my mind, something tells me to mention it depends on how the table was created (?), not sure though. I'll read on this more I guess. Going back, I did reduce it to 55M+ and still see a 96% disk space usage. Outrageous! Doing some little research, I came up with a fragmented rows theory - well, originally not mine, of course, I read it somewhere. And to clean this mess up is to OPTIMIZE TABLE. But before doing it, I did some homework. What is the effect of optimization? I felt that something behind this good thing is an evil lurking! Again, the table is locked and it will create a .TMD file (temporary .MYD file - please read more on this) that can wreck havoc. It can fill-up the file system depending on how big it is and how much is left. So be very careful when doing optimization - I knew this first hand. I also ran CHECK TABLE [STATUS] which could give me some indicators if table needs some repair, or anything. 'Though, at times, this won't give you anything at all. From what I just did, it says everything is good. And yet, I got a ton of fragmented rows. Well, could be some limitation - again, read on.

After all these crappy steps, I was ready to corrupt the database. Oh before I forget, P-L-E-A-S-E if you have a test database or something, do these steps there and not on the prod. Trim the the database there then, a short downtime (stopping applications) for moving it back. So here are the simple steps I took:

1. I used this to get my target date as to set my limit later for deletion.

SELECT UNIX_TIMESTAMP('YYYY-MM-DD HH:mm:ss');

2. Now, we're ready to delete some rows with limits. This is IMPORTANT or the next thing you'll know, you just deleted the entire data!

DELETE FROM <tablename> WHERE timestamp < <number from SELECT statement here> LIMIT 10000000;

3. On this part, I optimize. You might wonder why now, not after the whole thing. Well, you run after everything depending on the amount of rows you just deleted. In this case, it just too darn big which could lock my table my a very long time (INSERTs queue up) and create a .TMD file too big that could overwhelm my file system which have a domino effect on other applications/processes that use it.

OPTIMIZE TABLE <tablename>;

Let this run to completion or it could render the table unusable or corrupted data! You've been warned. Of course, you can run this again to fix it or do a REPAIR TABLE. But who knows, you might also lose it all. As the S.A. says, "He who laughs last has a backup."

4. And then I am "maarte" so I add this. Its significance is here.

FLUSH TABLE <tablename>;

That pretty sums up what I just did. So long. And yes, please I'd like to hear from you. Corrections are always welcome. Cheers!

Thursday, June 11, 2009

Scripting 101: Modification on the previous topic

Before I modified the original script, I was advised by my colleague to take note of the date especially for the single digits, which was not addressed before. So, the corrections include re-assigning of variables, deletion of unwanted, throwing errors to null, etc.

Here they are:

1. CSV_DATE=`date '+%a %b %d'`
2. LIST_DATE was removed.
3. echo-es were deleted.
4. Errors were thrown to /dev/null; when added to cron, this will be re-directed to /dev/null as well.
5. mail was not working so I used EOF for the body:

/usr/ucb/mail -s "Subject here..." $EMAIL_ADD <<\
EOF

# Whatever the content between EOFs will be evaluated a regular formatted text (including blank lines and spaces) except for variable substitution such as here, which will output the result - the current date.

This will be a regular text.
$(date)

EOF

6. Since LIST_DATE was removed, I used 'find' to locate the most recent files that were modified, as:

for CSV in `find /path/of/files -name "some*.csv" -mtime 0\
2> /dev/null`
do
cp -p $CSV /some/httpd/location/
done

I hope, everything will be ok now after this week's test. It'll be submitted for cron-nization!

Saturday, May 23, 2009

Scripting 101: Shell Script that uses Perl

Good morning (very sleepy now!).. I promised to myself that I will modify the script that we're using to generate a report. I'm not familiar - yet - with the contents but the thing is, we run this manually every night and then copy it to another directory which make it available to the user/s. It's just another basic modification but I added some "twists" a safety pre-caution or stop unwanted operation - well, that's what I wish, at least.

The main script was made by "Someone Else" (There! I ain't saying it's mine.). In time, I'll interpret the Perl part here.


#! /usr/bin/sh

#
# This is a modified version.
# Author: Someone Else
# Modified by: ME
# Renamed: badname_PT.sh
# Date: 23 May, 2009
# Version: 1
#

CSV_DATE=`date | awk '{print $1,$2,$3}'`
E_NOTCD=43
E_OK=0
E_OTHER=45
EMAIL_ADD="name@domain.com"
FILE_DATE=`date '+%d_%m_%y'`
HOSTS="/path/to/host_list_PT"
LIST_DATE=`date | awk '{print $2,$3}'`
SCRIPTDIR="/var/tmp/jf"
export all

# On top of the original script, hour condition was placed to make sure that it runs only after 17:59 daily.
if [ `date +%H` -gt "17" ]
then

cd $SCRIPTDIR || {
/usr/bin/mailx -s "Can't change to $SCRIPTDIR; Please \
check permissions..." $EMAIL_ADD
exit $E_NOTCD
}

# This generates the reports; the heart of the script
for H_LIST in `cat $HOSTS`
do
echo "Extracting bad names from P2PS: $H_LIST"

# No more manual intervention in changing the dates (CSV_DATE was used - from current date)
rsh $H_LIST cat /path/to/some.log* | grep \
"$CSV_DATE" | grep ptrade | perl -n -e \
'if (m/^.* ([0-9]*:[0-9]*:[0-9]*).*Open Failure for (\(.*:.*\)) \
by (\(.*\)) at (\(.*\/net\)).*/) {$_= "$1,$2,$3,$4"; $_ =~ s/[\)|\(]//g;\
print "$_\n";}' > /var/tmp/PT_BadRequests.$H_LIST.csv

sleep 1

echo "Copying /var/tmp/PT_BadRequests.$H_LIST.csv /var/tmp/jf"
cp /var/tmp/PT_BadRequests.$H_LIST.csv /var/tmp/jf
echo "Copy completed for $H_LIST"
done

sleep 3

chmod 666 /var/tmp/jf/PT_*

rm /tmp/PT_BadNames_*
rm /some/httpd/html/PT_BadNames*

tar cvf /tmp/PT_BadNames_$FILE_DATE.tar ./PT*csv
compress /tmp/PT_BadNames*.tar
cp /tmp/PT_BadNames* /some/httpd/html
chmod 666 /some/httpd/html/PT_BadNames*

sleep 2

rm /tmp/PT_BadNames*
rm /var/tmp/jf/PT_BadReq*

# This was added to copy the newly generated CSVs from /var/tmp to /app/httpd/html site
if cd /var/tmp
then
for csv in `ls -l *csv | grep PT_BadRequests | grep "$LIST_DATE" \
| awk '{print $9}'`
do
cp -p $csv /some/httpd/html/Primetrade_BadRequests/
ls -l /some/httpd/html/Primetrade_BadRequests/$csv
done

# On completion, a mail will be sent to intended recipient/s.
/usr/bin/mailx -s "Bad Name PRIMETRADE Report is DONE" $EMAIL_ADD
exit $E_OK

else
/usr/bin/mailx -s "Can't change to /var/tmp; Please check \
permissions..." $EMAIL_ADD
exit $E_NOTCD
fi

fi

echo "Not yet..."
exit $E_OTHER

Also, if this script is run manually, which uses csh, copying each file would be "tedious". So, I made a FOR-loop to do it - very basic but syntax is tricky.

% set LIST_DATE=`date +%b" "%d`
% foreach csv (`ls -l /var/tmp/*csv | grep PT_BadRequests | grep "$LIST_DATE" | awk '{print $9}'`)
? do
? cp -p $csv /some/httpd/html/Primetrade_BadRequests/
? ls -l /some/httpd/html/Primetrade_BadRequests/$csv
? end
%

Saturday, May 09, 2009

Parents' Wish

A little tribute to our parents on Mother's Day.

To Our Dear Child:

On the day when you see us old, weak and weary ...
Have patience and try to understand us ...
If we get dirty when eating ... If we can't dress on our own ...
Please bear with us and remember the times we spent feeding you and dressing you up.
If, when we speak to you, we repeat the same thing over and over again ... do not interrupt us... listen to us.
When you were small, we had to read to you the same story a thousand and one times until you went to sleep.
When we do no want to have a shower, neither shame nor scold us ...
Remember when we had to chase you with your thousand excuses to get you to the shower?
When you see our ignorance of new technologies ... help us navigate our way through those worldwide webs.
We taught you how to do so many things ... to eat the right foods, to dress appropriately, to fight for your rights ...
When at some moments we lose the memory or thread of our conversation ... Let us have the necessary time to remember ... and if we cannot, do not become nervous ...
as the most important thing is not our conversatrion but surely to be with you and to have you listening to us ...
If we ever do not feel like eating, do not force us.
We know well when we need to and when not to eat.
When our tired legs give way and do not allow us to walk without a cane.
Lend us your hand, the same way we did when you tried your first faltering steps.
And when someday we say to you that we do not want to live any more, that we want to die.
Do not get angry. Some day you will understand. Try to understand that our age is not just lived but survived.
Some day you will realize that, despite our mistakes,
we always wanted the best for you and we tried to prepare the way for you.
You must not feel sad, angry nor ashamed for having us near you.
Instead, try to understand us and help us like we did when you were young.
Help us to walk... Help us to live the rest of our life with love and dignity.
We will pay you with a smile and by the immense love we have always had for you in our hearts.

We love you, child.
Mom and Dad

Sunday, April 26, 2009

VirtualBox 101: Enable USB in Winhoes XP on Mac OS X host (same for Unices)

Hello! Why do I let others keep on forcing me to do things that are against my will (yet I enjoy them)? Crap!

I don't want to jinx but my target for the year is a Solaris certification. I know, I know, I got a very long way to go and steep hill to climb but I'm not that easy to give up!

You might wonder, what the hell is the connection? Anyone care? Well, certainly no one but just want to shout a piece of my mind anyway - remember, I get to write whatever I want, ayt? This is to "force" me to really go for it for I will be judged from here on. Crap! Stressful!

Going back, I got some materials - Tats note, legally obtained - in preparation but the sad part is, it will run only in Winhoes (now, this is really crap!). I really got no choice but to install XP in my VirtualBox. I planned to share folders from my Mac but from what I read, it is still not supported (Note however that Shared Folders are only supported with Windows (2000 or newer), Linux and Solaris guests.). So, I'll use my flash drive (SanDisk Cruzer Micro - 4GB, 16GB looks awesome BUT it burns a deep hole in your pocket!). With this, I have to activate USB in the settings of my Guest OS. After I "checked" to enable it and run my guest OS, nothing showed up in XP. Neither it was detected in VirtualBox yet I can see it mounted in my Mac. I tried to create a Filter with empty details for I knew nothing about it, and still nothing. Now, I was forced to read the user guide (when everything doesn't work, check the manual.. perfect), and I found this to get the info I need for the Filter:


$ VBoxManage list usbhost
VirtualBox Command Line Management Interface Version 2.2.0
(C) 2005-2009 Sun Microsystems, Inc.
All rights reserved.

Host USB Devices:

UUID: 98ac0131-bc9e-48f1-8dd9-9512cba98fba
VendorId: 0x045e (045E)
ProductId: 0x0737 (0737)
Revision: 1.0 (0100)
Manufacturer: Microsoft
Product: Compact Optical Mouse 500
Address: p=0x0737;v=0x045e;s=0x000005f7ad04cd23;l=0x06200000
Current State: Unavailable

...

UUID: 796f5590-1ed2-45fc-9c98-6baec073ad0e
VendorId: 0x0781 (0781)
ProductId: 0x5406 (5406)
Revision: 2.0 (0200)
Manufacturer: SanDisk
Product: U3 Cruzer Micro
SerialNumber: 08777107D092855A
Address: p=0x5406;v=0x0781;s=0x0000022d92a681f1;l=0x26200000
Current State: Unavailable

Armed, I filled in the details and re-started XP and voila, it's now working. Couldn't be happier.

Sunday, April 19, 2009

Scripting 101: Monitoring Autosys Jobs - Correction in SU_MAILER

Whoa! My prayer was answered! Problem is solved!

Helpful shell scripting guru, from a forum I'm in, was able to address my dilemma for the duplicate entries. I created a confusion which unknowingly had a BIG impact in it's evaluation of regex. Without much-ado, here it is:

sed '/'"$JOBNAME"'/{x;/Y/!{s/^/Y/;h;d;};x;}' JOBLIST.txt

Perfect! I'm just waiting for the explanation. I'm lazy right now to search for these.

Wednesday, April 01, 2009

Scripting 101: Monitoring Autosys Jobs - Changes

When I got to the office today, I started working on the planned changes for the script. 'Though not all, I was able to find a way to implement number 4 from my early list. So here it goes.

Let me establish first the fact that this is primarily designed to address the current setup of jobs that we're monitoring. We already know when it'll be executed so the list is pre-arranged according to time of execution. From which I have considered the changes that I did. And as always, this is a work in progress so bear with me.

First change that I did was to add a couple of lines in the JOB_CHECKER function after the SU_MAILER.

cp -p JOBLIST.txt TEMP.out
sed '1d' TEMP.out > JOBLIST.txt

It deletes job 1 from the JOBLIST.txt as not to waste time when doing a FOR-loop for jobs already marked as SU. It's kinda lame really but am looking for better way to cleanup this mess.

The second one is changing the condition in JOB_CHECKER function.

if [ "$DONE_JOB" = "$ONQUEUE_JOB" ]

was changed to [and DONE_JOB is removed]:

if grep -w $ONQUEUE_JOB EXCLUDEJOB.txt

Third is on WHILE-loop. The script exits when there's nothing left [to read] in the JOBLIST.txt. Hence, COUNTER is no longer needed.

while [ "$NO_JOBS" -gt "0" ]
...

It's still consuming a lot of CPU and I haven't figure this one out yet. To allow "others", after a FOR-loop, I put the script into sleep.

And lastly, a safety net was added in the SU_MAILER function. It'll erase duplicate entries from the exclusion list.

SU_MAILER() {
/sbcimp/run/tp/CA/Autosys/v4.0/bin/autorep -J $JOBNAME | grep $JOBNAME | awk '{sub (/:/,""); print $1$2$3}' >> EXCLUDEJOB.txt

awk '!x[$0]++' EXCLUDEJOB.txt > EXCLUDEJOB.txt

I hope I'm on track on improving my scripting skills; always looking forward for a new challenge.

********
This part was added on 15 April, 2009 at 21:33 SST.

The variable $NO_JOBS was removed. Instead, its value was placed in the WHILE-loop. It was late when I found out that it is NOT being updated. Also, there are times when some of the jobs listed may be invalid for one reason or another. With this, condition was added as well. So the code will be:

while [ `cat JOBLIST.txt | wc -l` -gt "0" ]
do
for JOBNAME in `cat JOBLIST.txt`
do

if /sbcimp/run/tp/CA/Autosys/v4.0/bin/autorep -J $JOBNAME | grep "Invalid Job Name" 2>&1 > /dev/null
then
cp -p JOBLIST.txt VALJOB.out
sed -e "\|^$JOBNAME\$|d" VALJOB.out > JOBLIST.txt
continue
fi

...
done

exit $E_DONE

SU_MAILER () is still in the works. I still can't find how to delete duplicate entries while leaving one of them. The "awk" part that previously used was scrapped. It's NOT up to the job. Of course, it's my fault. ;)

I really hope I'll have the flame to continue on this for I started on my new job and is being "forced" to take on Perl which seems great. Now I know that it ain't a scripting language but rather on the grey area between interpreted and compiled. Still a long wayyyyyy. See'ya round.

Scripting 101: Monitoring Autosys Jobs - Evolved Form

Finally, here it is! I have this tested last night - on a PROD server [Linux only for some commands might NOT run on Solaris] - and it went well. After making some adjustments, it did behave as expected. Wow, I can't believe it. After a long while, I did it. Not of course without the valuable insights from my mentors - CJ and Mors - who gave their all [I hope!?]. I am a student and beginner in scripting. As for now, I'm still analyzing the logs to maximize performance for I have seen that it wastes so much time in loops. So anyone out there, please help me!

Here are some of what I've observed and future plans:
1. Run via cron; OR run it via "screen"
2. High CPU utilization
3. Integrate mutt in the script [not sure where but I'll try; it's a bit cool, ain't it?]
4. Instead of having an exclusion file, after a job is done, it will remove from the JOBLIST.txt and exit when nothing to read.

**********
#! /bin/bash

# Calling AUTOSYS environment
. /sbcimp/shared/config/CA/Autosys/v4.0/prod/autouser/shellPLN

# Definition and initialization of VARIABLES.
E_NOTFOUND=66
E_MISSING=43
E_DONE=0
EMAIL_ADD=`whoami`
COUNTER=0
NO_JOBS=`cat JOBLIST.txt | wc -l`


# Initialization of EXCLUSION file; this will be used to exclude successful job
> EXCLUDEJOB.txt
/bin/touch EXCLUDEJOB.txt
if [ ! -w EXCLUDEJOB.txt ]
then
/bin/echo "EXCLUDEJOB.txt file does not exist and/or not writeable! Check write permissions on the directory." | /bin/mail -s "EXCLUDEJOB.txt Missing" $EMAIL_ADD
exit $E_NOTFOUND
fi


# This is a mailer function for successful jobs; once the job ended successfully, it will write to EXCLUDEJOB.txt before sending a mail
SU_MAILER() {
/sbcimp/run/tp/CA/Autosys/v4.0/bin/autorep -J $JOBNAME | grep $JOBNAME | awk '{sub (/:/,""); print $1$2$3}' >> EXCLUDEJOB.txt

/bin/mail -s "FOS Job Monitor Update" $EMAIL_ADD >>EOF
****** Feed Status ******

Job ended SU-ccessfully!
$(/sbcimp/run/tp/CA/Autosys/v4.0/bin/autorep -d -J $JOBNAME)

***** End of Status *****
EOF
} # End of SU_MAILER function


FA_MAILER() {
/bin/mail -s "FOS Job Monitor Alert" $EMAIL_ADD >> EOF
***** PLEASE check Job Status *****

$(/sbcimp/run/tp/CA/Autosys/v4.0/bin/autorep -d -J $JOBNAME)
$(/sbcimp/run/tp/CA/Autosys/v4.0/bin/autorep -q -J $JOBNAME)

***** End of Status ****
EOF
} # End of FA_MAILER function


JOB_CHECKER() {
DONE_JOB=`grep $JOBNAME EXCLUDEJOB.txt`
ONQUEUE_JOB=`/sbcimp/run/tp/CA/Autosys/v4.0/bin/autorep -J $JOBNAME | grep $JOBNAME | awk '{sub (/:/,""); print $1$2$3}'`

if [ "$DONE_JOB" = "$ONQUEUE_JOB" ]
then
continue
else
JOB_TIMESTAMP=`/sbcimp/run/tp/CA/Autosys/v4.0/bin/autorep -J $JOBNAME | grep $JOBNAME | awk '{sub (/:/,""); print $3}'`

if [ "$DATE_STAMP" = "$JOB_DATE" ] && [ "$TIME_STAMP" -ge "$JOB_TIMESTAMP" ]
then
if /sbcimp/run/tp/CA/Autosys/v4.0/bin/autorep -J $JOBNAME | grep $JOBNAME | grep -w SU 2>&1 > /dev/null
then

SU_MAILER
COUNTER=$[$COUNTER+1]

elif /sbcimp/run/tp/CA/Autosys/v4.0/bin/autorep -J $JOBNAME | grep $JOBNAME | egrep -w "ST|AC|RU" 2>&1 > /dev/null
then
break
else

FA_MAILER
COUNTER=$[$COUNTER+1]

fi
fi
fi
}
# End of JOB_CHECKER function

# This is basically the main function of the script; it will continue to run until all jobs listed in the JOBLIST.txt are successful

while [ "$COUNTER" -lt "$NO_JOBS" ]
do
for JOBNAME in `cat JOBLIST.txt`
do
JOB_TZ=`/sbcimp/run/tp/CA/Autosys/v4.0/bin/autorep -J $JOBNAME -q | grep timezone | awk '{print $2}' | cut -d/ -f2`

if [ -z "$JOB_TZ" ]
then
/bin/echo "Timezone is NOT defined in $JOBNAME properties. Please check." | /bin/mail -s "Missing TIMEZONE" $EMAIL_ADD
continue
else

JOB_DATE=`/sbcimp/run/tp/CA/Autosys/v4.0/bin/autorep -J $JOBNAME | grep $JOBNAME | awk '{print $2}'`

case $JOB_TZ in
London)
DATE_STAMP=`TZ=GMT date +%m/%d/%Y`
TIME_STAMP=`TZ=GMT date +%X | cut -d: -f1-2 | awk '{sub (/:/,""); print}'`;;

Zurich)
DATE_STAMP=`TZ=GMT-1 date +%m/%d/%Y`
TIME_STAMP=`TZ=GMT-1 date +%X | cut -d: -f1-2 | awk '{sub (/:/,""); print}'`;;

Eastern)
DATE_STAMP=`TZ=GMT+5 date +%m/%d/%Y`
TIME_STAMP=`TZ=GMT+5 date +%X | cut -d: -f1-2 | awk '{sub (/:/,""); print}'`;;

Singapore|HongKong)
DATE_STAMP=`TZ=GMT-8 date +%m/%d/%Y`
TIME_STAMP=`TZ=GMT-8 date +%X | cut -d: -f1-2 | awk '{sub (/:/,""); print}'`;;

Tokyo)
DATE_STAMP=`TZ=GMT-9 date +%m/%d/%Y`
TIME_STAMP=`TZ=GMT-9 date +%X | cut -d: -f1-2 | awk '{sub (/:/,""); print}'`;;

Sydney)
DATE_STAMP=`TZ=GMT-11 date +%m/%d/%Y`
TIME_STAMP=`TZ=GMT-11 date +%X | cut -d: -f1-2 | awk '{sub (/:/,""); print}'`;;

*)
/bin/echo "What else could go wrong? Notify the script owner... Stressful." | /bin/mail -s "FOlSe Alert. Something is missing..." $EMAIL_ADD
exit $E_MISSING;;
esac
fi

JOB_CHECKER

done # End of FOR-loop

# Sleep will be added here for about 5 minutes
sleep 300

done
# End of WHILE-loop

exit $E_DONE

World Clock