This document has information/background…etc on the tools:
- EM Miner
- Control-M Log Summary report
- EM User Authorization Report
- Export all job definitions to Excell
- Migrate (move/copy) tool
For any and all question/comments/suggestions/etc, please send me a note (). As we discussed on the conference call, these are not supported products of BMC or any other company. They are an accumulation of ideas/reports/routines that have been developed for a particular reason and in some cases have been generalized. If you use them, you are responsible for any/all results.
EM MINER
I hope you find the information presented by this tool as both enlightening and useful. Its primary goal is to show you how you are defining tasks.
It began with the realization that when I visit site that I don't work with often, several questions were always being ask like "how many jobs do you have", "do you use calendars", "how many EM users are there", etc.
Many times the site staff could only estimate the numbers and I would actually do a DB query to discover the numbers for both the client and myself.
So this routine is the accumulation of those questions. Many sites now use it periodically to see if there are any issues with their job definitions that may indicate they need to research them (like a maxwait value on jobs of 99 days, or discovering that they now have more than 1,400 calendars that have been created).
Let me know your thoughts on the routine once you have finished and if you don't mind sending me a copy of the created spreadsheet it would help me in both additional exposure to how clients are defining jobs within Control-M and additional test data for the routine itself.
The routine is written in PERL and will create an Excel spreadsheet as output.
So the requirements to run it normally are:
- A PC which has EM installed on it (since PERL is installed with EM in most versions)
- A PC with Excel on it
- a PC that can access the EM DB (via an sql type command so a DB client is
Required (but if you have the Control-M Reporting Facility one was installed
With it. If you can’t get to the DB, you may need one of the DBA staff at
Your site to install a simple DB client on your pc (most Control-M shops
Don’t need this as a client already exist).
That's about it. Here is some other preliminary test to see if you’re ready to run it.
The EM Miner routine is just doing SQL queries in their simplest form. If you were using Sybase or MSsql for your DB, then an isql query like "select count(*) from ....." would be executed. If Oracle is your db, then it would be the same type query using sqlplus.
Things needed to run it.
Note: I tried to document this as cleanly as I could. As a result it looks like a lot of stuff. If you like, we can do a "live meeting" and set it up then just have an icon on your desktop you can click for it to run anytime.
1) Save the emminer.pl routine on your pc in your path where your EM has PERL. On my PC that’s "C:\EM 6.3.01\bmcperl\bin" but on yours a default install of the EM client probably used "C:\bmc software\EM 6.3.01\bmcperl\bin"
2) Verify that we can access the EM Data base from your pc.
Start --> RUN --> cmd (this will open a dos prompt on your machine)
Enter the command isql if you use Sybase for a DB
sqlplus if you use Oracle for a DB
osql if you use MSSql (you might also try the isql for MSSql).
If all that comes back is a new line with your cursor blinking after a ">" or if you see text about needing a password or id, then you have a DB client (that’s good). If you get something like file or command isql not found, then you will need to contact one of your DBA's to install a client on this machine (or use another machine to run this report).
3) Now that we know we have a DB client and can run queries against the DB, we need the logon details of ID, Password, and the name of the DB.
On my machine (windows with MSsql db of HOU-TCANNON-71) I usethe EM account "emuser", with a password of "empass".
To test the DB connection, from the dos command prompt enter the relevant command for your DB type.
isql -U<em user account> -P<em user password> -S<db name>
or osql -U<em user account> -P<em user password> -S<db name>
or sqlplus <em user account>/<em user password>@<db name>
On my machine its osql -Uemuser -Pempass -S"HOU-TCANNON-71"
You should get some flavor of a prompt like or perhaps 1>
You canexit from sql by entering "quit" and hitting enter.
Were ready.
4) from your dos prompt change to the directory where we put the emminer.pl routine (for me that was C:\EM 6.3.01\bmcperl\bin)
cd "C:\EM 6.3.01\bmcperl\bin"
but for you it might be the default which would then be
cd "C:\bmc software\EM 6.3.01\bmcperl\bin"
5) enter perl emminer.pl
Here are the prompts you will see and note that once you have run the routine, it will remember your prompts and all you have to do for the same value is to hit enter. If you put a numeric value in the "max licensed tasks" prompt it will simply show up in the spreadsheet to tell you if your over and by how many jobs. It will accept any value for that prompt.
C:\EM 6.3.01\bmcperl\bin>perl c:\terry-d\code\perl\emminer.pl
emminer.pl the data miner is collecting needed info ...
---> em version number 6.3, 6.2, or 6.1.3 (6.3):
---> em userid (emuser):
---> db server ("hou-tcannon-71"):
---> db type {m for MSSQL, e for MSDE, s for SYBASE, or o for ORACLE} (E):
If you know are licenced by a max number of task for Control-M
and you would like this shown on the excel spreadsheet, the routine will
indicate this number and let you know if your daily usage is over that maximum.
If you are tier licenced or do not know or care for this indicator, just enter
something like "Not Measured" at the prompt
---> max licensed tasks (NOT MEASURED):
Then the screen clears and as it is doing the DB queries it provides some feed back of what query its on.
The final screen looks like this:
emminer.pl starting Data Mining ...
--> DB access verified
--> Now mining the EM DB for job info
Exploring definitions
query# description
------
1/47 Doclib
2/47 Memlib
3/47 Override Library
4/47 SNMP settings
5/47 Components
6/47 Tables by User Daily
7/47 Owners
8/47 Authors
9/47 Intervals
10/47 Priority
11/47 Time Zones
12/47 Max Waits
13/47 From times
14/47 To times
15/47 Global prefixes
--> identify jobs with Global In and Out Conditions
--> total of 0 global prefixes to process, showing % completion on next line
16/47 Alerts
17/47 Shouts
18/47 Command Line queries
19/47 On statements
20/47 By Weekly Days string
21/47 By Monthly Days string
22/47 By Monthly calendars
23/47 By Weekly calendars
24/47 By Holiday calendars
25/47 Calendars by Data Center
--> identify Highest year defined for each calendar and any duplicate calendars
--> total of 5 calendars to process, will show % of completion on next line
--- 20%--- 40%--- 60%--- 80%---100%
26/47 Application type
27/47 Def cond In Odates
28/47 Def cond Out Odates
29/47 Counting tables by Data Center
30/47 Jobs per table
31/47 Task type
32/47 Group
33/47 Application
34/47 EM users
35/47 Agent (shows all agents defined in job definitions)
--> resolving all agent host and showing its IP address also
--> total of 1 agents to process, will show % of completion on next line
---100%---200%
36/47 Historical Job Counts for EM (will take a moment)
37/47 Prereq conds
39/47 Total jobs
40/47 Cyclic
41/47 Jobs with Autoedits
42/47 Critical
43/47 Confirm required
44/47 Multi Agent
45/47 Active From & Until
46/47 Pre or Post commands
47/47 Retro
--> Enter your company name or abbreviation (with no spaces): terry
--> Find the EM report in the Excel file c:\temp\terry.emminer.rpt.2008.01.16.11.58.xls
Final notes. The "company name" you enter is used as the first part of the file name for the excel spreadsheet. The routine will try to pop excel open when its finished (if it isn't already open). Sometimes this works and your spreadsheet appears, often times it will blink for a second and not appear. You then just have to open it like any other spreadsheet manually.
I am sending the one I just created on my machine as an attachment. Tabs are across the bottom of the spreadsheet for the various queries done. The 3rd tab labeled "Jobs in EM AJF" will show you how many jobs were in your Control-M's over the past few days. How many days is actually a parm which I show on the "SNMP" tab (near the right end of all the tabs) under the parameter "maxolddays". You could adjust this if you wanted more EM history kept.
Again, please call if you would like to run thru this together. And send me a copy of the spreadsheet if you would like to do a quick conference call to discuss all of the tabs, what they mean, and how does your usage relate or differ from other sites.
Control-M Log Analyzer (summary)
This routine looks at all of the Control-M Log information and produces a clean report of all activity by day.
It uses the output from the “ctmlog” command as input so you must have a log for it to process.
You can get one easily enough by going to a machine where the Control-M Server is installed and typing:
Ctmlog list “*” > myctmlog.txt (no need to capitalize the leading C, Word did that)
That will dump the contents of the Control-M Log into your text file “myctmlog.txt”.
Now feed this log into the routine (you could either copy the routine to this machine where you just created the text file, or you could move your text file to a machine where you are running the analysis).
Just as with the EM Miner, this routine needs to have PERL installed the machine where you will be running it.
When your ready, from a dos/command line window, enter:
ctmlog.summary.pl -v 6.3 -log mylog.txt -out myrpt.txt
You can give the version (-v) as 6.2 or 6.1.3 also.
And you will see the report produced on your screen (as well as to the –out file).
For each day contained in the log, it will produce a page that looks something like this:
ctmlog.summary report started at 08/08/2008 10:39:02
*starting with ctmlog beginning of Aug 06 at 10:16
0806-1016 NEW DAY PROCEDURE STARTED
0806-1016 CONTROL-M LOG CLEANUP STARTED
0806-1016 CONTROL-M LOG CLEANUP ENDED. DAYS=2
0806-1017 ACTIVE JOBS FILE CLEANUP ENDED
stats: Jobs submitted -- 54
Ended OK -- 55
Ended NotOK --
Unique tasks -- 53
Cyclic runs -- 0
User interactions --
created --
held --
free --
Rerun --
Forced ok --
Killed --
Deleted --
Undeleted --
# of msg
occurrences id example messages (actually 1st occurrence of each msgid text)
------
1 8 SHOUT TO ECS FAILED The user name could not be found.
1 5006 CONTROL-M LOG CLEANUP ENDED. DAYS=2
1 5011 CONTROL-M LOG CLEANUP STARTED
1 5012 CONTROL-M JOBINF CLEANUP STARTED
1 5013 CONTROL-M JOBINF CLEANUP ENDED. DAYS=1
1 5018 CONTROL-M AGENT HOU-TCANNON-71 CLEANED SUCCESSFULLY. DAYS=2
1 5019 STATISTICS INFORMATION CLEANUP STARTED.
2 5030 ACTIVE JOBS FILE DOWNLOAD TO CONTROL-M/EM STARTED
1 5034 STATISTICS INFORMATION CLEANUP ENDED. MODE=0. LEAVE=20
1 5040 NEW DAY PROCEDURE STARTED
1 5041 ACTIVE JOBS FILE CLEANUP ENDED
1 5061 USER DAILY SYSTEM STARTED
1 5062 USER DAILY SYSTEM ENDED
70 5065 ORDERED JOB:641; DAILY SYSTEM, ODATE 20080806
53 5100 ENDED AT 20080806101726. OSCOMPSTAT 0. RUNCNT 1
54 5101 STARTED AT 20080806101715 ON HOU-TCANNON-71
54 5105 SUBMITTED TO HOU-TCANNON-71
227 5120
21 JOBSTATE CHANGED TO Analyzed
1 JOB STATE CHANGED TO Post processed
171 JOBSTATE CHANGED TO Executing
55 5133 ENDED OK
2 5164 DUMMY. STATUS CHANGED TO OK
3 5208 QUANTITATIVE RESOURCE payroll_server QUANTITY 1 ALLOCATED
53 5209 CONDITION maint-ENDED 0806 ADDED
42 5210 CONDITION ag_sysout_clean-ENDED 0801 DELETED
11 5212 JOB Start deposit report, TABLE bimdemoTbl FORCED
3 5214 QUANTITATIVE RESOURCES RELEASED
59 5216 REMOVED FROM ACTIVE JOB FILE BY GENERAL DAILY
21 5524 START OF TRACKING ALL ACTIVE JOBS
1 5525 CTMJSA STARTED. MODE=JOBNAME
Statistics for the complete jobs (submit & end)
Percent Job Duration Total
------
0 1 second or less --> 0
0 between 1 & 5 secs --> 0
0 between 5 & 10 secs --> 9
0 between 10 & 20 --> 33
0 between 20 & 30 --> 7
0 between 30 & 60 --> 0
0 between 1 & 5 min --> 4
0 between 5 & 10 min --> 0
0 between 10 & 60 min --> 0
0 more than 1 hr --> 0
EM User Authorization Report
Simple goal of this routine is to produce an Excel spreadsheet (or comma separated file if you want) which on one line shows all of the authorizations for a user.
Same requirements as the EM Miner (PERL, DB access, …)
Here is the screen shot of it having been run on my pc:
C:\program files\bmc software\Control-M EM\bin\binperl perl authrpt.pl
authrpt.pl starting ...
---> em userid (emuser):
---> em user password (empass):
---> EM DB server or instance name, in double quotes if the name has dashes ("tcannon71"):
---> db type {m for MSSQL, e for MSDE, s for SYBASE, o for ORACLE, p for Postgres} (e):
---> Excel spreadsheet or comma separated file for the results (E for Excel, C for comma separated) (E):
Percentage done
************100%
Find your Excel file at c:\temp\auth.report.2008.08.8.10.46.csv.2008.08.8.10.46.xls 10:46:37
Export job definitions to Spreadsheet (EXPALL.PL)
This routine is slightly different from the others. It will actually be doing its reporting against an “export” of the EM DB information if you have one available (many people take a daily export using the EM Utility “util”). But if you don’t, not to worry, it can take one itself (given the EM id and password).
Here is one other note on this routine. I have sometimes noticed that when I run it and give it bad data, I get bad data out. It’s not a little bad, it just looks a little bizarre and it tells you it is having a problem. To show you what I mean, here is bad output.
C:> exprall.pl
exprall.pl starting ...
---> Do you have an export file you want to run against? (n): y
---> What EM Export file is to be used(texpsmall):testexportfile
--> Starting Excel 10:53:11
--> Accessing export file
-- 54 jobs
-- 8 Data Centers
-- xxx In Conditions
-- xxx Quantitative Resources
-- xxx Out Conditions
-- xxx Set Variables
-- xxx Shouts
-- xxx On Statements
-- xxx DO Statements
-- xxx DO Mails
Error: Noted that at least one of the export record type totals does not match programs tallys. Printing both and exiting
NR_DEF_TABLES :def_tables_index xxx. :4.
NR_JOB :job_index 3. :79.
NR_LNKI_C :lnki_c_index 0. :0.
NR_LNKI_P :lnki_p_index 0. :14.
NR_LNKI_Q :lnki_q_index 0. :16.
NR_LNKO_P :lnko_p_index 0. :13.
NR_SETVAR :setvar_index 0. :15.
NR_SHOUT :shout_index 0. :1.
NR_ON :on_index 0. :12.
NR_DO :do_index 0. :8.
NR_DOFORCEJ :do_forcej_index 0. :0.
NR_DOCOND :do_cond_index 0. :0.
NR_DOSETVAR :do_setvar_index 0. :0.
NR_DOSHOUT :do_shout_index 0. :0.
NR_DOSYSOUT :do_sysout_index 0. :0.
NR_DOIFRERUN :do_ifrerun_index 0. :0.
NR_STEP_RANGE :do_step_range_index 0. :0.
NR_DOMAIL :do_mail_index 0. :6.
NR_DOREMEDY :do_remedy_index 0. :0.
NR_DOCTBRULE :do_ctbrule_index 0. :0.
NR_CTB_STEP :ctb_step_index 0. :0.
NR_PIPES :do_pipes_index 0. :0.
NR_DEF_TAGS :do_def_tags_index xxx. :0.
NR_SHOUT :do_shout_index 0. :1.
NR_DEF_JOB_TAGS :do_def_job_tags_index xxx. :0.
Now creating Excel worksheet 10:53:13
--- 0--- 66---100---133
--> Find your Excel file at c:\temp\exprall.2008.08.8.10.53.xls 10:53:14
The reason for the “bad” results is that the file I fed into it was not a valid export file. I just wanted to show you that in case you run into that issue. That is why I added the section to test the validity and print out those messages.
It should resemble this output
C:\terry-d\code\perl\fareport>exprall.pl
exprall.pl starting ...
---> Do you have an export file you want to run against? (n): y
---> What EM Export file is to be used(testexportfile):texp1
--> Starting Excel 11:00:07
--> Accessing export file
-- 4 Data Centers
-- 8377 jobs ......
-- 1129 Control Resources .
-- 3291 In Conditions ...
-- 11307 Quantitative Resources ......
-- 8370 Out Conditions ......
-- 5457 Set Variables .....
-- 7825 Shouts ......
-- 15783 On Statements ......
-- 7407 DO Statements ......
-- 513 DO Force Jobs
-- 1 DO Shouts
-- 15587 DO SYSOUTs ......
Now creating Excel worksheet 11:02:32
--- 10 –20 –30 –40 –50 –60 –70 –80 –90 –100%
You can find your spreadsheet in …….
MIGJOB (migrate/copy)
This routine was targeted at a specific client need. It automated the process of pulling job definitions from an EM, applying any requested changes (via a text file to indicate the type of changes to make like changing OWNER=Terry to OWNER=Frank…) and then pushing the job back into the same or a different EM. Thus it was “migrating” data.
Same requirements as the above tools.
If you want to have job definitions changed when your doing the move, you need to supply a file with those changes. Here is a sample file called newchanges I used:
C:\temp>type newchanges.txt
#this is a comment line because it starts with a #
#
#general syntax for request of an xml change is
# <xml field>|<old value>|<new value>
#with each change request starting in the first column
#
TABLE_NAME|aft|newtablename
NODEID|HOU-TCANNON-71|bravoo
APPLICATION|aft|newappl
AUTHOR|terry|stoneman
#have a blank or comment line at the bottom.
If you invoke the routine with no parameters, it will echo the syntax as:
C:\terry-d\code\perl>migjob
Syntax Error, You must specify a configuration file (i.e. migjob.pl -configfile="c:\temp\migconfig")
This file is used to hold values you supply for later runs
Syntax: migjob.pl (invokes as interactive)
migjob.pl -configfile=<configfile> {options}
or with a fully qualified path to where perl is installed, like:
c:\program files\bmc software\control-m em\defaulmcperl\perl migjob.pl
c:\program files\bmc software\control-m em\default\bmcperl\perl migjob.pl -silent
Options:
-emuser=<emuser> (the EM user id your moving jobs into)
-empass=<empass> (the EM user password
-emhost=<emhost> (the DB name, gui server name, or hostname needed to connect)
-schedtbl=<tblname> (the scheduling table to be moved)
-changes=<changes file> (the file which holds the changes to be made, if any
-altemhost=<altemhost> (the DB name, gui server, or hostname needed to connect)
-altemuser=<altemuser> (that EM user id)
-altempass=<altempass> (that EM user password)
-d (debugging)
-silent (use input values from the configfile given and runs with no prompts)