FullAuto Process Automation – How it Works
The modern world really LOVES “gadgets”. There is a gadget for literally everything – from peeling apples to changing light bulbs in hard to reach places, to removing lint from your sweater. The goal of nearly all these wonders of the modern age is to save the user time and aggravation. A lint shaver for example, saves the user countless rounds of pulling those annoying and embarrassing fuzz balls off of sweaters. Who can possibly denigrate such a “wonder”?
When is the last time YOU used one?
There are other issues of course that dilute the effectiveness of such a device. The shaver needs to be stored where it can be easily retrieved when it is needed. It requires a battery that has the nasty habit of being dead right when you most need it. It is far from a perfect instrument – works only with the “low hanging fruit” lint. The really embedded stuff still needs to be dug out by fingernail. And finally, maybe it’s just plain easier to put the lint filled sweater back on the hanger, and grab the new sweater you purchased last week instead.
“Computer process automation”has been and continues to be a lot like the plethora of gadgets that litter every kitchen, bathroom closet and garage from “sea to shining sea”. It sounds GREAT, rivaling the very best late night infomercials.But often times, it’s just easier to put the lint filled sweater back on the hanger and grab the new sweater instead. I’m reminded of the “in the egg scrambler” introduced in the 70’s. It works great, but really, how hard is it to just break the egg into a bowl and mix it with a fork? Not to mention that the fork and bowl are far easier to keep clean than the wonder device – and no batteries are needed!
This is precisely why, in an age where computers can literally do “anything” – countless tasks continue to be performed manually. When subjected to a cost-benefit analysis, the “gadget” (computer process automation) fails to deliver enough value to justify the time and effort it takes to implement. For countless tasks, “computer process automation” has been and continues to be more trouble and expense than it’s worth.
Yet there are gadgets that do deliver. And there are gadgets that once were mere novelties that are now ubiquitous. In the early 1980’s, a cordless screwdriver was almost useless – expensive, bulky, heavy, low torque, short battery life and long recharge times. Today they are used as much if not more than manual ones. Today, computer process automation is a lot like the early cordless screwdriver. The technology for computer process automation currently available todayis expensive, difficult to implement, fragile and challenging to trouble-shoot and maintain, and on top of that it’s slow.
But all that is about to CHANGE. Today’s cordless screwdrivers are not just better than their 30 year old ancestors; they are orders of magnitude better. For computer process automation software to really deliver on the promise of significantly reducing manual activities and associated costs, the innovationhas to be orders of magnitude better than what is now available.
After 15 years of intense development – FullAuto Process Automation Software is now ready deliver on that promise. It is not just “better” than the state of the art currently available; it is orders of magnitude better.
To illustrate that, we’ll use an analogy. Consider for a moment, that there are two HUGE warehouses on opposite sides of a very busy road. The warehouses represent two computers, and the road between them, the network that separates them. Imagine that a shipment is to go from one warehouse to the other. Even though it is “only” across the street (literally), it is still easier to load and use a truck than any other method of conveyance. The truck then has to leave its home facility, checking out at the exit gate; and then crossing the street and entering the other facility, now checking in at their entry gate. Imagine that each load has to be treated independently, inspected, itemized and unloaded, regardless of its relationships to other loads, or the fact that it comes literally from across the street. That is a LOT of overhead, time and expense that somehow seems unnecessary given the close proximity and similarity of the two complexes.
Modern computer systems interacting with each other programmatically over a network, work a lot like the analogy presented. Security is of course very important, which is why the security exit and entry gates are discussed in the example. One modern and widely used method for sending “instructions” and “data” (truck loads in our analogy) to other computers, is a software utility called secure shell or SSH.SSH is a command line tool that is now available for all computer systems including the Mainframe. Think of SSH as the truck from our example.
The black window on the box of the truck is what’s called a terminal emulator window. Just to illustrate how close to reality we are with our analogy, consider that trucks load and unload cargo at terminals. We also load data into databases. We can continue with the analogies – but you get the point.
A computer process is nothing more (when you break it down) than a series of commands. When an IT employee engages in “work”, what they are really doing in essence is typing in and running commands. Think of commands, and the output from commands, as the packages that are transported by the truck. In fact, at the network level, data is called “packets”. (But I digress). Cost savings come into play, when you can get the computers themselves to run more commands, and IT employees fewer commands. That is computer process automation defined as simply as it can be.
Unfortunately, we are all familiar with a rather sinister form of “computer process automation” that costs billions of dollars a year to protect ourselves from –called computer viruses. Anti-virus software is a countervailing form of “computer process automation” aimed at defeating the computer virus. So computer process automation is all around us every day and has been for quite some time. So why are manual computer processes still such a prevalent practice in nearly all IT organizations the world over?
The answer is rather simple: computer process automation ... is hard. Writing a good computer virus is hard. Writing good anti-virus software is just as hard, if not harder. Considering there are now billions of computer users worldwide, the number of skilled virus writers is a tiny, TINY minority. The number of developers working on anti-virus software is also small. The people engaged in these activities are some of the most skilled programmers on the planet. Very unfortunate for all of us of course and a tremendous waste of resource and talent – but it is what it is.
The average IT employee is orders of magnitude less skilled than virus and anti-virus programmers. Kind of like the difference between doctors and nurses. Again, unfortunate – but it is what it is. So if computer process automation is to become (by an “order of magnitude”, of course) more widespread that it is today, then the software available to do it must be easy enough for the average IT employee to master.
FullAuto software was created to precisely address that need. But before we look more closely at FullAuto itself, it would be helpful to explore a couple reasons why computer process automation is so inherently difficult. The first one (sticking to a theme) is naturally those wicked “virus writers”. Computer process automation is fundamentally hampered by the need for security. Protections put in place to protect computers from nefarious “automation”, also unfortunately work against the successful implementation of beneficial automation. Walls impede both sinners and saints. Barbed wire will entangle both the good and the bad.
In the late 90s, Secure Shell (or SSH) arose out of the need for a secure means of successfully connecting to, and interacting with a remote computer. With SSH, one can perform just about any activity on a remote system that one can perform on their local computer.
Which leads us to the second reason computer process automation is hard. Common sense would have us ask this: if SSH is secure and with it we can perform just about any activity on a remote system – then why not just automate the running of commands over SSH? Problem solved? … Right?
There is just one small issue with finding the means of accomplishing that. Currently, no one has found a way to run multiple serial commands, via a program, over a persistent SSH connection.
Huh? Care to explain that in English?
Let’s return to our truck and warehouse analogy. The truck starts out empty. Packages are loaded into the truck at the base warehouse, the truck passes through two security check points, is unloaded at the destination warehouse, loaded again at the destination warehouse (program output) and then again passes through two security check points again before it returns to the base warehouse where it is unloaded and returned to empty status. Now imagine that the truck, instead of carrying multiple packages loaded separately, instead carries only one big pallet – for that is actually closer to the way computer communication over SSH takes place. The pallet can contain one item or multiple items. But it is still just one pallet. If the individual boxes were computer commands, the “shrink wrap” that binds them together would be semi-colons.
= ls;hostname;id;uname
The problem with “packaging” commands with semi-colons is that all the output returned from all those commands will be sent back to the “base” computer all appended together – with no clear separation. So even though with this method multiple commands can be sent and run via SSH, this is rarely done because of the problem of sorting out and properly parsing the output from those commands.
Note how the output from all four commands, appears in the terminal window completely lacking any kind of consistent and identifiable separator. This is the core problem encountered when attempting to run multiple commands over a single SSH connection. In actual practice, a new SSH connection is made for each single command– just to avoid this problem. So instead of a pallet, imagine the truck discussed above making a completely new trip to the destination warehouse for every package!!!
Imagine the truck has to go through security four times, just to deliver that single package and return to base. Not very efficient, is it? Now imagine that a process you wish to automate over SSH contains a hundred commands or more. This means a hundred new “trips” and four hundred additional encounters with security. Are you beginning to see why automation that spans computers is hard?
Programs attempting to use SSH for automation, often use an operating systemcomponent called a “pseudo-terminal” or PTY. Pseudo-terminals have the advantage of enabling a programmer to send individual commands over a persistent connection, and retrieve output before sending another command. The problem though, is precisely the same issue described above – there was no reliable way to accurately separate the output from multiple commands. Many computer scientists have considered this an impossible problem to solve. Consider the opinion of Randall L. Schwartz, expert programmer and author of numerous books and manuals, on the perlmonks.org collaboration site:
I don't believethere's any waythata programthat sits on a pty[1] can ever distinguish where the data is coming from. This is the blessing and curse of Unix I/O.
So, no, I think you're just gonna have to consider the various things to wait for, some of which will be error-like messages coming at unexpected times.
Randall L. Schwartz,
This problem is so vexing, that process automation innovators have developed some very expensive work-a-rounds.Expensive both in terms of money and effort required. The most notorious is IBM’s Tivoli ESM (Enterprise Systems Management) Software. Tivoli pre-existed the advent of SSH, and so far appearsnotthreatened by it. It is what is known as client-server architecture. A Tivoli program (or agent) is literally installed on every computer needed by a process. Each installation is a license expense. Tivoli is challenging to install, maintain, program and successfully utilize. IT departments the world over have whole teams of personnel whose entire positions are just managing Tivoli and its automated jobs in the organization. The following link provides a good historical overview of IBM’s Tivoli:
Another approach to process automation across multiple computers is the “process scheduler”. BMC Software’s flagship product is CONTROL-M Enterprise Manager. Like IBM’s Tivoli, Control-M is client-server architecture. BMC Software grosses nearly two billion a year, and employs nearly 5000 consultants – who work principally supporting Control-M. Control-M’s approach to process automation is simply that of a “terminal agent”. It schedules individual “jobs” (or trucks if sticking with our analogy) to execute at precise intervals. It is truly nothing more than a glorified “dispatcher” – a truly robust and full-featured dispatcher to be sure, but still, at the end of the day, a mere “dispatcher”. The “trucks” it dispatches suffer all the problems and limitations discussed above.
Finally, to complete our survey of competing automation technology approaches, we will examine a more recent entry into the field – a product called “Chef”. Chef differs from Tivoli and Control-M in its efforts to cope with the “timing” and “roll-back” problems. Again, a process is nothing more than a series of commands. But of course, it is never really that simple. There are frequent cases where a command cannot execute successfully unless a previous command ran successfully. An example of this being a command that reads a file. If that file doesn’t exist, the command will fail. Suppose the file it is to read, is dynamically created by a different process run earlier in time.How is the new command, supposed to be sure that the file was successfully created by the earlier process? In our analogy, how does the truck driver KNOW that an earlier delivery that his load depends on was successfully delivered? In the real world, the truck driver would use a radio. The “Chef” software implements a kind of radio check by using a database. Output from commands is written to a database, which is accessible by all “truck drivers” (or “command wrappers” in computing parlance) – so that they can discover the status of earlier “dependent” loads. If they find out that the earlier load was not successfully or completely delivered on time, the driver can delay his trip or cancel it until the situation with the earlier load is resolved. With roll-back – imagine that three loads were delivered before the fourth delivery failed (the fourth truck had an accident and the load was destroyed). Not only does driver five now cancel his trip, but the three earlier loads have to picked up from the destination warehouse and returned to the base warehouse. That in a nutshell is “roll-back”, and it is a common need in large organizations. When not deemed an absolute necessity, roll-backcapability is still highly desired as it increases reliability, and saves IT organizations time, effort and money. But … it’s hard – which is why it is used mainly in mission critical processes where cost is not an issue.
FullAuto changes ALL this.ALL the problems listed above, are principally a failure to establish a persistent and secure connection between twocomputers (warehouses in our analogy) over which multiple commands can run and return output in a reliable fashion. Rather than trucks, what is actually desired is a connecting catwalk between the two warehouses:
With such a structure, passing packages back and forth between the two warehouses becomes as easy as moving packages within one warehouse. It is not a fallacy to assert that with such a catwalk, the two warehouses essentially become ONE BIG WAREHOUSE. By logical extension, two computers connected with a persistentSSH connection essentially become ONE BIG COMPUTER.
FullAuto is the world’s FIRST and currently ONLY successful implementation of “computer process automation” software utilizing persistent SSH connections over which multiple commands can be executed in a reliable fashion – without reliance on remote host configuration. But hold on! – wasn’t it stated earlier that experts considered this an impossible problem to solve?
Indeed it was so assumed – but it turns out, after 15 years of intense development, software author Brian Kelly has achieved the impossible. It turns out – the “experts” were WRONG, and FullAuto proves it!FullAuto is not a “theory”; it is not an “idea on a napkin”. It is fully developed, fully tested, world class software ready to use TODAY. Recall that earlier it was said that:
For computer process automation software to really deliver on the promise of significantly reducing manual activities and associated costs, the innovation has to be orders of magnitude better than what is now available.
FullAuto is indeed orders of magnitude better than any other computer process automation software currently available anywhere. FullAuto does not require the skills of a virus author, or anti-virus developer. FullAuto can be successfully and efficiently used by IT administration technicians with average script and batch file writing skills. This is important because scripts and batch files are used by every IT organization everywhere. Every IT organization employs admin technicians – from one to thousands in number. Anyone who has the skills to write a basic script or batch file is skilled enough to write instruction sets for FullAuto. FullAuto was created with one of the world’s oldest and most respected scripting languages – Perl. In fact, it can be said that FullAuto is really a Perl extension, that enables the Perl scripting language to be multi-computer capable.