Enterprise Wide IT Analysis and Automation Should be a Reality

ITOAAAlignmentBannerIn late 2014, I made the decision to expand the markets that I cover to include IT Automation.  Since making that decision and spending lots of late nights studying and ranking the technology from the various IT Automation vendors, talking to the CEOs that support and understand this market and traveling to the site of the many of the Global 1000 to talk to the users and see their “issues” first hand, I have come to the conclusion that enterprise wide IT Analysis (ITOA) and the associated and integrated IT Automation should be a reality.

I must be on to something as Gartner estimates that worldwide spending in this market subsector will surpass $800 million in 2013, which is a $500 million increase from the $300 million spent in 2012. Furthermore, this more than 100% growth rate is expected to continue through 2014. As a result of these estimated spending trends, Gartner  has assigned Will Capelli, a well known Gartner VP to cover the ITOA market.  And most of the other global IT analyst firms such as Forrester and IDC are also covering the evolution of the IT Automation market in some way shape or form.

Cappelli explains that “The volume and variety of data required to monitor and manage complex IT systems has grown exponentially, which increases the need for new technologies to ingest, store and analyze the data.” In fact, most IT operations are overwhelmed by the volume, velocity and variety of change and configuration data. And most of them lack insight or actionable information, making change and configuration problems a chronic challenge for IT departments.”

Capelli goes on to explain that “The primary goal of ITOA platforms is to deliver inferences about IT operations data to its users — an inference being an explicit addition to the information present in the data itself, generated by deductive and inductive processes applied to the data.”

Traditional IT management tools were not designed to deal with the complexity and dynamics of the modern data center. These tools have not been automated to collect data down to granular details, analyzing all changes and consolidating information to extract meaningful information from the sea of raw change and configuration data. Without systems to manage and organize this growth, IT will drown in its own data. By taking a different perspectives on the abundant data and complexity confronting operations teams, IT operation analytics tools use mathematical algorithms and other innovations to extract meaningful information from the sea of raw change and configuration data.”

A More Fundamental Explanation of ITOA

In an article entitled, “Give in to data centre automation and change your life,” written by  Aaron Milne and published on February 24, 2015, Mr. Milne states, “To put it simply, it is about automating every process involved in the day-to-day running of your data centre so teams can shift their focus from a reactive maintenance cycle to working on the projects that deliver greater business value.”

The full text of Mr. Mine’s article is as follows:

As an IT professional, unless you’ve been living under a rock you are probably familiar with automation, even if only in passing.

Automation process has been in use in the business world for many years, but somewhat paradoxically IT is usually the least automated department in any organisation.

Whole data centre automation and orchestration has been one of those topics that divide IT teams. Some speak of automation in the hushed tones reserved for discussing the holiest of holies.

Others find just speaking its name brings the sound of wailing and gnashing of teeth from the IT department.

Whole data centre orchestration has been pushed by vendors as the one true goal of any IT department in charge of web-scale infrastructure. In truth, orchestration is often poorly understood before, during and after any attempt at implementation. It needn’t be this way.

Talking cure

The traditional IT department model of the past 30-odd years has been silos. Systems manage the server infrastructure and try their level best not to talk to storage. Storage manage everything storage and often try to classify anything with more than 1GB of internal storage as part of their domain.

Networking administrators are seen as little more than semi-intelligent cable monkeys. More often than not the job is done these days by a pimply-faced youth who has yet to realise that systems is a better place to be.

I won’t talk about the Exchange and unified comms teams. Those guys and gals are a breed apart.

Communication between these teams is often minimal at best, with each jealously guarding its own demarcated little fiefdom. In a modern world, as budgets and head counts shrink, responsibilities are being shouldered by staff already exceeding the 80 per cent threshold for a reasonable break-fix workload.

Whole data centre orchestration is about maximising operational predictability, security and maintainability through inter-team communication and IT process automation.

Teams can shift focus from a reactive maintenance cycle to working on projects that deliver business value

To put it simply, it is about automating every process involved in the day-to-day running of your data centre so teams can shift their focus from a reactive maintenance cycle to working on the projects that deliver greater business value.

If you are liking the sound of automation so far, then it is time to take a look at the tools available on the market to help us achieve these goals.

Knowing where to start when picking an automation solution (or combination of solutions) can be difficult unless you are familiar with DevOps.

El Reg has decided to make things easier by providing a brief overview of three of the major automation and orchestration software packages.

Puppet

The first is probably the most popular of all the available offerings. Puppet has its roots in the open-source DevOps movement and has become a market leader in data centre orchestration and infrastructure management.

Whether you are managing just a few servers or a fleet of thousands, Puppet has the ability to scale. It offers a range of features that can automate every step of the software delivery process, including the provisioning of physical or virtual infrastructure, as well as test orchestration and reporting.

Focusing on reliability and stability, Puppet’s features go beyond simple configuration management. Its greatest strength, though, is its training resources.

One of the biggest barriers to entry for those looking at automation and orchestration products is the often steep learning curve associated with creating runbooks. Puppet acknowledges this is a big issue and provides the training resources needed to tackle it.

Orchestrator

No list of automation solutions would be complete without Microsoft’s Microsoft System Center 2012 Orchestrator. It is billed as a complete workflow management solution for the whole data centre, but it is with Windows Server infrastructure that Orchestrator really shines.

Chances are that if you are managing Windows Sever infrastructure above a certain scale you are familiar with other parts of the System Center suite. Like everything else, Orchestrator is reliant on other parts of the suite to deliver the most benefits.

Unless your budget is extremely constrained, this solution is well worth considering.

Chef

Chef’s offering differs fundamentally from Orchestrator and Puppet. Built with hybrid and hosted environments in mind, it has been embraced in a big way by those running wholly or partially hosted AWS clouds.

With an emphasis on cloud-centric features such as high availability, as well as zone-based replication and failover, Chef has put a lot of effort into catering for those running critical infrastructure in a cloud environment.

The company also offers privately hosted Chef for a premium subscription.

Follow the script

As the examples above show, whole data centre and hybrid cloud automation comes in a number of flavours and there are an entire ocean’s worth of resources available online that can further assist in deciding which tools are right for you.

Those tools are great, but what if you are not running a data centre or web-scale infrastructure? Can you still implement an automation and orchestration solution?

Are there some easy things you can do right now to show management that automation is not just for enterprises?

As an IT mercenary working with SMEs, it is not unusual for me to be supporting 15 to 20 servers and 300 to 500 endpoints spread across as many campuses. Like any IT professional supporting servers I handle many menial repetitive tasks which are easily orchestrated.

For example I hate it when end-users leave their workstations on over the weekend. Not only is it a waste of electricity (even with today’s reasonably green workstations), but it can lead to complications when you need to perform server maintenance.

Automating workstation shutdown via scripting is a simple yet effective answer and there are a number of ways it can be achieved.

  • You can run a batch script to execute shutdown via command prompt as a local scheduled task.
  • You can run a PowerShell script to execute shutdown as a local scheduled task.
  • You can run a PowerShell script locally on a server and have it execute remotely on the endpoints of your choice.
  • You can use IT process (runbook) automation or orchestration software.

Scripted automation provides a number of easy ways to perform simple tasks. Learning to automate administrative tasks on endpoints really is such small potatoes that junior help desk staff can take on some of the planning and implementation.

These small tasks are a great way both to learn about automation and to create examples that show management the benefits of automation in your clients’ workplaces.

Just the job

Whole data centre automation does a good job of covering the bases when it comes to web-scale, hybrid and fully hosted server infrastructures, but it is not always the best option when it comes to automating the everyday minor tasks that keep your endpoints humming.

While you can automate your servers with batch and PowerShell scripting, once your infrastructure grows beyond a certain level those tools won’t cut it. They don’t have the same power or flexibility to automate as Puppet, Chef or Microsoft System Center Orchestrator.

Choosing the balance point at which you switch between scripted automation and an automation package is a subjective decision. There is no specific moment where you absolutely have to move from one to the other. Generally, you will find out where that point is during the proof-of-concept stage of your project.

Automation is an integral tool for IT administrators as budgets and teams continue to shrink and infrastructures continue to grow in complexity, size and scope.

If you are running a hybrid or wholly hosted infrastructure, automation can save you an enormous amount of time and stress when it comes to monitoring and management. ®

IT Operations Analytics (ITOA) and Automation is the Goal

Both Capelli and Milne make really great points about the evolution of IT Automation.  However, I want to take the discussion to another level and state that the real “pain” and therefore the real requirement that I have heard from the CIO’s of the Global 1000 is the integration of ITOA and IT Automation across the entire enterprise (i.e. all platforms and applicaitons) into what I am calling Information Technology Operations Analysis and Automation (ITOAA).  The mission will be the  ability to take the analysis delivered by ITOA and its associated actions and integrate it with IT Automation.  The result will be dramatically increased IT and business productivity and reduced costs the likes that have never been seen before.  What more could a CEO or board from the Global 1000 want?

 

About Charles Skamser
Charles Skamser is an internationally recognized technology sales, marketing and product management leader with over 25 years of experience in Information Governance, eDiscovery, Machine Learning, Computer Assisted Analytics, Cloud Computing, Big Data Analytics, IT Automation and ITOA. Charles is the founder and Senior Analyst for eDiscovery Solutions Group, a global provider of information management consulting, market intelligence and advisory services specializing in information governance, eDiscovery, Big Data analytics and cloud computing solutions. Previously, Charles served in various executive roles with disruptive technology start ups and well known industry technology providers. Charles is a prolific author and a regular speaker on the technology that the Global 2000 require to manage the accelerating increase in Electronically Stored Information (ESI). Charles holds a BA in Political Science and Economics from Macalester College.