In this Writer’s Room blog, Andela Community member Shreeharsha G N discusses his experience in DevOps and explains how he developed a Jenkins framework to streamline processes and improve his client’s automation!
Yes, the title may be a small exaggeration: but the technology we’re about to discuss is no misnomer!
As part of our enterprise Solution R&D division, I was assigned a task to automate the “execution of a workload” on different SUTs (System Under Test) with four metrics, which takes nearly 2-3 days to complete execution. Seven of my teammates were scheduling these workloads on 15+ different SUTs almost every week, manually administering each metric one after another, and in-between weekends they would encounter more delays in getting the necessary results.
This exercise continued almost every three months, three times a year, costing the client a lot of money purchasing servers (for their on-prem/colo/hybrid cloud environments).
So, I developed a Jenkins framework that executes these workload metrics scheduled on different SUTs, as and when requests from team members hit a queue, which is freed up every 5 minutes. It was just a ‘wrapper’ over an already automated python package, which was not supporting multi-tenant use cases. This was challenging, as it was a multi-tenant use-case with reports generated for each execution run which had to be tagged to end users/SUTs/metrics and all together, with debugging logs required for each pipeline stage.
Objective and Goal of the Automation:
SUTs –> System Under Test
The flow chart I designed for this use case is below:
1. Even a not-so-experienced person can execute these applications and therefore, make cost savings.
2. Sanity tests – ILO, SUT, OS, Network, etc.
3. Regression tests: Early performance bugs detection across stacks.
4. Optimization pipelines – RFI, RFP.
5. In-house pipelines tools – Python/Java plugins and more…
Helped to successfully complete deals of RFP requests worth 5 to 8+ million US dollars. (Recent Example: SSA customer bid of 173 million wins in 2 months, an 8-year contract).
Fast turnaround time for RFP requests and customer delight.
Although I am a linux geek, I had to learn Windows batch scripting, as the automation framework used for the execution of the application, collect logs at the system level and dump the results on an NTFS shared across the team. The initial part was to get users input defined as a key-value pair that would have SUT (system under test) IP, credentials, BIOS tunes, OS tunes, etc., so the Jenkins framework shown above in the flow chart would read these key-value pairs and apply the BIOS tunes, OS tunes to the SUT, followed by application execution, with the choice of the application/metric made in the UI style drop-down in the Jenkins framework itself.
The users would edit and copy their run parameters as a text file and copy to the Input folder, and Jenkins framework would read these files every 5 mins and schedules the application execution on different SUTs automatically. Once the runs are complete, the results are dumped in user-specific folders with tuning-comment strings to identify users who executed a particular application on a given SUT, with specific tuning applied. This history of tunes and application results can be used to build machine learning models and use them for recommendations to RFP/RFI requests. However, if there are errors in the execution, it is currently not handled but is notified through Slack from the Jenkins workflow.
Some screenshots of the automation are shown below:
Since the SUTs on which the application is launched had to be on an isolated network (for performance tests), the JVM parameters used to launch are as below, so that there is no impact of Jenkins agents on the performance results.
The rest of the Python, windows batch scripts and Jenkins groovy scripts can be found at this GitHub link.
Want to be part of a vibrant tech community? Then join the Andela Talent Network!
Your career is a journey, not just a job. Taking ownership of your career development and actively seeking out opportunities for advancement can not only spark career growth, but also increase your enthusiasm for your work. Read our seven tips to accelerating your work ambitions!
With technology advancing faster than ever before, tech skills are always in demand. These are the top six right now: Core engineering, Cloud API, database expertise, data analytics, communications, and Devops methodology.