Rationalization of a large Application Portfolio
A large health sciences multinational that offers a wide range of instruments, software, consumables, reagents, and content for fields of cell biology, gene expression, protein purification, protein quantitation, drug discovery and manufacture, food safety, and science education.
The client has been growing with inclusion of new services and product offerings, and through acquisition. Over time, the application portfolio had grown to over 700 separate applications being supported by IT. Business required IT to reduce its annual outlay by close to 40%. Reduction of the application footprint by 50% was a key requirement to achieve this goal of annual outlay reduction.
The goal of the project was to rationalize the IT applications portfolio in order to reduce the footprint by 50% and improve manageability and cost effectiveness. The work required study and understanding of current state architecture and developing a future state architecture for the enterprise. We used a best-of-breed and customized IT portfolio rationalization and integration methodology, and carried out the following, in this project.
⇗Studied the current state of client application environment from the following perspectives: Application redundancy, Application usage, Application health; Studied the key applications for the organization for the following perspectives: Enterprise applications; Operational applications
⇗Determined key strategic application vendors with which to form strategic, long term relationships; Worked with the US, Europe, and APAC-based IT teams and identified the best-value future state applications using an objective, quantitative model
⇗Created a future state application portfolio based on rationalization outcomes, illustrating: To-Be application architecture wiring diagram; To-Be application architecture transition plan, illustrating yearly pace of transition; Infrastructure impact analysis and remediation recommendations
⇗Created a specific change management plan and communication plan to facilitate end users transitioning from one application to another
⇗Created a Governance and Application Rationalization sustainability framework to take in to account application lifecycle (including procurement), to-be-Introduced technologies such as cloud based applications and mobile applications, exception processing, communication and management, and ability to serve M&A work
⇗Created effective communication plans to address actual project implementation, governance procedures and outcomes, decision support related to application rationalization on an ongoing basis
⇗Developed a disposition plan for each application incorporating human factors, organizational change management, financial/ legal/ regulatory impact, and technology sunset designs for data, integration, and infrastructure
We went on to assist the client with the implementation of the rationalization. The rationalization effort saved over $ 40 million in annual outflow for the client.
Migration of COBOL reports
A major futures exchange in North America
The client was planning on phasing out a mainframe platform with COBOL applications. There were 150 reports that were in uses by the business getting generated by the COBOL programs. We had to generate reports from a newer data source, when the data from the mainframe was moved to departmental data marts. We had to generate new reports from the new data sources to match the old reports ahead of the phasing out of the mainframe.
The client had to migrate the existing COBOL reports to an open platform since the COBOL platform was being sunset. The inputs available were the information about source data required for the report, COBOL source code for the report program, and a sample generated report to be used for reference. Report development included writing database queries to obtain data from a departmental “datamart”, writing Java code to obtain data from an extract file, and then implementing formatting logic using either Spring Batch or BIRT to generate a text or PDF file. All 150 plus reports were successfully migrated on schedule.
The scope of the project was limited to generating 150 reports, in Java using Spring Batch/ BIRT framework; the programs obtained source data from an Oracle Database or fixed-record length extract file. Testing of the reports was done for data accuracy and visual compliance (almost) with current mainframe reports.
We developed a source mapping spreadsheet for each report to be migrated; the spreadsheet was developed with suitable information inputs from a client SME. We also utilized the report template provided by the client, keeping each report in close adherence to the template to the extent feasible. We created a test database schema and dummy sample data to accomplish report development off-site. The reports were developed using Java/ Spring Batch or Java/ BIRT depending on the output format requirements of the report.
We did an initial POC with 5 reports in order to generate feedback to set the development standards for rest of the reports. The reports delivered in batches and the project was completed in six months with a team of 3. The project was successful in terms of quality of the reports and the project getting accomplished in time and on budget.
Custom Development of a Logistics Solution/ ERP
North American unit of a water purification Multinational
The North American unit of the client required a complete revamp of its in-house developed systems, due to a confluence of factors: a barely maintainable old system, unmet business requirements, process inefficiencies, and new units acquired by the company. The old system did not have any documentation and a good part of the business knowledge was embedded in the system without any person(s) having complete knowledge of the system.
DivIHN carried out a reverse engineering effort to pry out business logic from the old system code and developed new business system requirements. It was decided to implement a phased modular replacement of the old system with initial emphasis on data migration. The new system was built around the .Net framework. It used component-based DNA architecture. The application behavior was divided in to layers, viz. Presentation layer, Business layer, and Data layer. The system had an interface to Great Plains financials and a mobile systems interface with Windows CE. The system included a Business Intelligence layer to provide relevant information for management and power users. Detailed system and user documentation, Direct training, and Training through trainers trained by DivIHN helped in change management. Business ownership of data was ensured.
The system, developed with an effort of over 300 person months of effort, over 18 months, eliminated the requirements backlog that had piled up. The phased implementation was a success with gradual handover of the post-implementation support activities. The system functioned as the complete organizational IS along with GP for Finance.
Reverse Engineering and Documentation of a legacy system
A leading manufacturer of products used by utilities for building transmission and distribution lines and substations
The client was embarking on a business transformation effort which required the information technology capabilities of the company to become a catalyst for growth. The homegrown legacy system that had served the company very well for decades was to be replaced by a system better suited for the current state and desired future. The homegrown system lacked proper documentation and that created additional risks in the transition – both for maintaining the old system during the long transition and for getting to the business logic contained in the old code. We addressed this problem by carrying out a reverse engineering effort and then documenting the legacy system in a maintainable manner. The code base was around half a million lines of COBOL code.
DiviHN utilized a Reverse Engineering process to rebuild the knowledge. The reverse engineering process entailed investigation, analysis and documentation of the internal workings of the Legacy System. The team involved in reverse engineering exercise extracted detailed workings such as rules, interfaces, transaction processing, authorizations, data mapping, batch processing, file management, etc. of the Legacy System. The team then captured the information in a component level document that can then become the basis of the forward engineering effort.
We started with a 4-week Proof of Concept (POC) Phase. During this phase, our team performed a high level walkthrough of the code artefacts and documents the code organization, rules extraction strategy, templates and checklists to be used. Along with the client we identified a set of representative programs, and documented them using the templates to ensure that the output was in line with client’s expectations.
During the rest of the project – Reverse Engineering Phase – the team performed detailed code mining to identify various elements like structure, dataflow, business rules etc. & document the same in the prescribed format.
The project was completed in time and on budget, in less than 6 months, with a peak team size of 8. The documentation was delivered in an easily accessible and searchable repository. The documentation continues to be updated on a regular basis as the client marches ahead with the transformation.
ESB Architecture development and implementation
A services organization supporting the compliance and reporting needs of thousands of Small Business Investment Companies (SBICS), as a leading fund administrators.
The client’s business strategy required outsourcing certain non-proprietary, non-core business technology solutions, such as accounting and CRM, to external providers. Strategically, this required an application interface technology which enables and manages application transaction data movement (inter-process communication) between applications hosted and managed by client as well as those hosted and managed externally.
The project started with the evaluation of the client’s current environment. We defined the application interface solution requirements and employed leading practices to help the client move toward an appropriate application-to-application interfacing environment which would handle not only their current needs but also anticipated future needs in terms of automated interfaces with their partner firms. The client envisioned that a central message-handling component of the implementation of the architecture will be a proprietary Data Broker. We documented findings, functional and technical requirements, and also designed the Data Broker.
The deliverables of the project included the Current State Architecture, Requirements for Future Interface Architecture, and Future Interface Architecture recommendations.
DivIHN team worked to develop an architecture to meet the requirements of a loosely coupled and flexible design with any specific implementation requirements of the external vendors.
We developed the ESB architecture and then helped the client implement the same. The design project was accomplished on budget in time. Subsequently we were involved in implementation of one of the external applications for the client.