In the 2019 NASCIO survey of State CIO Priorities for 2020, modernizing legacy applications ranked in the top 10 at #7. For many government organizations, legacy systems are a financial and technical obstacle to digital – and ultimately – government transformations. The applications running on these outdated legacy systems, are inefficient, costly and limit the organization’s agility and flexibility. The pandemic and the need for government to workfrom-home – in some cases literally overnight - helped to highlight the limitations of these outdated systems. Stay at home orders and social distancing have also dramatically increased reliance on systems of digital engagement which become bogged down by the legacy systems in the background further making the case that modernization of core IT systems is a vital piece of any digital transformation. The US federal government has drawn a direct link between legacy modernization, data security and digital transformation when it comes to improved citizen service delivery.
This growing understanding of the effect of modernization on digital transformation was reflected in Gartner’s Legacy Modernization Survey where government leaders cited business objectives rather than IT objectives as the top 5 reasons for modernization.
Top Reasons for Governments to Modernize Core IT Systems
Percentage few Respondents. Sum of Top Two Ranks
Base: Completed or will complete plans to modernize, n=105
Source: 2019 Gartner Legacy Modernization Survey
What are the top two reasons your organization chose to modernize it’s core IT systems? Top two ranks
As the need for legacy modernization and digital engagement is being recognized across the organization and at all levels of leadership, State and local government CIOs are increasingly engaging in cross-organization collaboration between IT and program and agency leadership. Together they are setting the vision and building modernization roadmaps that convey the links between modernization and the agency’s desired outcomes.
Successful modernization requires a carefully tailored approach for each organization and a roadmap that includes the transformation of technology and business processes that are co-designed by cross-functional teams using a variety of approaches such as agile and user centric design. Increasingly, CIOs are turning to experts in the practice of Application Modernization for help.
Application Modernization is an established practice that helps customers leverage micro services, serverless architecture, and containers to remove huge dependencies within products and create smaller and independently deployable components. Containers help modernize applications faster by bringing together everything an organization needs to run more efficiently — including products, services and third-party applications. Organizations can take advantage of advanced cloud services such as DevOps and API framework to improve scalability, reliability and cost-efficiency while adding new capabilities to their mission-critical software. This allows for greater agility, faster innovation, quick and easy access to data leading to better decisions and value for customers. Organizations can choose application migration strategies depending on their priorities and the applications they want to migrate. Some common scenarios used to modernize legacy applications with cloud are:
- Cloud Infrastructure-based Applications
- Cloud Optimized Applications
- Cloud-Native Applications
We explore some of these key concepts and the GCOM approach to it below.
Application Assessment and Cloud Migration
As more organizations look to build resiliency and scale, cloud computing becomes a more attractive model. Legacy applications that are not cloud native can inhibit an organization’s ability to move to the cloud and requires validation, modernization and migration. Additionally, decisions on government, public or private cloud must be taken quickly when there is a need to migrate legacy applications. GCOM’s Application assessment and cloud migration approach includes an overall assessment of the cloud readiness, health and open risk for each application. GCOM creates prioritized list of applications to migrate including on-premise to the Cloud (Platform as a service, Infrastructure as a service). GCOM application assessment includes specific recommendations using tools and templates on how to modernize an application, software health issues and open source risks. We also look into existing technical workforce and how they can be transformed into new technologies, volume of legacy applications including complexity and Custom, COTS, MOTS and frameworks. More importantly customer’s budget and how to align the team with available funding is explored.
GCOM has developed templates for Application Modernization:
- Application Inventory collection and prioritization: Record application list categorized by business functionality, identify applications for migration and mark applications that need to be deprecated.
- Data Migration and Data Quality Assessment.
- Infrastructure Assessment.
- Budget Assessment: Prioritize applications based on business needs and efforts.
Application refactoring is very important for Application Modernization when your applications reside on the infrastructure of your cloud provider. Refactoring involves modifying your existing applications or a large chunk of the codebase in order to take better advantage of the cloud-based services and the extra flexibility that comes with them. Application refactoring is found to be more complex than the other cloud migration approaches because while making application code changes, you must also ensure that it does not affect the external behavior of the application.
During application refactoring, GCOM validates if your existing application is resource intensive. The application may cause larger cloud usage because there is extensive data processing or rendering of images and videos. In that case, redesigning the application for a better resource utilization is required before moving to the cloud. Application refactoring sometimes becomes time-consuming and resource-intensive, but it can save future monthly costs. Our approach to application refactoring takes into account the following points:
- Operations cost reduction identifying resources those are highly valuable and provide better ROI.
- Decoupling of applications components (micro-services) and managed services by leveraging elasticity offered by the cloud.
- Support application demand by using auto-scaling features based on the business needs.
- Assemble team of experts.
- Enterprise level planning because of time intensive and complicated nature of application refactoring.
How do we refactor: Based on customer needs, budget and ROI needs we suggest following options:
1.Full Refactoring which targets code and database change to utilize cloud-native features that lead to improved performance, reduced operation costs and better support for business needs.
2.Cloud ready features refactoring to incorporate cloud features like security, cloud utilities and cloud database and utilize their capabilities.
3.Containerization refactoring by moving application with very less or no modifications to utilize cloud ready features.
DevOPS Assessment, Strategy and CI/CD pipelines
DevOps practice adoption takes time, we completely understand that. DevOps practices not implemented correctly can do more harm than good for an organization. It is really important for an organization to understand that the practices are in line with the organizational level DevOps initiative. GCOM’s DevOps assessment is a set of questions that personnel from different organizational areas can answer. Based on the analysis of the answers, the organization can see whether it is moving in the right direction. DevOps assessment helps plan existing investment and identify a road map of improvement areas.
We have noticed that most organizations are using DevOps in some capacity. An organization with changing IT and operational needs should definitely take advantage of DevOps assessment. Our assessment checklist includes following important questions:
- Does the organization really need DevOps? Create a business case including production test cases and environment, test cases and test data, the level of automation, complexity, visibility and proficiency in the devtest-release-deploy-manage processes, etc.
- Does the organization have everything that is needed for DevOps? Create tools and infrastructure checklist.
- What value it would add to the organization? Create a business value proposition statement.
- Are DevOps strategies right for the organization? Create ROI statement to show how strategy can help.
DevOps Assessment is a real necessity to get an idea of where the organization is currently and setup a roadmap for DevOps journey. Our DevOps assessment report provides an evaluation of an organization’s current DevOps ability and recommendations for improvements based on the responses that includes:
- Build a workforce along with recruitment team
- Management team to target the investment areas
- Operations team to understand the most critical operations
GCOM’s DevOPS strategy includes the following areas:
- Continuous Release Planning - Release Planning is done using iterative Agile methodologies and practices that align well with a DevOps culture.
- Continuous Integration and Deployment - Codebase control using Azure DevOps, Github, Gitlab type repositories. Continuous Integration using continuous integration server and deployment to virtual cloud computing services.
- Continuous Delivery and Continuous Build - RContinuous application delivery using Git, Build, Test and Release tools from Azure DevOps.
- Continuous Testing - Achieve monitoring on application status on virtual servers. Continuous Monitoring takes place using an Application Performance Monitoring (APM) tool in conjunction with a log aggregation framework such as the EFK (Elastic Search, Fluentd, Kibana) stack.
- Continuous Monitoring - Achieve monitoring on application status on virtual servers. Continuous Monitoring takes place using an Application Performance Monitoring (APM) tool in conjunction with a log aggregation framework such as the EFK (Elastic Search, Fluentd, Kibana) stack.
- Continuous Security - Continuous Security approach includes Static Code Analysis (SCA), dependency scanning, container hardening (CIS Benchmarks), Infrastructure as Code, immutable deployments through containerization, infrastructure compliance checks, security smoke tests (ZAP Baseline Scan), secrets management and credential rotation, integrated development environment security plugins, a peer review process, the continuous monitoring approach from above and compliance policies including, but not limited to, encryption of data bots at rest and in transit.
- Continuous Improvement - To achieve zero downtime, we use blue green deployment and A/B testing technique. 80% of production traffic is sent to the existing production environment, called blue and the remaining 20% of production traffic is sent to a new production environment, called green, to test new features of the application. If the green environment, with new features, is working fine, we can stand-by or terminate the blue environment. We can easily switch to previous versions of the blue environment when it is only with stand-by status.
- Blue Green Deployment - To achieve zero downtime, we use blue green deployment and A/B testing technique. 80% of production traffic is sent to the existing production environment, called blue and the remaining 20% of production traffic is sent to a new production environment, called green, to test new features of the application. If the green environment, with new features, is working fine, we can stand-by or terminate the blue environment. We can easily switch to previous versions of the blue environment when it is only with stand-by status.
- Automate Infrastructure - To achieve automatic infrastructure, we use terraform which can be integrated with any cloud providers infrastructure. Our solution follows Infrastructure as Code (IaC) to provision the infrastructure resources in the cloud.
Application containerization is an OS-level virtualization method used to deploy and run distributed applications without launching an entire virtual machine (VM) for each app. Multiple isolated applications or services run on a single host and access the same OS kernel. Containers work on bare-metal systems, cloud instances and virtual machines, across Linux and select Windows and Mac OSes. Application containers include the runtime components -- such as files, environment variables and libraries -- necessary to run the desired software. Application containers consume fewer resources than a comparable deployment on virtual machines because containers share resources without a full operating system to underpin each app.
Application containerization works with microservices and distributed applications, as each container operates independently of others and uses minimal resources from the host.
Each microservice communicates with others through application programming interfaces, with the container virtualization layer able to scale up microservices to meet rising demand for an application component and distribute the load. With virtualization, the developer can present a set of physical resources as disposable virtual machines. This setup also encourages flexibility. For example, if a developer desires a variation from the standard image, he or she can create a container that holds only the new library in a virtualized environment. GCOM has App containerization experience specifically, Docker App containerization.
Today, more and more software products & platforms are participating in a larger ecosystem to collaborate and leverage from each other. Companies are opening their platforms through Application Programming Interfaces (APIs) to external developers or partners for exploring new technological possibilities and business models. Although it creates business opportunities, it also comes with engineering challenges.
API Enablement Solution & Services are designed to help you create API Layers, Infrastructure and Developer Engagement Platform for your services by addressing some of the key areas like Security, Scalability, Monitoring, Monetization and Developer Adoption.
GCOM API services includes:
- Scanning API to scan paper documents over the internet to convert into PDF, PJG format.
Data Quality, Data Migration Assessment and Implementation
Data quality is of utmost importance through the lifecycle of data, including data migration to cloud. Data is generally considered high quality if it is “fit for [its] intended uses in operations, decision making and planning. Key attributes of such high-quality data are: 1) Accuracy, 2) Completeness, 3) Consistency, 4) Credibility and 5) Currentness. This simple definition forms the basis of GCOM’s Data Quality Methodology, as illustrated below.
GCOM has devised the Data Quality Methodology based on proven six-sigma principles of DMAIC Define, Measure, Analyze, Improve and Control. We refer these fundamentals in our various software engineering processes as well.
Leveraging fundamentals from the six sigma DMAIC, GCOM Data Engineering team has devised a data quality methodology with the following six key steps:
- Define: DQ Goals, Data owners, Data Rules
- DQ Goals:
- Ensure all master records are unique and accurate.
- Dedupe any detail / transactional records.
- Data Owners: Client’s Data Stewards, SME and Division heads
- Impacted Business processes: e.g. Eligibility, License issue, renewal notices etc.
- Data Rules: Identify standardization of rules.
- DQ Goals:
- Measure: : Assess the existing data against rules specified in Define Step. Assess data against multiple dimensions such as:
- Accuracy of key attributes.
- Completeness of all required attributes (Washington vs Washington).
- Consistency of attributes across multiple data sets.
- Timeliness of data.
- Analyze: Analyze the assessment results on multiple fronts including data and data types.
- Improve: Design and develop improvement plans based on prior analysis.
- Implement/ Data Migration: Implement solutions determined in the Improve stage.
- Application for data rules, testing and validation of the fixed data.
- Achieving target state and target disposition as needed for data migration.
- Carrying out data migration through bulk as well as incremental loads.
- Control: Verify at periodic intervals that the data is consistent with the business goals and the data rules specified in the Definition Step.
- Training: At GCOM we understand that in general Data Quality is not a onetime project but a continuous process and requires the entire organization to be data-driven and data-focused. We carry out knowledge transition and responsibility transition to the client’s data subject matter experts at each key milestone during the project schedule.
Benefits of Application Modernization:
GCOM isn’t your typical government solutions provider. GCOM combines the scale to support large complex projects with the agility and accessibility of a boutique solutions provider giving state and local government leaders a third option when looking for a partner to help modernize operations and optimize digital engagement. And we’ve earned a reputation for innovation and reliability by helping clients leverage cutting edge technology while mitigating risk. Whether it’s helping governments transition to virtual working, incorporating biometric ID to give physicians anywhere anytime access to vital records, providing local law enforcement with complete criminal histories on-demand, or data integration platforms that monitor community health, GCOM’s innovative, next generation government solutions improve operations and deliver more value to the communities they serve.