SDLC at Enterprise Level

SDLC at Enterprise Level is a process that involves lot of human resources, it has many stages, the first stage is the requirement analysis, the second is the designing and the third is the development phase which is the crux of SDLC.

In development phase the front end code will be developed by the front end team, the back-end code will be generated by the back end team. The apis will be developed and integrated by API team both the first party and the third party api integration will be managed by the API team.

The data governance framework needs to be well integrated by the data ops team. Data management protocols and procedures will be laid down by data ops and data security team, the version control will be manged via VERSION MANAGEMENT SOFTWARE.

GITHUB -> it is a utility to carry out the pull request to commit the code, also necessary rollback can be carried out if needed.

Version control systems needs to be integrated in SDLC in order to carry out Continuous integrations and Continuous delivery.

The github version control will be integrated with git action in order to carry out continuous integration and continuous delivery processes. CI/CD tools needs to integrated along with the data integration tools so that BA and DA, PM, marketing team, Ops team can consume actionable analytics and data teams can run ML algos to understand data domains in a deeper way.

All the code will be compiled and rendered on PUBLIC CLOUD | PRIVATE CLOUD | ON PREM .

Major enterprises be it medium or large sized will adapt to a mix of infrastructure , highly sensitive data will be be stored in highly encrypted fashion in ON PREM INFRA.

In major state owned or private owned bodies leverage the three set of infra deployment models. Sensitive data is kept on on-prem where the organization will have complete control over data and it processing paradigm

Data security enforcement is way more higher in on-prem then in-public or public cloud, private cloud is also way more Private and secure. All the major enterprises adopts to all the three infra deployment models as all of them satisfy certain use cases and business cases.

Each deployment mode has it own pro and cons and based on business cases and use cases decisions are taken at enterprise level to choose each deployment model coverage in a mixed deployment approach.

It is observable that the enterprise widely leverage the MULTI CLOUD setups due to fair advantage, flexibility and cost efficiency.

The development pipeline will be automated via integrating the CI / C D (continues integration and continues deployment mode), hence the CI/CD tool will automate the integration, delivery and deployment processes, along this CI/CD integration the code base and the services will also be integrated with DATA PIPELINE like ETL / ELT because of which the data ecosystem can be controlled, monitored and secured. At enterprise level with the right set of DATA PIPELINE and INTEGRATION PIPELINE the enterprise will be able to achieve higher productivity , efficiency and scalability.

The development stage will involve front end dev team, data engineering team, network team & few other tech teams, if the software solutions involves lot of tech teams then the centralized integration via version control and automation framework will optimize the development phase and be it the requirement analysis, designing phase or development phase, all the logs needs to be maintained via project management suit or MS tenant where all the processes are tracked, monitored and measured.

Development phase is the crux of SDLC hence with the right set of protocols, process and due diligence in needed to ship HQ Software Solutions.

The DEV TEAM

The developers will develop the code base and then the cloud architecture along with cloud engineers will manage the cloud scaling, provisioning, load balancing, cdn management and the cloud infra setup, integrations and optimizations tasks. The process of IAM, RABC and MFA will be integrated in the SDLC by API and DEV TEAMs.

THE DATA OPS TEAM

The data ops team will work in tandem with data engineer to develop and maintain the DATA PIPELINES, it can be ETL /ELT or any other data pipeline design pattern that needs to be integrated.

THE DESIGNING TEAM

The designer will be designing the GUI of the application. APIS will be utilized to connect the back-end with the front end infra. The back-end integration team along with software engineers will integrate the OLTP and OLAP systems in order to carry out scalable transactions and analytics processing.

NOTE : High volume businesses will need to integrate OLAP and OLTP at scale. 

THE TESTING TEAM

The testing team will carry out the QA and testing of the application , the testing environment and test cases will be different based on the type of the testing that is being carried out, here are the list of basic testing that needs to be carried out before the code hits the production stage. Testing can be bifurcated into FUNCTIONAL TESTING & NON-FUNCTIONAL TESTING.

Unit testing | Functional Testing | End to End Testing | Functional Testing | Stress Testing | System Testing | Usability Testing | 

THE CLOUD OPS

All the cloud computing based operations and processes are managed by the cloud ops team, the cloud ops team consist of network admins, cloud engineers, cloud architects. From cloud integration, to cloud testing and deployments, all are carried out by the cloud ops team, the cloud ops is also responsible for capacity planning and horizontal / vertical scaling of the cloud infra.

The NETWORK ADMIN and SYSTEM ADMIN

The network admin will manage the network configs, updates, upgrades and operations, the system admin will administer the systems in place, the system admin will be managing the software systems and hardware systems on-prem in order to ensure service levels and keep both the software and hardware system in best conditions.

A large to mid scale SDLC can involve a mid to large size team working on the same production code base but handling the core and supporting operation in an logical and abstracted fashion, the project both at large and mid scale is managed by project managers and senior managers. 

The article above is rendered by integrating outputs of 1 HUMAN AGENT & 3 AI AGENTS, an amalgamation of HGI and AI to serve technology education globally.

(Article By : Himanshu Nair)