About Me
Professional Summary
- I am an AWS Certified Solutions Architect Professional with over 14 years of experience in Cloud AWS (Amazon Web Services), more than 5 years of hands-on experience with GIT (a distributed version control tool), Terraform (infrastructure as code service), Ansible (a configuration management tool), Docker (a containerization tool), Kubernetes (a container orchestration platform), Jenkins (a continuous integration tool), Maven (a build automation tool), and over 2 years of experience in Hadoop/Big Data and Airflow.
- I have designed, built, and deployed numerous applications using nearly all AWS Services, including EC2, S3, IAM, SNS, VPC, SES, CloudWatch, EFS, Elastic Load Balancer, Route53, Auto Scaling, CloudTrail, AWS Glue, Athena, Redshift, and AWS CLI.
- I possess knowledge in implementing CI/CD pipelines, strong interpersonal and communication skills, and the ability to engage with clients and team members.
- I am committed, results-driven, and eager to learn new technologies and take on challenging tasks. I assist peers with technical and functional clarifications to ensure timely product delivery. I am involved in functionally and technically reviewing tasks to enhance accuracy and reduce rework.
- I report daily task statuses to all stakeholders, including clients, and engage in exploring new technologies and sharing insights with the team.
- I am a Wipro-certified Solution Architect and an AWS Community Builder specializing in Serverless.
Technical Experience
- Having very good hands on experience in EC2, S3, IAM, VPC, SES, SNS, Cloud Watch, Cloud Trail, Elastic Load Balancer, Auto Scaling, EFS, Glue, Cloud Front, VPC Flow logs, and Route53.
- Well versed with respect to EC2s like Launching Windows and Linux machines and all five types of EBS (Elastic Block store) volumes.
- Launched all three kinds of load balancers and attached to web servers and extensively used auto scaling to provide high availability to EC2 machines.
- Having good knowledge in creating snapshots and AMIs, volumes and snapshots, attaching and detaching volumes and created own AMIs for replication of same environment in same/different availability zones as well as regions.
- Having complete hands on experience in S3 bucket to provide durability and security.
- Good awareness in ACLs (access control lists), Bucket policies, and Transfer acceleration.
- Good understanding of different storage classes/tiers and effectively used life cycle management.
- Hands on experience in IAM roles to access AWS resources without credentials & enabling MFA (multi factor authentication).
- Having complete hands on experience in Route53 like purchasing domain names from AWS also from other sites &creating Record sets to provide alias names for load balancers DNS names.
- Well versed in creating all the five routing policies and specially configured health checks for failover routing policy.
- Having complete hands-on experience in creating and managing VPC.
- Extensively used SNS (Simple Notification Service) at auto scaling as well as Route53 level to get notifications in server failures as well as VPC failures.
- Configured Cloud Watch to monitor AWS resources, to get alarm alerts, to maintain high availability and to reduce downtime.
- Well versed in provisioning AWS resources using Cloud Formation and Terraform.
- Having complete hands-on experience in AWS Lambda.
- Having complete hands-on experience in AWS Glue.
- Having complete hands-on experience in AWS Glue Studio.
- Well versed in creating AWS Redshift Clusters, running SQL queries, query optimization.
- Experience in creation of bases and tables using python in Airtable.
- Experience in creating DDLs in Redshift and Athena using python script with AWS Lambda function.
Work
Projects Accomplished:
Project: APDP
Client: Apple
Technology Stack: AWS Cloud, DevOps, Kafka, Hadoop, Airflow
Responsibilities:
- Responsible for setting up EKS Cluster for running Kubernetes container.
- Responsible for setting up Kafka pipelines using terraform.
- Responsible for setting up and monitoring Airflow Dags.
- Responsible for setting up Oracle-S3, S3-Oracle, HDFS-S3, S3-HDFS jobs.
- Responsible for troubleshooting issues and gave solutions to user during on-call.
- Responsible for setting up SONAR/OWASP.
- Responsible for setting up Kerberos Authentication.
- Automated whole jobs catalogue at one place in the form of dashboard using Python.
- Automated self-service deployment for different type of jobs using Python.
- Automated SONAR/OWASP pipeline status in the form of dashboard.
- Automated service restarts in case of failures.
Project: Dalton
Client: Apple, USA
Technology Stack: AWS Cloud & DevOps
Responsibilities:
- Having complete hands on experience in pulling the data from Teradata, McQueen and Airtable into S3 bucket using AWS Lambda and SQS.
- Having complete hands on experience in pushing the data from Singular platform into S3 bucket.
- Extensively worked with AWS Glue and created several glue jobs for the purpose of cleansing and transforming the data.
- Created DDLs in AWS Redshift and Athena using Lambda
- Configured Redshift Cluster to store the processed data.
- Configured SMTP service using AWS Glue to send mail using Port 25.
- Created Dashboard in Airtable using Lambda to show the status of processed data.
- Deployed AWS services like S3, Lambda, SQS, SNS, Glue, Redshift using Cloud Formation Template.
- Setup CI/CD Pipelines using RIO.
- Setup automation test framework using docker image.
Project: Dexture
Client:Citi Group
Technology Stack: AWS Cloud & DevOps
Responsibilities:
- Having complete hands on experience in launching EC2s and configuring auto-scaling with web servers to provide high availability to data.
- Extensively worked with IAM user and group accounts with limited privileges to individual users as well as groups in the process of maintaining less expenditure and high security.
- Well versed in IAM roles, enabling multifactor authentication, and aware of all password policies.
- Having hands on experience in S3 bucket like ACLs, enabled versioning at bucket level to recover objects and object-lock to protect objects from accidental deletion.
- Enabled CRR to have exact copy of data in another region’s bucket &to replicate the same environment in another region.
- Extensively used transfer acceleration in the process of migration of data to S3 bucket.
- Worked with Route 53 like purchasing domain names from AWS as well as from other websites.
- Created record sets to assign load balancer’s difficult DNS name with user friendly domain name.
- Integrated SNS at auto scaling level and Route 53’s level to get notifications.
- Worked with Chef Cookbooks and Chef Recipes to automate infrastructure as a code (IAC).
- Good experience in all kinds of docker instructions in docker file.
- Extensively Worked with Docker images like pulling images from docker hub, creating images from docker file, also from docker containers.
- Well experienced in creating containers from docker images as well as from our own images.
- Experienced with volume mappings and port mappings.
- Setting up of docker registry server.
- Extensively used Cloud Watch monitoring service, monitored many things and configured cloud watch alarms to take necessary actions whenever required and to maintain high availability by reducing downtime.
Project: Stratos
Client: Sciquest, USA
Technology Stack: AWS Cloud & DevOps
Responsibilities:
- We have taken lead in migration process of servers & data from on-premises data centre to AWS Cloud
- Responsible for complete administration of Cloud Infrastructure in my organization.
- Created VPC from the scratch, defined IP ranges in VPC to have better control over VPC.
- Created Public and Private Subnets for proper segregation of web servers and database servers to provide high level security.
- Migrated all the object storage to S3 Buckets and Created IAM role with S3 full access and attached to EC2s to access data without credentials.
- Configured SNS (Simple Notification Service) at auto scaling level and route53’s level to get notifications mainly in case of server failures and VPC failures.
- Configured all in Auto Scaling Groups and Launch Configurations.
- Launched Web servers in public subnets through Auto scaling and connected to load balancer.
- Provided network to public SNs through Internet Gateways and Route tables.
- Launched DB servers in Private SNs and provided internet through NAT server in a secured manner.
- Opened MYSQL port in database servers to all the public subnets.
- Launched Bastion/Jump server in Public SNs to have ssh connection into the DB servers which are present in private subnets.
- Enabled NACLs (Network Access Control Lists) at both public and private subnet levels to restrict and to allow Ports in providing high security.
- Installed and configured GIT and GitHub in organization
Project : T-Mobile
Client : T-Mobile,USA
Technology Stack: AWS Cloud
Responsibilities:
- Understand client requirements, propose solutions and ensure delivery.
- Experience of designing and developing comprehensive Cloud Computing solutions on the AWS platform.
- Experience with EC2 Auto Scaling, EBS, S3, ELB, RDS.
- Implementing AWS Infrastructure services (IAAS) like EC2, VPC, ELB, EBS, S3, RDS.
- Setting up new server (EC2) instances/services in AWS, configuring security groups, and setting up Elastic IPs.
- Monitoring health of Amazon EC2 instances and other AWS services.
- Managed hosted zone and domain naming service using Route53.
- Worked in close collaboration with the Development Teams, Product Management, Enterprise Architects, Operations, Executives and Finance teams to achieve the optimum performance low cost solution
- Interfaced with various levels of executives, management, and technical staff of customers.
Tutorial:
Work Experience
• Currently working as a Technical Lead/AWS engineer in Wipro since January’2022.
• Worked as a senior AWS engineer in WEB-II-TECH from August’2018 to December’2021
• Worked as a cloud support engineer L2 in WEB-II-TECH from April’2015 to July’2018
• Worked as a cloud support engineer L1 in WEB-II-TECH from January’2013 to March’2015
• Worked as a cloud trainer in WEB-II-TECH from March 2009 to December’2012
Contact
LET’S TALK.
Please feel free to contact me for any assistance or recruitment.