Job Description
Software Developer is needed to perform the following duties:
· Create business models, logical specifications and/or user requirements to develop solutions for the application environment.
o Created a Lambda Deployment function and configured it to receive events from your S3 bucket.
o Converted existing Terraform modules that had version conflicts to utilize Cloud formation templates during deployments and worked with Terraform.
o create stacks in AWS and updated the Terraform scripts based on the requirement on a regular basis.
· Design software applications, create system procedures and ensure that the developed applications function normally.
o Designed and created multiple deployment strategies using Continuous Integration and Continuous Development Pipelines and configuration management tools with remote execution to ensure zero downtime and shortened deployment cycles via automated deployments.
o Highly motivated and committed DevOps Engineer experienced in Automating, Configuring, Migrating and Deploying instances on AWS cloud environments, also familiar with ECS, EKS, Cloud Formation, EFS, RDS, S3, ELB, ALB, IAM, Cloud watch, Elastic IP’s, EC2, Glacier, Route53 on AWS.
o Administration of Production, Development and Test environment’s carrying Windows, Ubuntu, Red Hat Linux, SUSE Linux and CentOS. Experience supporting Chef Environment with 200+ servers and involved in developing manifests.
· Develop and create business models, logical specifications and/or user requirement solutions for the application environment.
· Develop, publish, and present innovative, insightful, and actionable research related to DevOps' topic on managing the deployment of AI models using Amazon Bedrock.
· Ensuring that the CI/CD pipeline supports rapid deployment of GenAI agents. This includes writing well-researched and concise reports, generating forecasts, and presenting findings to clients.
· Develop, enhance and maintain the build, deployment and configuration for Continuous Integration and Continuous Deployment (CI/CD) Pipeline.
· Implement and modify programs; make approved changes by amending flow charts, develop detailed programming logic, and coding changes.
· Implemented a 'server less' architecture using API Gateway, Lambda, and Dynamo DB and deployed AWS Lambda code from Amazon S3 buckets.
· Availability and performance of AI-driven features by automating model retraining and updates using AWS services.
· Extensively worked on ansible deployments, written various ansible playbooks with multiple roles, tasks with loops, templates, service management, host variables, group variables etc.
· Participate in scrum meetings and coordinate with Business Analysts to understand the business needs and implement the same into a functional design.
· Deploy machines learning models to production in a scalable, secure environment.
· Maintain data pipelines for real-time and batch processing. Ensure high availability and performance of AI/ML workflows. Collaborate with data scientists to streamline model deployment.
· Automated and Configured Hashicorp vault in EKS cluster to store secrets with all the authentications configurations with LDAP, IAM and Kubernetes.
· Manage source code, prepare test data, tests and debug programs; revise and refine programs to improve performance of the application software.
· Expert in setting up Baselines, Branching, Merging and Automation Processes using Shell, Python and Bash Scripts.
· Set up and manage CI/CD pipelines for deploying Amazon Bedrock-based GenAI agents and models.
· Automated EKS cluster with terraform, which has Grafana, Prometheus and alert manager and Kubernetes Dashboard on top of it. To create an EKS cluster and an autoscaling group of workers for the cluster.
· Perform execution of functional test plan, validate test results, prepare documentation & data for analysis.
· Involved in continuous integration and continuous deployment system with Jenkins on Kubernetes container environment, utilizing Kubernetes and Docker for the runtime environment for the system to build, test and deploy.
· Able to provide critical analysis under pressure and as a fast learner, I can ramp up quickly on both new technologies and existing technologies.
· Optimize cloud infrastructure to support scalable and secure deployment of AI models.
Bachelor's Degree is required in Computer Science or Computer Engineering or Computer Information Systems or Information Technology.
Job Type: Full-time
Schedule:
- Monday to Friday
Education:
- Bachelor's (Preferred)
Work Location: In person
Visit Original Source:
http://www.indeed.com/viewjob