Data Solutions Architect (Any Cloud: AWS/Azure/GoogleCloud)

OpsGuru, a Carbon60 Company
Canada
30+ days ago

Job Description

OpsGuru is a global engineering and consulting group. We are experts in the container ecosystem, data processing and analytics, and cloud-native technologies. Our team is formed by network, data, security, DevOps specialists, and application developers. OpsGuru empowers customers with technology to solve their business problems and provide the tools to assure success in their digital transformation.

OpsGuru's value to our customers centers around our ability to provide deep technical guidance based on their business needs. We achieve this by assigning small, virtual teams of highly skilled individuals to each client. Within these teams, the Data Solutions Architect is responsible for providing technical expertise and leadership to Data and Cloud Engineers, while also maintaining a systems view that is able to reconcile technical decisions with broader project goals. Data Solutions Architects work alongside our Principal Consultants to ensure our project deliverables meet stakeholders' needs while upholding OpsGuru's standards for quality and operational maturity.

Roles and Responsibilities

  • Provide deep technical expertise and leadership across a range of cloud data technologies. You will be the go-to person for driving tool selection, resolving complex engineering issues, and guiding best practices during engagements.
  • Lead the design and implementation of data platforms to meet customers' business requirements.
  • Lead whiteboard design sessions with internal and external team members.
  • Identify and communicate technical risks as they emerge over the course of a project.
  • Work closely with Principal Consultants to extract project requirements during technical discovery sessions, define deliverables to meet those requirements, and break down those deliverables into a technical roadmap.
  • Lead teams of data and cloud engineers to execute project roadmaps. Provide guidance on technical tasks, priorities, and technical assistance when needed.
  • Manage scope within customer engagements. Identify changing requirements as they arise, determine their impact on scope, and ensure all stakeholders are aware and agree with the changes.
  • Maintain a close working relationship with the customer, as a "Trusted Advisor". Set clear expectations, challenge assumptions, solicit feedback, and take ownership of project deliverables.
  • Building complex ETL code using technologies such as; Spark, Nifi, Glue
  • Building real-time streaming solutions leveraging technologies such as: Kafka, Kinesis, Pub-Sub and Spark Streaming
  • Developing code using Python, Java, and Scala languages
  • Creating complex data solutions and build data pipelines
  • Establishing credibility and build impactful relationships with our customers
  • Maintain relevant certifications on cloud technologies and stay informed of key industry trends.

Qualification & Experience

  • 3+ years of experience in public cloud environments (AWS, GCP, or Azure).
  • 3 + years of developing code using Python, Java, or Scala languages
  • Experience with building complex ETL code using technologies such as Spark, Nifi, Glue
  • Experience with building real-time streaming solutions leveraging technologies such as; Kafka, Kinesis, Pub-Sub or Spark Streaming
  • Strong SQL skills
  • Creating complex data solutions and build data pipelines
  • Establishing credibility and build impactful relationships with our customers
  • Strong communication skills, written and verbal

Job Types: Full-time, Permanent

Salary: $110,000.00-$150,000.00 per year

Experience:

  • At least one non-shell scripting language: 1 year (preferred)
  • Technical leadership: 2 years (preferred)
  • Advanced knowledge of Microsoft Azure: 2 years (preferred)
  • Advanced knowledge of AWS: 2 years (preferred)
  • Advanced knowledge of Google Cloud: 2 years (preferred)
  • building complex ETL code using : Spark, Nifi, Glue: 1 year (required)
  • Kafka, Kinesis, Pub Sub or Spark Streaming: 1 year (preferred)
  • Building Data Pipelines: 1 year (preferred)
  • Data Lakes: 1 year (preferred)
  • developing code using Python, Java or Scala languages: 2 years (required)
  • Public or private cloud environments: 3 years (preferred)

Source

https://ca.indeed.com/jobs

Not sure if you qualify?

Uvaro is here to help you land great jobs like this one.

Upgrade your skills

Other Jobs

Rise People

Rise is Canada’s complete people management solution that gives employers everything they need to build loyal and productive teams—all in one place. We collaborate and co-create to build solutions that completely change how companies of all shapes and sizes manage their payroll, scheduling, time tracking, group benefits, recruitment, onboarding, and more.

 
Vancouver, BC
Uvaro

1+ years of successful sales development experienced experience in B2B software or B2C services is a plus. Ability to work occasional evenings or weekends.

 
Waterloo, ON / Remote
Interior Savings Insurance Services Inc.

Insurance Agents - Trainee or Level 1 & 2 - Kamloops

Interior Savings Insurance Services Inc.

Actively solicits referrals to commercial lines representatives, wealth management specialists and credit union sales staff; cross sells all agency and credit…

11 hours ago
Kamloops, BC