Job Description
Chippewa Government Solutions, a subsidiary of Sault Tribe, Inc. ("STI Federal"), is part of the growing federal contracting division established to drive economic growth and uplift the quality of life for the Sault Ste. Marie Tribe of Chippewa Indians. Founded in 2020, STI Federal operates as the independent business arm of the Sault Tribe, creating opportunities to expand Tribal enterprises and enhance the lives of its members. We strive to lead impactful federal contracting efforts through STI Federal, fostering meaningful growth and delivering exceptional service to our government partners.
We have an opportunity for you to use your analytical skills in support of the Department of Veterans Affairs (VA) Veterans Health Administration (VHA) to improve the way healthcare is delivered, as well as extend access to quality healthcare for Veterans nationwide.
Role Overview:
With the abundance of structured and unstructured data, you understand the importance of transforming complex data sets into useful information to solve challenges. As a data engineer, you’ll work closely with your team and clients to plan, design, and implement various data processes, including data extract, transform, load (ETL) processes for transactional systems and analytical environments and data migration processes leveraging Microsoft Azure services.
Key Responsibilities:
- Design and implement ETL/ELT processes for structured and unstructured data.
- Develop scalable data pipelines using Azure technologies and Databricks.
- Collaborate with teams to optimize data architecture and storage solutions.
- Conduct data migration processes ensuring quality and security.
- Provide technical support and scripting for converting raw data into actionable formats.
- Maintain documentation for data workflows and best practices.
Minimum Qualifications:
- Bachelor’s degree
- 4+ years of experience with custom or structured ETL design, implementation, and maintenance
- 3+ years of experience in developing and deploying data ingestion, processing, and distribution systems on and with Azure technologies and/or Databricks
- 2+ years of experience with Python libraries, including PySpark
- Experience with data modeling, data lake, or data warehousing
- Experience with developing scalable ETL/ELT workflows for reporting and analytics
- Experience creating solutions within a collaborative, cross-functional team environment
- Ability to develop scripts and programs for converting various types of data into usable formats and support project team to scale, monitor, and operate data platforms
- Ability to obtain and maintain a Public Trust determination
Preferred Qualifications:
- Experience with VHA data
- Experience with Cloud platforms, including AWS or Azure
- Experience with data architecture design, with a focus on end-to-end enterprise solutions
- Experience building and managing a medallion architecture
- Experience with Delta and/or Parquet file formats
- Experience with automating, managing, and monitoring data pipeline operations
- Experience with Agile engineering practices
Salary Range: The salary range for this position is $125,000 - $160,000. Final compensation will be determined based on the successful candidate’s experience, education, and skills relevant to the role.
STI Federal is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment and will not be discriminated against on the basis of age, disability, ethnicity/race, national origin, religion, gender, gender identity, sexual orientation, or veteran status.
STI Federal participates in the Electronic Employment Verification Program. Please click the E-Verify link for more information: https://www.e-verify.gov/sites/default/files/everify/posters/EVerifyParticipationPoster.pdf
Visit Original Source:
http://www.indeed.com/viewjob