Job Description
Overview:
At Cotiviti, we are custodians of data for our clients. Using their technical experience in ETL processes, Data Engineers ensure operational functions are occurring as expected. This includes but is not limited to managing data implementations, data integrations, data production, data quality, and data security.
Responsibilities:
- Create, maintain and execute intermediate to advanced Spark scripts for data management and data validation, and data integration
- Create, maintain and execute basic to intermediate SQL scripts for data management and data validation.
- Optimize the queries to improve the efficiency of daily tasks.
- Perform data analysis and identify any issues.
- Work with other groups such as Engineering team, DBA, Cloud ops, etc. to troubleshoot and resolve any environmental or network issues that impact your work. Extend your support to after – hours or weekends as needed.
- Create and maintain data pipelines as needed
- Validates the tasks results to ensure that all the requirements are met.
- Adhere to all the industry level and organization level compliance rules and regulations to maintain data integrity.
- Complete individual productivity tracking
- Complete task assignments using department ticketing system within assigned deadline.
- Achieve organizational and individual goals as identified in performance reviews and goal setting exercises.
- Complete all special projects and other duties as assigned.
- Must be able to perform duties with or without reasonable accommodation.
This job description is intended to describe the general nature and level of work being performed and is not to be construed as an exhaustive list of responsibilities, duties and skills required. This job description does not constitute an employment agreement and is subject to change as the needs of Cotiviti and requirements of the job change.
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology or equivalent work experience.
- 3+ years of working knowledge of big data technologies (Spark, S3, Kafka, Ray, Hadoop, etc.)
- 3+ years of working knowledge of cloud (AWS, Azure, GCP, OCI etc.)
- 3+ years of working knowledge of RDBMS (Oracle, MS SQL, Vertica, etc.) and experience using SQL, PL/SQL or other data integration/ETL tools.
- 3+ years of data analysis. Preferably in the Healthcare industry of enrollment, medical claims and/or pharmacy claims.
- Proficient in Microsoft Office Suite applications PowerPoint, Word, Excel and Outlook.
- Flexible work schedule.
- Experience with project management tools like JIRA.
- Databricks and/or Snowflake environment familiarity a plus
Mental Requirements:
- Strong analytical skills
- Excellent verbal, listening and written communication skills.
- Ability to multitask and prioritize projects to meet scheduled deadlines and tight turnaround times.
- Ability to work well independently or in a team environment.
Working Conditions and Physical Requirements:
- Remaining in a stationary position, often standing or sitting for prolonged periods.
- Repeating motions that may include the wrists, hands and/or fingers.
- Must be able to provide a dedicated, secure work area.
- Must be able to provide high-speed internet access / connectivity and office setup and maintenance.
Visit Original Source:
http://www.indeed.com/viewjob