72626: Columbus, OH – IT – MCD – Technical Specialist 3/TS3

Job Title: Technical Specialist 3/TS3
NFP Rate: $59.80
Will close to submissions: Friday 6/26/2020 at 10:00am EST
Skype interviews

SCOPE OF WORK SUMMARY
The Technical Specialist will be responsible for Medicaid Enterprise data warehouse design, development, implementation, migration, maintenance and operation activities. Works closely with Data Governance and Analytics team. The candidate will work closely with Data Governance and Analytics team. Will be one of the key technical resource for data warehouse projects for various Enterprise data warehouse projects and building critical data marts, data ingestion to Big Data platform for data analytics and exchange with State and Medicaid partners. This position is a member of Medicaid ITS and works closely with the Business Intelligence & Data Analytics team.
Responsibilities:
• Participate in Team activities, Design discussions, Stand up meetings and planning Review with team.
• Perform data analysis, data profiling, data quality and data ingestion in various layers using Database queries, Informatica PowerCenter, Informatica Analyst score cards, Pyspark programs and UNIX shell scripts.
• Follow the organization coding standard document, Create mappings, sessions and workflows as per the mapping specification document.
• Updating the production support Run book, Control M schedule document as per the production release.
• Create and update design documents, provide detail description about workflows after every production release.
• Continuously monitor the production data loads, fix the issues, update the tracker document with the issues, Identify the performance issues.
• Performance tuning long running ETL jobs by creating partitions, enabling bulk load, increasing commit interval and other standard approaches.
• Perform Quality assurance check, Reconciliation post data loads and communicate to vendor for receiving fixed data.
• Participate in ETL code review and design re-usable frameworks.
• Create Pyspark programs to ingest historical and incremental data.
• Create SQOOP scripts to ingest historical data from EDW oracle database to Hadoop IOP, created HIVE tables and Impala views creation scripts for Dimension tables.
• Writing complex SQL queries and performed tuning based on the explain plan results.
• Extract unstructured and semi-structured data using data processor transformation in IDQ.
• Participate in meetings to continuously upgrade the Functional and technical expertise.

REQUIRED Skill Sets:
• 8 years of experience with Informatica Power Center on Data Warehousing or Data Integration projects
• Proven ability to write high quality code
• 7 years of experience with Expertise implementing complex ETL logic
• 3 years of experience Develop and enforce strong reconciliation process
• Accountable for ETL design documentation
• 5 years of Strong SQL experience (prefer Oracle)
• 5 years of Good knowledge of relational database, d

Comments are closed.