GrandRapidsRecruiter Since 2001
the smart solution for Grand Rapids jobs

Data Quality and Integration Developer (Remote Position)

Company: Stefanini
Location: Grand Rapids
Posted on: January 16, 2022

Job Description:

Stefanini is looking for a Data Quality and Integration Developer (Remote Role)Summary:We're seeking a collaborative, data driven, Application Development Analyst, who is passionate about designing, implementing, and owning an enterprise-wide Data Quality solution, comprised of various applications and integrations.This highly influential role will be responsible for building a Data Quality product, that meets the needs of our organization, and enables the usage of data assets to drive value.This individual will need to have strong familiarity with data models & warehousing, data connections & integrations, data governance & quality, and dashboarding.This work will involve both on premise and cloud solutions.In this role, you'll be using modern data architecture tools including cloud solutions (AWS/Snowflake), and connection/integration technologies such as (Informatica/Python/Kafka/REST/Java/JSON/JDBC/OLEDB).In addition to building a next generation Data Quality solution, you'll be working with some of the most forward-thinking teams in data and analytics.Success:Success will require, understanding the value & criticality of establishing decision rights and accountabilities related to data management, the ability to communicate concepts and initiatives to internal and external audiences of all levels, implementing the solutions, and tracking & monitoring progress. This Data Quality developer will serve as a process & technology expert and support our Data Governance community, as needed.Job Requirements:Required:Bachelor's degree in Computer Engineering, Computer Science, Data Science, or related disciplineMinimum 3-7 years relevant experienceCritical Thinking (Process, Analytical, & Outcomes based)Understanding & Conveying of Data models and Data ArchitectureRelational Database Experience: Understanding of Data Modeling concepts (Dimensional and Normalization), knowledge of OLAP structures, SQL Performance TuningExperience and working knowledge in programming languages i.e. SQL, PLSQL, Python, & JavaProficient with Informatica Power Center.Experience with setting up and operating ETL/ELT pipelinesData Integration experience (using Talend, Informatica Cloud, MuleSoft, etc)Understand different types of storage (Relational, MPP, NoSQL) and working with different kinds of data (structured, unstructured, metrics, logs, etc.)Preferred:Experience in Data Governance3 years' Experience with Data Quality and related tools like Informatica IDQ/CDQExperience working with Informatica IDQ, CDQ, or similar Data Quality applicationsExperience using Informatica Analyst for performing data profiling, data mapping, data validation, data manipulation, data analysis, use case, test cases, building rules etc.Proficient with data quality practices, Profiling, Cleansing, Rule Creation, Deduplication, Standardization and Address Validation using Address Doctor, Exception Monitoring and Handling.Experience using core accelerators, Data Domains and suggesting build for new custom accelerators / domains for reuse.Design and develop rules with IDQ Utilize Informatica Developer, Informatica Power Exchange, Informatica Metadata Manager, and Informatica Analyst to design and develop custom objects and rules, reference data tables.Experience with Performance optimization of IDQ mappings/workflows.Experience using informatica analyst for performing data profiling, data mapping, data validation, data manipulation, data analysis, use case, test cases, building rules etc.Experience using core accelerators, Data Domains and suggesting build for new custom accelerators / domains for reuse.Informatica University Data Quality 10: Developer (Professional Certification)Experience with deploying and monitoring data quality rules/code and reporting results of data quality against data quality dimensions.Responsible for designing, testing, deploying, and documenting the data quality procedures and their outputsExperience with Informatica MDMIntegrationsCloud experience on Snowflake, Azure, AWS, and GCPAPI IntegrationsHands-on experience in building and consuming webservices from IDQ.Event Driven Message Streaming experience working with technologies such a Kafka, Kinesis/Event Hubs/PubSubExperience with Data Governance and related tools like Collibra3 years' experience working with Collibra, Alation, Informatica EDC/Axion, or similar Data Intelligence appsCollibra Ranger (Certification) or Collibra Solution Architect (Certificate of Completion)Administrative level knowledge and expertise of Collibra CloudCollibra Workflow Development Experience with Business Process Management (BPM) toolUnderstanding of REST API concepts and scriptingAbility to work with business and technical stakeholders in capturing and modeling assetsPrevious consulting experience.

Keywords: Stefanini, Grand Rapids , Data Quality and Integration Developer (Remote Position), IT / Software / Systems , Grand Rapids, Michigan

Click here to apply!

Didn't find what you're looking for? Search again!

I'm looking for
in category

Log In or Create An Account

Get the latest Michigan jobs by following @recnetMI on Twitter!

Grand Rapids RSS job feeds