Metadata Data Lineage Data Analyst - Contract
*This is a contract role*
BILL RATE: OUR PRICE: $140.00 - $160.00 /HOUR
LOCATION: CHICAGO, IL
3 DAYS ONSITE AND 2 DAYS REMOTE
These are Java developers that came up and can now focus on data analysis, metadata data flows, data mappings, data lineage solutions. Multiple data sources across on prem cloud protocol buffer REDIS API database data platform Kafka a plus rise and data governance platforms. You will not be doing programming but rather developing custom metadata connections/scanners
- Develop Metadata and Data Lineage Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Kafka, Protocol Buffers, REDIS, APIs, Databases, Flat Files, JSON, ETL/BI Tools and other Data Platform technologies etc.
- Work with Technical SMEs/developers understanding the applications/systems design, development and create data flow diagrams/data mappings.
- Create Source to Target mapping documents reverse engineering the application Java code/BI tools/SQL queries for the identified data flows.
- Develop custom metadata connectors/scanners using programming tools to automate the metadata extraction from disparate data sources.
- Develop solutions developing programs to automate metadata extraction and data flow/data lineage/Source to target mapping documents for complex applications/systems/BI tools.
- Working on metadata management, administration, support and ingest data management assets using extension mappings, custom data assets, metadata bridges, connectors, third party metadata bridges ensuring data lineage/ source to target data mapping.
- Create, develop, configure and execute end to end business and technical data lineage across disparate sources in accordance with the Data Governance Standards, Policies and Procedures.
- Design and build data capabilities like data quality, metadata, data catalog and data dictionary.
SKILL AND EXPERIENCE REQUIRED:
- 6 or more years of data analysis experience with robust understanding of metadata, data flows and mappings.
- Ability to understand the Java Code base; read and/or write code using a programming language (e.g., Java, Python, etc.). Proficient with SQL and experience working with Git and experience with data analysis using Python/ Pyspark.
- Hands-on experience with Java version 8 onwards, Spring, SpringBoot, Microservices, REST API, Kafka Streams.
- Experience working with various types of databases like Relational, NoSQL, Object-based.
- Ability to review the application development code to ensure it meets functional requirements, architectural and data standards.
- Proficiency in writing technical documentation for Java based applications that process data in real time and batch.
- Ability to develop and implement Data Quality Solutions for multiple data sources across On Prem and Cloud environments including but not limited to Databases, Flat Files, JSON, APIs, Kafka etc.
- Experience working on Protobuf, APIs, Kafka as Data Sources is preferred.
- Experience working with draw.io or other tools creating architecture or data flow diagrams.
- Ability to multitask and meet aggressive deadlines efficiently and effectively.
- Experience in object-oriented design and software design patterns.
Employers have access to artificial intelligence language tools (“AI”) that help generate and enhance job descriptions and AI may have been used to create this description. The position description has been reviewed for accuracy and Dice believes it to correctly reflect the job opportunity.
Report this job
- Dice Id: napil006
- Position Id: 8434051