THE ROLE:
The Data Engineer role is responsible for data modelling, design, optimization, security and administration of ETL/ELT tools and data engineering interfaces. The role will be responsible for building data interfaces focusing on data accuracy, completeness and availability. The role will analyze source and target systems design, build and test solutions to make data available Business intelligence systems, websites mobile apps and other applications.
HOW YOU WOULD CONTRIBUTE:
• Design, Build, Test and migrate ETL/ELT systems and data interfaces
• Analyze and design source and target systems data architectures
• Design and implement data tables, functions, views, procedures and routines to provide accurate and complete data to BI and data applications.
• Identify opportunities for technical innovation that add value to the platform.
• Assist in establishing and embedding data management and governance processes.
• Enhance performance in the Applications environment.
• Meet service level agreements for production support response and resolution.
• Research, Design and Develop technical solutions to a pre-defined requirement and develop components including extensions, views, customizations, modifications, reports, and workflows independently or as a part of a team.
• Follow documentation, software development methodology, version control and testing, and migration standards.
• Develop and improve the current data architecture, emphasizing data security, data quality and timeliness, scalability, and extensibility.
• Provide technical guidance and mentoring to others in areas of expertise.
• Deploy and use various technologies and run pilots to design low latency data architectures at scale
• Collaborate with BI teams, business analysts, product managers, and application teams to provide data for BI, web, mobile applications
• Collaborate with business analysts, data scientists, product managers, and BI teams to develop, implement, and validate KPIs, statistical analyses, data profiling, prediction, forecasting, clustering.
• Develop a collaborative environment that encourages the exchange of ideas.
QUALIFICATIONS FOR SUCCESS:
• Expert proficiency in building data pipelines and ETL/ELT using tools such as Informatica, ODI, Azure Data Factory etc.
• Experience in administration and migration activities of data movement and transformation tools.
• Expert knowledge in advanced SQL, stored procedures, views, functions, indexes, etc.
• Expert knowledge of Entity Relationship Diagrams.
• Expert knowledge of data modeling techniques (type 1,2,3,4), dimensions, facts and aggregations.
• Proficiency in relational and non-relational databases such as OLTP, MPP appliances, NoSQL, DaaS, Cloud etc.
• Proficiency in data movement techniques such as replication, switching, pipelines etc.
• Proficiency in data integrity, profiling and storage and mining techniques.
• Proficiency in change data capture techniques including timestamp, log and trigger-based mechanisms.
• Experience in working with different types of large and small sets of data; structured, semi structured and unstructured.
• Proficiency in working with scheduling tools.
• Proficiency in working with change management tools and processes include source control, versioning, defect tracking and release management.
• Proficiency in analyzing impact of smaller and large-scale initiatives.
• Good working knowledge of Unix/Linux/PowerShell scripting.
• Manage multiple priorities.
• Excellent written and verbal communication skills.
Experience:
• 5+ year’s experience in working with data in a data mart, warehousing, OLTP environment.
• 5+ year’s experience working with ETL/ELT and data movement tools.
Education required:
• Bachelor’s in information technology, computer science or related field
Software Powered by iCIMS
www.icims.com