Job Summary
We are seeking a skilled Data Engineer to design, develop, and maintain robust ETL/ELT data pipelines that support business analytics and reporting needs. The ideal candidate will have hands-on experience with Oracle Data Integrator (ODI) and strong expertise in SQL, data modeling, and database performance optimization. This role requires collaboration with data scientists, analysts, and BI teams to deliver high-quality, well-structured data that drives actionable insights. The successful candidate will also ensure data integrity, monitor pipeline health, and implement best practices in data engineering.
Key Responsibilities
- Design, develop, and maintain ETL/ELT data pipelines to facilitate efficient business analytics and reporting.
- Build and implement data models optimized for performance, scalability, and alignment with business requirements.
- Utilize Oracle Data Integrator (ODI) to automate and streamline data integration workflows.
- Ensure data quality, integrity, and consistency across multiple systems and platforms.
- Optimize SQL queries and enhance database performance to support fast and reliable data retrieval.
- Collaborate closely with data scientists, analysts, and business intelligence teams to provide clean, structured data sets for analysis.
- Monitor data pipeline operations, promptly troubleshoot failures, and minimize downtime to maintain continuous data flow.
- Follow and promote best practices in data engineering, including thorough documentation, version control, and automation of processes.
Required Qualifications
- Strong proficiency in SQL, including data transformation, query optimization, and performance tuning.
- Hands-on experience with Oracle Data Integrator (ODI) for data integration and process automation.
- Solid understanding of data modeling concepts such as dimensional modeling, normalization, and denormalization.
- Familiarity with relational database systems, including Oracle, Microsoft SQL Server, or similar platforms.
- Excellent problem-solving skills with the ability to diagnose and optimize complex data workflows.
- Ability to work independently while effectively collaborating with cross-functional teams.
- Good knowledge of data governance principles, including data security and compliance best practices.
- Experience with scripting languages such as Python or Shell for automation purposes.
- Exposure to Continuous Integration/Continuous Deployment (CI/CD) pipelines for data workflows and familiarity with version control tools like Git.
Preferred Qualifications and Benefits
While not explicitly stated, candidates with additional experience in cloud data platforms, advanced automation frameworks, or BI tool integration will be highly valued. Our organization offers a dynamic work environment where innovation and collaboration are encouraged, along with opportunities for professional growth and skill development.
If you are passionate about building scalable data solutions and enjoy working in a collaborative, fast-paced setting, we invite you to apply and contribute to our data-driven decision-making processes.