Tripshepherd is looking for a talented Data Analytics Engineer with expertise in Python dashboards and ETL processes to join their team. The successful candidate will be responsible for developing interactive dashboards, building reporting pipelines, and implementing data cleaning workflows. This role involves ongoing data maintenance and designing scalable data storage solutions that support performance monitoring and data-driven decision-making across various departments. A hands-on approach to data engineering is essential, with a strong focus on optimizing systems and fostering collaboration across teams.
Key Responsibilities
Develop and maintain interactive dashboards using Python frameworks such as Plotly Dash or Streamlit. Build and optimize reporting pipelines to ensure accurate and timely data delivery. Design, implement, and manage ETL pipelines for data ingestion, transformation, and cleaning from diverse sources including APIs, databases, and flat files. Create and maintain scalable data storage solutions by applying data warehousing concepts and best practices in schema design. Perform data cleaning, anomaly detection, and normalization to maintain high data quality. Collaborate closely with product, engineering, and operations teams to understand data requirements and deliver actionable insights. Utilize version control systems like Git for effective code management and team collaboration. Conduct debugging, testing, and performance tuning to optimize data workflows and dashboard functionality. Deploy dashboards using platforms such as Streamlit Cloud or containerized environments like Docker.
Required Qualifications
Advanced proficiency in Python programming with at least 2–3 years of professional experience. Strong SQL skills for querying, data transformation, and query optimization. Proven experience in building interactive dashboards using tools like Plotly Dash or Streamlit. Hands-on experience designing and managing ETL pipelines and data cleaning processes. Solid understanding of data warehousing concepts, including star and snowflake schema design. Familiarity with version control systems, particularly Git. Experience with Python libraries for data manipulation such as Pandas, NumPy, or Polars. Ability to implement dashboard features including filtering, drill-downs, export options, and responsive user interfaces. Knowledge of automation tools or scripting-based workflows such as Airflow or Prefect. Experience working with modern data warehouse technologies like Amazon Redshift, Google BigQuery, Snowflake, or traditional RDBMS such as PostgreSQL and MySQL. Strong debugging, testing, and performance tuning capabilities. Excellent written and verbal communication skills to collaborate effectively across teams.
Preferred Qualifications and Benefits
Experience deploying dashboards in cloud environments or containerized setups is highly valued. Ability to thrive in a fast-paced, collaborative environment working with cross-functional teams. This is a full-time, in-person position offering the opportunity to contribute directly to Tripshepherd’s data-driven initiatives and system optimizations.
This role provides an excellent opportunity to leverage your Python and data engineering skills to influence critical business decisions and enhance operational efficiency across multiple departments. If you are passionate about building scalable data solutions and interactive analytics tools, Tripshepherd offers a dynamic and supportive environment to advance your career.