Just a moment.
Just a moment.
Just a moment.
Job Opening
Job Opening
Job Opening

Data Scientist (Delta Lake/Lakehouse)

We are looking for a talented and passionate Data Scientist to join our dynamic team and help us harness the power of data to make impactful decisions.

Last updated: August 25, 2024

Job location: Remote

Job Summary:

We are seeking a highly skilled Professional Data Scientist with a strong background in computer science or data science and specialized expertise in Delta Lake/Lakehouse technologies. The ideal candidate will possess relevant training and working experience with Delta Lake/Lakehouse, a Databricks Certified Data Engineer Professional certification, and experience in creating interactive dashboards using these technologies.

Key Responsibilities:

  • Design, develop, and maintain data pipelines using Delta Lake/Lakehouse architectures.

  • Utilize Databricks for data processing, analysis, and machine learning model development.

  • Create and manage interactive dashboards to visualize data and provide actionable insights.

  • Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs.

  • Conduct data analysis to identify trends, patterns, and anomalies.

  • Implement best practices for data governance, security, and performance optimization.

  • Stay current with industry trends and emerging technologies in data science and Delta Lake/Lakehouse ecosystems.

Required Skillsets:

  • Master’s degree in computer science, Data Science.

  • 0-2 years of working experience with Delta Lake/Lakehouse technologies.

  • Databricks Certified Data Engineer Professional certification required.

  • Proven experience in developing interactive dashboards using StreamLit or Solara.

  • Hands on programming experience in Python, SQL, and other relevant languages.

  • Direct working experience with Unix, Jit, and Jenkins.

  • Excellent problem-solving skills and attention to detail.

Preferred Qualifications:

  • Experience with cloud platforms such as AWS, Azure, or Google Cloud.

  • Knowledge of big data frameworks (e.g., Apache Spark).

  • Experience with machine learning and statistical analysis.

  • Familiarity with ETL processes and data integration techniques.

  • Excellent problem-solving skills and attention to detail.



Get in touch with us

We’re here to answer your doubts and questions

Get in touch with us

We’re here to answer your doubts and questions

Get in touch with us

We’re here to answer your doubts and questions

Get in touch with us

We’re here to answer your doubts and questions

You might also be interested in

No job posting found
No job posting found
No job posting found

Oops… This can't be. Please try other positions.

Oops… This can't be. Please try other positions.

Oops… This can't be. Please try other positions.

Oops… This can't be. Please try other positions.

Clear search
Clear search
Clear search
Clear search

Want to be considered for future job openings?

Want to be considered for future job openings?

Build better workforce with Entagile

No headache of scaling teams anymore and excel in competition.

© 2015-2025 Entagile LLC All rights reserved.

Build better workforce with Entagile

No headache of scaling teams anymore and excel in competition.

© 2015-2025 Entagile LLC All rights reserved.

Build better workforce with Entagile

No headache of scaling teams anymore and excel in competition.

© 2015-2025 Entagile LLC All rights reserved.

Build better workforce with Entagile

No headache of scaling teams anymore and excel in competition.

© 2015-2025 Entagile LLC All rights reserved.