Lagozon is a premier Data and AI company, renowned for delivering cutting-edge solutions that empower businesses across a broad spectrum of industries, including Healthcare, Retail, Manufacturing, Automotive, Logistics & Distribution, Telecom & Media, and BFSI. Leveraging deep industry expertise and advanced technological capabilities, Lagozon crafts innovative, data-driven solutions that deliver tangible business outcomes. Our solutions not only accelerate revenue growth but also enhance risk management, enabling our clients to thrive in an increasingly data-centric world. Through a steadfast commitment to innovation, quality, and excellence, we have built long-lasting partnerships and earned a reputation as a trusted advisor to global enterprises
• Design, develop, and maintain ETL/ELT data pipelines using Databricks and PySpark.
• Work with Delta Lake and implement efficient data lakehouse solutions.
• Optimize Spark jobs for scalability, performance, and reliability.
• Collaborate with data scientists and analysts to prepare clean, structured datasets.
• Automate workflows and manage data quality across projects.
• 3–4 years of experience in Databricks and Apache Spark.
• Proficiency in Python (PySpark) and SQL.
• Solid understanding of data engineering concepts, ETL pipelines, and data modeling.
• Experience with Delta Lake, Databricks Workflows, and version control (Git).
• Strong problem-solving and collaboration skills.
Email : hr@lagozon.com
Mobile : +91 98105 15823
