Job Description: We are seeking a skilled Data Warehouse (DWH) and Data Lake Developer to join our team and contribute to the design, development, and maintenance of data platforms for our clients. As a DWH and Data Lake Developer, you will work closely with stakeholders to understand their data requirements and design solutions that meet their needs while ensuring data quality, reliability, and performance. You will have the opportunity to work on challenging projects and collaborate with cross-functional teams to deliver best-in-class data solutions.
Responsibilities:
- Collaborate with stakeholders to gather and analyze requirements for Data Warehouse (DWH) and Data Lake solutions.
- Design, develop, and implement Data Warehouses and Data Lakes using industry best practices and methodologies.
- Develop ETL (Extract, Transform, Load) processes to extract data from source systems, transform it according to business rules, and load it into the data platform.
- Design and implement data models, schemas, and structures to support analytical and reporting requirements.
- Optimize data storage, retrieval, and processing performance using indexing, partitioning, and other optimization techniques.
- Implement data governance and security controls to ensure data privacy, compliance, and integrity.
- Develop and maintain documentation, including data dictionaries, data lineage, and metadata management.
- Collaborate with data analysts, data scientists, and business users to understand data needs and deliver actionable insights.
- Stay up-to-date with the latest trends, technologies, and best practices in data warehousing and data lake development.
Requirements:
- Bachelor’s degree in Computer Science, Engineering, or related field (or equivalent work experience).
- Proven experience as a Data Warehouse (DWH) and Data Lake Developer, with a strong understanding of data warehousing concepts and principles.
- Proficiency in SQL and experience with relational database management systems (e.g., Oracle, SQL Server, PostgreSQL).
- Experience with big data technologies and platforms (e.g., Hadoop, Spark, Kafka) is highly desirable.
- Strong understanding of ETL tools and processes, including data integration, transformation, and loading.
- Experience with cloud data platforms (e.g., AWS, Azure, Google Cloud) and related services (e.g., Redshift, BigQuery, S3) is a plus.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration skills, with the ability to work effectively in a team environment.
- Ability to manage multiple projects simultaneously and prioritize tasks effectively.
Benefits:
- Competitive salary and benefits package.
- Flexible work hours and remote work options.
- Opportunities for professional development and career advancement.
- Dynamic and collaborative work environment with passionate and talented colleagues.
- Chance to work on cutting-edge projects that have a meaningful impact on businesses and industries.
Job Category: Product design
Job Type: professional Service
Job Location: Pune