Country/Region:  IN
Requisition ID:  30222
Work Model: 
Position Type: 
Salary Range: 
Location:  INDIA - PUNE - BIRLASOFT OFFICE - HINJAWADI

Title:  Technical Specialist-Data Engg

Description: 

Area(s) of responsibility

About Us:

Birlasoft, a global leader at the forefront of Cloud, AI, and Digital technologies, seamlessly blends domain expertise with enterprise solutions. The company’s consultative and design-thinking approach empowers societies worldwide, enhancing the efficiency and productivity of businesses. As part of the multibillion-dollar diversified CKA Birla Group, Birlasoft with its 12,000+ professionals, is committed to continuing the Group’s 170-year heritage of building sustainable communities.

• Job Title – Data Engineer DBT + Snowflake

• Experience- 5- 9 years

• Location: Bangalore, Noida, Chennai, Mumba, Hyderabad, Pune

• Shift Time - CET (12:30 to 9:30 IST)

Job Description:

We are seeking a highly skilled and motivated Data Engineer to join our data team. As a Data Engineer, you will be primarily responsible for designing & developing data transformations and data models to ensure reliable and efficient data processing and analysis. You will work closely with cross-functional teams to support data-driven decision-making processes and contribute to the overall success of our insight’s teams.

Key Proficiency & Responsibilities:

• Expertise in DBT (Data Build Tool) for data transformation and modelling.

• Proficiency in Snowflake, including experience with Snowflake SQL and data warehousing concepts.

• Strong understanding of data architecture, data modelling, and data warehousing best practices.

• Design, develop, and maintain robust data pipelines using DBT and Snowflake.

• Implement and optimize data ingestion processes to ensure efficient and accurate data flow from various sources.

• Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and ensure data integrity and quality.

• Perform data analysis and profiling to identify and address data quality issues.

• Maintain comprehensive documentation of data workflows, models, and ETL processes.

• Stay up to date with the latest trends and technologies in data engineering, DBT, and Snowflake.

• Proven experience as a Data Engineer or similar role, with a focus on data ingestion and ETL processes.

• Experience with other ETL tools and technologies is a plus (e.g., Apache Airflow, Talend, Informatica).

• Proficient in SQL and experience with programming languages such as Python or Java.

• Familiarity with cloud platforms and services (e.g., AWS) and experience on AWS Lambda is a must.

• Adhere to and promote development best practices, including version control using Git and branching models.

• Code review to ensure consistent coding standards and practices.

• Participate in scrum methodology, including daily stand-ups, sprint planning, and retrospectives.

• Communicate effectively with team members and stakeholders to understand requirements and provide updates.

• Take ownership of assigned tasks and work independently to complete them.