Title: Architect
Area(s) of responsibility
Job Description:
Responsibilities:
- Design and implement scalable data architectures using Databricks and Snowflake.
- Develop and optimize ETL/ELT pipelines to ensure efficient data processing and integration.
- Collaborate with data engineers, data scientists, and business stakeholders to understand data requirements and deliver solutions.
- Ensure data quality, governance, and security across all data platforms.
- Provide technical leadership and mentorship to team members on best practices and emerging technologies.
- Troubleshoot and resolve data-related issues and performance bottlenecks.
- Stay current with industry trends and advancements in Databricks, Snowflake, and related technologies.
Requirements:
- Bachelor's or Master's degree in Computer Science, Information Technology, or a related field.
- Proven experience as a Data Architect or similar role with expertise in Databricks and Snowflake.
- Strong understanding of data warehousing concepts, data modeling, and ETL/ELT processes.
- Proficiency in SQL and experience with programming languages such as Python or Scala.
- Familiarity with cloud platforms (e.g., AWS, Azure, GCP) and their data services.
- Excellent problem-solving skills and attention to detail.
- Strong communication and collaboration abilities.
- Experience with data governance and security best practices is a plus.
Benefits:
- Competitive salary and performance-based bonuses.
- Comprehensive health, dental, and vision insurance.
- Flexible working hours and remote work options.
- Opportunities for professional growth and development.
- Inclusive and collaborative work environment.