Job Details
Job DescriptionRole: Snowflake ArchitectExperinece:7+ yearsLocation: Hyderabad/Pune/GurgaonTechnical Skill- Snowflake, Data Build Tool, Spark / Pyspark, Java/ Python, Scala, Any CLOUD Responsibilities:Job Description: Snowflake Data warehouse Architect(1) (Preferably immediate joining)- Minimum 6+ years of experience as an Architect on Analytics solutions an around 2 years of experience with Snowflake - Design and implement effective Analytics solutions and models with Snowflake - Examine and identify Data warehouse structural necessities evaluating business requirements - Assess Data warehouse implementation procedures to ensure they comply with internal and external regulations - Prepare accurate Data warehouse design and architecture reports for management and executive teams.
- Oversee the migration of data from legacy systems to new solutions - Monitor the system performance by performing regular troubleshooting, and integrating new features - Recommend solutions to improve new and existing Datawhouse solutions - Understand and document data flows in and between different systems/applications - Guidance to developers in preparing functional/technical specs to define reporting requirements and ETL process Required SkillsWorking experience and communicating with business stakeholders and architects Industry experience in developing relevant big data/ETL data warehouse experience building cloud-native data pipelines Experience in Python, Pyspark, Scala, Java and SQL Strong Object and Functional programming experience in Python Experience working with REST and SOAP-based APIs to extract data for data pipelines Extensive experience working with Hadoop and related processing frameworks such as Spark, Hive, Sqoop, etc Experience working in a public cloud environment, particularly AWS is mandatory Ability to implement solutions with AWS Virtual Private Cloud, EC2, AWS Data Pipeline, AWS Cloud Formation, Auto Scaling, AWS Simple Storage Service, EMR and other AWS products, HIVE, Athena Experience in working with Real-time data streams and Kafka Platform.
Working knowledge of workflow orchestration tools like Apache Airflow design and deploy dags Hands-on experience with performance and scalability tuning Professional experience in Agile/Scrum application development using JIRA