Job Title

Snowflake Developer

  • Position:
  • Salary: $$70-85
  • Location:
      Remote
  • Work Eligibility: USC, GC, GC-EAD, TN, H1, H4-EAD, OPT-EAD, CPT
  • Job ID: 06338
Share This Job
Required Skills:

Pandya

130 Active Positions

Job Description

Key Responsibilities:

Design, develop and maintain scalable data pipelines
Develop data ingestion and integrations (REST, SOAP, SFTP, MQ, etc.) processes
Take ownership of building data pipelines
Actively engage in technology discovery and implementation for both on-prem and in Cloud (i.e. Azure or AWS) to build solution for future systems
Develop high performance scripts in SQL/Python/etc. to achieve objectives of enterprise data, BI and analytics need.
Incorporate standards and best practices into engineering solutions
Manage code versions in source control and coordinate changes across team
Participate in architecture design and discussions
Provide logical and physical data design, and database modeling
Be part of the Agile team to collaborate and to help shape requirements
Solve complex data issues around data integration, unusable data elements, unstructured data sets, and other data processing incidents
Supports the development and design of the internal data integration framework
Works with system owners to resolve source data issues and refine transformation rules
Partner with enterprise teams, data scientist, architects to define requirements and solution

Key Qualifications :

Have a B.A./B.S. and 5-8 years of relevant work experience; or an equivalent in education and experience
Must have excellent experience with Snowflake
Hands on experience with Microsoft Stack – SSIS, SQL, etc.
Possess strong analytical skills with the ability to analyze raw data, draw conclusions, and develop actionable recommendations
Experience with the Agile development process preferred
Proven track-record of excellence and consistently delivered past project successfully
Hands on experience with Azure data factory V2, Azure Databricks, SQLDW or Snowflake, Azure analysis services and Cosmos DB
Experience with Python or Scala.
Understanding of continuous integration and continuous deployment on Azure
Experience with large scale data lake or warehouse implementation on any of the public cloud (AWS, Azure, GCP)
Have excellent interpersonal and written/verbal communication skills
Manage financial information in a confidential and professional manner
Be highly motivated and flexible
Effectively handle multiple projects simultaneously and pay close attention to detail
Have experience in a multi-dimensional data environment

Tags:

Join Our Free Newsletter

Our newsletter contains valuable information that will help you in the application process and you can learn what is the importance of different roles and positions in the job industry. This is a great way to learn and brush up on many common practices.