Job Title

Data Engineer

  • Position:
  • Salary: $$55
  • Location:
      Remote
  • Work Eligibility: H1, H4-EAD, OPT-EAD, CPT
  • Job ID: 06124
Share This Job
Required Skills:

Pandya

29 Active Positions

Job Description

SUMMARY

The FinERP data lake team is vital to platform to bring all Finance data into the data lake that will be used for Finance Reporting, Planning and Analytics. We partner closely with our business teams to understand the needs of our customers to continue to deliver high value solutions. We take pride in developing these mission critical systems that are optimized for the best customer experience.

Successful candidates will be strong leaders who can prioritize well, communicate clearly, and have a consistent track record of delivery and support of ecommerce solutions. The implementation around eCommerce systems require strong critical thinking skills, creative engineering balanced with high quality and customer focus. The ideal candidate is one that is not only passionate about software development and software architecture but also focus on business needs and customer experience. The day-to-day work is interesting, challenging and fast-paced!

ESSENTIAL DUTIES AND RESPONSIBILITIES include the following. Other duties may be assigned.

• As a developer of this team, you will work with business stakeholders, project managers, business analysts, and other IT teams to understand the business needs and their requirements. You will be working closely with a nimble team of Data engineers and enterprise architects to determine the best architectural and design decisions, find and develop innovative and practical solutions to meet our business needs in a fast-paced environment.

QUALIFICATIONS

To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.

• Hands-on technical experience:

The team utilizes a variety of technologies to provide the best solution for requirements including AWS S3 [data lake], lambda, glue jobs, Snowflake for data warehouse, pyspark for data processing.
Experience in Data lake including ingestion, ETL, and Spark processing. Preferably pyspark.
Experience with developing Lambdas, and glue jobs is needed.
Experience / exposure to DynamoDB is an advantage.
Experience with enhancing and maintaining existing datalake applications
Experience with the Data Stage 11.7 will be a plus.
Experience dealing with data at scale (streaming – Kafka and unstructured data) is a plus.
• Communication Skills

Exceptional customer relationship skills including the ability to discover the true requirements underlying feature requests, recommend alternative technical and business approaches, work with architects to come to an agreed technical approach, and lead development efforts to meet aggressive timelines with optimal solutions.
Ability to work collaboratively with or lead cross-functional teams with minimal supervision.
Ability to drive consensus within a team and influence outcomes in significant technical decision-making.
Excellent interpersonal, written and verbal communication skills to work with different business groups as well as IT partners (enterprise architects, vendors, etc.)
• Problem Solving

Attention to detail and organization in all aspects of the system development along with a solid understanding of the customer need.
Ability to understand, master, and be able to prototype with existing and new technologies quickly.
• Time Management

Ability to effectively plan, organize and prioritize multiple streams of activity. Adapt well to changes.
Demonstrated ability to meet commitments and multi-task in a fast-paced work environment with a high-level of accuracy and efficiency.
• Analytic Skills

Demonstrated critical thinking skills.
Open-minded, willing to consider multiple options, sources, perspectives, and possible solutions. Careful assessment of the importance, relevance, and validity of all options.
Inquisitive. Ask probing questions and research as a basis for making design decisions and judging quality; understand the true reason behind the request rather than just accepting an initial thought or proposed solution.
Proactive and willing to contribute ideas. Not afraid to ask questions.
Should be a very quick learner and adapt to changes.
EDUCATION and/or EXPERIENCE

Bachelor’s degree in a technical discipline or equivalent experience/training
7-10 years of overall experience IT experience.
At least 4+ years experience working with large scale datasets in Datalakes. [AWS s3 experience preferred]
At least 3+ years of hands-on experience in Spark development. PySpark is preferred.
Experience in understanding, and working with Parquet files, Lambdas, and Glue jobs is required.
Experience creating and configuring deployments in Jenkins (nice to have)
Ability to effectively plan, organize and prioritize multiple streams of activity. Adapt well to changes.
Demonstrated ability to meet commitments and multi-task in a fast-paced work environment with a high-level of accuracy and efficiency.
Demonstrated knowledge in building, debugging and maintaining mission critical enterprise applications.

Tags:

Join Our Free Newsletter

Our newsletter contains valuable information that will help you in the application process and you can learn what is the importance of different roles and positions in the job industry. This is a great way to learn and brush up on many common practices.