Job Title

Software Engineer – Java/Big Data

  • Position:
  • Salary: $$60-65
  • Location:
      Remote
  • Work Eligibility: USC, GC, GC-EAD, TN, H1, H4-EAD, OPT-EAD, CPT
  • Job ID: 08019
Share This Job
Required Skills:

Pandya

239 Active Positions

Job Description

Requirements

• BS degree in computer science, computer engineering or equivalent

• 5 – 6 years of experience delivering enterprise software solutions for Developers

• Proficient in Java, Spark, Kafka, Python, AWS Cloud technologies

• Must have active current experience with Scala, Java, Python, Oracle, Cassandra, Hbase, Hive

• 3+ years of experience across multiple Hadoop / Spark technologies such as Hadoop, MapReduce, HDFS, Cassandra, HBase, Hive, Flume, Sqoop, Spark, Kafka, Scala Familiarity with AWS scripting and automation

• Flair for data, schema, data model, how to bring efficiency in big data related life cycle

• Must be able to quickly understand technical and business requirements and can translate them into technical implementations

• Experience with Agile Development methodologies

• Experience with data ingestion and transformation

• Solid understanding of secure application development methodologies

• Experienced in developing microservices using spring framework is a plus

• Understanding of automated QA needs related to Big data

• Strong object-oriented design and analysis skills

• Excellent written and verbal communication skills Responsibilities

• Utilize your software engineering skills including Java, Spark, Python,

Scala to Analyze disparate, complex systems and collaboratively design new products and services

• Integrate new data sources and tools

• Implement scalable and reliable distributed data replication strategies

• Ability to mentor and provide direction in architecture and design to onsite/offshore developers

• Collaborate with other teams to design and develop and deploy data tools that support both operations and product use cases

• Perform analysis of large data sets using components from the Hadoop ecosystem

• Own product features from the development, testing through to production deployment

• Evaluate big data technologies and prototype solutions to improve our data processing architecture

• Automate everything

Notes-This is a backfill for one of our contractors who is ending next Friday and the manager is giving us the next 48 hours to come through with candidates before notifying the competition. Please help us fill this ASAP!

[jobboard-shortcode-map-2][/jobboard-shortcode-map-2]
Tags: