Calance Job Opening

Job TitleSr. Hadoop Data Engineer - ID:36086
Duration6 Months
Start DateASAP
Job SkillsHadoop architecture, HDFS commands  , Hive, Shell Script, Unix, Hadoop Concepts
LocationHartford, CT
Date Posted09/03/2020

Generated button

Our client is seeking a Sr. Hadoop Data Engineer for a contract opportunity in Hartford, CT that can be worked remotely from your home office until Covid is over. Please review the job description and if interested, apply within.

Job description
We are looking for a savvy Hadoop Data Engineer to join our growing team of analytics experts. The contractor will be responsible for building and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder. The Hadoop Data Engineer will support our software developers and database architects initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products

This role will start working remotely but after Covid restrictions are lifted, the goal is to have this person onsite in Hartford, CT.

Fundamental Components:
• Develops large scale data structures and pipelines to organize, collect and standardize data that helps generate insights and addresses reporting needs.
• Collaborates with other data teams to transform data and integrate algorithms and models into automated processes.
• Uses knowledge in Hadoop architecture, HDFS commands and experience designing & optimizing queries to build data pipelines.
• Builds data marts and data models to support Data Science and other internal customers.
• Analyzes current information technology environments to identify and assess critical capabilities and recommend solutions.
• Experiments with available tools and advises on new tools in order to determine optimal solution given the requirements dictated by the model/use cases

• 3 or more years of progressively complex related experience.
• Has strong knowledge of large scale search applications and building high volume data pipelines.
• Experience building data transformation and processing solutions.
• Knowledge in Hadoop architecture, HDFS commands and experience designing & optimizing queries against data in the HDFS environment.
• Ability to understand complex systems and solve challenging analytical problems.
• Ability to leverage multiple tools and programming languages to analyze and manipulate data sets from disparate data sources
• Strong collaboration and communication skills within and across teams
• Strong problem solving skills and critical thinking ability

SKILL SET desired:
• Hive
• Shell Script
• Unix
• Hadoop Concepts (Sqoop, YARN, MapReduce ,etc.)
• Python

ADDED MUST HAVES: Kafka and NIFI for skillsets and experience with Azure and GCP (Cloud)

Who is Calance?
Calance is a global IT company with operations in the United States, Canada and India. Over the years, Calance has grown organically and has acquired numerous successful IT Services firms along the way. As a result, the company today is a mix of diverse cultures, talents and expertise that collaborate globally to bring our best capabilities and thinking to clients. Calance also offers benefits which includes Medical, Dental, Vision care and 401K.

Calance - the place to grow.

Send To Email / Remind Me



Mission Viejo, CA ~ (800) 732-4680
Atlanta, GA ~ (866) 951-1151

Calance is a global IT Services firm specializing in end-to-end solutions for Development, Managed Service,
Security, SAP, Project Control Integration and IT Staffing.Operating in the United States and India,
Calance helps clients bring their ideas and strategies to life through talent, technology and tenacity.

2020 All rights reserved.