PROCESSING APPLICATION
Hold tight! We’re comparing your resume to the job requirements…
ARE YOU SURE YOU WANT TO APPLY TO THIS JOB?
Based on your Resume, it doesn't look like you meet the requirements from the employer. You can still apply if you think you’re a fit.
Job Requirements of Data Engineer:
-
Employment Type:
Full-Time
-
Location:
New York, NY (Onsite)
Do you meet the requirements for this job?
Data Engineer
Essential Functions:
Design, develop and implement end-to-end solutions on google cloud platform - big query, cloud composer, apache airflow; strong ability to translate business requirements into technical design plan.
Automate, deploy and support solutions scheduled on Crontab or Control-M. Deployment includes proper error handling, dependency controls and necessary alerts. Triage and resolve production issues and identify preventive controls.
Build rapid prototypes or proof of concepts for project feasibility.
Document technical design specifications explaining how business and functional requirements are met. Document operations run book procedures with each solution deployment.
Identify and propose improvements for analytics eco-system solution design and architecture.
Participate in product support such patches and release upgrades. Provide validation support for Google cloud and SAS products, including any changes to other infrastructure, systems, processes that impact Analytics infrastructure.
Participate in full SDLC framework using Agile/Lean methodology.
Support non-production environments with the Operations and IT teams.
Consistently demonstrate regular, dependable attendance & punctuality.
Managing offsore resources' assignments and tasks.
Perform other duties as assigned.
Qualifications:
Education/Experience
Bachelor's degree in Computer Science/Engineering, Analytics, Statistics or equivalent work experience.
6+ years of work experience in Data Engineering, ETL Development and Data Analytics.
6+ years of hands-on experience using SQL and scripting language such as Unix Shell or Python
5+ years of hands-on experience developing on a Linux platform.
4+ years of hands-on experience working in traditional RDBMS such as Oracle, DB2.
3+ years of hands-on experience working in HDFS, TEZ, MapReduce, Sqoop
1+ years of hands-on experience working in Cloud technology
1+ years of hands-on experience working scripting language such as Python or SAS with BASE SAS, SAS MACRO, and SAS STAT
Communication Skills
Excellent written and verbal communication and presentation skills
Ability to collaborate with internal and cross functional teams
Mathematical Skills (if applicable)
Basic math functions such as addition, subtraction, multiplication, division and analytical skills.
Reasoning Ability
Must be able to work independently with minimal supervision, strategic thinking and organizational planning skills
Physical Demands (if applicable)
This position involves regular ambulating, sitting, hearing, and talking.
May occasionally involve stooping, kneeling, or crouching.
Other Skills
Experience with Spark, PySpark, Zeppelin and Jupyter Notebook is nice to have
Demonstrated experience implementing and automating ETL processes on large data sets
Experience with report development and supporting data requirements for reporting
Knowledge of Hadoop/Big Data architecture and operational workings is nice to have.
Ability to multi-task and meet deadlines.Ability to work with diverse teams and multiple technologies