PROCESSING APPLICATION
Hold tight! We’re comparing your resume to the job requirements…
ARE YOU SURE YOU WANT TO APPLY TO THIS JOB?
Based on your Resume, it doesn't look like you meet the requirements from the employer. You can still apply if you think you’re a fit.
Job Requirements of Data Specialist:
-
Employment Type:
Full-Time
-
Location:
San Francisco, CA (Onsite)
Do you meet the requirements for this job?
Data Specialist
Job Description
Design, develop and implement end-to-end solutions on google cloud platform - big query, cloud composer, apache airflow; strong ability to translate business requirements into technical design plan.
Automate, deploy and support solutions scheduled on Crontab or Control-M. Deployment includes proper error handling, dependency controls and necessary alerts. Triage and resolve production issues and identify preventive controls.
Build rapid prototypes or proof of concepts for project feasibility.
Document technical design specifications explaining how business and functional requirements are met. Document operations run book procedures with each solution deployment.
Identify and propose improvements for analytics eco-system solution design and architecture.
Participate in product support such patches and release upgrades. Provide validation support for Google cloud and SAS products, including any changes to other infrastructure, systems, processes that impact Analytics infrastructure.
Participate in full SDLC framework using Agile/Lean methodology.
Support non-production environments with the Operations and IT teams.
Consistently demonstrate regular, dependable attendance & punctuality.
Managing offsore resources' assignments and tasks.
Perform other duties as assigned.
Education/Experience
Bachelor's degree in Computer Science/Engineering, Analytics, Statistics or equivalent work experience.
6+ years of work experience in Data Engineering, ETL Development and Data Analytics.
6+ years of hands-on experience using SQL and scripting language such as Unix Shell or Python
5+ years of hands-on experience developing on a Linux platform.
4+ years of hands-on experience working in traditional RDBMS such as Oracle, DB2.
3+ years of hands-on experience working in HDFS, TEZ, MapReduce, Sqoop
1+ years of hands-on experience working in Cloud technology
1+ years of hands-on experience working scripting language such as Python or SAS with BASE SAS, SAS MACRO, and SAS STAT