US
0 suggestions are available, use up and down arrow to navigate them
PROCESSING APPLICATION
Hold tight! We’re comparing your resume to the job requirements…
ARE YOU SURE YOU WANT TO APPLY TO THIS JOB?
Based on your Resume, it doesn't look like you meet the requirements from the employer. You can still apply if you think you’re a fit.
Job Requirements of Data Engineer, Analytic Systems:
-
Employment Type:
Full-Time
-
Location:
San Francisco, CA (Onsite)
Do you meet the requirements for this job?
Data Engineer, Analytic Systems
Bayone Solutions Inc
San Francisco, CA (Onsite)
Full-Time
Job Overview
The Marketing Analytic Systems data pipeline delivers value to various business partners by providing tools and capabilities to analyze large data sets for research, analytics, campaign management, reporting and tactical/strategic decision making
Our systems development team is looking for a Developer who has passion to work with data and to build solutions that supports our Analytic Systems Solution stack, which includes Google cloud platform (GCP), SAS (Linux environment) and Tableau Server/Desktop.
The successful candidate would have extensive experience as a data engineer or ETL developer building and automating data transformation and loading procedures
Strong knowledge and experience using Big Query, SQL, cloud composer, apache airflow and SAS to conduct data profiling/discovery, data modelling and process automation is required
The candidate must be comfortable working with data from multiple sources; Hadoop, DB2, Oracle, flat files
The projects are detail intensive, requiring the accurate capture and translation of data requirements (both tactical and analytical needs) and validation of the working solution
We work in a highly collaborative environment working closely with cross functional team members; Business Analysts, Product Managers, Data Analysts and Report Developers
Essential Functions
• Design, develop and implement end-to-end solutions on google cloud platform - big query, cloud composer, apache airflow; strong ability to translate business requirements into technical design plan.
• Automate, deploy and support solutions scheduled on Crontab or Control-M
Deployment includes proper error handling, dependency controls and necessary alerts
Triage and resolve production issues and identify preventive controls.
• Build rapid prototypes or proof of concepts for project feasibility.
• Document technical design specifications explaining how business and functional requirements are met
Document operations run book procedures with each solution deployment.
• Identify and propose improvements for analytics eco-system solution design and architecture.
• Participate in product support such patches and release upgrades
Provide validation support for Google cloud and SAS products, including any changes to other infrastructure, systems, processes that impact Analytics infrastructure.
• Participate in full SDLC framework using Agile/Lean methodology.
• Support non-production environments with the Operations and IT teams.
• Consistently demonstrate regular, dependable attendance & punctuality.
• Managing offsore resources' assignments and tasks.
• Perform other duties as assigned.
Years Of Experience
Minimum Years of Experience - 5 Years
Competencies
Education/Experience
Bachelor's degree in Computer Science/Engineering, Analytics, Statistics or equivalent work experience.
6+ years of work experience in Data Engineering, ETL Development and Data Analytics.
6+ years of hands-on experience using SQL and scripting language such as Unix Shell or Python
5+ years of hands-on experience developing on a Linux platform.
4+ years of hands-on experience working in traditional RDBMS such as Oracle, DB2.
1+ years of hands-on experience working in HDFS, TEZ, MapReduce, Sqoop
1+ years of hands-on experience working scripting language such as Python or SAS with BASE
The Marketing Analytic Systems data pipeline delivers value to various business partners by providing tools and capabilities to analyze large data sets for research, analytics, campaign management, reporting and tactical/strategic decision making
Our systems development team is looking for a Developer who has passion to work with data and to build solutions that supports our Analytic Systems Solution stack, which includes Google cloud platform (GCP), SAS (Linux environment) and Tableau Server/Desktop.
The successful candidate would have extensive experience as a data engineer or ETL developer building and automating data transformation and loading procedures
Strong knowledge and experience using Big Query, SQL, cloud composer, apache airflow and SAS to conduct data profiling/discovery, data modelling and process automation is required
The candidate must be comfortable working with data from multiple sources; Hadoop, DB2, Oracle, flat files
The projects are detail intensive, requiring the accurate capture and translation of data requirements (both tactical and analytical needs) and validation of the working solution
We work in a highly collaborative environment working closely with cross functional team members; Business Analysts, Product Managers, Data Analysts and Report Developers
Essential Functions
• Design, develop and implement end-to-end solutions on google cloud platform - big query, cloud composer, apache airflow; strong ability to translate business requirements into technical design plan.
• Automate, deploy and support solutions scheduled on Crontab or Control-M
Deployment includes proper error handling, dependency controls and necessary alerts
Triage and resolve production issues and identify preventive controls.
• Build rapid prototypes or proof of concepts for project feasibility.
• Document technical design specifications explaining how business and functional requirements are met
Document operations run book procedures with each solution deployment.
• Identify and propose improvements for analytics eco-system solution design and architecture.
• Participate in product support such patches and release upgrades
Provide validation support for Google cloud and SAS products, including any changes to other infrastructure, systems, processes that impact Analytics infrastructure.
• Participate in full SDLC framework using Agile/Lean methodology.
• Support non-production environments with the Operations and IT teams.
• Consistently demonstrate regular, dependable attendance & punctuality.
• Managing offsore resources' assignments and tasks.
• Perform other duties as assigned.
Years Of Experience
Minimum Years of Experience - 5 Years
Competencies
Education/Experience
Bachelor's degree in Computer Science/Engineering, Analytics, Statistics or equivalent work experience.
6+ years of work experience in Data Engineering, ETL Development and Data Analytics.
6+ years of hands-on experience using SQL and scripting language such as Unix Shell or Python
5+ years of hands-on experience developing on a Linux platform.
4+ years of hands-on experience working in traditional RDBMS such as Oracle, DB2.
1+ years of hands-on experience working in HDFS, TEZ, MapReduce, Sqoop
1+ years of hands-on experience working scripting language such as Python or SAS with BASE
Get job alerts by email.
Sign up now!
Join Our Talent Network!