• IBM

Business Manager

Data Engineer Big Data-AWS

Jobs Description

RTH - Yes Total exp: 6+ yrs, Rel exp: 6+ yrs (with 2 yrs mandatory in AWS) Mandatory skills: Skill sets for BAU lead can have below either of the options who can work in BAU production support. But do inform the candidates that this role is for BAU and not Development work. Preferred skills to have: Option 1 : AWS (Glue,s3) , Airflow , Snowflake and with good understanding on DMS Option 2 : AWS (Glue,s3,DMS), Airflow with good understanding on Snowflake Job Description: BAU Lead ? Data Engineering Key Skills: ? Strong AWS knowledge in terms of designing new architecture and providing optimized solution for existing one. (S3, Glue,DMS,MWAA,AMS,IAM) ? In-depth knowledge with respect to Snowflake and its architecture. ? Prefer prior Experience in BAU environment. ? Good knowledge on Airflow and MWAA. ? Hands-on experience in SQL/Python/Pyspark. ? Expertise in optimizing techniques in cloud environments. ? Should have the vision on data strategy and able to deliver the same. ? Should beable to lead the design and implementation of data management processes, including data sourcing, integration, and transformation. ? Able to manage and lead a team of data professionals, providing guidance, mentoring and foster a collaborative and innovative team culture focused on continuous improvement. ? To evaluate and recommend data-related technologies, tools, and platforms. ? Collaborate with IT teams to ensure seamless integration of data solutions. ? Should have experience in Implementing and enforce data security protocols and ensure compliance with relevant regulations. Required Qualifications: ? Experience as a Lead: 6+ years, should pose a strong character in leading a small team. ? Bachelor qualification in a computer science or STEM (science, technology, engineering, or mathematics) related field. ? At least 8+ years of strong data warehousing experience using RDBMS and Non-RDBMS databases. ? At least 5 years of recent hands-on professional experience (actively coding)working as a Lead handling support & production issues. Public? Professional experience working in an agile, dynamic and customer facing environment is required. ? Understanding of distributed systems and cloud technologies (AWS) is highly preferred. ? Understanding of data streaming and scalable data processing is preferred to have. ? Experience with large scale datasets, data lake and data warehouse technologies such as AWS Redshift, Google BigQuery, Snowflake. Snowflake is highly preferred. ?Atleast 2+ years of experience in ETL (AWS Glue), Amazon S3, Amazon RDS, Amazon Kinesis, Amazon Lambda, Apache Airflows, Amazon Step Functions. ? Strong knowledge in scripting languages like SQL ,Python, UNIX shell and Spark is required. ? Understanding of RDBMS, Data ingestions, Data flows, Data Integrations etc. ? Technical expertise with data models, data mining and segmentation techniques. ? Experience with full SDLC lifecycle and Lean or Agile development methodologies. ? Knowledge of CI/CD and GIT Deployments. ? Ability to work in team in diverse/ multiple stakeholder environment. ? Ability to communicate complex technology solutions to diverse teams namely, technical, business and management teams Responsibilities: ? Work with stakeholders to understand needs for data structure, availability, scalability, and accessibility. ? Develop tools to improve data flows between internal/external systems and the data lake/warehouse. ? Build robust and reproducible data ingest pipelines to collect, clean, harmonize, merge, and consolidate data sources. ? Understanding existing data applications and infrastructure architecture ? Build and support new data feeds for various Data Management layers and Data Lakes ? Evaluate business needs and requirements. Public? Support migration of existing data transformation jobs in Oracle, and MS-SQL to Snowflake. ? Lead the migration of the existing data transformation jobs in Oracle, Hive, Impala etc. into Spark, Python on Glue etc. ? Able to document the processes and steps. ? Develop and maintain datasets. ? Improve data quality and efficiency. ? Lead Business requirements and deliver accordingly. ? Collaborate with Data Scientists, Architect and Team on several Data Analytics projects. ? Collaborate with DevOps Engineer to improve system deployment and monitoring process. ? Experience in critical production support and how the BAU functions is preferred. Soft Skills: ? Ability to work in a collaborative environment and coach other team members on coding practices, design principles, and implementation patterns that lead to high quality maintainable solutions. ? Excellent comm and stake-holder management is required. ? Ability to handle the senior leadership and report to them if required. ? Ability to work in a dynamic, agile environment within a geographically distributed team. ? Ability to focus on promptly addressing customer needs.

Make a Difference with Your Online Resume!

Your Resume in Minutes with Jobs Resume Assistant is Ready!

Create an Account
`

Offered Salary

₹12.00 LPA - ₹12.00 LPA

Subscribe to Our Newsletter!

Subscribe to get latest updates and information.

Upload Resume (Docx, Doc, PDF) File.

You can apply to this job and others using your online resume. Click the link below to submit your online resume and email your application to this employer.

Welcome Back Sign in to Continue

Don't Have an Account? Sign Up!
Forgot Password?

Create your Account!

Don't Have an Account? Sign Up!