A Data Engineer at MightyHive, with a thorough understanding of cloud technologies, will help to deploy Big Data solutions, build and manage data pipelines, ETL processes, and cloud data migrations in a production, secure and scalable manner.
The data engineer should be a data specialist with strong experience in the deployment and automation of data pipelines, data development, data cleansing, data warehousing, monitoring data processing systems, and dealing with large and various sets of structured and unstructured data.
The data engineer will have experience ingesting real-time data from various data sources and designing new data platforms
Design and build data pipelines using a cloud platform.
Manage and provision the cloud solution infrastructure.
Design for data security and compliance.
Manage and automate ETL and cloud deployment implementations.
Ensure solution and operations reliability.
Design and implement Big Data and Data Warehousing solutions with their corresponding Data Governance processes.
Able to manage Cloud databases.
Provide domain expertise around public cloud and enterprise technology
Required skills and qualifications
Demonstrable deep knowledge and experience in cloud migration, cloud strategy and transformation, cloud architecture and engineering.
Broad knowledge of the major cloud vendors with deep knowledge in GCP.
A set of certifications in (GCP) : Cloud Architect, Data Engineer or Cloud Engineer.
Understanding of the high-level levers for cost-effective cloud delivery.
Deep hands-on experience with cloud orchestration tooling, infrastructure-as-a-code (Terraform, Chef, Ansible or Puppet).
Deep hands-on experience building scalable ETL pipelines (Apache Beam, Airflow).
Technical depth and experience with SQL.
Strong problem solving or analytics experience.