Are you a Data Analytics specialist? Do you have Data Warehousing and / or Hadoop / Data Lake experience? Do you like to solve the most complex and high scale (billions + records) data challenges in the world today?
Would you like a career that gives you opportunities to help customers and partners use cloud computing to do big new things faster and at lower cost?
Do you want to be part of history and transform businesses through cloud computing adoption? Do you like to work on-site in a variety of business environments, leading teams through high impact projects that use the newest data analytic technologies?
Would you like a career path that enables you to progress with the rapid adoption of cloud computing?
At Amazon Web Services (AWS), we’re hiring highly technical cloud computing architects to collaborate with our customers and partners on key engagements.
Our consultants will develop and deliver proof-of-concept projects, technical workshops, and support implementation projects.
These professional services engagements involve emerging technologies like AI, IoT, and Data Analytics. This role will specifically focus on Data and Analytics capabilities and helping our customers and partners to remove the constraints that prevent our customers from leveraging their data to develop business insights.
At AWS, we are hiring the best Data / Analytics cloud computing consultants, who can help our clients and partners derive business value from Data in the cloud.
Our consultants will collaborate with partner and client teams to deliver proof-of-concept projects, conduct topical workshops, and lead implementation projects.
These professional services engagements will focus on large scale data warehousing and database migration capabilities and customer solutions such as batch / real-time data processing, Data and Business intelligence.
Our team collaborates across the entire AWS organization to bring access to product and service teams, to get the right solution delivered and drive feature innovation based upon customer needs.
Responsibilities include :
Expertise - Collaborate with AWS field sales, pre-sales, training and support teams to help partners and customers learn and use AWS services such as Redshift, Database Migration Service, Glue, EMR, Elastic Compute Cloud (EC2), S3, Relational Database Service (RDS) / Aurora, and Amazon Kinesis.
Solution - Deliver on-site technical engagements with partners and customers. This includes participating in pre-sales on-site visits, understanding customer requirements, generating consulting proposals, contributing to internal Area of Depth (AoD) programs, authoring AWS Data Analytics best practice blogs / whitepaper and creating packaged data service offerings.
Delivery - Engagements include short on-site projects proving the use of AWS services to support new distributed computing solutions that often span private cloud and public cloud services.
Engagements will include migration of existing applications and development of new applications using AWS cloud services.
Insights - Work with AWS engineering and support teams to convey partner and customer needs and feedback as input to technology roadmaps.
Share real world implementation challenges and recommend new capabilities that would simplify adoption and drive greater value from use of AWS cloud services.
Push the envelope Cloud computing is reducing the historical IT constraint on businesses. Imagine bold possibilities and work with our clients and partners to find innovative new ways to satisfy business needs through Data / Business Intelligence cloud computing.
Ability to travel to client locations to deliver professional services, as needed.
Bachelor’s degree, in Computer Science, Engineering, Mathematics or a related field or equivalent professional or military experience
5+ years of experience of IT platform implementation in a technical and analytical role
3+ years of experience of Data Lake / Hadoop platform implementation
2+ years of hands-on experience in implementation and performance tuning Hadoop / Spark implementations.
Experience Apache Hadoop and the Hadoop ecosystem
Experience with one or more relevant tools (Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, HCatalog, Solr, Avro)
Experience with one or more SQL-on-Hadoop technology (Hive, Impala, Spark SQL, Presto)
Experience developing software code in one or more programming languages (Java, Python, etc)
Current hands-on implementation experience required
Ability to travel to client locations when needed. Up to 50% regionally
Ability to communicate fluently in Spanish and English
Masters or PhD in Computer Science, Physics, Engineering or Math.
Hands on experience leading large-scale full-cycle MPP enterprise data warehousing (EDW) and analytics projects (including migrations to Amazon Redshift).
Ability to lead effectively across organizations and partners.
At least one of the AWS Associate level certifications or higher.
Industry leadership in the fields of database, data warehousing or data sciences.
Ability to think strategically about business, product, and technical challenges in an enterprise environment.
Ability to communicate fluently in Spanish, English and Portuguese