Cargill provides food, agriculture, financial and industrial products and services to the world. Together with farmers, customers, governments and communities, we help people thrive by applying our insights and over 150 years of experience.
We have 150,000 employees in 70 countries who are committed to feeding the world in a responsible way, reducing environmental impact and improving the communities where we live and work.
Cargill as a company possesses and generates large volume of data on a regular basis and when that data is combined with external market data, it has a huge potential of generating revenue and / or save cost for the company.
All businesses and functions at Cargill are making a lot of investment to create data and analytics strategy and roadmap, stand up analytics teams, develop data lakes and enable analytics through self-
service to make Cargill a data-driven company. Analytics Hub at Cargill Business Services is set up to enable and expedite the Analytics journey of the company and help make it a success.
As a Data Engineer, you will work as part of the Analytics Hub Team to identify and prepare datasets and build tables / views in Hadoop platform for data discovery and exploration to be used by Data Analysts or Data scientists or Business.
You will be directly working with Business / Function teams, Enterprise Architecture and Business Analysts to understand business requirements and build solutions to meet their needs and objectives.
The solutions will need to integrate data from structured and unstructured sources using ETL tools or open source Hadoop technologies into the Cargill Data Platform.
The Data Engineer will ensure solutions developed are aligned to the EA / D&BI architecture standards, frameworks and principles, ensure they leverage common solutions and services, as well as meet financial targets. 75% Data Engineering
Design, build, test and deploy data integration solutions to move data from production systems (ERP & non-ERP) to Cargill’s data platform using ETL, ELT or Hadoop technologies.
Works with business and functional teams to understand the requirements and create technical specifications, as appropriate, to design and implement the data integration solution.
Leverage architecture standards / patterns / API’s that will enable simple use of complex datasets
Ensure adherence to development and architecture standards and best practices.
Works with the managed service partners to build, test and deploy the solution.
Works with the Enterprise Architects / Data Architects to define strategy for creating & maintaining metadata about Cargill’s diverse applications & data sources.
Support the design, build, test and maintaining of the Data Catalog, a key component of Cargill’s Big Data and Advanced Analytics platform
Effectively communicate, through written and oral presentation, of insights to multiple levels of management across business units and functional areas
Leverage strong business acumen and business domain knowledge to anticipate business inquiries and act accordingly and decisively 10% Consulting and Relationship Management
Liaise with the respective Business or Function teams to build trust and relationship by efficient delivery and engagement
Regularly interface with architects, analysts, process designers, and BU / Function subject matter experts to understand and evaluate business opportunities 10% Create Artifacts and Build Knowledge Base for CoE
While working with a business or function, continue to contribute to the Analytics Hub community of practice by building reusable artifacts, sharing knowledge and best practices
Establish contacts with Analytics experts within and outside of Cargill to keep pace with latest trends and updates from the Analytics industry and apply to Cargill as relevant 5% Miscellaneous Duties
As assigned on ad-hoc basis
Project demands may from time to time require work during non-standard business hours.
Provide consultation on project work and data related problems
Bachelor’s Degree in MIS, Statistics, Business or related field
At least 4 years of BI / Data warehouse experience, including : analysis, technical design, coding, testing, deployment, and transition to support.
o2+ years of experience working with Hadoop or other Big Data platform : o2+ Experience working with RDBS such as SQL Server, MySQL.
Experience building Big Data solutions using Hadoop and / or NoSQL technology
Experience loading external data to Hadoop environments using tools like MapReduce, Sqoop, and Flume
Experience working with very large data sets, knows how to build programs that leverage the parallel capabilities of Hadoop and MPP platforms
Experience interfacing with data-science products and creating tools for easier deployment of data-science tools
Experience with Scala and / or Spark.
Curious about data and passionate about the business value of big data and advanced analytics role
Experience with object oriented programming
Consulting experience with a demonstrated ability to multi-task and apply initiative and creativity on challenging projects and keep the customers engaged with innovative solutions
Solid work experience in at least one of these Analytics domains Supply Chain, HR, IT, Finance, Trading
Strong experience of directly working with business or stakeholders from across the globe
Ability to ask next level questions anticipating business inquiries and performing root cause analysis
Ability to work in a globally dispersed team and matrix organization
English Advance Level Preferred requirements
Experience with Front End BI tools (Tableau, PowerBI, Business Objects)
Experience with statistical tools such as R