The Datawarehouse Developer & Data Engineer will be part of the FM Intelligence team within the Facilities / Property Management Engineering organization at JLL Technologies.
This person will collaborate with data scientists, business analysts, fellow data engineers, application owners, tech operations team, and various other stakeholders in a dynamic and rapidly changing environment.
This person will be responsible for key processes that enable embedding data-driven intelligence in the JLLT technology products.
Requirements : Required Skills
Minimum of 4 years of experience in data engineering and data operation roles. Ideally 7+ years.
Bachelor’s degree in computer science, engineering or similar quantitative field
Expert writing SQL queries and working with databases (such as Snowflake)
Expert user of programing languages for data analytics (such as python and R)
Expert user of various ELT tools and technologies (such as Fivetran, DBT, among other)
Knowledgeable about macro and template languages (such as SAS Macros, Jinja templates, etc.)
Knowledgeable about cloud technologies (such as Azure Machine Learning Service)
Knowledge of big-data technologies and Spark language a strong plus
Knowledgeable about data visualization platforms and tools (such as Tableau Server and Tableau Desktop)
Proficient user of Microsoft Office applications (Excel, Word, PPT)
Ability to learn quickly the technology stack of the Corrigo AI / BI team (Snowflake, DBT, python, Azure, etc.)
Strong attention to quality and to detail
Strong problem-solving skills
Excellent communication skills
Job Responsibilities :
Design, develop, and improve high-quality data pipelines, essential for best-in-class ML / AI applications, at scale.
Design, develop, and improve ELT processes, Tables, Views, SQL reports, and other data warehouse elements.
Collaborate with other members of the intelligence team and of the broader engineering team in general in the implementation of large-scale solutions that deliver business value.
Partner with internal clients, data scientists, and fellow data engineers in the grooming of the requirements.
Partner with application owners on the identification of data sources, change management, integration, and productizing of the intelligence.
Partner with the technical operations team on security requirements, infrastructure requirements, daily operational execution, and support.
Troubleshoot any data issues, find their root causes, and either fix them directly or take a lead role working with others to fix those issues.
Ensure adequate performance of the Datawarehouse & Data Pipelines. Use best practices, maintain simplicity while abiding by all architectural requirements aligned to our business goals.
Ensure quality of the input data, intermediate tables, and results to be delivered. Test any code / system changes in a validation environment before implementing them in production.
Develop automated tests and reports to ensure data quality is maintained when the data changes daily. Compare source and destination systems to ensure replication is successful.
Monitor logs and notifications. Participate in peer reviews. Support data quality improvement projects.
Ensure the processes and changes are sustainable from an operational standpoint. Understand well how the capabilities being developed may impact the job of those running them in production, monitoring them, and supporting them.
Learn new tools and technologies as needed for the job. Continue to learn and adopt DataOps best practices.
Write and run database queries to explore data, answer questions, research business issues, and develop metrics.
Automate data processes to run with the frequency needed, in both batch and real time scenarios.
Optimize the data flows from a performance, data quality, cost, and availability standpoint.
Maintained code organized and documented and ensure all results are always reproducible. Ensure ease of maintenance, readability, reproducibility, among other best practices.
Review the data model with both technical and business audiences as well as Data Governance
Perform various other related activities determined by management
What We Offer
Exciting Projects : Come take your place at the forefront of digital transformation! With clients across all industries and sectors, we offer an opportunity to work on market-defining products using the latest technologies.
Collaborative Environment : Expand your skills by collaborating with a diverse team of highly talented people in an open, laidback environment or even abroad in one of our global centers or client facilities!
Work-Life Balance : GlobalLogic prioritizes work-life balance, which is why we offer flexible work schedules, opportunities to work from home, and paid time off on your birthday.
Professional Development : Our dedicated Learning & Development team regularly organizes English classes, professional certifications, and technical and soft skill trainings.
We also offer the chance to travel internationally
Excellent Benefits : We provide our employees with competitive salaries, family medical insurance, extended paternity leave, annual performance bonuses, and referral bonuses.
Fun Perks : We want you to love where you work, which is why we cater breakfast daily and offer discounts for popular stores and restaurants!