A global market leader in analytics for resilience planning and enterprise climate risk management, especially in financial services, industrial, the public sector, and NGOs.
Led by pioneers in data, climate, and earth and ocean sciences, as well as technology, risk management, company building, and public policy. Their climate risk modeling solutions save lives and mitigate potentially catastrophic impacts inflicted by hurricanes, floods, heat waves, wildfires, drought, and other extreme weather events on homes, businesses, infrastructure, food and water supplies, and entire economies.
This position is in a fast-growing company created to meet the global demand for local climate and weather information to protect and develop assets, and to manage risk in operations. You will work with our exceptional scientific and technical staff, with experience in environmental modeling, impacts, and machine learning will be part of the team deploying models in an elastic computing environment.
Your critical responsibilities include, but not limited to:
- Design, develop, implement, optimize and maintain data engineering pipelines to Snowflake
- Be an expert in everything Snowflake and build a cloud data warehouse for analytical and data science workload.
- Identify, prioritize and execute tasks in the software development life cycle.
- Analyze requirements to translate into technical solutions.
- Develop high-quality software solutions that deliver the required product features.
- Self-learn, to keep up with technologies, schedules, and deliverables.
- Apply appropriate tools to analyze, identify, and resolve technical problems.
- Develop and enforce coding guidelines and best practices.
- Write design documentation, user documentation and test information as required.
- Participate in team meetings: daily stand-ups, bi-weekly sprint reviews, design meetings.
- Work closely with product, science, and delivery teams.
- Proactively communicate status on all projects and releases.
- Mentor junior and mid-level engineers.
- Bachelor's degree or equivalent practical experience.
- 4+ years of experience with Snowflake, Snowpipe, SnowSQL and data sharing.
- Precious experience with end to end implementation of Snowflake or some similar cloud base data warehouse.
- Excellent understanding and practical experience working with petascale data.
- Good written and verbal communication skills.
- Proficiency in object-oriented design and programming.
- Proficiency in client-server architecture, containers and microservices.
- Proficiency in programming languages such as Python, Java or GoLang.
- Proficiency in configuring and deploying applications on Linux-based systems.
- Knowledge and proficiency of cloud-based systems such as AWS (EC2, S3, RDS, Lambda), GCP or Azure.
- Experience architecting solutions of distributed processing frameworks such as Spark, Flink is a plus.
- Experience with NoSQL databases (e.g. AWS-DynamoDB, Presto or AWS-Athena) is a plus.
- Experience with geospatial data, projections, multidimensional data structures is a plus.