Flex Engineering is seeking senior engineers to join the Data Infrastructure team that is shaping the future of our finance data platforms. The Data Infrastructure team is dedicated to building a scalable, modern data platform to support Flex’s current and future financial products with a cutting-edge, forward-thinking approach. Our goal is to empower all facets of the business with Machine Learning and Business Intelligence, enhancing the end-users' ability to better utilize our data. Our impact extends beyond reporting and analytics; our tools directly influence the growth and strategy of Flex’s future direction. At Flex, we have a culture of data-driven decision-making, and we demand data that is timely, accurate, and actionable.
What You’ll Do
- Design, implement, and maintain high-quality data infrastructure services, including but not limited to Data Lake, Kafka, Amazon Kinesis, and data access layers.
- Develop robust and efficient DBT models and jobs to support analytics reporting and machine learning modeling.
- Closely collaborating with the Analytics team for data modeling, reporting, and data ingestion.
- Create scalable real-time streaming pipelines and offline ETL pipelines.
- Design, implement, and manage a data warehouse that provides secure access to large datasets.
- Continuously improve data operations by automating manual processes, optimizing data delivery, and redesigning infrastructure for greater scalability.
- Create engineering documentation for design, runbooks, and best practices.
What We’re Looking For
- A minimum of 4 years of industry experience in the data infrastructure/data engineering domain.
- A minimum of 4 years of experience with Python and SQL. Java experience is a plus.
- A minimum of 2 years of industry experience using DBT.
- A minimum of 2 years of industry experience using Snowflake and its basic features.
- Familiarity with AWS services, with industry experience using Lambda, Step Functions, Glue, RDS, EKS, DMS, EMR, etc.
- Industry experience with different big data platforms and tools such as Snowflake, Kafka, Hadoop, Hive, Spark, Cassandra, Airflow, etc.
- Industry experience working with relational and NoSQL databases in a production environment.
- Strong fundamentals in data structures, algorithms, and design patterns.
Core competencies:
- Prior experience working on cross-functional teams.
- Industry experience in using Infrastructure as Code tools, specifically CDK and Terraform.
- Experience with CI/CD to improve code stability and code quality.
- Motivated to help other engineers succeed and be effective.
- Excited to work in an ambiguous, fast-paced, and high-growth dynamic environment.
- Strong written and verbal communication skills.
For working locations in NY/NJ/CA, the base salary pay range will be $197,000-$213,000
For all other states, the base salary pay range will be $177,000-$192,000