Remote JobsRemote CompanyBlog
Sign In
Sign Up
Back to all jobs

Portfolio Data Engineer

Remote, UK
Data

The Role

Portfolio Data Integration is part of the broader Addepar Platform team. The overall Addepar Platform provides a single source of truth “data fabric” used by the Addepar product set, including a centralised and self-describing repository (a.k.a Data Lake), a set of API-driven data services, an integration pipeline, analytics infrastructure, warehousing solutions, and operating tools. The team has responsibility for all data acquisition, conversion, cleansing, disambiguation, modelling, tooling and infrastructure related to the integration of client portfolio data.

Addepar’s core business relies on the ability to quickly and accurately ingest data from a variety of sources, including 3rd party data providers, custodial banks, data APIs, and even direct user input. Portfolio Data integrations and feeds are a highly critical cross-section of this set, allowing our users to get automatically updated and reconciled information on their latest holdings onto the platform.

As a data engineer for this team, you will execute the development of new data integrations and maintenance of existing processes in order to expand and improve our data platform. You’ll be adding automation and functionality to our distributed data pipelines by writing PySpark code and integrating it within our Databricks Data Lake. As you gain more experience, you’ll contribute to increasingly challenging engineering projects within our platform with the ultimate goal of dramatically increasing the throughput of data ingestion for Addepar. This is a crucial, highly visible role within the company. Your team is a big component of growing and serving Addepar’s client base with minimal manual effort required from our clients or from our internal data operations team.

What You’ll Do

  • Complete individual project priorities, deadlines, and solutions.
  • Build pipelines that support the ingestion, analysis, and enrichment of financial data in partnership with business data analysts
  • Improve the existing pipeline to increase the throughput and accuracy of data
  • Develop and maintain efficient process controls and accurate metrics to ensure quality standards and organisational expectations are met
  • Partner with members of Product and Engineering to design, test, and implement new processes and tooling features that improve data quality as well as increase operational efficiency
  • Identify areas of automation opportunities and implement improvements
  • Understand data models and schemas, and work with other engineering teams to recommend extensions and changes

Who You Are

  • 1-3 years of professional data engineering/analysis experience
  • Experience with object-oriented programming (we use PySpark/Python)
  • Knowledge of SQL or relational database concepts
  • Experience in data modelling, visualisation, and ETL pipelines
  • Knowledge of financial concepts (e.g., stocks, bonds, etc.) is helpful but not necessary
  • Passion for the world of FinTech and solving previously intractable problems at the heart of investment management
 Apply this job
Please mention that you found this job on remotewlb.com. Thanks & good luck!
 Apply
 Save
Share to :

Addepar

New Job Alert

COMING SOON~
Follow us on
Give a ⭐ on
Similar Jobs
Find more remote jobs
Do you love using our product?

Share a testimonial/suggestion.We'd love to hear about it!

Click to submit✍️
logo of sitemark

Copyright © RemoteWLB 2025

Remote Dev JobsRemote Support JobsRemote Design JobsRemote Sales JobsRemote Product JobsRemote Business JobsRemote Data JobsRemote Devops JobsRemote Finance JobsRemote Legal JobsRemote HR JobsRemote QA JobsRemote Write JobsRemote Edu JobsRemote Market JobsRemote Management JobsRemote Others Jobs