Role: Client Reporting Technical Analyst
Employment Type: Contract
Location: San Francisco, CA
Description:
Overview
We are seeking a technically proficient Technical Analyst to support the design, delivery, and continuous improvement of Client Reporting technology. This role will be an expert on the end-to-end client reporting data flow - from upstream data ingestion and pipeline management through to final report generation and distribution - ensuring accuracy, timeliness, and operational resilience across all reporting channels.
The ideal candidate brings hands-on experience with reporting platforms like Seismic or Vermilion, a strong command of data pipelines and warehouse structures, and a working knowledge of the broader investment operations data ecosystem. This role is well suited for a self-starter who can independently diagnose data issues, partner with technology and business teams, and drive continuous improvement in a lean, high-ownership environment.
Key Responsibilities Include
• Subject matter expert on end-to-end client reporting workflow, from data sourcing and
transformation through report generation and distribution across institutional, intermediary,
and internal stakeholders.
• Develop deep familiarity with report templates, delivery schedules, and client-specific data
requirements across standard and custom reporting packages.
• Manage and triage production report incidents, proactively identifying and resolving failures,
data anomalies, or delivery exceptions prior to client impact.
• Collaborate with stakeholders to translate business reporting requirements into technical data
and platform specifications.
• Maintain a thorough understanding of the client reporting data flow - from source systems
(vendor system, internal data warehouse) through ingestion, transformation, and final report
output.
• Support and troubleshoot data pipelines feeding reporting platforms, including identifying root
causes of data quality issues, mapping upstream data dependencies, and coordinating
resolution with data and platform teams.
• Document data lineage, pipeline logic, and table/schema structures for client reporting datasets,
contributing to a maintainable and auditable data environment.
• Support platform upgrades, integrations, and UAT testing as part of the broader SDLC process.
• Contribute to data governance practices including metadata documentation, data dictionary
maintenance, and lineage tracking within platforms such as Ataccama or Collibra.
• Maintain and prioritize a backlog of reporting enhancements, automation opportunities, and
data quality improvements.
• Provide hands-on support for ad hoc data requests and analysis in support of client deliverables,
regulatory filings, and business initiatives.
Qualifications
3–7 years of experience in financial services, preferably within a buy-side asset manager supporting
client reporting, investment operations, or related data functions.
• Demonstrated understanding of the end-to-end client reporting data flow, including data
sourcing from accounting and operations platforms, pipeline transformations, and report
distribution.
• Proficiency in SQL and the ability to independently query, analyze, and troubleshoot complex
datasets across relational databases or cloud data warehouses (Snowflake preferred).
• Working knowledge of data pipeline concepts, including ingestion patterns, staging layers,
transformation logic, and table/schema design.
• Strong analytical and problem-solving skills with a demonstrated ability to identify data issues,
trace root causes, and drive resolution independently.
• Experience managing or supporting technology projects within the SDLC framework.
• Excellent communication and stakeholder management skills.
• Demonstrated ability to take initiative, work independently, and hold themselves accountable
for delivering results in a lean, low-structure environment.
• Experience with Eagle PACE, Eagle SRM, or comparable fund accounting and security master
platforms strongly preferred.
• Familiarity with Snowflake or similar cloud data warehouse environments, including
understanding of data models supporting performance, holdings, and characteristics, preferred.
• Experience with scripting or automation (Python or equivalent) to support pipeline monitoring,
report validation, or workflow automation preferred.
• Bachelor's degree required