
Financial Data Engineer
Angel Studios
Provo, UTThis was removed by the employer on 5/1/2026 6:36:00 PM PST
Not to worry we have many other jobs on the site;
Browse all jobs
Browse the Accounting/Finance Category
Browse the IS/IT Category
Search for Financial Data Engineer jobs in Provo-UT
Search all Financial Data Engineer postings
This is a Full Time Job
Office: Provo
You'll take ownership of Angel's financial data foundation
You'll take ownership of how Angel understands, reconciles, and operationalizes its financial data-building the datasets, integrations, and metric definitions that make our numbers accurate, traceable, and actionable. From Stripe and Shopify through NetSuite and into Snowflake, you'll ensure financial transactions are auditable from source events to journal-ready outputs, automate the workflows that keep Finance moving fast, and create a financial data foundation the company can rely on as we scale in a public-company environment.
Why Join Angel
• Massive Impact: Your work will shape how audiences experience the stories they love-and help amplify light around the globe.
• Extreme Ownership: We are a team of owners and entrepreneurs. This is your chance to operate like a startup founder inside a fast-scaling media company.
• Future of Streaming: Help build a streaming platform that competes with giants-without playing by their rules.
What You'll Do
As our Financial Systems Data Engineer, you'll work directly with our finance team to turn finance and operational data into trusted datasets, automated integrations, and clear reporting that helps Angel run with accuracy and speed. You'll work with focused depth-so you can deliver high-velocity, high-impact work on financial data foundations, reconciliations, and the metrics that power decision-making.
This is a hub-and-spoke role: you'll be a core, cross-functional member of the Data team, embedded day-to-day with Finance to accelerate outcomes, improve data quality at the source, and make financial metrics consistent and auditable.
• Design and maintain financial data pipelines - build and maintain pipelines that ingest, normalize, and reconcile data across payment, commerce, and accounting systems (Stripe, Shopify, NetSuite, etc.), ensuring transactions are traceable from source events through ledger/journal-ready outputs. Monitor and maintain pipeline reliability by handling upstream scheme changes, data anomalies and operational issues to ensure consistent data delivery.
• Model end-to-end transaction lifecycles - reconstruct and model complex financial flows (payments, refunds, fees, taxes, adjustments, chargebacks) including edge cases like prorated refunds, gross vs. net logic, and tax jurisdiction nuances-translating them into auditable datasets suitable for accounting and reporting.
• Build trusted financial datasets - write complex SQL in Snowflake (and dbt where appropriate) to create clean, reusable datasets that power reporting, reconciliations, forecasting support, and ad hoc analysis.
• Own cross-system integrations and identifiers - use Python to build and maintain integrations (REST APIs and event-driven workflows where applicable), including schema change handling, metadata mapping, and reconciliation of cross-system identifiers.
• Investigate and resolve discrepancies - trace transactions across multiple systems and data sources to identify root causes, resolve mismatches, and improve upstream data correctness and monitoring.
• Establish trust through financial metrics - define, refine, and communicate the metrics that matter (revenue, refunds/chargebacks, COGS, margin, deferred revenue concepts, cash timing), building stakeholder confidence in how we measure performance.
• Use AI to move faster (without lowering the bar) - leverage AI tools to accelerate SQL/dbt development, debugging, documentation, testing, and carefully scoped workflow automation, while maintaining high standards for correctness, traceability, and security.
What Success Looks Like
• Finance and leadership trust the numbers-your Snowflake datasets (and dbt models where used) become the go-to source for financial reporting and analysis.
• Financial transactions are traceable end-to-end from Stripe/Shopify source events to NetSuite-ready outputs, with clear documentation and reconciliation paths.
• Manual reconciliations and recurring reporting tasks are automated, reducing close time and operational risk.
• Discrepancies across systems are identified earlier, debugged faster, and fixed at the root-improving correctness and confidence.
• Teams across the organization confidently use the metrics and data foundations you've built in Rill, Lightdash, and Metabase.
What You'll Need
• Strong SQL foundation - you can build reliable, reusable datasets from messy real-world tables, and you know how to validate results when numbers don't match expectations.
• Python fluency for integrations and automation - you can write production-quality scripts/services to move, transform, and reconcile data across systems (APIs, scheduled jobs, lightweight pipelines), with solid engineering hygiene.
• Systems integration experience - experience integrating systems through REST APIs and/or event-driven workflows, including handling schema changes, metadata reconciliation, and cross-system identifiers.
• Financial modeling accounting literacy - you understand core accounting principles and terminology (GL, accruals, revenue recognition concepts, debits/credits) and can translate transaction flows into ledger-ready, auditable logic.
• Auditability and traceability mindset - you design datasets so transactions can be reconciled back to source systems and support compliance requirements (especially important in a public-company environment).
• High-velocity execution with deep focus - you deliver impact quickly and maintain momentum on deep technical work without constant context switching, while knowing when to build a durable solution versus a quick one.
• Strong ownership and initiative - you spot problems in the data, define an approach, and drive it to completion with urgency and accountability, raising the bar for data trust.
• Exceptional communicator - you can explain financial data logic clearly to both technical and non-technical stakeholders, and you write documentation that makes your work easier to adopt and audit.
• AI-enabled productivity mindset - you're willing (and ideally experienced) using AI to speed up development, improve code quality, generate documentation and tests, and support carefully scoped workflow automation-without compromising data correctness, auditability, or control.
Experience
• 6 years in a data engineer, analytics engineer, data scientist, or adjacent role with substantial SQL and Python responsibility (title flexible if skill fit is strong).
• Demonstrated ability to write complex SQL and build/maintain durable datasets in Snowflake; familiarity with dbt is a plus.
• Demonstrated ability