Senior Engineer - Data, Platform & Analytics

Element14 · Remote / Washington DC

Data + Analytics
Poverty Alleviation & Economic Development
Public Infrastructure
Public Service & Civic Engagement
$140,000 - $180,000 Per Year
Posted 2 hours ago
Report an Issue
Featured Job

Senior Engineer - Data, Platform & Analytics

Element14 is hiring a Senior Engineer to work across the full surface of our financial analytics work for a public sector organization focused on housing and community development. One sprint you might be building data pipelines that bring commercial data into the lakehouse. The next, building a backend service that surfaces analytical outputs to program staff. The week after, wiring an LLM into a workflow that turns unstructured documents into structured signals. This is a generalist seat for someone who is comfortable moving across data, backend, and AI tooling without needing a handoff — and senior enough to be effective in each.

What you would work on

Element14 is building a production financial analytics capability for a public sector organization focused on housing and community development. The organization manages a large grant and program portfolio, and the team's job is to help leadership and program offices understand it — where the money is going, how programs are performing, where the patterns in the data warrant attention, and what the modeling work can do to support better decisions.

The team produces three flavors of analytics, each with its own users and its own data. Portfolio analytics gives leadership the macro view of the program landscape. Entity analytics gives program officers and analysts the per-firm view that supports oversight and program management. Transaction analytics gives operating staff a per-payment view at the point of action. We start producing real value on day one, using the organization's own data and public sources.

The work is data science and engineering applied to financial data. Building statistical and ML models on disbursement and recipient data. Designing analytical products that program staff and leadership actually use. Connecting the organization's internal systems to public datasets in ways that reveal patterns the organization could not see on its own. Careful, defensible work that holds up under scrutiny.

We are AI-first by default. Modern data science and ML are part of the toolkit on every engagement — LLMs for parsing unstructured documents and free-text fields, ML for scoring and classification, agentic workflows for repetitive analytical work. We are not chasing AI for its own sake, but we are not doing 2018-era data science either. We expect the people on this team to be fluent in current tools and to use them to be faster and sharper than the consulting median.

The data sources span the organization's own systems, commercial data, and public records. Organizational systems include the general ledger, grant tracking, and program disbursement systems. Commercial sources include major entity and identity data providers. Public sources include USASpending.gov, FFATA sub-awards, SAM.gov, and IRS Form 990s. The interesting analytical questions almost always live at the intersection.

Beyond this engagement, we expect this team to grow with the firm. As Element14 wins additional federal and state work, the people we hire now will help shape future engagements and the capabilities we build.

What you will do

  • Move across the stack as the work demands. One week is data pipelines, another is backend services or APIs, another is AI tooling. You do not need to be expert in all of it, but you need to be effective in most of it and willing to fill in the rest. We are a small, senior team — versatility is what makes the team work.
  • Contribute to the integration layer. Help bring data in from the organization's internal systems where access is granted, major commercial data providers, and public records (USASpending.gov, FFATA, SAM.gov, IRS 990s). APIs, SFTP, bulk file drops — handle the full range.
  • Help shape the cloud lakehouse. Work alongside the team on schema design, storage layout, and the bronze/silver/gold structure that makes analytical work fast and audit-defensible. You do not need to own the platform end-to-end, but you should be able to contribute to it and reason about the choices.
  • Build backend services and APIs that put analytical work in front of the people who use it. Model scoring endpoints, data product surfaces, integrations between the lakehouse and the application layer. Python or Node.js; REST or GraphQL; async jobs where the work calls for it.
  • Contribute to entity resolution work. The same legal entity often appears under multiple identifiers across programs; the same recipient appears in different forms across datasets. Help ship the linkage layer the rest of the team builds on — Splink, Senzing, dedupe, or comparable.
  • Help maintain data lineage and governance. Versioned transformations, PII handling controls, documentation that holds up to an A&A review and an audit. Not your sole domain — but you treat it as part of doing the work properly.
  • Use modern AI tooling as a standard part of your engineering workflow. LLMs and agentic coding tools to accelerate code, generate transformations, and explore unfamiliar data. We expect this on every engineering role on the team.

Who we are looking for

In addition to the qualities we look for in everyone on the team:

  • Genuine interest in public service. You are excited about helping government agencies operate more effectively and advance their mission. The variety appeals to you — one quarter you might be working on financial integrity for housing programs, the next on agricultural data, the next on health programs.
  • Commitment to technical craft. You stay current. You can point to tools and techniques you have picked up in the last twelve months and explain why they matter. Cloud, data, and AI are central to what we do, and we expect that to be central to how you think too.
  • Strong communication and relationship-building. You build trust with government stakeholders by listening carefully, explaining clearly, and following through. You learn the program — not just the data — deeply enough that clients want you in the room.
  • Drive and ownership. This is not a clock-in, clock-out role. We want people who care, and that shows up in their work — anticipating the next question, fixing what is broken before being asked, and treating the mission as their own.
  • Senior engineering depth and range. Five or more years building production systems at meaningful scale. Strong Python and SQL. Comfortable across data pipelines and backend services — you have written both and you know which one a problem needs. You write code other engineers can read and maintain, and you have shipped systems that real users depended on.
  • Data engineering literacy. You have built data pipelines before. Hands-on experience with Apache Spark for distributed processing. You understand partitioning, broadcast joins, and why a slow Spark job is slow — even if you are not the person who lives in Spark internals day to day.
  • AWS comfort. Hands-on experience with the AWS stack — some combination of S3, Glue, Athena, Lambda, ECS or EKS, API Gateway, RDS, IAM, VPC. You can architect a service or a pipeline end-to-end and ship to a FedRAMP-aligned environment without surprise. Familiarity with GovCloud is a plus.
  • Lakehouse and modeling literacy. You can read and contribute to a schema designed for analytical workloads. You know the difference between star schemas, wide denormalized tables, and lakehouse layouts, and you can hold your own in a design conversation about file formats (Parquet, Iceberg, Delta), partitioning, and clustering.
  • Integration range. You have integrated heterogeneous sources — REST APIs, SFTP feeds, bulk file drops, change-data-capture from operational systems — and built the orchestration to make them reliable. You have opinions about Airflow, Dagster, and Step Functions even if you do not pick favorites.
  • Governance and lineage as part of the craft. You take audit-defensibility seriously even when it is not the most exciting part of the work. You version your transformations. You document. You build for the auditor who will read your lineage two years from now and need to trust it.
  • Applied, not academic. You roll up your sleeves and ship. You build for the actual scale and complexity of the problem, not the one in the architecture diagram. You leave systems easier to work with than you found them.
  • AI-first. Comfortable using modern LLMs and agentic coding tools as part of your daily engineering workflow. You use them to be faster and more thorough — not as a substitute for understanding.
  • Preferred: evidence of work you have done outside a paid job — open-source contributions, side projects, academic research, hackathons, or a portfolio you can walk us through. We read these as signals of curiosity and craft.
  • Also a plus: backend or API development experience (Python, Node.js, or comparable); experience with Databricks Unity Catalog or comparable governance tooling (Collibra, Alation); experience with entity resolution at scale (Splink, Senzing, or comparable); experience with FedRAMP or AWS GovCloud; experience with government financial data; experience integrating major commercial data vendors; orchestration tooling like Airflow, Dagster, or dbt.
  • Required: ability to obtain a U.S. Federal Public Trust clearance. This requires U.S. citizenship or lawful permanent residency and a successful background investigation.

Salary range: $140,000 – $180,000 annual base. Final offer determined based on experience, depth, and the specific seat. We post ranges because we believe in transparency about pay.

Location: Hybrid — Washington, DC metro area preferred for periodic on-site collaboration with the client. Remote candidates within the United States considered for the right fit. U.S. work authorization required.

Perks & benefits

  • Health, dental & vision coverage. Comprehensive medical plans with generous company support toward your premiums.
  • 7 weeks paid time off. 11 federal holidays, 15 vacation days, and 8 sick days each year.
  • 401(k) retirement plan. Save for the future, pre-tax.
  • Remote-friendly across the U.S. Work from anywhere in the United States, with periodic on-site collaboration in the DC metro area where the work calls for it. We value autonomy and trust.
  • Mission-driven work. Meaningful projects that change how government uses data and technology in service of the public.
  • Small, tight-knit team. A senior bench where your ideas matter and your growth is encouraged.

How to apply

Apply through Tech Jobs for Good, or send your resume and a short note about why this work interests you to [email protected]. If you have a portfolio, GitHub, blog, or anything else that shows how you think — please include it. We read everything that comes in.

Element14 is an equal opportunity employer. We are committed to building a team that reflects the public we serve.

View more remote jobs
Be the first to see new Senior Engineer - Data, Platform & Analytics jobs

Save this search to get an email when new jobs match this search.

Create Email Alert