Software Engineer, Data Platform

Munich — Device

About the Company:

World is building a real human network designed to accelerate people in the age of AI. As bots and autonomous agents reshape the internet, people, institutions, and applications need a trusted way to confirm who is a real human while preserving privacy. Our products make this possible: the Orb verifies real people, World ID proves it privately, and World App enables and distributes the new applications made possible by this technology. Together, they form a new layer for AI internet.

We’re one of the fastest-growing networks in tech. More than 17 million people across 160 countries have verified with World ID, and we complete over 350,000 verifications each week. World App is already among the most used wallets globally. Developers are integrating World ID to build safer online experiences and create spaces where real people can participate, earn, and be recognized in ways AI simply can’t replicate.

World was founded in 2019 and launched globally in 2023. We are more than 400 people across hardware, software, AI, cryptography, mobile engineering, and global operations. Our teams come from OpenAI, Tesla, SpaceX, Apple, Google, Stripe, Meta, Coinbase, Palantir and MIT Media Lab. We’re backed by leading investors, including a16z, Khosla Ventures, Bain Capital Crypto, Blockchain Capital, Variant, Tiger Global, and Coinbase Ventures, as well as prominent operators and founders across fintech and AI.

World has been featured on the cover of TIME Magazine, highlighted in Fast Company’s Next 5 in Fintech, and explored in a Bloomberg deep dive. The New York Times, Bankless and TechCrunch have all recognized our progress in identity, cryptography, AI, and global-scale hardware deployment. Our leadership is also named to the Time AI 100. Learn more about the newest product launches from our Unwrapped event.

This opportunity would be with Tools for Humanity

About the AI & Biometrics Team:

The AI & Biometrics team is building a biometric recognition system that can work reliably with more than a billion users and enables them to claim their free share of WLD. We use cutting-edge machine learning models deployed on custom hardware to enable high-quality image acquisition, identification, and fraud prevention, all while requiring minimal user interaction.

We are building a biometric recognition and fraud detection engine that works on the 1bn people scale. Therefore, its performance needs to out-perform all the current recognition technologies. We leverage our powerful custom-made iris recognition and presentation attack detection device, the Orb, combined with the latest research from the field of AI and Deep Learning.

About the Opportunity:

You will join a high-impact team that maintains and evolves the data platform powering our AI pipelines. This is an all-rounder role that combines backend development, data engineering, infrastructure, and lightweight frontend work.

Your work will span the ingestion layer, transformation workflows, and the warehouse itself: designing resilient pipelines, building secure APIs, and creating services that make our datasets reliable, discoverable, and ready for large-scale training. You will also play a key role in rearchitecting existing systems into generic, reusable components, moving away from point solutions toward a data collection platform that can serve multiple programs without being rebuilt each time.

You will be a key contributor to the infrastructure that feeds and monitors our machine learning models in production: ensuring that data flows seamlessly, services run reliably, and governance standards are never compromised. Every solution you build will follow the highest security standards and rigorous data governance principles, ensuring sensitive biometric data is handled with absolute care.

This role is onsite 5 days/week and sits in our Munich office.

Key Responsibilities:

  • Design and operate automated data quality pipelines with human-in-the-loop review stages, combining automated checks with structured labeling workflows to determine whether ingested data meets acceptance criteria

  • Develop and refine transformation processes to deliver clean, well-structured datasets ready for analytics, model training, and evaluation: production-grade, with strong schema contracts

  • Instrument systems with metrics, alerts, and recovery mechanisms, and build internal tooling and dashboards that make dataset health, pipeline state, and operational metrics visible to engineers and stakeholders

  • Build APIs and backend services that provide secure, performant access to large datasets while upholding strict governance and privacy controls

  • Raise engineering standards by improving CI/CD pipelines, integration tests, and dependency management

  • Own the lifecycle of critical data assets, including lineage tracking, access control, and schema enforcement

You will work with both structured and semi-structured data, combining SQL-based platforms like Snowflake with NoSQL sources like MongoDB. You'll build resilient pipelines that handle versioning, schema evolution, and are GDPR compliant.

About You:

  • 5+ years of proficiency in Python, with experience building production services

  • Strong system design fundamentals, with experience evolving existing systems toward more generic, reusable designs

  • Experience designing and building APIs with security and performance requirements

  • Comfortable with containerization and orchestration tools like Docker and Kubernetes

  • Experienced with AWS services (S3, KMS, IAM) and Terraform for infrastructure as code

  • Skilled in designing and operating data ingestion and transformation workflows, with exposure to Snowflake or other SQL-based analytics platforms

  • Familiar with CI/CD pipelines and version control practices, ideally using GitHub Actions or similar tools

  • Committed to building systems that are secure, observable, and follow strong data governance principles

  • Obsessed with reliability, observability, and data governance, you care deeply about logs, metrics, and traceability

  • Experience with structured logging, metrics instrumentation, and alerting

  • Strong fundamentals in data modeling, schema design, and backward-compatible schema evolution

  • Comfortable working with NoSQL systems like MongoDB, especially for building ingestion frameworks, managing schema evolution, or integrating Change Streams into ETL pipelines

  • Knowledge of data partitioning strategies and large-scale dataset optimization for analytics and ML

  • Experience with event-driven data pipelines using SQS, SNS, Lambda, or Step Functions

Nice to have:

  • Proficiency in Go

  • Familiarity with annotation or labeling workflows and tooling

  • Exposure to monitoring and alerting stacks such as Datadog or Prometheus

  • Proficiency in Rust or an interest in learning it

Mag-apply Na

Ang Worldcoin ay kalahok sa E-Verify Program

Ang Worldcoin ay isang Equal Opportunity / Affirmative Action employer na nagpapahalaga sa pagkakaiba-iba sa loob ng kumpanya. Ang lahat ng kwalipikadong aplikante ay bibigyan ng patas na pagkakataon sa trabaho anuman ang lahi nila, kulay, relihiyon, kasarian, sekswal na oryentasyon, edad, bansang pinagmulan, kapansanan, protektadong katayuan bilang beterano, pagkakakilanlang pangkasarian o anumang salik na protektado ng naaangkop na pederal, estado o lokal na batas. Pinahahalagahan din ng Worldcoin ang pakikipagtulungan at pagbibigay ng makatuwirang tulong sa mga indibidwal na may mga kapansanan. Mangyaring ipaalam sa recruiter mo kung kailangan mo ng tulong sa anumang bahagi ng interview.

Pinahahalagahan din ng Worldcoin ang pakikipagtulungan at pagbibigay ng makatuwirang tulong sa mga indibidwal na may mga kapansanan. Mangyaring ipaalam sa recruiter mo kung kailangan mo ng tulong sa anumang bahagi ng interview.