top of page
Image-empty-state.png

William Hill

Meroxa

William Hill is a staff software engineer at Meroxa with over a decade of experience building full-stack systems in domains ranging from national security and climate science to healthcare and generative AI. He specializes in data engineering, distributed systems, and real-time data processing.

Throughout his career, William has built tools for subject matter experts in nonproliferation, climate science, solar energy, and basketball analytics—translating complex domain needs into reliable, scalable software. He has led high-impact projects including real-time maritime data integrations with the U.S. Space Force, HL7-to-FHIR pipelines for health data interoperability, and tactical communication tools for the Air Force. At Livermore National Lab, he led DevOps efforts for the R&D 100 award-winning ESGF climate data platform, modernizing an 8,000-line Bash release process into a streamlined CI/CD system. At New Relic, he launched features like Saved Views for Log Management and the S3 Log Forwarder used by thousands of customers.

He is an active contributor to the open-source Conduit project, building data connectors that power modern ETL pipelines.

A seasoned speaker, William has presented at conferences such as AWS re:Invent, BiTCON, RenderATL, and Tapia. His talks and workshops cover topics like AI agent workflows, Retrieval Augmented Generation (RAG), and global state management in React. He is also committed to increasing diversity in tech through mentorship and community engagement.

Conduit: Real-Time Data Streams Made Simple (and Fast) with Go

Real-time data pipelines are often associated with complex, distributed infrastructure. But what if you could stream data across systems using just Go and a simple configuration file?

In this talk, I’ll introduce Conduit, an open-source real-time data integration tool written in Go and designed for simplicity, speed, and extensibility. This talk is informed by my experience at Meroxa, the company that created and maintains Conduit, where I’ve worked hands-on with the project to build custom components and understand its internal architecture.

I’ll walk through how Conduit uses Go’s concurrency patterns—goroutines, channels, and interfaces—to power modular, high-throughput streaming pipelines. You’ll see how data can be streamed from sources like Postgres or Kafka to destinations like Elasticsearch or S3 with minimal setup, and how the gRPC-based plugin system enables developers to extend functionality using Go.

We’ll walk through a simple pipeline and explore the design tradeoffs that make Conduit developer-friendly and production-ready.

Whether you’re a Go developer exploring systems design or a data engineer seeking flexibility without operational overhead, this session will show how Go makes real-time streaming accessible.

Takeaways:

Understand how Go’s concurrency model powers Conduit’s architecture

Learn how to build streaming pipelines using simple YAML configs

See how plugins extend functionality using Go and gRPC

Gain insight into the tradeoffs behind designing a real-time system in Go

bottom of page