Skip to content
Back to Projects
event streaming data lake analytics Redshift

Reporting 2 - Event Streaming & Core Data Sync

Backend (.NET + Kafka + AWS Lambda)

Screenshots coming soon

About

A Kafka-based event ingestion and data lake pipeline that captures retail events via a secure Events API, streams them into Redshift and S3, and keeps core data tables synchronized through scheduled consumers and AWS Lambda.

My Role

Developed a secure events API and Kafka consumers for streaming events into S3 and Redshift. Used S3-hosted JSON schemas for validation, enabling dynamic event-type management. Built a Lambda job to keep Redshift core tables in sync with transactional SQL systems. Implemented rate limiting, HMAC/AES token validation, and structured logging.

Core Capabilities

Architecture

Tech Stack

ASP.NET Core Confluent.Kafka Quartz.NET AWS S3 Amazon Redshift AWS Lambda Terraform