Packer: How to Build a Golden Image behind Corporate Proxy

Packer is a powerful tool to build a golden image for multiple platforms. All the configurations are scripted therefore you can embrace Infrastructure-as-a-Code practice. And the best thing, it is FREE!

Migration Strategy from Enterprise Scheduling to AWS Hybrid Cloud – Part 2: Implementation

Solution Diagram

After we have a blueprint for our solutions in part 1, now it is time to implement it into code. We use AWS Cloud Development Kit (CDK) as Infrastructure as a Code and Spring Boot as our custom job implementation.

How to Provision AWS EC2 in Private Subnet by Using SSM and Ansible Dynamic Inventory

An example to provision EC2 instances in the private subnet using AWS SSM, Ansible Dynamic Inventory and AWS community collections.

A Study Case: Building A Simple Credit Card Fraud Detection System – Part 2: Mapping Gerkhin to Kafka Streams

Part 2 will talk about mapping the Gerkhin feature file into Cucumber Step Definition and its implementation using Kafka Streams.

A Study Case: Building A Simple Credit Card Fraud Detection System – Part 1: From User Story to Gerkhin Feature File

Part 1 will talk about building a user story and how to translate it into Gerkhin feature file.

How to Unit Test Kafka Streams Application – PART 2 (Processor API)

Part 2 of 2 articles to unit test Kafka Streams application. In the second part, I talk about testing Processor API by using MockProcessorContext as well as how to test Processor Scheduler with two types of Punctuator: STREAM_TIME and WALL_CLOCK.

How to Unit Test Kafka Streams Application – PART 1 (DSL)

This is part 1 of 2 articles to unit test Kafka Streams application. The first part talks about testing DSL transformation, stateless and stateful, including joining and windowing.

Filtering Nested Array Objects in Elasticsearch Document with Painless Scripting

Painless is a simple, secure scripting language designed specifically for use with Elasticsearch. It is the default scripting language for Elasticsearch and can safely be used for inline and stored scripts.

https://bit.ly/2TRuUNj

How to Join One-to-Many and Many-to-One Relationship with Kafka Streams

Yet another Kafka feature, which is Kafka Streams, allow us to join two streams into one single stream. Kafka streams provide one-to-many and many-to-one join types.

How to Encrypt and Decrypt Kafka Message using Custom (De)Serializers

Sensitive data always need to be handled with extra careful. Thus, in some cases, we need to encrypt before delivering message to Kafka topic.