7 Comments
User's avatar
Runay Dhaygude's avatar

What we are trying to achieve is a write to DB and also publish to kafka. What if we first publish to kafka and make the DB as one of the consumer to it ? Is this viable ?

Munashe Njanji's avatar

That's the event driven architecture. It might not be good if you are interested in having immediate consistency as the database may not reflect the latest state immediately after the event is published. So this results in you having stale reads.

Runay Dhaygude's avatar

Got it!

Raul Junco's avatar

Thanks Runay for the great question.

Munashe, you beat me on this one, thanks for great answer!

Priyansh Agrawal's avatar

So you have "listen-to-yourself". Jokes apart. We call this approach "listen to yourself" . You can read more about here

https://www.confluent.io/blog/dual-write-problem/

Saurabh Dashora's avatar

Great article Raul.

We had implemented dual writes using CDC for one of the projects. It was a good outcome overall.

Also, thanks for the mention!

Raul Junco's avatar

Thanks for sharing, Saurabh!!!