Yes, Oracle always strive to integrate latest trends in the industry to its Database. Now the new one is Kafka integration with Oracle Database through DBMS_KAFKA package. Where you can publish events and consume or load data into tables.
This feature has been discussed in CodeOne 2018 by Melli Annamalai, Senior Principal Product Manager at Oracle as per Lucas blog post. Awaiting to see the feature and get hands dirty on it. Very recently I am into a project where I have to publish events back to Oracle database and I am using much of work around solutions. :). This will help
Read full article directly from Lucas Jellama's post
BEGIN
dbmskafka.register_cluster
(‘SENS2’
,'<Zookeeper URL>:2181′,
‘<Kafka broker URL>:9092’
,’DBMSKAFKA DEFAULT DIR’ ,
’DBMSKAFKA_LOCATION DIR’
‘Testing DBMS KAFKA’);
END;
Thanks
Suresh
Hi,
Is there some “step by step” about how to send messages to kafka directly from Oracle database?
Thanks
Hi Marcelo
Sorry for the delay in response, hope you found what you want, otherwise this will help, there is nothing I have written it yet
But this is what you can do
Goldengate for Bigdata
Oracle Database –> ———————– –> Kafka (with streasm API) –> Target (Datawarehouse, Elasticsearch, Bigdata)
Kafka Connect
I found interesting one https://www.confluent.io/blog/ksql-in-action-real-time-streaming-etl-from-oracle-transactional-data
and this one https://www.rittmanmead.com/blog/2016/07/introduction-oracle-stream-analytics/
Thanks
Suresh