kafka-elasticsearch-standalone-consumer
kafka-elasticsearch-standalone-consumer copied to clipboard
is it possible to provide one Docker file for this issue due to environment conflict ?
Hi,
It's very glad to see this project, it's really useful, I tried to set it up in our environment, but I failed to build it, I used mvn3 and elasticsearch 2.1 , i am not sure what caused the build failure, so i think it would be great if you could provide one Dockerfile then everyone can use this project very easily.
Thanks
Thanks for the suggestion .. I'll work on that ..
Hi @dhyaneshm : Can I go-ahead and write shell and bat scripts for this ? Or you have some other thought ?
One note: please note that all active development is moved to this project now: https://github.com/BigDataDevs/kafka-elasticsearch-consumer
It is Spring-based, and uses Gradle for build and running.
We still have to update all docs to point to this GIT repo, and provide updated documentation for building/running/configuration options.
Just a quick note before all that happens - the build/run can be done as:
To build from the command line: cd [project_root]/ ./gradlew clean jar
Result: ./build/libs/kafka-elasticsearch-consumer-0.0.2.0.jar
To run from a command line - using default properties from the project's config dir: ./gradlew run
To use your own properties: ./gradlew run -Dindexer.properties=/[your_dir]/my_indexer.properties -Dlogback.configurationFile=/[your_dir]/my_logback.xml
There are also changes in how one can extend IndexManager and BasicMessageHandler - documentation changes are coming soon ! :)
@reachkrishnaraj : As @ppine7 stated we don't need additional start script gradle tasks should take care of that.
NOTE : Currently gradle and spring changes are maintained in spring-conversion branch (https://github.com/BigDataDevs/kafka-elasticsearch-consumer/tree/spring-conversion)
@regarding request from @zousheng :
I'm thinking about just adding one additional docker compose to get the dependant softwares like kafka & Elastic search .
thanks for the information, looking forward to the updates.