Priyank Bagrecha

Results 24 comments of Priyank Bagrecha

@walterddr looks like you are correct. i see two entries in the pinot server configmap ``` apiVersion: v1 data: pinot-server.conf: |- pinot.server.netty.port=8098 pinot.server.adminapi.port=8097 pinot.server.instance.dataDir=/var/pinot/server/data/index pinot.server.instance.segmentTarDir=/var/pinot/server/data/segment pinot.set.instance.id.to.hostname=true pinot.server.instance.realtime.alloc.offheap=true pinot.server.instance.currentDataTableVersion=2 pinot.server.instance.dataDir=/var/pinot/server/data/index pinot.server.instance.segmentTarDir=/var/pinot/server/data/segment...

https://github.com/apache/pinot/blob/b2da31005b9e958a022150cfaac228a951daf0b3/pinot-server/src/main/java/org/apache/pinot/server/starter/helix/DefaultHelixStarterServerConfig.java#L40 logs the config map ``` 2022-11-17 13:53:41 | External config key: pinot.server.instance.segmenttardir, value: /var/pinot/server/data/segment,/var/pinot/server/data/segment 2022-11-17 13:53:41 | External config key: pinot.server.swagger.use.https, value: true 2022-11-17 13:53:41 | External config key:...

@Jackie-Jiang yes the issue is resolved after removing the 2 extra entires.

i even tried using `"value.converter.protoMapConversionType": "map"` as mentioned in the documentation of [kafka-connect-protobuf-converter](https://github.com/blueapron/kafka-connect-protobuf-converter/blob/master/README.md#to-connect), but it still writes `map` in protobuf as `array` in parquet.

@Enalmada is that still working for you? it doesn't seem to be working for me. this is what i am doing, please let me know if you are doing something...

@Enalmada even with your suggestion I get ``` imgmin.c:28:29: fatal error: wand/MagickWand.h: No such file or directory #include ^ compilation terminated. dssim.c: In function ‘convert_image_row’: dssim.c:304:55: warning: unused parameter ‘inf’...

that worked. thanks @Enalmada. I had to do ```"sudo apt-get install libmagickcore-dev libmagickwand-dev"```. that got the ```/usr/include/ImageMagick-6/wand/MagickWand.h```

i was able to build locally by following instructions at https://github.com/twitter/hadoop-lzo/tree/9ab0565b74e4ac11172c29acf1b398f7aacfb767. i had to also follow suggestion from https://github.com/twitter/hadoop-lzo/issues/35#issuecomment-2687776 to make sure it builds. hope this helps.

i see jars for all versions of `hadoop-lzo` at https://maven.twttr.com now, thanks for fixing the issue!