elasticsearch-jdbc icon indicating copy to clipboard operation
elasticsearch-jdbc copied to clipboard

fields.raw were not created

Open lhenry2k opened this issue 9 years ago • 6 comments

I successfully ran jdbc and it created (for every string type field) another field.row (non-analyzed) which seems necessary for aggregations in kibana.

Then I moved the script to production (same EL/jdbc 2.1.1.0 environment), I ran it but this time .raw filelds were not created, but have no idea why. Are those fields initialized by ES or created and imported by jdbc ? tnx

lhenry2k avatar Jan 26 '16 23:01 lhenry2k

Can you show how you run JDBC importer?

jprante avatar Jan 27 '16 07:01 jprante

I just changed the statement on mysql Geopoint example and it worked for various indexes and queryes. Both ES istances are 2.1.1 and both java are java 1.8.0_72

I run with $ bash bin/script.sh

Anyway, If I got very confused about the REST command you do use: Why an importer would delete an index wich is not supposed to exist yet? if run the script with (your) XDELETE i get "index not found" error and jdbc stops. If I change to XPOST everything is OK

#!/bin/sh

DIR="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
bin=${DIR}/../bin
lib=${DIR}/../lib

curl -XPOST 'localhost:9200/logstash-my2008'

echo '
{
    "type" : "jdbc",
    "jdbc" : {
        "url" : "jdbc:mysql://127.0.0.1:3306/db",
        "user" : "lhenry",
        "password" : "lhenry",
        "locale" : "en_US",

    "sql" : [
            {
                "statement" : "select    table.id  as _id,  \"logstash-my2008\" as _index,    \"preno\" as _type,    .... [etc etc]       "
            }
        ],

    "elasticsearch" : {
             "host" : "localhost",
             "port" : 9300
        },
        "index" : "logstash-my2008",
        "type" : "preno",
        "index_settings" : {
            "index" : {
                "number_of_shards" : 2
            }
        }

    }
}
' | java \
    -cp "${lib}/*" \
    -Dlog4j.configurationFile=${bin}/log4j2.xml \
    org.xbib.tools.Runner \
    org.xbib.tools.JDBCImporter

lhenry2k avatar Jan 27 '16 10:01 lhenry2k

The scripts are just examples to get things demonstrated. They are not for production use.

jprante avatar Jan 27 '16 10:01 jprante

Can you clear about those .raw fields? Is Jdbc creating them during import ? Does it always create them when finds string fields or under some other conditions? I do not belive I'll have to add same field twice in the sql and then work with mapping to instruct ES not to anyalize them.. tnx

lhenry2k avatar Jan 27 '16 11:01 lhenry2k

replaying to myself, here his why:

https://discuss.elastic.co/t/confused-about-how-to-use-raw-fields-and-not-analyze-string-fields/28106/27

lhenry2k avatar Feb 02 '16 21:02 lhenry2k

@lhenry2k how did you solve the issue? The issue you quoted is implying to rename the index to logstash-%date. However, when I rename my indicies by adding the date e.g. logstash-2015 nothing changes even after reindexing. I can neither set "analyzed" : "false" before indexing nor access any .raw data fields. Thank you in advance

zehbauer avatar Oct 17 '16 13:10 zehbauer