gis-tools-for-hadoop icon indicating copy to clipboard operation
gis-tools-for-hadoop copied to clipboard

Error while converting json to features for ST_Buffer

Open SrinivasRIL opened this issue 8 years ago • 6 comments

Hi, My objective is to create a buffer of 1000 m for enode. I am able to obtain a json using the copy from HDFS tool. But I am unable to create a feature class using json to features tool. I used the following query

 > CREATE external table enodebuffer (shape binary);
 >INSERT OVERWRITE TABLE enodebuffer
> SELECT ST_Buffer(enodeaggregate.shape,100) from enodeaggregate;

I am able to obtain the count and retrieve the columns on hive. Copy from HDFS to obtain a json file also works successfully.

When I convert json to features using the git hub tool I get Failed to execute json to features and I get the following description for the error as shown in the image below. untitled

I tried the same query creating a table using multiple colums with the shape column but I get the same error as shown above.

Any info would be greatly appreciated

Thanks

SrinivasRIL avatar May 04 '16 10:05 SrinivasRIL

FYI, the error pasted here and the one that you posted on gis.stackexchange (json-to-features-not-working-while-performing-st-buffer) are not the same.

It looks like you've tried to run the JSON to Features tool using both the Enclosed and Unenclosed JSON options. Try to stay away from the Enclosed option.

Can you add in the entire create statement for CREATE external table enodebuffer? I want to verify that it was created using the Unenclosed option as well.

climbage avatar May 04 '16 15:05 climbage

Thank you for your valuable input. I took into account all your suggestions but I am not able to get the desired result when I want to create a buffer for 1000's of points. 1) SO for the table, as you had suggested

create external table enodebuffer1 (EnodeBStatus string,shape binary) ROW FORMAT SERDE 'com.esri.hadoop.hive.serde.JsonSerde' STORED AS INPUTFORMAT 'com.esri.json.hadoop.UnenclosedJsonInputFormat' OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat';

  1. Then I inserted the values in the table enodebuffer1 using the select query as below

Insert into table enodebuffer1 select enodeaggregate.EnodeBStatus,ST_Buffer(enodeaggregate.shape,1000)from enodeaggregate;

  1. I got the result showing that the data has been loaded into the table succesfully, and am alos able to retrieve the values using select and limit query.

  2. Now the main part is I want to able to see the above results in ArcMAP. So I used the esri github tools for hadoop COPY FROM HDFS and JSON TO FEATURES. Copy from HDFS was successful, no error. But json to features it shows the following error (I hope you can see the images)

untitled

untitled1

  1. So as few people had suggested I had tested it with 100 points and it worked fine. Possibly the error is it can't handle too many vertices?? But there has to be a solution coz we are planning to migrate terabytes of data into HDFS and using multiple geometry queries to retrieve the desired results.

Your help is highy appreciated

Thanks

SrinivasRIL avatar May 05 '16 04:05 SrinivasRIL

How many rows are there?

randallwhitman avatar May 05 '16 15:05 randallwhitman

there are 30230 rows

SrinivasRIL avatar May 05 '16 16:05 SrinivasRIL

@azhigimont thoughts? it looks like it's trying to extract wkid and it isn't there.

climbage avatar May 05 '16 16:05 climbage

Yes, because when I tried to do it with 100 rows, I was able to see the results on ArcMAP but it showed me a warning message that spatial reference was missing and cannot be projected. That's the reason I tried using enclosed json option.

SrinivasRIL avatar May 06 '16 04:05 SrinivasRIL