Hive-JSON-Serde icon indicating copy to clipboard operation
Hive-JSON-Serde copied to clipboard

Read - Write JSON SerDe for Apache Hive.

Results 68 Hive-JSON-Serde issues
Sort by recently updated
recently updated
newest added

After building the develop branch on OSX 10.9.3, the testSerializeWithMapping test fails: Test output: Tests run: 13, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.184 sec

My sample table definition is as follows create external table if not exists test ( ctx string, test int )PARTITIONED BY (game_id STRING) ROW FORMAT SERDE 'org.openx.data.jsonserde.JsonSerDe'; Input data: {"ctx":{"reason":184,"initial":105,"name":62,"delta":5},...

May we have another serde property which will allow ignoring all exceptions, in case we are willing to receive null row in case something is wrong. The ignore.malformed only ignores...

below is the stack trace. When the execution engine is tez i get the below mentioned exception. But when i change the execution engine to MR it works fine. table...

select w.\* from external_json_table w left outer join hive_managed_table s on w.uid=s.uid where s.uid is null; ## Task with the most failures(4): Task ID: task_1419996056279_0188_m_000000 URL: ## http://0.0.0.0:8088/taskdetails.jsp?jobid=job_1419996056279_0188&tipid=task_1419996056279_0188_m_000000 Diagnostic Messages...

While Parsing the data columns for hive in a very large file, some wrong/corrupted values (resulting in NumberFormatException) should not break the MR job. so this change parse all the...

If I process an uncorrect json object, it will throw exception, like bellow Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Hive Runtime Error while processing writable {"duration":1929,"errorcode":"","errortype":0,"iaddr":1,"methodname":"getPushInformation","nettype":"wifi","timestamp":"1415848814278","trader":{"clientAppVersion":"3.2.1","clientSystem":" at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:513) at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:157) ... 8 more...

Hi, I am using this project to see if I can use it to query many large nested documents. So far I have only tried with small documents and I...