Hive-JSON-Serde icon indicating copy to clipboard operation
Hive-JSON-Serde copied to clipboard

ClassCastException on java.lang.String to HiveVarchar

Open mvklingeren opened this issue 9 years ago • 2 comments

I'm getting a weird error about a invalid cast. First i'm creating a table with the following schema, then addind some data to it. That all works fine, but when i query the table with 'select * from data', i'm getting the following error:

hive> select value from data; OK Failed with exception java.io.IOException:org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.hadoop.hive.common.type.HiveVarchar

Do you have any idea about what is going on, what am i doing wrong?

Greets, Milco

-------------------------creation of the schema, loading data, goes well ------------------------------------ ADD JAR /opt/hive-serde-rcongiu/json-serde-1.3-jar-with-dependencies.jar; DROP TABLE IF EXISTS Impressions; CREATE TABLE data ( context VARCHAR(72), value ARRAY< STRUCT< CustomerID: VARCHAR(5), CompanyName: VARCHAR(19), ContactName: VARCHAR(12), ContactTitle: VARCHAR(20), Address: VARCHAR(13), City: VARCHAR(6), Region: VARCHAR(20), PostalCode: VARCHAR(5), Country: VARCHAR(7), Phone: VARCHAR(11), Fax: VARCHAR(11) > > ) ROW FORMAT SERDE 'org.openx.data.jsonserde.JsonSerDe'; LOAD DATA LOCAL INPATH 'customer.json' INTO TABLE data;

ALL OK,

mvklingeren avatar Apr 10 '15 08:04 mvklingeren

Sorry saw this issue just today.
HiveVarchar is a new type, are you using hive 1.2 ?

rcongiu avatar May 28 '15 15:05 rcongiu

Hi,

I am also getting the above issue. Hive : 1.2.1 Phoenix : 4.4.0 Hadoop : HDP 2.3.4

Please refer below logs for the same. 16/07/25 06:02:09 INFO mapreduce.Job: Task Id : attempt_1465458660016_137863_m_000004_1, Status : FAILED Error: java.lang.ClassCastException: java.lang.String cannot be cast to org.apache.hadoop.hive.common.type.HiveVarchar at org.apache.hadoop.hive.serde2.objectinspector.primitive.JavaHiveVarcharObjectInspector.getPrimitiveWritableObject(JavaHiveVarcharObjectInspector.java:54) at org.apache.hadoop.hive.serde2.lazy.LazyUtils.writePrimitiveUTF8(LazyUtils.java:235) at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.serialize(LazySimpleSerDe.java:307) at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.serializeField(LazySimpleSerDe.java:262) at org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.doSerialize(LazySimpleSerDe.java:246) at org.apache.hadoop.hive.serde2.AbstractEncodingAwareSerDe.serialize(AbstractEncodingAwareSerDe.java:50) at org.apache.hive.hcatalog.mapreduce.FileRecordWriterContainer.write(FileRecordWriterContainer.java:122) at org.apache.hive.hcatalog.mapreduce.FileRecordWriterContainer.write(FileRecordWriterContainer.java:54) at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.write(MapTask.java:658) at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:89) at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.write(WrappedMapper.java:112) at com.lendingpoint.hadoop.hbasetohive.PhoenixToHiveMapper.map(PhoenixToHiveMapper.java:52) at com.lendingpoint.hadoop.hbasetohive.PhoenixToHiveMapper.map(PhoenixToHiveMapper.java:1) at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146) at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)

Thanks Kanav Narula

kanav-narula avatar Jul 25 '16 12:07 kanav-narula