Spark-The-Definitive-Guide icon indicating copy to clipboard operation
Spark-The-Definitive-Guide copied to clipboard

ch10 - sparksql- Inserting into tables query not working Cannot safely cast 'count': string to bigint

Open shanmugavel04 opened this issue 3 years ago • 0 comments

Query in the book: INSERT INTO partitioned_flights PARTITION (DEST_COUNTRY_NAME="UNITED STATES") SELECT count, ORIGIN_COUNTRY_NAME FROM flights WHERE DEST_COUNTRY_NAME='UNITED STATES' LIMIT 12 In Spark 3.0, The above query returns the below error in SQL statement: AnalysisException: Cannot write incompatible data to table 'default.partitioned_flights':

  • Cannot safely cast 'count': string to bigint so, modified the query as below to cast the count column as an integer. INSERT INTO partitioned_flights PARTITION (DEST_COUNTRY_NAME = "UNITED STATES") SELECT ORIGIN_COUNTRY_NAME, cast(count as int) count1 FROM flights WHERE DEST_COUNTRY_NAME = "UNITED STATES" LIMIT 12 could you please check on this?

shanmugavel04 avatar Jun 20 '21 08:06 shanmugavel04