Unexpected error Error: generator raised StopIteration
Good afternoon, ladies and gentlemen. I encountered an unusual error:
root@apache-superset-app-654d87964d-dj4ql:/usr/local/lib/python3.9/site-packages/sqlalchemy_drill/drilldbapi# pip show Name: sqlalchemy-drill Version: 1.1.4 Summary: Apache Drill for SQLAlchemy Home-page: https://github.com/JohnOmernik/sqlalchemy-drill Author: John Omernik, Charles Givre, Davide Miceli, Massimo Martiradonna, James Turton Author-email: [email protected], [email protected], [email protected], [email protected], [email protected] License: MIT Location: /usr/local/lib/python3.9/site-packages Requires: ijson, requests, sqlalchemy Required-by:
LOGS:
"""
Query SELECT source_ip_type AS source_ip_type,
COUNT(*) AS count
FROM mongo.events_base.events
WHERE org_id = 'non_exists_value'
GROUP BY source_ip_type
ORDER BY count DESC
LIMIT 10000 on schema mongo.events_base failed
Traceback (most recent call last):
File "/app/superset/connectors/sqla/models.py", line 1795, in query
df = self.database.get_df(sql, self.schema, mutator=assign_column_label)
File "/app/superset/models/core.py", line 614, in get_df
data = self.db_engine_spec.fetch_data(cursor)
File "/app/superset/db_engine_specs/base.py", line 788, in fetch_data
raise cls.get_dbapi_mapped_exception(ex) from ex
File "/app/superset/db_engine_specs/base.py", line 763, in fetch_data
data = cursor.fetchall()
File "/usr/local/lib/python3.9/site-packages/sqlalchemy_drill/drilldbapi/_drilldbapi.py", line 73, in func_wrapper
return func(self, *args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/sqlalchemy_drill/drilldbapi/_drilldbapi.py", line 294, in fetchall
return self.fetchmany(-1)
File "/usr/local/lib/python3.9/site-packages/sqlalchemy_drill/drilldbapi/_drilldbapi.py", line 73, in func_wrapper
return func(self, *args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/sqlalchemy_drill/drilldbapi/_drilldbapi.py", line 267, in fetchmany
row_dict = next(self._row_stream)
RuntimeError: generator raised StopIteration
"""
Could it be that the error is not caught by the try-except block?
@is_open def fetchmany(self, size: int = None): ....... fetch_until = self.rownumber + (size or self.arraysize) results = []
try:
while self.rownumber != fetch_until:
row_dict = next(self._row_stream)
# values ordered according to self.result_md['columns']
row = [row_dict[col] for col in self.result_md['columns']]
if self._typecaster_list is not None:
row = (f(v) for f, v in zip(self._typecaster_list, row))
results.append(tuple(row))
self.rownumber += 1
if self.rownumber % api_globals._PROGRESS_LOG_N == 0:
logger.info(f'streamed {self.rownumber} rows.')
except StopIteration:
self.rowcount = self.rownumber
logger.info(
f'reached the end of the row data after {self.rownumber}'
' records.'
)
# restart the outer parsing loop to collect trailing metadata
self._outer_parsing_loop()
return results