PyHive icon indicating copy to clipboard operation
PyHive copied to clipboard

Python interface to Hive and Presto. 🐝

Results 136 PyHive issues
Sort by recently updated
recently updated
newest added

Hi , I am trying connect Hive server 2 using PyHive using the below code conn = hive.Connection(host='host', port=port, username='id', password='paswd',**auth='LDAP'**,database = 'default') and it generate error below. File "C:\Users\userabc\AppData\Local\Continuum\Anaconda3\lib\site-packages\thrift_sasl\__init__.py",...

Recently Prestosql rebranded as Trino and as part of this change they renamed their code base and client protocols. The old client headers will no longer works with new version...

Support for decimal type is missing from ParamEscaper.escape_item Reproduce: ``` engine = create_engine(url) q = text('select * from test where a = :param') q = q.bindparams(bindparam('param', Decimal('1.0'))) # Raises exc.ProgrammingError...

In case when the token is invalid, just EOFError is raised, without any explanation. Though the response from a server contains a meaningful message. TBinaryProtocol.trans contain code=403, message='forbidden' and headers....

Hello, I'm using pyhive with sqlAlchemy in my web application to to fetch data from hive, at first my queries seem to work just fine but after I leave it...

Pyhive sqlachemy dialect can not do 'where' filtering for dates/timestamps with year lower than 1000 because leading zeros are omitted when %Y is used. Years 1-999 are rare is real...

Our server is configured with hive.server2.transport.mode set to HTTP. When switching to binary , everything seems to work perfectly. Is there a way to enable PyHive to work with HTTP...

enhancement

``` TExecuteStatementResp(status=TStatus(statusCode=3, infoMessages=["*org.apache.hive.service.cli.HiveSQLException:Error running query: org.apache.spark.sql.AnalysisException: cannot resolve '`AB`' given input columns: [spark_catalog.default.************************************ ``` the above exception is so decrypt. the above exception is correctly thrown, because the sql error...

When accessive Hive through SQLAlchemy or the Jupyter Hive Kernel, using Kerberos and the auth-conf setting, we are getting the following exception. It occurs on a simple select statement listing...