graphql-db-api icon indicating copy to clipboard operation
graphql-db-api copied to clipboard

graphql engine can not process graphql queries against OpenLink Virtuso´s graphql endpoint.

Open E-Babkin opened this issue 1 year ago • 5 comments

OpenLink Virtuoso exposes a Graphql schema in a different format, without NON_NULL top-level type.

In the result exception is raised in a stand alone program, or inside the Apache Superset.

The details of that case are described also in Stackoverlfow: https://stackoverflow.com/questions/78405538/how-to-achieve-compliance-between-graphql-in-virtuoso-and-python-graphql-db-api

E-Babkin avatar Apr 30 '24 11:04 E-Babkin

What issue are you running into when trying to use this library with the OpenLink Virtuoso API? If there is a stack trace, could you post it?

cancan101 avatar May 01 '24 06:05 cancan101

Sure, here is the program

from sqlalchemy import create_engine
from sqlalchemy import text


#engine = create_engine('graphql://pet-library.moonhighway.com/')
#query = "select id, name from 'allPets?is_connection=0'"

# Virtuoso's hosted  GraphQL endpoint
engine = create_engine('graphql://linkeddata.uriburner.com/graphql')
query = "select name from 'Concepts?is_connection=0'"

with engine.connect() as connection:
    for row in connection.execute(text(query)):
        print(row)

And the stacktrace :

/home/PycharmProjects/pythonProject/.venv/bin/python /home/PycharmProjects/pythonProject/graphql_client.py 
/home/PycharmProjects/pythonProject/graphql_client.py:8: SADeprecationWarning: The dbapi() classmethod on dialect classes has been renamed to import_dbapi().  Implement an import_dbapi() classmethod directly on class <class 'graphqldb.dialect.APSWGraphQLDialect'> to remove this warning; the old .dbapi() classmethod may be maintained for backwards compatibility.
  engine = create_engine('graphql://linkeddata.uriburner.com/graphql')
Traceback (most recent call last):
  File "/home/PycharmProjects/pythonProject/.venv/lib/python3.11/site-packages/shillelagh/backends/apsw/db.py", line 229, in execute
    self._cursor.execute(operation, parameters)
  File "src/cursor.c", line 959, in APSWCursor_execute.sqlite3_prepare_v3
apsw.SQLError: SQLError: no such table: Concepts?is_connection=0

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/PycharmProjects/pythonProject/graphql_client.py", line 21, in <module>
    for row in connection.execute(text(query)):
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/PycharmProjects/pythonProject/.venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 1422, in execute
    return meth(
           ^^^^^
  File "/home/PycharmProjects/pythonProject/.venv/lib/python3.11/site-packages/sqlalchemy/sql/elements.py", line 514, in _execute_on_connection
    return connection._execute_clauseelement(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/PycharmProjects/pythonProject/.venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 1644, in _execute_clauseelement
    ret = self._execute_context(
          ^^^^^^^^^^^^^^^^^^^^^^
  File "/home/PycharmProjects/pythonProject/.venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 1850, in _execute_context
    return self._exec_single_context(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/PycharmProjects/pythonProject/.venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 1990, in _exec_single_context
    self._handle_dbapi_exception(
  File "/home/PycharmProjects/pythonProject/.venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 2360, in _handle_dbapi_exception
    raise exc_info[1].with_traceback(exc_info[2])
  File "/home/PycharmProjects/pythonProject/.venv/lib/python3.11/site-packages/sqlalchemy/engine/base.py", line 1971, in _exec_single_context
    self.dialect.do_execute(
  File "/home/PycharmProjects/pythonProject/.venv/lib/python3.11/site-packages/sqlalchemy/engine/default.py", line 919, in do_execute
    cursor.execute(statement, parameters)
  File "/home/PycharmProjects/pythonProject/.venv/lib/python3.11/site-packages/shillelagh/backends/apsw/db.py", line 87, in wrapper
    return method(self, *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/PycharmProjects/pythonProject/.venv/lib/python3.11/site-packages/shillelagh/backends/apsw/db.py", line 240, in execute
    self._create_table(uri)
  File "/home/PycharmProjects/pythonProject/.venv/lib/python3.11/site-packages/shillelagh/backends/apsw/db.py", line 304, in _create_table
    self._cursor.execute(
  File "src/vtable.c", line 907, in VirtualTable.xCreate
  File "/home/PycharmProjects/pythonProject/.venv/lib/python3.11/site-packages/shillelagh/backends/apsw/vt.py", line 294, in Create
    adapter = self.adapter(*deserialized_args)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/PycharmProjects/pythonProject/.venv/lib/python3.11/site-packages/graphqldb/adapter.py", line 419, in __init__
    raise ValueError("Unable to resolve item_container_type")
ValueError: Unable to resolve item_container_type

Process finished with exit code 1

E-Babkin avatar May 01 '24 07:05 E-Babkin

  1. You should be indicating this is a non-Connection: query = "select name from 'Concepts?is_connection=0'"
  2. You need the changes in this PR: #377; however, I am not okay just merging that as is without adding some additional error reporting, etc.

cancan101 avatar May 05 '24 22:05 cancan101

Ok, thank you for the response. Indeed, the no-connection option should be used in that case. I made changes in the original code and the stack trace. Also, the statements in https://github.com/cancan101/graphql-db-api/pull/377 correctly note needs to support IRI data type.
so, after including error reporting these fixes will be available in the main branch, will be ?

E-Babkin avatar May 06 '24 10:05 E-Babkin

I don't have the bandwidth to complete #377 and merge to main. That being said, you can use pip to install directly from the PR branch if you want to work with the functionality.

cancan101 avatar May 06 '24 14:05 cancan101