neomodel
neomodel copied to clipboard
Multiple label relation traversal
As per #533 I added support for multiple relation labels in relationship traversal. It's used like this:
definition = dict(node_class=Person, direction=INCOMING,
relation_type=('FATHER','MOTHER'), model=None)
relations_traversal = Traversal(jim, Person.__label__,
definition)
all_jims_children = relations_traversal.all()
I had to add support for DISTINCT modifiers to return statements. I am not sure, how this would be done best. I tried to modify the _ast['return'] field, but this was used in a lot of places and broke a few things. I then introduced a new field _ast['return_mod'] which is then added to the query like this:
query += ' RETURN ' + self._ast.get('return_mod', '') + ' ' + self._ast['return']
Furthermore we need to consider it's impact in the _count, _contains and _execute methods:
_containsjust introduces a new constraint that matches a single item, the distinct modifier does not make any difference here._executewraps the_ast['return']in anid()call. If we want unique results, we keep the modifier_countrequires the most rework. Here we need to pull the modifier into thecount()call.
I believe I cover all this in the test I wrote, but a critical eye would be highly appreciated (it's the first python unit test for me)
I am not sure why the build failed for python 3.5, the log reads:
Traceback (most recent call last):
File "setup.py", line 33, in <module>
"Topic :: Database",
File "/home/travis/virtualenv/python3.5.6/lib/python3.5/site-packages/setuptools/__init__.py", line 143, in setup
return distutils.core.setup(**attrs)
File "/opt/python/3.5.6/lib/python3.5/distutils/core.py", line 148, in setup
dist.run_commands()
File "/opt/python/3.5.6/lib/python3.5/distutils/dist.py", line 955, in run_commands
self.run_command(cmd)
File "/opt/python/3.5.6/lib/python3.5/distutils/dist.py", line 974, in run_command
cmd_obj.run()
File "/home/travis/build/neo4j-contrib/neomodel/.eggs/pytest_runner-5.2-py3.5.egg/ptr.py", line 209, in run
return self.run_tests()
File "/home/travis/build/neo4j-contrib/neomodel/.eggs/pytest_runner-5.2-py3.5.egg/ptr.py", line 220, in run_tests
result_code = __import__('pytest').main()
File "/home/travis/build/neo4j-contrib/neomodel/.eggs/pytest-6.1.2-py3.5.egg/pytest/__init__.py", line 3, in <module>
from . import collect
File "/home/travis/build/neo4j-contrib/neomodel/.eggs/pytest-6.1.2-py3.5.egg/pytest/collect.py", line 8, in <module>
from _pytest.deprecated import PYTEST_COLLECT_MODULE
File "/home/travis/build/neo4j-contrib/neomodel/.eggs/pytest-6.1.2-py3.5.egg/_pytest/deprecated.py", line 11, in <module>
from _pytest.warning_types import PytestDeprecationWarning
File "/home/travis/build/neo4j-contrib/neomodel/.eggs/pytest-6.1.2-py3.5.egg/_pytest/warning_types.py", line 7, in <module>
from _pytest.compat import final
File "/home/travis/build/neo4j-contrib/neomodel/.eggs/pytest-6.1.2-py3.5.egg/_pytest/compat.py", line 57, in <module>
import importlib_metadata # noqa: F401
File "/home/travis/build/neo4j-contrib/neomodel/.eggs/importlib_metadata-3.1.1-py3.5.egg/importlib_metadata/__init__.py", line 43, in <module>
class PackageNotFoundError(ModuleNotFoundError):
NameError: name 'ModuleNotFoundError' is not defined
The error seems to be inside the importlib_metadata package, where it declares a class PackageNotFoundError which extends the ModuleNotFoundError see the code The problem is, that the ModuleNotFoundError is not available in python3.5 as it was introduced in python 3.6 (see the docs)
The same error is present in a run ten days ago. Therefore I think this might not be a problem with my PR but rather an error on another end?
Hello - thank you for your contribution. We are working on updating this repo and reviewing PRs but, to set expectations, maintenance is moving at a slow pace.
Hey @whatSocks, thanks for the update! If you don't mind, I'd be interested in helping out the team. Is there a process to apply as a collaborator?
@AntonLydike hi Anton, the repo has been updated since Jan - can you rebase and check on the failing tests?
Marking this as "Review to be done", since the feature looked approved by the team, and just lacked some actions (rebase, fix tests).
@AntonLydike Any chance you could review this so that we can decide if it can be added to an upcoming release?
Kudos, SonarCloud Quality Gate passed! 
0 Bugs
0 Vulnerabilities
0 Security Hotspots
3 Code Smells
No Coverage information
0.0% Duplication