graphql-engine
graphql-engine copied to clipboard
`_in` operator in `where`-clause isn't working with special characters like `ü`, `ö`, `ä`
Version Information
Server Version: >2.11.0 (all Versions above 2.11.0 seems to be broken) CLI Version (for CLI related issue):
Environment
Enterprise On-Premise Cloud (Deployed Docker Image in K8s) - Database PostgreSQL 13.8
What is the current behaviour?
_in
operator in where
-clause isn't working with special characters like ü
, ö
, ä
. Maybe there are more characters, i didn't test this.
Used GraphQL-Query:
query IssueQuery {
test(limit: 10, where: {text_w_special_chars: {_in: "Ökologisch"}}){
text_w_special_chars
}
}
Genrated SQL-Query with Hasura Version >2.11.0:
SELECT
coalesce(json_agg("root"), '[]') AS "root"
FROM
(
SELECT
row_to_json(
(
SELECT
"_e"
FROM
(
SELECT
"_root.base"."text_w_special_chars" AS "text_w_special_chars"
) AS "_e"
)
) AS "root"
FROM
(
SELECT
*
FROM
"public"."test"
WHERE
(
("public"."test"."text_w_special_chars") = ANY((E '{"\\214kologisch"}') :: text [])
)
LIMIT
10
) AS "_root.base"
) AS "_root"
Note: The SQL Query, which is generated by Hasura for the GraphQL-Query, seems to have an additional space
in the ANY
Statement. If you paste the query into a pgadmin, it returns a syntax fault. Refering to the pg docs (SQL Syntax Strings Escape) - It should look like E'foobar'
, not E 'foobar'
.
If you correct this, the result of the query is empty. But it isn't! - See How to reproduce the issue?
GraphQL Result:
{
"data": {
"test": []
}
}
What is the expected behaviour?
The result of the query should not be empty, like in Hasura Version < 2.11.1
We expect to get this result from the graphql query above:
{
"data": {
"test": [
{
"text_w_special_chars": "Ökologisch"
}
]
}
}
The generated SQL from Hasura 2.11.0:
SELECT
coalesce(json_agg("root"), '[]') AS "root"
FROM
(
SELECT
row_to_json(
(
SELECT
"_e"
FROM
(
SELECT
"_root.base"."text_w_special_chars" AS "text_w_special_chars"
) AS "_e"
)
) AS "root"
FROM
(
SELECT
*
FROM
"public"."test"
WHERE
(
("public"."test"."text_w_special_chars") = ANY(('{Ökologisch}') :: text [])
)
LIMIT
10
) AS "_root.base"
) AS "_root"
How to reproduce the issue?
- Create a table with special data, i will share my example here:
CREATE DATABASE hasura_issue
WITH
OWNER = postgres
ENCODING = 'UTF8'
LC_COLLATE = 'de_DE.utf8'
LC_CTYPE = 'de_DE.utf8'
TABLESPACE = pg_default
CONNECTION LIMIT = -1;
CREATE TABLE IF NOT EXISTS public.test
(
id bigint NOT NULL DEFAULT nextval('test_id_seq'::regclass),
text_w_special_chars text COLLATE pg_catalog."default" NOT NULL,
CONSTRAINT test_pkey PRIMARY KEY (id)
)
INSERT INTO test (text_w_special_chars)
VALUES ('Überfahrt'), ('Änderung'), ('Ökologisch'), ('Röhre'), ('Spezialität'), ('wühlen')
- Use the Console to query the table
test
, after you updated your metadata:
{
"data": {
"test": [
{
"text_w_special_chars": "Ökologisch"
}
]
}
}
Screenshots or Screencast
Query Console Hasura Version 2.11.0
Query Console Hasura Version 2.11.1
Please provide any traces or logs that could help here.
No errors in Log.
Any possible solutions?
Maybe there is something wrong in this fix: "server: format text literals correctly in postgres array literals." see changelog
Can you identify the location in the source code where the problem exists?
No
If the bug is confirmed, would you be willing to submit a PR?
No
Keywords
_in, special characters, query, where clause, ä, ö ü
No umlauts permitted in WHERE clause sounds like a serous issue to me. I also would like to see this working again.
The issue seems related to 8961, which has been fixed recently and will be part of "v2.13.0-beta.1 " due to be released today/tomorrow.
@OleRoel Could you please subscribe to #8961,I'll close this one as a duplicate.