ibis icon indicating copy to clipboard operation
ibis copied to clipboard

feat: pyspark support json array operation

Open karta0807913 opened this issue 8 months ago • 0 comments

Is your feature request related to a problem?

No response

What is the motivation behind your request?

In my usage scenario, I often use JSON operations. However, Ibis does not support JSON array parsing.

import ibis
ibis.pyspark.connect().compile(ibis.literal("[1,2,3]", type="json").array)

the output is

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/local/lib/python3.10/dist-packages/ibis/backends/sql/__init__.py", line 135, in compile
    query = self.compiler.to_sqlglot(expr, limit=limit, params=params)
  File "/usr/local/lib/python3.10/dist-packages/ibis/backends/sql/compilers/base.py", line 597, in to_sqlglot
    sql = self.translate(table_expr.op(), params=params)
  File "/usr/local/lib/python3.10/dist-packages/ibis/backends/sql/compilers/base.py", line 665, in translate
    results = op.map(fn)
  File "/usr/local/lib/python3.10/dist-packages/ibis/common/graph.py", line 305, in map
    results[node] = fn(node, results, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/ibis/backends/sql/compilers/base.py", line 645, in fn
    result = self.visit_node(node, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/ibis/backends/sql/compilers/base.py", line 691, in visit_node
    return method(op, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/ibis/backends/sql/compilers/base.py", line 739, in visit_Literal
    return self.visit_DefaultLiteral(op, value=value, dtype=dtype)
  File "/usr/local/lib/python3.10/dist-packages/ibis/backends/sql/compilers/base.py", line 835, in visit_DefaultLiteral
    raise NotImplementedError(f"Unsupported type: {dtype!r}")

Describe the solution you'd like

I can create a new PR to implement this feature by using the FROM_JSON function which has been provided after spark 2.1.0

What version of ibis are you running?

10.3.1

What backend(s) are you using, if any?

pyspark

Code of Conduct

  • [x] I agree to follow this project's Code of Conduct

karta0807913 avatar Mar 23 '25 08:03 karta0807913