node-oracledb
node-oracledb copied to clipboard
Automatic Parsing of JSON Structures when receiving from Query
I am am following the guide for JSON usage with OracleDB. I have added a new column to one of my tables:
ALTER TABLE "companies"
ADD ("geojson" VARCHAR2(1000), CONSTRAINT "geojson_check" CHECK ("geojson" IS JSON))
Now, when retrieving the GeoJSON structure, I receive it back as string, as expected due to the VARCHAR2(1000) setup.
Basically, I execute the following read operation in node:
SELECT "c".*,
"c"."geojson"."coordinates"[0] "latitude",
"c"."geojson"."coordinates"[1] "longitude"
FROM "companies" "c"
WHERE "c"."id" = :id
And I asked myself if I could use the geojson column directly and return it to the caller. For that, I would need to do a JSON.parse since I do want to provide the full JSON and not a string.
Is there a way to attach a specific column type, so that I can hook up JSON.parse automatically on some column types?
Database version: 18.0.0.0 (18c)
@PeterMerkert Be sure to give this a read: Using GeoJSON Geographic Data
Now, when retrieving the GeoJSON structure, I receive it back as string, as expected due to the VARCHAR2(1000) setup.
JSON is a string. It's only when you parse it that you end up with an in-memory object or array.
And I asked myself if I could use the geojson column directly and return it to the caller.
Who is the "caller" here? The caller of a function within your application or the caller of a REST API you've created?
In the case of a function call, yes, you would need to parse the JSON to give the caller the in-memory object. However, if it's a REST call, then the object would need to be serialized into a string to send across the wire. In that case, you would just skip the parse and send it along.
For that, I would need to do a JSON.parse since I do want to provide the full JSON and not a string.
Once you do a parse, you no longer have JSON. You have the language-specific, in-memory representation of the JSON data. In JavaScript, the structure is nearly identical (because JSON comes from JavaScript). But in other languages, take PL/SQL for example, the in-memory representation may be quite different.
Is there a way to attach a specific column type, so that I can hook up JSON.parse automatically on some column types?
@anthony-tuininga has pondered this one more than once. One option is to provide a callback function to the execute method that would do post-processing of the rows before returning them. But when you think about it, whether it's a function passed to execute or some code right after execute, it's the same logic just in two different places. I'm not sure there's much of an advantage to one over the other.
One alternative (future) design possibility is that fetchInfo could be used with some new type constant to indicate JSON columns should be fetched as JS objects. Something similar could be done with binds. Since these would save just a line or two of application code, I'd like to be sure the added user complexity (documentation, testing, user cognitive overload etc) is worth it.
If you're playing with the JSON features in Oracle DB, You may be interested in the JSON Office Hours.
When using node-oracledb 5.1 with Oracle Client 21 and Oracle DB 21, the queried results from a JSON column will be a JS object (with some edge cases if you use the old, no-longer-default storage options like BLOB, or have different client or server versions). See https://oracle.github.io/node-oracledb/doc/api.html#jsondatatype
Along with the native type handling for JSON, you can now use a node-oracledb 6.0 output type handler converter function to change data being fetched.
Yessssss, thanks for the follow up @cjbj ! We have also directly upgraded today to 6.0! 🥰