bigquery-emulator icon indicating copy to clipboard operation
bigquery-emulator copied to clipboard

Node Client - load from file throwing error

Open AlexLeonte opened this issue 2 years ago • 4 comments

Load from file not working while using nodejs client.

href: 'http://0.0.0.0:9050/upload/bigquery/v2/projects/<projectId>/jobs?uploadType=multipart' message: 'runtime error: invalid memory address or nil pointer dereference'

AlexLeonte avatar Jan 17 '23 13:01 AlexLeonte

@AlexLeonte Since it is not possible to investigate from this information, please include the version of the emulator you used and the reproduction code.

goccy avatar Jan 17 '23 14:01 goccy

async function createTable() {
  const bigquery = new BigQuery({
    apiEndpoint: 'http://0.0.0.0:9050/',
    projectId: 'gccstart'
  })
  
  const datasetId = 'my_dataset';
  const tableId = 'my_table';
  
  const options = {
    location: 'US'
  }

  // Create a new table in the dataset
  const [table] = await bigquery
    .dataset(datasetId)
    .createTable(tableId, options)

  console.log(Table ${table.id} created.);
}
async function createDataset(){
  const bigquery = new BigQuery({
    apiEndpoint: 'http://0.0.0.0:9050',
    projectId: 'gccstart'
  })

  const datasetId = 'my_dataset'

  const [dataset] = await bigquery.createDataset(datasetId)
  console.log(`Dataset ${dataset.id} created.`)
}
async function populateTable() {
  const bigquery = new BigQuery({
    apiEndpoint: 'http://0.0.0.0:9050',
    projectId: 'gccstart'
  })

  const datasetId = 'my_dataset';
  const tableId = 'my_table';

  const metadata = {
    sourceFormat: 'NEWLINE_DELIMITED_JSON',
    schema: {
        fields: [
        {name: 'name', type: 'STRING'},
        {name: 'id', type: 'STRING'},
        ],
    },
    location: 'US',
  };

  const filePath = path.join('bqvalues.json')
  await bigquery
    .dataset(datasetId)
    .table(tableId)
    .load(filePath, metadata)
}

bqvalues.json { "name":"Alice","id":"12"} { "name":"Bob","id":"34"} { "name":"Charles","id":"45"}

and for auth (GOOGLE_APPLICATION_CREDENTIALS) a valid account is used.

AlexLeonte avatar Jan 17 '23 15:01 AlexLeonte

@goccy I'm running into the exact same issue using the 'latest' docker image as of 2/24/2023.

  await bigquery.dataset(datasetId).table(tableId).insert([{
    policy: 'test',
    uuid: 'uuid',
    timestamp: new Date(),
    result: {
      id: 'result'
    },
    input: {
      id: 'input'
    },
  }])

Update... The JSON datatype is not working. As a workaround, I changed the JSON datatype to STRING. This will be changed when working on GCP. Hopefully, the JSON datatype will be supported soon. :)

From:
    fields.push({name: 'result'   , type: 'JSON',   mode: 'REQUIRED'})
    fields.push({name: 'input'    , type: 'JSON',   mode: 'REQUIRED'})

    const result = { result: 'result'}
    const input  = { input: 'input'}
    
    const decision = {
      policy: 'test',
      uuid: uuidv4(),
      timestamp: new Date(),
      result: result,
      input: input
    }   

To:

    fields.push({name: 'result'   , type: 'STRING',   mode: 'REQUIRED'})
    fields.push({name: 'input'    , type: 'STRING',   mode: 'REQUIRED'})

    const result = JSON.stringify({ result: 'result'})
    const input  = JSON.stringify({ input: 'input'})

    const decision = {
      policy: 'test',
      uuid: uuidv4(),
      timestamp: new Date(),
      result: result,
      input: input
    }   

sklakken avatar Feb 24 '23 20:02 sklakken

I have the same problem, hope it will be supported :)

SebHeuze avatar Mar 29 '23 15:03 SebHeuze