label-studio-transformers icon indicating copy to clipboard operation
label-studio-transformers copied to clipboard

not showing predictions after training

Open hienvantran opened this issue 3 years ago • 17 comments

Describe the bug There are 2 problems:

  1. After training ML backend model, I cannot find the model predictions in the UI when labelling.
  2. Often cannot train all 100 epochs, the system will crash at middle, 30-70 epochs although dataset is small (50) and have GPU.
  3. Error shows that: get latest job results from work dir doesn’t exist
  4. Sometimes, when 3 not occurs, other issue is that: unable to load weight from pytorch checkpoint file. image image

To reproduce Steps to reproduce the behaviour

  1. Import pre annotated data
  2. Manually label some of them
  3. Go to ML UI in Setting, connect model (BERT classifier) and start training
  4. After finishing, come back to Label UI. In prediction tab, only the pre annotated predictions are shown.

Expected behaviour ML training should be completed and new predictions should be shown in UI

hienvantran avatar Jun 02 '21 04:06 hienvantran

Facing similar issue. Note: I am a beginner with Django and labelstudio so please forgive my naivety.

  1. Using BERT NER with sample data from here
  2. At the end of training, it says "POST /train HTTP/1.1←[0m" 201" Which I understand, is confirmation of successful training.
  3. On the UI, In the filters, I select prediction results and prediction score. [See screenshot]

Questions:

  1. Is it correct way of viewing the predictions?
  2. If so, how to get prediction scores?

image

kbv72 avatar Jun 23 '21 17:06 kbv72

check you code, whether the variable pred_labels and pred_scores in you code.

china-zyy avatar Aug 13 '21 07:08 china-zyy

@kbv72 sorry for a late answer :slightly_frowning_face:

Is it correct way of viewing the predictions?

yes, it looks ok.

If so, how to get prediction scores?

You have to put float "score" field into the root of the prediction dict.

makseq avatar Aug 19 '21 01:08 makseq

Hello there, I am facing the same issue with the prediction scores.

image

As you can see, i have converted my scores to float but it is still not showing up in my frontend.

image

aczy99 avatar Mar 06 '22 15:03 aczy99

@aczy99 Try to set model version on Project Settings => Machine Learning. Also it's better to include "model_version" field to the prediction root too.

makseq avatar Mar 11 '22 20:03 makseq

Facing the same issue, scores are provided but not shown in the tasks list. Selecting the Model version doesn't seem to do anything - displays "Saved!", however is removed upon refreshing the page. It is also confusing where exactly the model version is retrieved from as it is not the same as I provide together with the predictions.

35grain avatar Dec 29 '22 20:12 35grain

is not the same as I provide together with the predictions.

what do you mean by this?

Please show your tasks with predictions.

makseq avatar Jan 05 '23 04:01 makseq

what do you mean by this?

Please show your tasks with predictions.

I am using the format provided as example in docs: https://labelstud.io/guide/predictions.html#Example-JSON Scores per result (individual label) are shown in the Labeling view, however the overall score of a task (predictions.score) is not displayed in the project Tasks view Prediction score column (at least this is where I would expect it to be displayed).

As for the model version, I would also expect the string set from predictions.model_version to be listed under Project Settings => Machine Learning, however there are instead some numeric combinations and 'INITIAL'. A new version (the numeric combination) appears any time I rebuild the backend Docker container.

35grain avatar Jan 05 '23 13:01 35grain

Most likely you have a mistake in the prediction/task format.

makseq avatar Jan 05 '23 13:01 makseq

Most likely you have a mistake in the prediction/task format.

So I pulled out the list of predictions made on my tasks, and this is one example stored in the Label Studio database:

{
    "id": 2053,
    "model_version": "INITIAL",
    "created_ago": "1 day, 19 hours",
    "result": [
      {
        "from_name": "label",
        "image_rotation": 0,
        "original_height": 720,
        "original_width": 1280,
        "score": 0.7231900691986084,
        "to_name": "image",
        "type": "rectanglelabels",
        "value": {
          "height": 9.309395684136284,
          "rectanglelabels": [
            "Visual defects"
          ],
          "rotation": 0,
          "width": 8.777930736541748,
          "x": 35.5985426902771,
          "y": 25.251723395453556
        }
      },
      {
        "from_name": "label",
        "image_rotation": 0,
        "original_height": 720,
        "original_width": 1280,
        "score": 0.6727777123451233,
        "to_name": "image",
        "type": "rectanglelabels",
        "value": {
          "height": 9.74336412217882,
          "rectanglelabels": [
            "Visual defects"
          ],
          "rotation": 0,
          "width": 5.079851150512695,
          "x": 24.418318271636963,
          "y": 33.53565639919705
        }
      },
      ...
    ],
    "score": 0.5981751024723053,
    "cluster": null,
    "neighbors": null,
    "mislabeling": 0,
    "created_at": "2023-01-03T18:40:23.579377Z",
    "updated_at": "2023-01-03T18:40:23.579422Z",
    "task": 600
}

As we can see, the score is in fact stored in the database and therefore simply not displayed. This leads me to believe that the issue is not caused by some formatting mistake in the backend output. As the model version string is defined right after the score in my backend output, it is odd that this isn't stored in the database, though.

35grain avatar Jan 05 '23 14:01 35grain

@35grain I see you missed "id" in results:

"result": [
      {
        "id": "random123", <====
        "from_name": "label",
        "image_rotation": 0,
        "original_height": 720,
        "original_width": 1280,
        "score": 0.7231900691986084,
        "to_name": "image",
        "type": "rectanglelabels",
        "value": {
          "height": 9.309395684136284,
          "rectanglelabels": [
            "Visual defects"
          ],
          "rotation": 0,
          "width": 8.777930736541748,
          "x": 35.5985426902771,
          "y": 25.251723395453556
        }
      },

makseq avatar Jan 11 '23 01:01 makseq

Fixed it for me using this code:

        prediction.append({
            'result': [{
              "value": {
                "start": 1593866042000,
                "end": 1593966345000,
                "instant": False,
                "timeserieslabels": [
                  "test"
                ]
              },
              "id": "jT4DkFmczt",
              "from_name": "label",
              "to_name": "ts",
              "type": "timeserieslabels",
            }],
        })

But score is not working yet or I dont found out where to put the score right

Developer66 avatar Jan 23 '23 22:01 Developer66

@35grain I see you missed "id" in results:

"result": [
      {
        "id": "random123", <====
        "from_name": "label",
        "image_rotation": 0,
        "original_height": 720,
        "original_width": 1280,
        "score": 0.7231900691986084,
        "to_name": "image",
        "type": "rectanglelabels",
        "value": {
          "height": 9.309395684136284,
          "rectanglelabels": [
            "Visual defects"
          ],
          "rotation": 0,
          "width": 8.777930736541748,
          "x": 35.5985426902771,
          "y": 25.251723395453556
        }
      },

Finally had time to play around with this and unfortunately adding an ID did not solve it. Both score and model version are still missing / invalid.

{
   "id":3353,
   "model_version":"INITIAL",
   "created_ago":"0 minutes",
   "result":[
      {
         "from_name":"label",
         "id":"0",
         "image_rotation":0,
         "original_height":720,
         "original_width":1280,
         "score":0.9651663303375244,
         "to_name":"image",
         "type":"rectanglelabels",
         "value":{
            "height":12.63888888888889,
            "rectanglelabels":[
               "Visual defects"
            ],
            "rotation":0,
            "width":12.65625,
            "x":30.390625,
            "y":24.444444444444443
         }
      },
      {
         "from_name":"label",
         "id":"1",
         "image_rotation":0,
         "original_height":720,
         "original_width":1280,
         "score":0.9236611127853394,
         "to_name":"image",
         "type":"rectanglelabels",
         "value":{
            "height":10.555555555555555,
            "rectanglelabels":[
               "Visual defects"
            ],
            "rotation":0,
            "width":7.8125,
            "x":36.796875,
            "y":74.86111111111111
         }
      },
      ...
      {
         "from_name":"label",
         "id":"23",
         "image_rotation":0,
         "original_height":720,
         "original_width":1280,
         "score":0.3958974778652191,
         "to_name":"image",
         "type":"rectanglelabels",
         "value":{
            "height":7.638888888888889,
            "rectanglelabels":[
               "Visual defects"
            ],
            "rotation":0,
            "width":4.84375,
            "x":52.5,
            "y":83.19444444444444
         }
      }
   ],
   "score":0.8146782057980696,
   "cluster":null,
   "neighbors":null,
   "mislabeling":0.0,
   "created_at":"2023-01-24T15:30:13.252432Z",
   "updated_at":"2023-01-24T15:30:13.252489Z",
   "task":77
}

35grain avatar Jan 24 '23 15:01 35grain

@35grain did you ever find a solution, i'm having the same issue of no prediction score despite the prediction result structure looking correct.

croche2574 avatar Apr 26 '23 03:04 croche2574

@35grain did you ever find a solution, i'm having the same issue of no prediction score despite the prediction result structure looking correct.

Unfortunately not. Didn't have time to dig any deeper either :/

35grain avatar Apr 26 '23 10:04 35grain

I've found that if you attach scores to both the result and alongside the result, the score will appear in the interface. I think that there are some recently introduced bugs that we're working on sorting out, and hopefully they will be cleared up soon.

hogepodge avatar May 06 '23 18:05 hogepodge