Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Now we’ll retrain the model. You can choose between retraining texts from your uploaded file or upload annotated data as a new file.

Table of Contents

Option 1: Annotate selected instances from the uploaded file

Select instances to annotate from the uploaded text

  1. Select Action:

    Status
    titleModel

  2. Select your model

  3. Select Action:

    Status
    titleUpdate

  4. Select Mode:

    Status
    titlerequest
    and select how many instances you want to annotate.

  5. Select sampler:

    1. Random: Returns random text instances from your dataset

    2. Margin

      Status
      colourGreen
      titlerecommended
      : A metric to find uncertain instances for the underlying model

  6. Then click the button “Request”.

Annotate selected instances

Now you want to annotate the requested instances. To do that, proceed as following:

...

You can repeat this procedure multiple times.

Option 2: Upload a file of annotated instances

  1. Upload a file that has a column with post annotations

  2. Select the text column

  3. Select the label column (post annotations)

  4. Click “Update” to retrain the model. This process takes a little while. You may see a message “Model not ready. Come back later”.

Estimated quality performance

After the model has been trained at least one time, the user will be able to monitor the evolution of the quality performance.

...

  1. With your model selected, select action

    Status
    titleinspect
    .

  2. After each iteration a new point will be drawn on the "estimated quality performance" line plot. You can deduce from it that, when the curve starts to flatten, the model quality is converging, and probably it's not going to learn much more.

Evaluate model performance

Using a different dataset than the one the model was created with, you can evaluate the model’s performance.

  1. Prepare your data file with the following columns:

    1. Text: Your raw text instances

    2. Label: Label annotations for each text

  2. Select Action:

    Status
    titleevaluate

  3. Upload your file

  4. Select the text column and label column

  5. As a result, you’ll see a table with different performance metrics and a list of text instances where the predicted label and the annotated label mismatch.

...

...

Next:

...

K4. Inference