LogoLogo
RedBrick AIGuides
  • Quick Start
    • Walkthrough Guides
    • Get Started with a Project
    • Get Started with Workspace
      • Cohort Creation
      • Datapoint Classification
      • Configuring Metadata Schema
    • Creating a RedBrick AI Account
  • Organizations
    • Organization and Project Roles
    • Inviting Your Team
      • Single Sign-on
  • Dashboard
    • Account Settings
    • User Preferences
    • Worklist
    • Preview Tool
    • Integrations
    • Taxonomies
    • Boost
      • Auto Annotator
    • Home
      • Sections
  • Importing Data
    • Uploading Data to RedBrick
    • Import Cloud Data
      • Configuring AWS s3
      • Configuring Azure Blob
      • Configuring GCS
      • Configuring AltaDB
      • Creating an Items List
    • Troubleshooting
  • Projects
    • Tasks & Assignment
    • Comments & Raise Issue
    • Reference Standards
    • Project & Task Analytics
    • Labeler Evaluation
  • Project Pages
    • Data Page
    • Settings Page
      • Custom Label Validation
      • Custom Hanging Protocol
      • Webhooks
    • Multiple Labeling
      • Consensus
        • Agreement calculation
      • Task duplication
  • Annotation & viewer
    • Viewer Basics
      • Document Viewer
      • Multiple Modalities
      • Intellisync
      • Annotation Mirroring
    • Creating, Editing and Deleting Annotations
    • Visualization and Masking
    • Segmentation
      • Segmentation Tools
      • Instance vs. Semantic
      • Overlapping Segmentations
    • Heat maps
  • Python SDK & CLI
    • Full documentation
    • Installation & API Keys
    • SDK Overview
      • Importing Data & Annotations
      • Programmatic Label & Review
      • Assigning & Querying Tasks
      • Exporting Annotations
    • CLI Overview
      • Creating & Cloning Projects
      • Import Data & Annotations
      • Exporting Annotations
    • Importing Annotations Guide
    • Formats
      • Full Format Reference
      • Export Structure
  • Useful Links
    • Privacy Policy
Powered by GitBook
On this page
  • Export Folder Structure
  • Segmentations Subdirectory
  • Code Examples
  • Generate an Audit Trail
  • Additional Capabilities

Was this helpful?

  1. Python SDK & CLI
  2. SDK Overview

Exporting Annotations

PreviousAssigning & Querying TasksNextCLI Overview

Last updated 8 months ago

Was this helpful?

You can make use of RedBrick AI's Python SDK to export your annotations using a Python script.

Within the Python SDK, annotations are exported in two ways:

  1. The export_tasks function returns a Python object containing meta-data information and any vector annotations (measurements, landmarks, etc.). Please see the .

  2. By default, segmentation data is written to your disk in NIfTI format. Segmentation data can also be exported in PNG or RT Struct by manipulating the parameters of the export_tasks function. Please view the detailed .

If you're attempting a one-time export or don't have intensive requirements for your export, the also provides a simple and optimized workflow for exporting a Project's annotations.

Export Folder Structure

RedBrick AI exports annotations in a JSON structure, accompanied by for segmentations. All data will be exported within a folder named after your project_id, with the following structure:

project_id/
├── segmentations
│   ├── study01
│   │   └── series1.nii
│   └── study02
│       ├── series1.nii
│       └── series2.nii
└── tasks.json

The above structure is for a standard export (i.e. not semantic, not binary mask, etc.) and assumes no .

Segmentations Subdirectory


Code Examples

project = redbrick.get_project(org_id, project_id, api_key)

With a new Project object created, you can export your Project's Tasks in various ways. Please see some common examples below.

Export All Tasks

The export_tasks() function exports segmentation files for all Ground Truth Tasks by default. To export All Tasks, set the only_ground_truth parameter to False.

annotations = project.export.export_tasks(only_ground_truth=False)

Export Only Ground Truth

You can export only the Tasks in Ground Truth, i.e., Tasks that have successfully made it through all Label and Review Stages.

gt_annotations = project.export.export_tasks(only_ground_truth=True)

Export Specific Tasks

Export selected Tasks by specifying Task IDs.

specific_annotations = project.export.export_tasks(task_id="...")

Generate an Audit Trail

An audit trail can be useful for regulators interested in your quality control processes, as well as for managing your internal QA processes.

Audit Trail - All Tasks

If you'd like to generate an audit trail for all Tasks (not only those in the Ground Truth Stage), be sure to include the only_ground_truth=False parameter.

# Return an audit trail for all Tasks in all Stages
audit_trail = project.export.get_task_events(only_ground_truth=False)

Audit Trail - Ground Truth Tasks Only

Retrieve an audit trail for all Ground Truth Tasks. Please note that by default, get_task_events only returns audit information for Tasks in the Ground Truth Stage.

project = redbrick.get_project(org_id, project_id, api_key)

# Return an audit trail for all Tasks currently in the Ground Truth Stage
audit_trail = project.export.get_task_events()

The returned object will contain data similar to the code snippet below, where each entry will represent a single Task (uniquely identified by taskId). The events array contains all key events/actions performed on the Task, with events[0] being the first event.

[
  {
    "taskId": "...",
    "currentStageName": "Label",
    "events": [
      {
        "eventType": "TASK_CREATED",
        "createdAt": "...",
        "isGroundTruth": false,
        "createdBy": "..."
      },
      {
        "eventType": "TASK_ASSIGNED",
        "createdAt": "...",
        "assignee": "...",
        "stage": "Label"
      }
    ]
  }
]

Additional Capabilities

The segmentation directory will contain a single sub-directory for each task in your export. The sub-directories will be named after the task . A single task (depending on whether it was single series or multi-series) can have one or more segmentations.

The individual segmentation files will be in NIfTI-1 format and be . If no series name is provided on upload, RedBrick will assign a unique name. Corresponding meta-data ex. category names will be provided in .

As always, you should first perform the to create a Project object.

Please see a detailed reference for.

First, perform the to create a Project object.

The following is a non-exhaustive list of other available functionalities when using the . A full list of the capabilities of our Export class can be found .

Track labeler or reviewer time spent on a Task with ;

Fetch Task events from a specific timestamp to the present day using and the from_timestamp parameter;

Easily search for Tasks based on a wide variety of criteria using ;

Perform a semantic export (that exports a single file per category name) using ;

Configure s;

Upload a for ;

get_task_events here
Export class
here
get_active_time()
get_task_events()
list_tasks()
export_tasks()
Hanging Protocol
script
Custom Label Validation
name
named after the user-defined series name
tasks.json
format of the object here
export_tasks reference here
CLI
NIfTI-1 masks
overlapping segmentations
standard RedBrick AI SDK setup
standard RedBrick AI SDK setup