Welcome to Xtreme1 Open-source platform for multisensory training data.
Xtreme1 is the world's first open-source platform for multisensory training data .
Xtreme1 provides deep insight into data annotation, data curation, and ontology management to solve 2D image and 3D point cloud dataset ML challenges.
The built-in AI-assisted tools take your annotation efforts to the next level of efficiency for your 2D/3D Object Detection , 3D Instance Segmentation , and LiDAR-Camera Fusion projects.
1οΈβ£ Supports data labeling for images π· , 3D LiDAR and 2D/3D Sensor Fusion datasets π π¦ π·
2οΈβ£ Built-in pre-labeling and interactive models support 2D/3D object detection, segmentation and classification π
3οΈβ£ Configurable Ontology Center for general classes (with hierarchies) and attributes for use in your model training π
4οΈβ£ Data management and quality monitoring π
5οΈβ£ Find and fix labeling errors π¬
6οΈβ£ Results visualization to help you to evaluate your model π
Getting Started
You can install Xtreme1 on a Linux, Windows, or MacOSX machine.
Prerequisites details and built-in models installation is explained here .
Get started from the Quick Start :
Xtreme1, the First Open-Source Labeling & Annotation and Visualization Project, is debuting at the Linux Foundation AI & DATA Global Landscape Support and Community
Join our community to chat with other members.
Issue: https://github.com/xtreme1-io/xtreme1arrow-up-right /issuesarrow-up-right
Medium: https://medium.com/multisensory-data-trainingarrow-up-right
GitHub: https://github.com/xtreme1-io/xtreme1arrow-up-right
Twitter: https://twitter.com/Xtreme1ioarrow-up-right
Subscribe to the latest video tutorials on our YouTubearrow-up-right channel
π£οΈ LiDAR Annotation Tool chevron-right Please refer to the Linux Foundation Trademark Usage page to learn about the usage policy and guidelines: https://www.linuxfoundation.org/trademark-usagearrow-up-right .