TensorLabs AI maintains a comprehensive portfolio on Hugging Face, establishing ourselves as a major contributor to the open-source AI ecosystem. Our presence includes cutting-edge models, curated datasets, and innovative research that drives the industry forward.
Architecture: Fine-tuned from Qwen3-4B Base Model
Performance: Optimized for web-based AI operating systems
Quantizations: 2 quantized versions available for different deployment scenarios
Use Cases: Specialized for Julia OS integration and system-level AI tasks
We created a dataset for our custom Solana developer model. This dataset consists of 25,000 question-answer pairs, generated in 5,000-sized batches using a rotating set of templates.
We based the content on key sources such as the official Solana documentation, Solana Cookbook, Anchor docs, SPL references, blog posts, and GitHub repositories.
To ensure variety and depth, we used seven distinct templates:
Templates were rotated either sequentially for balanced coverage or randomly for natural variation.
For optimal distribution, we use:
A quality control system rated each QA pair on relevance, correctness, completeness, clarity, and code quality to ensure a high standard across the dataset.