RapidFire AI today announced the open‑source release of its “rapid experimentation” engine designed to dramatically speed up and simplify one of the most critical, yet underserved, stages of AI development: customizing large language models (LLMs) through fine‑tuning and post‑training.
Released under the Apache 2.0 license, RapidFire AI lets you launch and compare many fine-tuning/post-training configs at once on a single GPU or across multiple GPUs spanning data, model/adapter choices, trainer hyperparameters, and reward functions. It does this by training on dataset chunks and efficiently swapping adapters or base models between chunks, while the scheduler automatically reallocates GPUs for high utilization. Live metrics stream to an MLflow dashboard from where you can stop, resume, and clone-modify configurations, enabling faster, cheaper exploration toward better eval metrics.
Built for Hyperparallel Exploration and Interactive Control
RapidFire AI enables users to launch as many training/tuning configurations as they want in parallel even on a single multi‑GPU machine, spanning variations of base model architectures, hyperparameters, adapter specifics, data preprocessing, and reward functions. Live metrics and Interactive Control (IC) Ops allow users to stop weak configurations early, clone high‑performers, and warm‑start new configurations in real time right from the dashboard, enabling more impactful results without needing more GPU resources. In the same wall‑time as a few sequential comparisons, teams can explore far more paths and reach better metrics, often realizing 20× higher experimentation throughput.
Key Capabilities
- Hyperparallel configuration comparison on a single machine: compare even 20+ variants in parallel; expand or prune on the fly based on data- and use case-specific constraints.
- Interactive Control (IC) Ops: Stop, Resume, Clone‑Modify, and warm‑start new configurations directly from the dashboard on the fly to double down on what works.
- Chunk‑wise scheduling: Adaptive engine cycles configurations over chunks of the data to maximize GPU utilization, while ensuring sequential-equivalent metrics and minimizing runtime overheads.
- Hugging Face‑native workflow: Works natively with PyTorch, Transformers, TRL; supports PEFT/LoRA and quantization.
- Supported TRL workflows: SFT, DPO, and GRPO.
- MLflow‑based dashboard: Unified tracking and visualization for all metrics, metadata management, and control panel for IC Ops—no extra MLOps wiring needed.
RapidFire AI’s technology is rooted in award-winning research by its Co-founder, Professor Arun Kumar, a faculty member in both the Department of Computer Science and Engineering and the Halicioglu Data Science Institute at the University of California, San Diego.
The company has raised $4 million in pre-seed funding from leading deep‑tech investors including .406 Ventures, AI Fund, Willowtree Investments, and Osage University Partners.
Availability
RapidFire AI’s open‑source package, documentation, and quickstart guides are available now: rapidfire.ai/docs
AI developers and researchers are invited to try out this package, share feedback, showcase their use cases, and contribute to extensions. For more information on the company visit www.rapidfire.ai.
TELUS Reaches Historic Planting of 25 Million Tree Milestone During National Forest Week
Posted in Commentary with tags Telus on September 23, 2025 by itnerdIn celebration of National Forest Week, TELUS has achieved a landmark environmental milestone by planting its 25 millionth tree. When fully mature, these 25 million trees will absorb 7.5 million metric tons of CO2, equivalent to removing 1.8 million cars from roads while creating vital wildlife habitats across an area 50 times larger than New York’s Central Park. For over 25 years, TELUS has been a global leader in sustainability, investing in innovative technology and sustainable business practices. This achievement exemplifies TELUS’ commitment to meaningful environmental action.
Leading Through Science-Based Climate Action
As a globally recognized sustainability leader, TELUS has established ambitious science-based targets aligned with the Paris Climate Agreement, including:
Beyond tree planting, TELUS has demonstrated comprehensive environmental stewardship by diverting 15 million devices from landfills since 2005, investing nearly $52.4 million through the TELUS Pollinator Fund for Good since 2020, and accelerated reforestation efforts with over eight million trees planted across Canada in 2024 alone, restoring more than 5,300 hectares of natural habitats.
Comprehensive Nature-Based Solutions
Central to these achievements is TELUS Environmental Solutions, which offers comprehensive nature-based climate solutions including strategic tree planting, innovative kelp afforestation, and critical mangrove restoration projects. These initiatives contribute to enhanced biodiversity, accelerated carbon sequestration, and ecosystem restoration while empowering customers and partners to take meaningful steps towards a healthier planet.
To learn more about TELUS’ commitment to a more sustainable future, visit telus.com/sustainability.
Leave a comment »