DEV Community

AI Tech Connect
AI Tech Connect

Posted on • Originally published at aitechconnect.in

OpenAI Open Weights: gpt-oss-120b and gpt-oss-20b Guide

Originally published on AI Tech Connect.

What OpenAI released OpenAI released two open-weight models under the Apache 2.0 licence: gpt-oss-120b and gpt-oss-20b. Both use a mixture-of-experts (MoE) architecture, support full-parameter fine-tuning, and are available for download from HuggingFace. Key characteristics verified at release: Licence: Apache 2.0 — commercial use, modification, and redistribution permitted. No copyleft requirement on derivative works. Architecture: Mixture-of-experts (MoE), consistent with the broader trend toward MoE for inference efficiency at scale Fine-tuning: Full-parameter fine-tuning supported; LoRA and QLoRA adapters also work with standard HuggingFace PEFT tooling Download: Available on HuggingFace under the OpenAI organisation namespace Apache 2.0 matters Llama 4 uses the Llama Community…


Read the full article on AI Tech Connect →

Top comments (0)