Originally published on AI Tech Connect.
What OpenAI released OpenAI released two open-weight models under the Apache 2.0 licence: gpt-oss-120b and gpt-oss-20b. Both use a mixture-of-experts (MoE) architecture, support full-parameter fine-tuning, and are available for download from HuggingFace. Key characteristics verified at release: Licence: Apache 2.0 — commercial use, modification, and redistribution permitted. No copyleft requirement on derivative works. Architecture: Mixture-of-experts (MoE), consistent with the broader trend toward MoE for inference efficiency at scale Fine-tuning: Full-parameter fine-tuning supported; LoRA and QLoRA adapters also work with standard HuggingFace PEFT tooling Download: Available on HuggingFace under the OpenAI organisation namespace Apache 2.0 matters Llama 4 uses the Llama Community…
Top comments (0)