{"id":6934,"date":"2025-08-05T17:31:31","date_gmt":"2025-08-05T21:31:31","guid":{"rendered":"https:\/\/guardianglobe.org\/?p=6934"},"modified":"2025-08-05T17:31:33","modified_gmt":"2025-08-05T21:31:33","slug":"openai-releases-first-open-weight-models-since-2019","status":"publish","type":"post","link":"https:\/\/guardianglobe.org\/?p=6934","title":{"rendered":"OpenAI Releases First Open-Weight Models Since 2019"},"content":{"rendered":"\n<h2 class=\"wp-block-heading\">New models aim for accessibility and customization<\/h2>\n\n\n\n<p>OpenAI has launched two open-weight language models, <strong>gpt-oss-120b<\/strong> and <strong>gpt-oss-20b<\/strong>, marking its first open-weight release since GPT-2 in 2019. These models are text-only and designed to offer developers, researchers, and businesses a lower-cost, customizable alternative to closed models. The release is part of OpenAI\u2019s broader effort to promote transparency and decentralize access to powerful AI tools.<\/p>\n\n\n\n<p>Open-weight models expose their parameter data, allowing third parties to run and tailor them as needed. However, unlike open-source models, they don\u2019t provide access to full source code. The announcement follows similar moves by competitors such as Meta, Microsoft-backed Mistral AI, and China\u2019s DeepSeek.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Focus on safety and hardware compatibility<\/h2>\n\n\n\n<p>OpenAI collaborated with <strong>Nvidia, AMD, Cerebras,<\/strong> and <strong>Groq<\/strong> to ensure the models perform efficiently across a wide range of hardware. Nvidia CEO Jensen Huang praised OpenAI\u2019s contribution, stating it would further drive innovation in open software built on Nvidia\u2019s AI infrastructure.<\/p>\n\n\n\n<p>Before releasing the models, OpenAI ran rigorous safety testing. It filtered out sensitive data involving chemical, biological, radiological, and nuclear material and tested the models against malicious fine-tuning attempts. According to OpenAI, none of the maliciously altered versions reached a \u201chigh capability\u201d threshold under its internal Preparedness Framework.<\/p>\n\n\n\n<p>Three independent expert groups also evaluated the safety measures and provided feedback to help ensure responsible deployment.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Availability and use cases<\/h2>\n\n\n\n<p>Developers can now download the model weights from <strong>Hugging Face<\/strong> and <strong>GitHub<\/strong> under an <strong>Apache 2.0 license<\/strong>. The models will also be available on platforms such as <strong>LM Studio<\/strong>, <strong>Ollama<\/strong>, and through cloud providers like <strong>Amazon, Microsoft,<\/strong> and <strong>Baseten<\/strong>.<\/p>\n\n\n\n<p>Both models support advanced reasoning, tool usage, and chain-of-thought logic. OpenAI noted that the smaller model, gpt-oss-20b, can run directly on consumer hardware such as laptops and assist with tasks like file searches and content creation. This flexibility positions the models for a range of uses, from enterprise systems to personal assistants.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\">Strategic delay, now live<\/h2>\n\n\n\n<p>The release follows several delays. OpenAI CEO Sam Altman had previously stated that more time was needed for safety testing. With the models now launched, OpenAI says it is focused on distributing the benefits of AI research as broadly as possible. Altman emphasized the company\u2019s goal of putting powerful AI tools \u201cin the hands of the most people possible.\u201d<\/p>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>New models aim for accessibility and customization OpenAI has launched two open-weight language models, gpt-oss-120b and gpt-oss-20b, marking its first open-weight release since GPT-2 in 2019. These models are text-only and designed to offer developers, researchers, and businesses a lower-cost, customizable alternative to closed models. The release is part of OpenAI\u2019s broader effort to promote<\/p>\n","protected":false},"author":5,"featured_media":6935,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[22],"tags":[4561,1933,4563,4560,4565,4562,241,4564,305,961],"class_list":{"0":"post-6934","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-tech","8":"tag-ai-customization","9":"tag-ai-safety","10":"tag-apache-2-0-license","11":"tag-gpt-oss-120b","12":"tag-gpt-oss-20b","13":"tag-hugging-face","14":"tag-nvidia","15":"tag-open-weight-models","16":"tag-openai","17":"tag-sam-altman"},"_links":{"self":[{"href":"https:\/\/guardianglobe.org\/index.php?rest_route=\/wp\/v2\/posts\/6934"}],"collection":[{"href":"https:\/\/guardianglobe.org\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/guardianglobe.org\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/guardianglobe.org\/index.php?rest_route=\/wp\/v2\/users\/5"}],"replies":[{"embeddable":true,"href":"https:\/\/guardianglobe.org\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=6934"}],"version-history":[{"count":1,"href":"https:\/\/guardianglobe.org\/index.php?rest_route=\/wp\/v2\/posts\/6934\/revisions"}],"predecessor-version":[{"id":6936,"href":"https:\/\/guardianglobe.org\/index.php?rest_route=\/wp\/v2\/posts\/6934\/revisions\/6936"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/guardianglobe.org\/index.php?rest_route=\/wp\/v2\/media\/6935"}],"wp:attachment":[{"href":"https:\/\/guardianglobe.org\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=6934"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/guardianglobe.org\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=6934"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/guardianglobe.org\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=6934"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}