
GLM-130B/MODEL_LICENSE at main · zai-org/GLM-130B · GitHub
Subject to the terms and conditions of this License, the Licensor hereby grants to you a non-exclusive, worldwide, non-transferable, non-sublicensable, revocable, royalty-free copyright …
Hugging Face
Subject to the terms and conditions of this License, the Licensor hereby grants to you a non-exclusive, worldwide, non-transferable, non-sublicensable, revocable, royalty-free copyright …
GLM-130B Reviews, Alternatives, and Pricing updated February 2026
Trained on 400B tokens with the GLM library, it delivers strong results on a range of NLP benchmarks and ships with downloadable checkpoints and inference/deployment code for …
GLM-130B: Specifications and GPU VRAM Requirements
Aug 3, 2022 · GLM-130B supports fast inference, making it suitable for real-time large-scale language processing tasks. It is designed to enable inference on a single A100 (40G * 8) or …
GLM-130B download | SourceForge.net
Oct 4, 2025 · GLM-130B is an open bilingual (English and Chinese) dense language model with 130 billion parameters, released by the Tsinghua KEG Lab and collaborators as part of the …
GLM-130B: A Truly Open, Bilingual 130B-Language Model That …
Jan 5, 2026 · GLM-130B redefines what’s possible with open, large-scale language models. It’s not just another research artifact—it’s a production-ready, bilingual LLM that delivers GPT-3 …
THUDM-GLM-130B/MODEL_LICENSE at main - GitHub
Subject to the terms and conditions of this License, the Licensor hereby grants to you a non-exclusive, worldwide, non-transferable, non-sublicensable, revocable, royalty-free copyright …
MODEL_LICENSE · silver/chatglm-6b-slim at main - Hugging Face
41 Use in Transformers main chatglm-6b-slim /MODEL_LICENSE Sengxian Add chatglm-6b d11c6aa 3 months ago raw history blame contribute delete No virus 2.35 kB
Learning-Develop-Union/modelscope - modelscope/models/nlp/glm_130b …
To apply the Apache License to your work, attach the following boilerplate notice, with the fields enclosed by brackets " []" replaced with your own identifying information.
GLM-130B: An Open Bilingual Pre-Trained Model - GitHub
GLM-130B is an open bilingual (English & Chinese) bidirectional dense model with 130 billion parameters, pre-trained using the algorithm of General Language Model (GLM). It is designed …