Lines Matching defs:description
4 "description": "Meta's Llama 3.2 model with 3 billion parameters.", string
11 "description": "Meta's Llama 3.2 model with 1 billion parameters.", string
18 … "description": "Llama 3.1 is a new state-of-the-art model from Meta. 405 billion parameters.", string
25 …"description": "New state-of-the-art 70B model from Meta that offers similar performance compared … string
32 … "description": "Llama 3.1 is a new state-of-the-art model from Meta. 8 billion parameters.", string
39 … "description": "Google Gemma 2 is a high-performing and efficient model. 27 billion parameters.", string
46 … "description": "Google Gemma 2 is a high-performing and efficient model. 9 billion parameters.", string
53 … "description": "Google Gemma 2 is a high-performing and efficient model. 2 billion parameters.", string
60 …"description": "Qwen2.5 models are pretrained on Alibaba's latest large-scale dataset. 72 billion … string
67 …"description": "Qwen2.5 models are pretrained on Alibaba's latest large-scale dataset. 32 billion … string
74 …"description": "Qwen2.5 models are pretrained on Alibaba's latest large-scale dataset. 14 billion … string
81 …"description": "Qwen2.5 models are pretrained on Alibaba's latest large-scale dataset. 7 billion p… string
88 …"description": "Qwen2.5 models are pretrained on Alibaba's latest large-scale dataset. 3 billion p… string
95 …"description": "Qwen2.5 models are pretrained on Alibaba's latest large-scale dataset. 1.5 billion… string
102 …"description": "Qwen2.5 models are pretrained on Alibaba's latest large-scale dataset. 0.5 billion… string
111 … "description": "A high-performing open embedding model with a large token context window.", string
117 "description": "State-of-the-art large embedding model from mixedbread.ai", string
123 …"description": "The project aims to train sentence embedding models on very large sentence level d… string
129 "description": "Embedding model from BAAI mapping texts to vectors.", string