Skip to content
Architecture data sourced from HuggingFace config.json files, ArXiv papers, and vendor technical reports. Registry sync powered by Raschka's LLM Architecture Gallery (Apache 2.0).
LLM Architecture Gallery

LLM Architectures

A comprehensive catalog of large language model architectures — decoder types, attention mechanisms, parameter counts, and context windows — curated for enterprise AI research and evaluation.

Colaberry AI catalogs 79+ large language model architectures from 23 organizations including Meta, Google, OpenAI, DeepSeek, Alibaba, and Mistral. The gallery covers Dense Transformers, Mixture-of-Experts (MoE), Hybrid SSM-Transformer, and Recurrent models with architecture specifications, attention mechanisms, and context window sizes.

  • 79 architectures
  • 23 organizations
  • 4 decoder types
  • LLM-indexed
Architectures
0
Curated gallery
Organizations
0
Global AI labs
Decoder Types
0
Dense · MoE · Hybrid · Recurrent
Filters

Search and filter

Find architectures by decoder type, organization, parameters, and features.

Showing 24 of 79 architectures

Enterprise AI platform

Compare, evaluate, and deploy LLM architectures at scale

Colaberry AI provides architecture specifications, benchmark comparisons, and deployment guidance across dense transformers, MoE, hybrid, and recurrent models.

Catalog Workspace

Discover agents, MCP servers, and skills in one governed surface

Use structured catalog views to compare readiness, ownership, integrations, and deployment posture before rollout.