LLaMA

LLaMA

A foundational, 65-billion-parameter large language model

LLaMA is a collection of foundation language models ranging from 7B to 65B parameters and show that it is possible to train state-of-the-art models using publicly available datasets exclusively, without resorting to proprietary and inaccessible datasets.

You might also like

Replit Code V1.5 3B

Replit Code V1.5 3B

A new code generation language model
PaLM 2

PaLM 2

Google's next generation large language model
Carton

Carton

Run any ML model from any programming language.
StableLM

StableLM

Stability AI's language models
ChatGPT

ChatGPT

Optimizing language models for dialogue
Galactica

Galactica

The Language Model that Wrote Its Own Scientific Paper
Introducing DBRX

Introducing DBRX

Revolutionizing Language Models with DBRX
Ollama

Ollama

The easiest way to run large language models locally
Visual ChatGPT

Visual ChatGPT

Talking, drawing and editing with visual foundation models
Dramatron

Dramatron

Script writing tool that leverages large language models
Language Atlas

Language Atlas

Learn a new language for free
Language Learning with Netflix

Language Learning with Netflix

Use movies as study material for language learning
Mixedname

Mixedname

Find a cross-language compatible name for your baby
No Language Left Behind by Meta

No Language Left Behind by Meta

An open-source language translation system for 200 languages
ESIF

ESIF

Educational Sensational Inspirational Foundational
SemanticDiff

SemanticDiff

Programming language aware diffs for Visual Studio Code
Wayback Machine

Wayback Machine

Explore more than 664 billion web pages saved over time
Crystal

Crystal

The Crystal Programming Language
Pure Data

Pure Data

Open source visual programming language for multimedia
Traveler Map

Traveler Map

Explore national parks around the world