top of page

DEMO

EFFECTIVE LEARNING

image.png

Model is runned on dataset WikiText2 to showcase the learning methods as graph. In only few minutes model managed to: Lower perplexity for 10k, stay with enthropy tokens, minimise drastically halucinations without any RAG, use token similarity in Semantic Graph and guess multiple times what is coming before it sees the ground truth from dataset.

Runing models on the CPU

2.jpg

You are seeing a model running on a CPU that is larger than Mistral. 
The model is working on MacBook Pro Max 3

 

  • 16k Context Length

  • 32 Heads

  • 32 Layers

  • 2048 Embedding Size 

  • and ONLY EFFECTIVE SIZE OF 149MB disk size in Full 32 INT

Let’s
Connect

The Open Machine HQ

E-MAIL: info@theopenmachine.com

ADDRESS :Vatrogasna 5, 32100 Vinovci, Croatia 

OIB:62039729450

MB:06094830

Founded:2024. 

  • LinkedIn
  • Facebook
  • Instagram
  • Youtube
image.png
FRC.png

© 2025 The Open Machine Inc.

bottom of page