A decentralized cloud for artificial intelligence.

Releasing v1 of OBTranslate GPT-1


OBTranslate GPT-1, optimised on over 2 billion tokens, which could surpass most 100B+ parameter models in classification.
OBTranslate GPT-1 is formed and trained with a decentralised algorithm with an interconnection of 1 Gbps, contrary to the typical data centre networks of 100Gbps-1.6Tbps.
We are working hard to bring computation power together to support the open models ecosystem, while developing decentralized training algorithms for heterogeneous GPU hardwares connected over slow (1Gbps) internet links.

We are building state-of-the-art models that are much larger, that can outperforms many 100B+ parameter models on classification benchmarks. Training foundation models, such as GPT can be extremely expensive, often involving tens of thousands of GPUs running continually for months.

These models are typically trained in specialized clusters featuring fast, homogeneous interconnects and using carefully designed software systems that support both data parallelism and model/pipeline parallelism.

OBTranslate GPT-1, optimised on over 2 billion tokens.

about image

Join Us

We collaborate with universities, researchers, developers and businesses to harness and enhance artificial intelligence with an intuitive platform that combines data, models and computation..
  • Researches
  • Publications
  • Project Competition
  • Weekly Meeting
  • STEM Education
Join Now  For University