Skip to content

Daniel-Shunom/Distributed_NeuralNet_C-

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🧠 'Distributed' Deep Neural Network

MIT License C++ PRs Welcome Build Status Stars


Project Logo

Deep Neural Networks Implementation via Distributed Computing

A sophisticated implementation of distributed neural network architecture leveraging consensus mechanisms for robust and scalable machine learning implementations


📖 Overview⚡ Quick Start🛠️ Usage🎯 Features🤝 Contributing📧 Contact


🌟 Overview/Abstract

Large language models and machine learning algorithms are typically large computing processes that occur often in local multithreaded environments. This localization of computing is notoriously resource-intensive for the given system and puts a lot of load on the processors and computers involved. For large companies, these issues are easier to fix and monitor, given more extensive access to the necessary capital. This is not the same for smaller companies that want to build their own models but cannot afford the required infrastructure.

Matrix operations are expensive. What's more expensive is a hundred thousand more, spread across multiple layers and compute blocks. The big idea here is to distribute these compute operations across a network of participating computers across the globe, layer by layer, and return calculated values to be used in the network, effectively distributing the network layers and compute blocks across multiple devices.

This project is a Proof of Concept and aims to demonstrate a possible cheaper alternative to current model development strategies across the globe.

This innovative implementation brings together:

  • Distributed Computing: Each neural network layer is computed accross the network
  • Consensus Mechanisms: Validates computed layer packets accross network
  • Scalable Architecture: Designed for horizontal scaling and high performance
  • Smart Propagation: Optimized forward propagation with validation checkpoints
Architecture Overview

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages