What Is GNY (GNY)?

What Is GNY (GNY)? Complete Guide Review About GNY.

What Is GNY (GNY)?

GNY Distributed Deep Learning breaks this pattern of one large platform library by creating two of the smallest configurable self-learning unsupervised neural net nodes – ETL node and ML node – and distributing these nodes into each block of the block chain to have them teach themselves the solution to each problem. The one
conceptual problem though of machine learning is that error detection requires global knowledge that gets backpropagated to its constituents. This requirement though is fixed in gny.io’s DistributedDeep Learning systems.

Therefore they prefer the term ‘localized models’ and not ‘model free models’, even though both mean a measure of lack of intelligence of constituent parts. In general though, this is similar to the concept of ‘parsimony’, in that this seek the simplest of mechanisms that give rise to emergent properties of prediction. Gny.io is not a crushed up small machine learning library; it is one node of a huge distributed brain. GNY has configurable ETL microservices and configurable machine learning micro services that can read the entire chain of data or it can read the current block data.

GNY microservices uses deep learning which is a class of machine learning algorithms in the form of a neural network that uses a cascade of layers (tiers) of
processing units to extract features from data and make predictive guesses about new data. The smallest ML node system varies the weights and biases to see if a better outcome is obtained using a neural network.

GNY Storage Key Points

Coin BasicInformation
Coin NameGNY
Short NameGNY
Circulating Supply192,376,657.00 GNY
Total Supply400,000,000
Source CodeClick Here To View Source Code
ExplorersClick Here To View Explorers
Twitter PageClick Here To Visit Twitter Group
WhitepaperClick Here To View
Official Project WebsiteClick Here To Visit Project Website

Neural node in a neural network

At the heart of GNY is the special back propagation algorithm. Back propagation of errors and gradient descent are some of optimization methods used to calculate the error contribution of each blockchain node after a batch of data is processed. Gny.io uses a variation of structured streaming AI and parquet data format – this sets the
weights in the neuron. This means that the neurons change their connection and their weights to lesson the error. Gny.io will determine the normal rules of the system by itself with unsupervised learning. Adjusted weights is what is configurable.

Back propagation is an expression for the partial derivative ∂C/∂w of the cost function C with respect to any weight w (or bias b) in the network. The expression tells us how quickly the cost changes when we change the weights and biases. And so back propagation isn’t just a fast algorithm for learning. It actually gives detailed insights into how changing the weights and biases, changes the overall behavior of the network. Let W(l)jk be the weight they are trying to find to kth neuron in the (l−1)th layer to the jth neuron in the lth layer. So, for example, the diagram below shows the weight on a connection from the fourth neuron in the second layer to the second neuron in the third layer of a network.

Parquet data flow

GNY gradient descent optimization is Deep Learning. Deep learning is an advanced form of Artificial Intelligence and a powerful set of techniques for learning in neural networks. Neural networks and deep learning currently provide the best solutions to many problems in image recognition, speech recognition, and natural language processing, Gny.io is a fusion of Deep Learning technology on to a blockchain. The blockchain, being a distributed transparent consensus based peer-to-peer network that has been shown to be extremely resilient to adversarial attacks.

The blockchain itself creates the neural net. Gny.io is the first to apply this technique in blockchain. GNY uses a Probabilistic Graph Model (PGM) approach to DL. In the PGM approach, the neural nodes construct a probabilistic graph that defines the relationship’s different variables. The approach uses Monte-Carlo sampling to construct Bayesian consistent distributions for the variables. Then use Deep Learning to learn from this synthesized data.


GNY A system needs to be sufficiently flexible and agile to adjust to changes in the environment. Adaptive approaches that involve simulation, selection, and amplification of successful strategies are important. Self-learning is requirement to achieve adaptability. Decoupling of components act like firewalls between components and help mitigate against total collapse. Individual component damage can be tolerated while the integrity of other components are preserved. In general, a distributed loosely coupled system has higher survivability that a centralized tightly coupled system.