Neuton: A New Disruptive Neural Network Framework Far More Effective Than Any Other Framework

Neuton A New Disruptive Neural Network Framework Far More Effective Than Any Other Framework
Bell Integrator
Share this:

A lot of the progress has made in the AI field is owed to deep learning neural networks. Neuton prides itself on being a new framework that can be a lot compact and faster.

It requires limited training and skills compared to what other companies like Facebook, Google, and AWS have.

Deep learning is one of the most revolutionary technologies today. To have a framework like Neuton being launched by someone who is not part of the FAANG (Facebook, Apple, Amazon, Netflix and Alphabet’s Google ) of this world is sure to elicit some speculation.

Bell Integrator claims that the new neural network framework (Neuton) surpasses other frameworks and non-neural algorithms currently on the market today.

According to information from released benchmarks, Neuton serves as an Auto ML solution that has resulting models, which are not only self-growing but also learning.

What’s more, Bell Integrator said that Neuton is easy to use, which eliminates the need for special AI background.

For the past few days, we have witnessed a new launch of PyTorch, which is among the leading neural network frameworks as well as the introduction of fast.ai. Furthermore, MLflow, a meta-framework developed by Databricks is also getting close to version 1.0.

Bell Integrator is the company behind Neuton. It boasts more than 2,500 employees spread out in 10 locations. Bell lists several names like Societe Generale, Deutsche Bank, CitiBank, Juniper, Cisco, Ericsson and Century Link as part of its clientele.

According to Bell Integrator’s CTO Blair Newman, Neuton was delivered by several scientists with over 700 years worth of combined experience as scientific researchers.

These experts are known for having solved advanced algorithmic issues in various fields such as blockchain, internet of things, video analytics, machine learning, artificial intelligence, augmented reality, and neural networks.

Although we can only speculate on the development of Neuton, its features are nothing short of impressive. Not many people were aware of this framework, even machine learning enthusiasts.

According to Bell Integrator, Neuton is both self-growing and learning. What this means is that it eliminates the needs to work on neurons and layers.

Instead, all you have to do is create a dataset, and a model will, in turn, be developed automatically. The model requires less training samples.

Aside from benchmarks, you can now download such models. These models are claimed to be 10-100 times faster and smaller compared to those developed using existing non-neural algorithms and frameworks, while also utilizing 10-100 times fewer neurons.

Neuton has a higher accuracy, far lesser relative and absolute errors in the validation samples, and Auto ML. It requires basic technical skills only.

If all that does not puzzle you, the Neuton’s FAQ mentions that the initial release will be devoid of RNN/CNN (recurrent/ convolutional neural networks).

With all that in mind, the question is whether Neuton is a neural network or not? Even though RNN/CNN is not included in the first launch of the framework, Neuton is a neural network that has the potential to solve classification and regression issues.

It requires training, even though the training samples are fewer compared to other algorithms.

The resulting models come in the hdf5 format, which is generally open source and can be utilized by most of the modern frameworks and language such as Keras. Python, Java, and others support hdf5.

Newman said: “Neuton is an independent method of machine learning, it is our proprietary development. Neuton’s workflow is very easy and consists of a few steps. In the first step, a user uploads their data. In the second step, they specify which data to use for training and which to use for validation. At the third step, they select a metric for their task and criteria for training to stop. After the training is complete, we enable the user to make sure in its accuracy, forecasting the results on unknown data. At the final step, the user can choose how to use the model.

We provide the option of downloading the model or hosting it in the cloud. For large enterprise clients who do not feel secure in uploading their data to a public cloud we roll-out the model on premises.

Neuton’s model can be used either as a standalone solution or to build an ensemble of various algorithms. Models based on Neuton can automatically be rolled out as a REST-API service in one click. They can also be downloaded with a code sample for local use in Python.”

Newman explained: “Thanks to our proprietary algorithm and disruptive machine learning technology, models built on Neuton are super compact, meaning that they consist of relatively few neurons and coefficients. The actual algorithm is our IP, therefore, we cannot disclose it. Neuton results were compared against Caffe2, TensorFlow+Keras, CNTK, Torch, Theano. Those networks showed very similar results.

“The results are also reproducible by third parties and the trained models together with datasets and TensorFlow configuration used can be downloaded from the website for offline use. We have demonstrated Neuton’s future releases features. We conducted a few experiments that prove that using Neuton’s models in ensemble dramatically improve the results of the single model. We used these results in comparison with some traditional algorithms that are ensembles themselves (xgb, random forest, etc),” he said.

We cannot say whether Neuton uses deep learning or not since we are not conversant with its internal architecture. However, this does not in any way change how impressive it sounds.

Newman said: “Unlike Neuton, PyTorch and Fast.ai require some coding and the knowledge of neural network architectures, which means that our target audience is much wider and model setup time shorter, regardless of the level of expertise. We also offer our users all necessary infrastructure elements including storage for user data and models, virtual machines with GPU for training, virtual machines for rolling out in the cloud, meanwhile simultaneously empowering enterprise customers to use Neuton on their premise where desired. From the performance and effectiveness perspective, the new libraries mentioned above are still the same and do not affect our benchmarks.”

He added: “Neuton makes AI available to everyone and augments human ingenuity, which will have a transformative impact on the economy, every industry, scientific breakthroughs, and the quality of our and future generations everyday life through wider usage and adoption of artificial intelligence. We believe that intelligence makes the world a better place.”

Source Zdnet

Share this:

Leave a Reply

avatar
  Subscribe  
Notify of