Biological learning curves outperform existing ones in artificial intelligence algorithms

Recently, deep learning algorithms have outperformed human experts in various tasks across several domains; however, their characteristics are distant from current knowledge of neuroscience. The simulation results of biological learning algorithms presented herein outperform state-of-the-art optimal learning curves in supervised learning of feedforward networks. The biological learning algorithms comprise asynchronous input signals with decaying input summation, weights adaptation, and multiple outputs for an input signal. In particular, the generalization error for such biological perceptrons decreases rapidly with increasing number of examples, and it is independent of the size of the input. This is achieved using either synaptic learning, or solely through dendritic adaptation with a mechanism of swinging between reflecting boundaries, without learning steps. The proposed biological learning algorithms outperform the optimal scaling of the learning curve in a traditional perceptron. It also results in a considerable robustness to disparity between weights of two networks with very similar outputs in biological supervised learning scenarios. The simulation results indicate the potency of neurobiological mechanisms and open opportunities for developing a superior class of deep learning algorithms.

figure1

read full article: https://www.nature.com/articles/s41598-019-48016-4

Deep neural network chip from Intel®

Prototype and deploy deep neural network (DNN) applications smarter and more efficiently with a tiny, fanless, deep learning development kit designed to enable a new generation of intelligent devices.

The new, improved Intel® Neural Compute Stick 2 (Intel® NCS 2) features Intel’s latest high-performance vision processing unit: the Intel® Movidius™ Myriad™ X VPU. With more compute cores and a dedicated hardware accelerator for deep neural network inference, the Intel® NCS 2 delivers up to eight times the performance boost compared to the previous generation Intel® Movidius™ Neural Compute Stick (NCS).

Technical Specifications

  • Processor: Intel® Movidius™ Myriad™ X Vision Processing Unit (VPU)
  • Supported frameworks: TensorFlow* and Caffe*
  • Connectivity: USB 3.0 Type-A
  • Dimensions: 2.85 in. x 1.06 in. x 0.55 in. (72.5 mm x 27 mm x 14 mm)
  • Operating temperature: 0° C to 40° C
  • Compatible operating systems: Ubuntu* 16.04.3 LTS (64 bit), CentOS* 7.4 (64 bit), and Windows® 10 (64 bit)

source: https://software.intel.com/en-us/neural-compute-stick

Contextual Chatbots with Tensorflow

In conversations, context is king! We’ll build a chatbot framework using Tensorflow and add some context handling to show how this can be approached.

Ever wonder why most chatbots lack conversational context?

How is this possible given the importance of context in nearly all conversations?

We’re going to create a chatbot framework and build a conversational model for an island moped rental shop. The chatbot for this small business needs to handle simple questions about hours of operation, reservation options and so on. We also want it to handle contextual responses such as inquiries about same-day rentals. Getting this right could save a vacation!

We’ll be working through 3 steps:

  • We’ll transform conversational intent definitions to a Tensorflow model
  • Next, we will build a chatbot framework to process responses
  • Lastly, we’ll show how basic context can be incorporated into our response processor

We’ll be using tflearn, a layer above tensorflow, and of course Python. As always we’ll use iPython notebook as a tool to facilitate our work.

We’ll be using tflearn, a layer above tensorflow, and of course Python. As always we’ll use iPython notebook as a tool to facilitate our work.  …. 

Full Source: https://chatbotsmagazine.com/contextual-chat-bots-with-tensorflow-4391749d0077

ChatterBot

_images/banner.png

ChatterBot is a Python library that makes it easy to generate automated responses to a user’s input. ChatterBot uses a selection of machine learning algorithms to produce different types of responses. This makes it easy for developers to create chat bots and automate conversations with users. For more details about the ideas and concepts behind ChatterBot see the process flow diagram.

An example of typical input would be something like this:

user: Good morning! How are you doing?bot:  I am doing very well, thank you for asking.user: You're welcome.bot:  Do you like hats?

Language Independence

The language independent design of ChatterBot allows it to be trained to speak any language. Additionally, the machine-learning nature of ChatterBot allows an agent instance to improve it’s own knowledge of possible responses as it interacts with humans and other sources of informative data.

How ChatterBot Works

ChatterBot is a Python library designed to make it easy to create software that can engage in conversation.

An untrained instance of ChatterBot starts off with no knowledge of how to communicate. Each time a user enters a statement, the library saves the text that they entered and the text that the statement was in response to. As ChatterBot receives more input the number of responses that it can reply and the accuracy of each response in relation to the input statement increase.

The program selects the closest matching response by searching for the closest matching known statement that matches the input, it then chooses a response from the selection of known responses to that statement.

https://chatterbot.readthedocs.io