Is Reinforcement Learning a Slow Learner?

A prominent AI researcher recently gave a webinar with the ACM (Association of Computing Machinery) expressing dismay at the current performance of AI systems and giving his thoughts on the directions research should take. Yann LeCun, Chief AI Scientist at Facebook and Professor at NYU, gave a talk titled “The Power and Limits of Deep Learning”. Current AI systems have no ability to model the real-world. For example, babies learn quite early that a truck that drives off a platform and hovers in the air is unexpected. Current AI systems do not have this ability – they might, after many, many training examples, might be able predict this type of behavior for very specific vehicles. “Sure, I know a red fire truck will fall down, but I have no idea what this Prius is going to do. Let’s watch…” This same type of thing happens in the simpler task of image recognition. A human can get the idea of an elephant from a few images, but our most sophisticated image recognition systems need many thousands of training examples to recognize a new object. And even then, it will have difficulty in recognizing a different view (Elephant rear-end?, Elephant with trunk hidden behind a wall?) if it has not specifically been trained with those types of views.

Similarly, Reinforcement Learning, a technique used to train AI systems to do things like play video games at (or above) human levels, is a slow learner. It takes 83 hours of real-time play for the RL systems to achieve a level a human player can achieve in 15 minutes.

Two basic algorithms used in AI are (1) supervised learning and (2) unsupervised learning. Supervised learning is an algorithm trained by showing it an image (or training example) along with the desired response. “Hello computer. This image is a car. This next image is a bird.” This goes on for millions of images (The ImageNet dataset, used in a lot of benchmark tests, has over 15 million images, and often a subset of over 1 million images is used for training). The training is also repeated over that same set many times (many “epochs”). On the other hand, unsupervised learning tries to make sense of the data without any human “supervised” advice. An example is a clustering algorithm that tries to group items into clusters, or groups, so that items within each group are similar to each other in some way.

Prof. LeCun’s suggestion is that unsupervised learning, or what he calls self-supervised learning, might provide a better approach. He said “Prediction is the essence of intelligence.” We will see whether computers will be able to generate predictions from just a few examples.


  1. Karen Hao, Technology Review, The AI technique that could imbue machines with the ability to reason
  2. Yann LeCun, The Power and Limits of Deep Learning

Deep Trouble – Neural Networks Easily Fooled

Deep Neural Networks (DNNs) always leave me with a vague uneasy feeling in my stomach.  They seem to work well at image recognition tasks, yet we cannot really explain how they work.  How does the computer know that image is a fish, or a piano, or whatever?  Neural networks were originally modeled after how biological neurons were thought to work, although they quickly became a research area of their own without regard to biological operation.

Well, it turns out that DNNs don’t work at all like human brains.  A paper by Nguyen, et al 1 explores how easy it is to create images that look like nothing (white noise essentially) to the human brain, and yet are categorized with 99%+ certainty as a cheetah, peacock, etc. by DNNs that perform at human levels on standard image classification libraries.  Here a small sample of images from their paper:

The researchers created the false images through an evolutionary algorithm that seeks to modify existing images through mutation and combination in order to improve on a goal, in this case to find images that would score highly with a DNN classifier (“fooling images”).  They used a couple of methods, both of which worked.  The direct method, illustrated in the top two images, works by direct manipulation of the pixels in an image file.  The indirect method, illustrated in the bottom two images, works by using a series of formulas to generate the pixels; the formulas were then evolved.  The idea was to create images that looked more like images and less like random noise.  In both cases, the researchers found it easy to come up with images that fooled the DNN.

Their results also seemed robust.  They performed various trials with random starting points, they even added their fooling images to the training sets, so as to warn the DNN that these were incorrect.  Even after doing that, they were still able to find other fooling images that were misclassified by the new “improved” DNN.  They even repeated this process as many as 15 times, all to no avail.  The DNNs were still easily fooled.

As the authors point out, this ability of DNNs to be fooled has some serious implications for safety and security, for example in the area of self-driving cars.  For real world results, see the Self-driving car hack.

There is something going on here that we do not fully understand.  Researchers are starting to look into what features a DNN is really considering – which may help us to improve or alter the game for image recognition. Until then, pay attention to that pit in your stomach.


  1. A. Nguyen, J Yosinkski, and J. Clune, Deep Neural Networks are Easily Fooled: High Confidence Predictions for Unrecognizable Images, Computer Vision and Pattern Recognition (CVPR ’15), IEEE, 2015

Building a Multiple GPU Computer for Grid Computing

Computer with three GPUs running


Want to search for evidence of extraterrestrial life? Want to find your very own prime number?  One way to do this is to join a grid computing project.  The Berkeley Open Infrastructure for Network Computing (BOINC) is a framework for people creating grid computing projects.

A video from Matt Parker and Numberphile, 383 is cool, sparked huge interest in the Primegrid project on BOINC.  You can join BOINC and Primegrid with just about any computer (I will give some instructions in a later post), but for doing real supercomputing, you will want to use a GPU (Graphics Processing Unit).  GPUs have traditionally have used for gaming purposes, but as scientists realized the computing power inherent in the GPU, uses outside gaming started to proliferate.  A single GPU unit can perform calculations 50-200 times faster than a single CPU.

This post will discuss how to build a computer with 3 powerful GPUs that is still climbing up the contributor’s rank list at Primegrid – I hope to make it to the top 3.

To incorporate multiple GPUs in a single case, there are a number of things to worry about that you don’t worry about in a basic CPU and motherboard build:

  • CPU – CPUs have a limited number of PCIe “lanes” available that have to be split up amount all the GPUs.  The latest generation (7th) of Intel processors, such as the i7-7700, have 16 lanes of PCIe, so they could do one GPU at x8 and two at x4.  We want to go faster, so we picked an i7-6850K, which, while an older generation and lower clock speed, has 40 lanes of PCIe, and thus, we will be able do 3 GPUs at x16, x16, and x8.
  • Motherboard – you need to be able to fit all the GPUs into PCIe slots.  GPUs are typically double wide (take up 2 slot spaces).  You want enough PCIe lanes to keep each GPU humming at full capacity.  If you are going to spend all those dollars on top-end GPUs, you want to fully utilize them.  The exact number of lanes you need will depend on the project you are working on.  For this computer build, I decided I wanted to keep as many GPUs running at x16 (full bandwidth) as I could – experiments later could help determine if they are all needed.  Of course, the motherboard must be slot compatible with the CPU you choose.  For the 6850K we will need a socket LGA 2011 motherboard.
  • Case – the case needs to hold the motherboard you chose (obviously), and have lots of fans to dissipate all the heat that will be generated.  Also, it will be nice to have extra of room to maneuver as we fit in all the cards, fans, and power supplies inside.
  • Power supply – You need enough power for all the parts you have selected.  Each GPU will use up to about 200W (check the specs).  Also, since we will be using lots of power and we want to keep our operating costs down, more efficiency will be worthwhile.  Using a smaller fraction of a power supply’s capacity (maybe 50%) will generally improve efficiency also.  Also consider if you will be adding more components later, as taking apart the machine to upgrade the power supply later might not be convenient.
  • GPUs – the muscles of the machine.  Nvidia-based GPUs are popular.  The most powerful one currently is the GTX1080, but since i already had a GTX1070, I decided to go with them by adding 2 more.  You can check with your particular project to see what the cost/performance tradeoff is for various GPUs.  The ASUS ones have a design that sends the airflow out the back of the case, where the IO connectors are, which seems better in a multi-GPU setup than the EVGA design that blows hot air on the GPU next to it.

The Machine

Before we get into building the computer, here are the parts we selected:

  • Case – Corsair 750D Full-tower.  Anyone used to wimpy mid-tower cases will be totally impressed with this monster.
  • CPU – i7-6850K 40 lane processor.  Uses socket LGA-2011
  • Motherboard – EVGA X99 FTW.  This is compatible with the 6850K, and can hold 128GB of RAM
  • RAM – 16GB of 2400 MHz DDR4.  We don’t need huge amounts of memory for the BOINC projects we are running, but your needs might vary.
  • CPU cooler – Hyper 212X Turbo
  • Power supply – Thermaltake Grand Platinum 1200W.  Probably overkill, but better safe than sorry.
  • Hard drive – I just used a 1 TB HDD lying around.  GPU computing projects generally don’t need speedy disks, but if you wanted to boot faster, you could replace with a SSD (solid state drive).
  • DVD drive – not completely necessary, but makes it easier to boot up and install your operating system
  • Monitor – you will want a monitor to install the OS and configure it, even though you do not need it once you are chugging away looking for aliens.  You will also need it to configure the BIOS, and also keep in mind that the motherboard used here does not have built in graphics, so a VGA monitor has nowhere to plug in. You will need an inexpensive HDMI or DVI monitor.  And keyboard for setup.
  • GPUs – I used one EVGA GTX1070 and two ASUS GTX1070 Turbo VR Ready editions.  This ASUS edition has the fans that exhaust out the back.
  • OS – I choose Debian linux.  It’s free, and below I will show how to install it to get the GPUs working on BOINC.

Building It

Case with power supply before other stuff.  The order you assemble this computer can reduce the hassles, so here is what I found worked for me.

This picture shows the computer with the motherboard and power supply installed, before the CPU cooler and GPUs go in.





  1. Remove second HDD cage closest to power supply area, otherwise 3rd GPU won’t fit
  2. Put in HDD and DVD and wire up
  3. Install power supply.  This heavy item should go in before delicate components to come.
  4. Insert CPU
  5. Insert memory (do this before CPU cooler because it sits below heatsink)
  6. CPU cooler
  7. Wire up the fans (do the before GPUs).  One of the front fans wires was tucked under the HDD cage and I overlooked it the first time.
  8. Insert GPUs
  9. Add the power supply connectors to the motherboard and to the GPUs.  Make sure you follow your motherboard and GPU instructions.  You can also add a extra PCIe bus power connector to the motherboard, which seems like a good idea given all the PCIe boards we just installed.

Here is what the finished install looks like.  You can see that we have room for one more GPU, if we want to go all out!

The CPU cooler fans are oriented to blow air towards the back of the case.  This is a must since the front fans and back fan are oriented the same way.  There is a very small, yet positive, clearance between the CPU cooler and the EVGA GPU.

Once everything is wired up and double checked, hook up a DVI/HDMI monitor and keyboard and fire it up.

You should adjust any BIOS settings you need before installing the OS.
Next post:
installing BOINC on multi-GPU computer

Relevant items at Amazon: (If you can’t see them, turn off your ad blocker)

Buy NVIDIA GTX 1080 at Amazon