World's most popular travel blog for travel bloggers.

Using a combination of spatial and non-spatial inputs for convolutional neural networks

, , No Comments
Problem Detail: 

I'm working on training a game AI using deep reinforcement learning to achieve specific examples based on pixel input and some additional state information.

Naturally, I'm using a convolutional neural network to deal with the pixel information, which has been working well so far. However, I still have additional information available, such as numeric values associated with current health and ammo.

I know it must be possible to create a network architecture to take advantage of both the spatial information provided by the screen buffer as well as non-spatial information such as the game states I mentioned earlier. Is there any way to create a neural net architecture that can handle both spatial and non-spatial state?

Asked By : XtremeCheese
Answered By : D.W.

Yes. One way is to start with several convolutional layers, and then end with one or two fully connected layers. You can add the other features as inputs to the first of those fully connected layers. This is basically serial composition of convolutional layers then fully connected layers.

There are other options for network architectures, but that'd be a reasonable one to try first. The choice of architecture is an art and depends partly on how you think one might learn from the features, and how the value of one feature might affect the kinds of things you'd like for in other features. The architecture I suggest above would be good if we pretty much want to look for the same kinds of information about the pixel input, regardless of the value of health, ammo, etc. If you think the health, ammo, etc. would dramatically influence what kinds of information we'd want the network to extract from the pixel input, one might consider other network architectures.

Best Answer from StackOverflow

Question Source :

3200 people like this

 Download Related Notes/Documents


Post a Comment

Let us know your responses and feedback