World's most popular travel blog for travel bloggers.

How to encode date as input in neural network?

, , No Comments
Problem Detail: 

I am using neural networks to predict a time series. The question I'm facing now is how do I encode date/time/serial no. of each input set as an input to the neural network?

Should I use 1 of C encoding (used for encoding categories) as described here?

Or Should I just feed it the time (in milliseconds since 1-1-1970)?

Or is feeding it the time unnecessary as long as I feed it the rest of the data chronologically?

Asked By : Shayan RC

Answered By : alto

Neural Networks are not magic. If you treat them like they are and just throw data at them without thinking you're going to have a very bad time.

You need to stop and ask youself "Is milliseconds since 1970 actually going to be predictive of the event I'm interested in?" The answer you should arrive at immediately is no. Why? For every instance you actually care about (events in the future, the past already happened) the time variable will take on a value that is greater than any value the time variable will take in your training data. Such a variable is very unlikely to help. Even worse it is likely to cause overfitting (a serious problem for powerful non-linear models like neural networks) if you aren't careful.

Now what might make sense is a variable like week of the year or month that could help you model seasonal or annual effects. I've done some work in agricultural disease prediction where Julian day ended up being a very important variable. Based on this experience, I suspect you'd be better off encoding this type of variable as a categorical variable rather than ordinal, your experience may very. Notice that month or week of the year are repeatable events that one is likely to see many times in your training data and it is possible to explain why such a variable could impact a financial outcome. Contrast this with milliseconds since 1970 which is just a monotonically increasing value.

Lastly, from your statement "Or is feeding it the time unnecessary as long as I feed it the rest of the data chronologically?" it sounds like you might not have a very good grasp of how neural networks work. With a standard feedforward neural network the order you feed the network your data is going to have no impact on the predictions. Order may impact training if you're using stochastic or mini-batch gradient descent, but this is only an artifact of the iterative (as opposed to batch) training method. If you want to model temporal dependence with a neural network you'll need to use something like a sliding window or a recurrent neural network.

Best Answer from StackOverflow

Question Source : http://cs.stackexchange.com/questions/14634

0 comments:

Post a Comment

Let us know your responses and feedback