G. Hinton wants to train Neural Nets with Trillions of Parameters

Focus of the recent Wired article was again deep learning researches being done at the Google X-labs. In that same article G. Hinton mentions that he would like to train Neural Nets with trillions of parameters:

“I’d quite like to explore neural nets that are a thousand times bigger than that,” Hinton says. “When you get to a trillion [parameters], you’re getting to something that’s got a chance of really understanding some stuff.”

Source: Wired Article

1 comment to G. Hinton wants to train Neural Nets with Trillions of Parameters

  • If anyone else is interested in developing the practical hardware needed to deal with learning trillions of parameters, please get in touch with me via my LinkedIn page or by replying to this message.