# Difference between revisions of "VS298 (Fall 06): Suggested projects"

• Feedforward vs. recurrent weights. As we discussed in class, one can implement a given input-output mapping in a neural network using just feedforward weights: ${\displaystyle y=Wx}$, or using just recurrent weights: ${\displaystyle \tau dy/dt+y=x+My}$, or both: ${\displaystyle \tau dy/dt+y=Wx+My}$. Probably there is a trade-off here in terms of minimizing overall wiring length and settling time - i.e., feedforward networks are fast but require lots of synapses, while recurrent networks are slower but can implement more complex functions with local connections. Explore these tradeoffs for a particular problem - e.g., implementing an array of Gabor filters in model of V1.