One Day Workshop on Neural Networks

Organised byacm-tvm-large-font.jpg




August 27, 2016 (Saturday, 9.30am – 4.00pm)


Venue: Indian Institute of Information Technology and Management-Kerala (IIITM-K)

Technopark, Trivandrum




Dr. Sebastian Basterrech

Department of Computer Science

Faculty of Electrical Engineering and Computer Science

VSB-Technical University of Ostrava, Ostrava, Czech Republic

About Neural Networks


Neural Networks (NNs) are one the most useful and popular computational model on the area of Machine Learning and Artificial Intelligence. A numerical optimization problem appears when NNs are applied for solving Machine Learning problems, which consists in minimizing a non-convex function in a multidimensional space. Several algorithms based on the first order derivative (gradient type algorithms) of this non-convex function have been successfully applied for training Feedforward NNs (FNNs). Even though those methods can work well on training FNNs, they can fail in the case of networks with recurrences (Recurrent Neural Networks (RNNs)), as well as in the case of networks with many layers (Deep Networks). The gradient can be unstable with large networks and recurrent ones, this is a fundamental drawback of the gradient type algorithms that have been characterised under the name of vanished and exploding gradient. Much effort has been devoted for developing algorithms for optimizing RNNs. At the beginning of 2000s, a paradigm named Reservoir Computer (RC) was introduced, which overcomes the problems presented on the learning algorithms on RNNs. The RC models have been successfully applied for solving temporal learning tasks, even though they present also well-identified drawbacks. Recently, a new technique method has been introduced for training deep and recurrent NNs under the name of Hessian-free (HF) optimization. The approach has shown success for training deep learning problems. This workshop aims to provide an overview of neural networks. 


Tentative outline



  • Introduction to NNs, historical motivations, biological description, revisiting interesting classic literature
  • Different types of NNs (FNNs, RNNs etc.).
  • Stochastic NNs: a brief introduction of Random Neural Networks
  • Optimization algorithms: First-order methods.
  • The vanishing and exploding gradient problem
  • Optimization algorithms: Second-order algorithms
  • Reservoir computing paradigm
  • Quasi-Newton optimization



  • Metaheuristic techniques applied for improving NNs performances.
  • Applications on real-world problems:
  • Quality of voice under the Internet.
  • Internet traffic prediction.
  • Weather forecasting.
  • Brain-computer interface.
  • Emotion Recognition.
  • A discussion of new trends and challenges on the area for the near future.


Biography of the Speaker


Sebastián Basterrech obtained the Ph.D. degree in Computer Sciences in November, 2012 from the University of Rennes I, Bretagne, France. During 2009-2012 he was a doctoral student fellowship of INRIA, France. He received the M.A. degree in Computer Arts (“Arts et technologies numériques”) at the University of Rennes II, Bretagne, France. In addition, he received a M.Sc. degree in Applied Mathematics in 2008 from the Aix-Marseille. Since June, 2013 until December, 2015, he has been working as a postdoctoral researcher at the National Supercomputing Center and VŠB-Technical University of Ostrava, Czech Republic. Currently, he is membership of the researcher staff of the Computer Department at Faculty of Electrical and Computer Science in VŠB-Technical University of Ostrava. S. Basterrech is a TC membership of the International IEEE SMC Soft-Computing Society. He has published more than 30 articles and book chapters published in several International Journals and books. His main research topics belong to: Neural computation, Human-Computer Interface, Dynamical Systems and Bio-inspired methods.


How to Register


Target Audience: Faculty members from institutions approved by AICTE & UGC and individuals working in industry or R&D organizations. Ph.D/M.Tech/MSc(CS)/Final Year B.Tech CSE/ECE/IT Students are also eligible.


Registration Fee:  Rs. 250                Mode of Payment: By Cash  (Spot registration at the venue on August 27, 2016, 9am)


Online Form:       (Complete the online form before Aug. 19, 2016)


Selection Process: The short listed candidates will be informed through e-mail by Aug 23, 2016.